Jul 7 00:16:42.775670 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jul 7 00:16:42.775693 kernel: Linux version 6.12.35-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Sun Jul 6 21:52:18 -00 2025 Jul 7 00:16:42.775703 kernel: KASLR enabled Jul 7 00:16:42.775709 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Jul 7 00:16:42.775715 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390b8118 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Jul 7 00:16:42.775721 kernel: random: crng init done Jul 7 00:16:42.775728 kernel: secureboot: Secure boot disabled Jul 7 00:16:42.775734 kernel: ACPI: Early table checksum verification disabled Jul 7 00:16:42.775740 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Jul 7 00:16:42.775747 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Jul 7 00:16:42.775755 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jul 7 00:16:42.775761 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 7 00:16:42.775766 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jul 7 00:16:42.775773 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 7 00:16:42.775780 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 7 00:16:42.775788 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 7 00:16:42.775794 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 7 00:16:42.775801 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jul 7 00:16:42.775808 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jul 7 00:16:42.775814 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Jul 7 00:16:42.775820 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jul 7 00:16:42.775827 kernel: ACPI: Use ACPI SPCR as default console: Yes Jul 7 00:16:42.775833 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Jul 7 00:16:42.775839 kernel: NODE_DATA(0) allocated [mem 0x13967ddc0-0x139684fff] Jul 7 00:16:42.775845 kernel: Zone ranges: Jul 7 00:16:42.775853 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jul 7 00:16:42.775860 kernel: DMA32 empty Jul 7 00:16:42.775866 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Jul 7 00:16:42.775872 kernel: Device empty Jul 7 00:16:42.775878 kernel: Movable zone start for each node Jul 7 00:16:42.775884 kernel: Early memory node ranges Jul 7 00:16:42.775891 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Jul 7 00:16:42.775897 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Jul 7 00:16:42.775903 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Jul 7 00:16:42.775909 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Jul 7 00:16:42.775916 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Jul 7 00:16:42.775922 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Jul 7 00:16:42.775928 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Jul 7 00:16:42.775936 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Jul 7 00:16:42.775943 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Jul 7 00:16:42.775952 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Jul 7 00:16:42.775959 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Jul 7 00:16:42.775966 kernel: psci: probing for conduit method from ACPI. Jul 7 00:16:42.775974 kernel: psci: PSCIv1.1 detected in firmware. Jul 7 00:16:42.775993 kernel: psci: Using standard PSCI v0.2 function IDs Jul 7 00:16:42.776000 kernel: psci: Trusted OS migration not required Jul 7 00:16:42.776006 kernel: psci: SMC Calling Convention v1.1 Jul 7 00:16:42.776013 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jul 7 00:16:42.776020 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jul 7 00:16:42.776027 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jul 7 00:16:42.776033 kernel: pcpu-alloc: [0] 0 [0] 1 Jul 7 00:16:42.776041 kernel: Detected PIPT I-cache on CPU0 Jul 7 00:16:42.776048 kernel: CPU features: detected: GIC system register CPU interface Jul 7 00:16:42.776054 kernel: CPU features: detected: Spectre-v4 Jul 7 00:16:42.776063 kernel: CPU features: detected: Spectre-BHB Jul 7 00:16:42.776070 kernel: CPU features: kernel page table isolation forced ON by KASLR Jul 7 00:16:42.776076 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jul 7 00:16:42.776083 kernel: CPU features: detected: ARM erratum 1418040 Jul 7 00:16:42.776090 kernel: CPU features: detected: SSBS not fully self-synchronizing Jul 7 00:16:42.776096 kernel: alternatives: applying boot alternatives Jul 7 00:16:42.776104 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=dd2d39de40482a23e9bb75390ff5ca85cd9bd34d902b8049121a8373f8cb2ef2 Jul 7 00:16:42.776111 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jul 7 00:16:42.776118 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jul 7 00:16:42.776125 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jul 7 00:16:42.776133 kernel: Fallback order for Node 0: 0 Jul 7 00:16:42.776140 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Jul 7 00:16:42.776147 kernel: Policy zone: Normal Jul 7 00:16:42.776153 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jul 7 00:16:42.776160 kernel: software IO TLB: area num 2. Jul 7 00:16:42.776167 kernel: software IO TLB: mapped [mem 0x00000000f98d0000-0x00000000fd8d0000] (64MB) Jul 7 00:16:42.776174 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jul 7 00:16:42.776181 kernel: rcu: Preemptible hierarchical RCU implementation. Jul 7 00:16:42.776188 kernel: rcu: RCU event tracing is enabled. Jul 7 00:16:42.776195 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jul 7 00:16:42.776203 kernel: Trampoline variant of Tasks RCU enabled. Jul 7 00:16:42.776210 kernel: Tracing variant of Tasks RCU enabled. Jul 7 00:16:42.776218 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jul 7 00:16:42.776224 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jul 7 00:16:42.776231 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 7 00:16:42.776238 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 7 00:16:42.776245 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jul 7 00:16:42.776252 kernel: GICv3: 256 SPIs implemented Jul 7 00:16:42.776259 kernel: GICv3: 0 Extended SPIs implemented Jul 7 00:16:42.776265 kernel: Root IRQ handler: gic_handle_irq Jul 7 00:16:42.776272 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jul 7 00:16:42.776279 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jul 7 00:16:42.776285 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jul 7 00:16:42.776292 kernel: ITS [mem 0x08080000-0x0809ffff] Jul 7 00:16:42.776302 kernel: ITS@0x0000000008080000: allocated 8192 Devices @101900000 (indirect, esz 8, psz 64K, shr 1) Jul 7 00:16:42.776309 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @101910000 (flat, esz 8, psz 64K, shr 1) Jul 7 00:16:42.776316 kernel: GICv3: using LPI property table @0x0000000101920000 Jul 7 00:16:42.776323 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000101930000 Jul 7 00:16:42.776329 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jul 7 00:16:42.776336 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jul 7 00:16:42.776343 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jul 7 00:16:42.776350 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jul 7 00:16:42.776356 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jul 7 00:16:42.776363 kernel: Console: colour dummy device 80x25 Jul 7 00:16:42.776371 kernel: ACPI: Core revision 20240827 Jul 7 00:16:42.776379 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jul 7 00:16:42.776386 kernel: pid_max: default: 32768 minimum: 301 Jul 7 00:16:42.776394 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jul 7 00:16:42.776401 kernel: landlock: Up and running. Jul 7 00:16:42.776408 kernel: SELinux: Initializing. Jul 7 00:16:42.776415 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jul 7 00:16:42.776422 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jul 7 00:16:42.776429 kernel: rcu: Hierarchical SRCU implementation. Jul 7 00:16:42.776436 kernel: rcu: Max phase no-delay instances is 400. Jul 7 00:16:42.776445 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jul 7 00:16:42.776453 kernel: Remapping and enabling EFI services. Jul 7 00:16:42.776459 kernel: smp: Bringing up secondary CPUs ... Jul 7 00:16:42.776466 kernel: Detected PIPT I-cache on CPU1 Jul 7 00:16:42.776474 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jul 7 00:16:42.776481 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000101940000 Jul 7 00:16:42.776487 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jul 7 00:16:42.776494 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jul 7 00:16:42.776524 kernel: smp: Brought up 1 node, 2 CPUs Jul 7 00:16:42.776534 kernel: SMP: Total of 2 processors activated. Jul 7 00:16:42.776546 kernel: CPU: All CPU(s) started at EL1 Jul 7 00:16:42.776554 kernel: CPU features: detected: 32-bit EL0 Support Jul 7 00:16:42.776563 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jul 7 00:16:42.776570 kernel: CPU features: detected: Common not Private translations Jul 7 00:16:42.776578 kernel: CPU features: detected: CRC32 instructions Jul 7 00:16:42.776585 kernel: CPU features: detected: Enhanced Virtualization Traps Jul 7 00:16:42.776592 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jul 7 00:16:42.776601 kernel: CPU features: detected: LSE atomic instructions Jul 7 00:16:42.776608 kernel: CPU features: detected: Privileged Access Never Jul 7 00:16:42.776615 kernel: CPU features: detected: RAS Extension Support Jul 7 00:16:42.776623 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jul 7 00:16:42.776630 kernel: alternatives: applying system-wide alternatives Jul 7 00:16:42.776637 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Jul 7 00:16:42.776645 kernel: Memory: 3875624K/4096000K available (11072K kernel code, 2428K rwdata, 9032K rodata, 39424K init, 1035K bss, 215284K reserved, 0K cma-reserved) Jul 7 00:16:42.776652 kernel: devtmpfs: initialized Jul 7 00:16:42.776660 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jul 7 00:16:42.776668 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jul 7 00:16:42.776676 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jul 7 00:16:42.776683 kernel: 0 pages in range for non-PLT usage Jul 7 00:16:42.776690 kernel: 508480 pages in range for PLT usage Jul 7 00:16:42.776697 kernel: pinctrl core: initialized pinctrl subsystem Jul 7 00:16:42.776704 kernel: SMBIOS 3.0.0 present. Jul 7 00:16:42.776712 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Jul 7 00:16:42.776719 kernel: DMI: Memory slots populated: 1/1 Jul 7 00:16:42.776726 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jul 7 00:16:42.776735 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Jul 7 00:16:42.776742 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jul 7 00:16:42.776750 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jul 7 00:16:42.776757 kernel: audit: initializing netlink subsys (disabled) Jul 7 00:16:42.776765 kernel: audit: type=2000 audit(0.016:1): state=initialized audit_enabled=0 res=1 Jul 7 00:16:42.776772 kernel: thermal_sys: Registered thermal governor 'step_wise' Jul 7 00:16:42.776779 kernel: cpuidle: using governor menu Jul 7 00:16:42.776786 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jul 7 00:16:42.776794 kernel: ASID allocator initialised with 32768 entries Jul 7 00:16:42.776802 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jul 7 00:16:42.776810 kernel: Serial: AMBA PL011 UART driver Jul 7 00:16:42.776817 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jul 7 00:16:42.776825 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jul 7 00:16:42.776832 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jul 7 00:16:42.776839 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jul 7 00:16:42.776846 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jul 7 00:16:42.776854 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jul 7 00:16:42.776861 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jul 7 00:16:42.776870 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jul 7 00:16:42.776877 kernel: ACPI: Added _OSI(Module Device) Jul 7 00:16:42.776884 kernel: ACPI: Added _OSI(Processor Device) Jul 7 00:16:42.776892 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jul 7 00:16:42.776899 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jul 7 00:16:42.776906 kernel: ACPI: Interpreter enabled Jul 7 00:16:42.776913 kernel: ACPI: Using GIC for interrupt routing Jul 7 00:16:42.776920 kernel: ACPI: MCFG table detected, 1 entries Jul 7 00:16:42.776927 kernel: ACPI: CPU0 has been hot-added Jul 7 00:16:42.776936 kernel: ACPI: CPU1 has been hot-added Jul 7 00:16:42.776943 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jul 7 00:16:42.776951 kernel: printk: legacy console [ttyAMA0] enabled Jul 7 00:16:42.776958 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jul 7 00:16:42.777140 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jul 7 00:16:42.777211 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jul 7 00:16:42.777273 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jul 7 00:16:42.777338 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jul 7 00:16:42.777397 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jul 7 00:16:42.777408 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jul 7 00:16:42.777417 kernel: PCI host bridge to bus 0000:00 Jul 7 00:16:42.777486 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jul 7 00:16:42.777566 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jul 7 00:16:42.777625 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jul 7 00:16:42.777686 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jul 7 00:16:42.777769 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jul 7 00:16:42.777842 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Jul 7 00:16:42.777907 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Jul 7 00:16:42.777971 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Jul 7 00:16:42.778060 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:16:42.778126 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Jul 7 00:16:42.778196 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 7 00:16:42.778261 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jul 7 00:16:42.778328 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jul 7 00:16:42.778402 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:16:42.778467 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Jul 7 00:16:42.778561 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 7 00:16:42.778629 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Jul 7 00:16:42.778708 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:16:42.778774 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Jul 7 00:16:42.778839 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 7 00:16:42.778901 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Jul 7 00:16:42.778964 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jul 7 00:16:42.779072 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:16:42.779142 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Jul 7 00:16:42.779205 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 7 00:16:42.779269 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Jul 7 00:16:42.779331 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jul 7 00:16:42.779399 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:16:42.779462 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Jul 7 00:16:42.779617 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 7 00:16:42.779684 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Jul 7 00:16:42.779750 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jul 7 00:16:42.779827 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:16:42.779889 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Jul 7 00:16:42.779950 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 7 00:16:42.780025 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Jul 7 00:16:42.780087 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jul 7 00:16:42.780159 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:16:42.780224 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Jul 7 00:16:42.780285 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 7 00:16:42.780346 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Jul 7 00:16:42.780407 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Jul 7 00:16:42.780475 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:16:42.780574 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Jul 7 00:16:42.780641 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 7 00:16:42.780702 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Jul 7 00:16:42.780772 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:16:42.780835 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Jul 7 00:16:42.780898 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 7 00:16:42.780958 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Jul 7 00:16:42.781065 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Jul 7 00:16:42.781137 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Jul 7 00:16:42.781212 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jul 7 00:16:42.781279 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Jul 7 00:16:42.781346 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jul 7 00:16:42.781409 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jul 7 00:16:42.781483 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jul 7 00:16:42.781577 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Jul 7 00:16:42.781655 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Jul 7 00:16:42.781720 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Jul 7 00:16:42.781784 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jul 7 00:16:42.781856 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jul 7 00:16:42.781921 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jul 7 00:16:42.782010 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jul 7 00:16:42.782083 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff] Jul 7 00:16:42.782146 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jul 7 00:16:42.782220 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Jul 7 00:16:42.782284 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Jul 7 00:16:42.782346 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jul 7 00:16:42.782417 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jul 7 00:16:42.782482 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Jul 7 00:16:42.783649 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Jul 7 00:16:42.783733 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jul 7 00:16:42.783802 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jul 7 00:16:42.783866 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jul 7 00:16:42.783930 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jul 7 00:16:42.784017 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jul 7 00:16:42.784086 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jul 7 00:16:42.784157 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jul 7 00:16:42.784224 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jul 7 00:16:42.784286 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jul 7 00:16:42.784347 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jul 7 00:16:42.784412 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jul 7 00:16:42.784475 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jul 7 00:16:42.784771 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jul 7 00:16:42.784852 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jul 7 00:16:42.784916 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jul 7 00:16:42.785032 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Jul 7 00:16:42.785117 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jul 7 00:16:42.785182 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jul 7 00:16:42.785243 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jul 7 00:16:42.785313 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jul 7 00:16:42.785375 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Jul 7 00:16:42.785436 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Jul 7 00:16:42.785515 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jul 7 00:16:42.785581 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jul 7 00:16:42.785644 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jul 7 00:16:42.785712 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jul 7 00:16:42.785777 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jul 7 00:16:42.785838 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jul 7 00:16:42.785901 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jul 7 00:16:42.785961 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jul 7 00:16:42.786036 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jul 7 00:16:42.786099 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jul 7 00:16:42.786166 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jul 7 00:16:42.786230 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jul 7 00:16:42.786294 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jul 7 00:16:42.786356 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jul 7 00:16:42.786418 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jul 7 00:16:42.786482 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jul 7 00:16:42.788687 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jul 7 00:16:42.788772 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jul 7 00:16:42.788841 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jul 7 00:16:42.788909 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jul 7 00:16:42.789013 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jul 7 00:16:42.789091 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jul 7 00:16:42.789163 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jul 7 00:16:42.789227 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jul 7 00:16:42.789300 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Jul 7 00:16:42.789365 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Jul 7 00:16:42.789429 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Jul 7 00:16:42.789498 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Jul 7 00:16:42.789593 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Jul 7 00:16:42.789656 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Jul 7 00:16:42.789722 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Jul 7 00:16:42.789788 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Jul 7 00:16:42.789852 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Jul 7 00:16:42.789924 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Jul 7 00:16:42.789999 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Jul 7 00:16:42.790066 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Jul 7 00:16:42.790129 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Jul 7 00:16:42.790191 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Jul 7 00:16:42.790253 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Jul 7 00:16:42.790317 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Jul 7 00:16:42.790378 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Jul 7 00:16:42.790439 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Jul 7 00:16:42.792137 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Jul 7 00:16:42.792262 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Jul 7 00:16:42.792334 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Jul 7 00:16:42.792407 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jul 7 00:16:42.792473 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jul 7 00:16:42.793604 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jul 7 00:16:42.793685 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 7 00:16:42.793750 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Jul 7 00:16:42.793814 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Jul 7 00:16:42.793886 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jul 7 00:16:42.793963 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jul 7 00:16:42.794075 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 7 00:16:42.794149 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Jul 7 00:16:42.794211 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Jul 7 00:16:42.794283 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jul 7 00:16:42.794355 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jul 7 00:16:42.794419 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jul 7 00:16:42.794484 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 7 00:16:42.796635 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Jul 7 00:16:42.796721 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Jul 7 00:16:42.796784 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jul 7 00:16:42.796858 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jul 7 00:16:42.796924 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 7 00:16:42.797028 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Jul 7 00:16:42.797100 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Jul 7 00:16:42.797163 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jul 7 00:16:42.797238 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jul 7 00:16:42.797304 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Jul 7 00:16:42.797373 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 7 00:16:42.797435 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Jul 7 00:16:42.797497 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Jul 7 00:16:42.797579 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jul 7 00:16:42.797650 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jul 7 00:16:42.797717 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jul 7 00:16:42.797782 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 7 00:16:42.797858 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Jul 7 00:16:42.797922 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Jul 7 00:16:42.797999 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jul 7 00:16:42.798071 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Jul 7 00:16:42.798140 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Jul 7 00:16:42.798207 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Jul 7 00:16:42.798274 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 7 00:16:42.798339 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Jul 7 00:16:42.798401 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Jul 7 00:16:42.798463 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jul 7 00:16:42.800617 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 7 00:16:42.800728 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Jul 7 00:16:42.800792 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Jul 7 00:16:42.800856 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jul 7 00:16:42.800923 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 7 00:16:42.801000 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Jul 7 00:16:42.801077 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Jul 7 00:16:42.801139 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jul 7 00:16:42.801204 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jul 7 00:16:42.801260 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jul 7 00:16:42.801315 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jul 7 00:16:42.801385 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Jul 7 00:16:42.801449 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jul 7 00:16:42.801530 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jul 7 00:16:42.801600 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Jul 7 00:16:42.801657 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jul 7 00:16:42.801713 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jul 7 00:16:42.801787 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Jul 7 00:16:42.801844 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jul 7 00:16:42.801902 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jul 7 00:16:42.801967 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Jul 7 00:16:42.802067 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jul 7 00:16:42.802127 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jul 7 00:16:42.802202 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Jul 7 00:16:42.802262 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jul 7 00:16:42.802318 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jul 7 00:16:42.802387 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Jul 7 00:16:42.802444 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jul 7 00:16:42.805558 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jul 7 00:16:42.805731 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Jul 7 00:16:42.805795 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jul 7 00:16:42.805857 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jul 7 00:16:42.805930 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Jul 7 00:16:42.806003 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jul 7 00:16:42.806063 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jul 7 00:16:42.806128 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Jul 7 00:16:42.806185 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jul 7 00:16:42.806241 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jul 7 00:16:42.806252 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jul 7 00:16:42.806262 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jul 7 00:16:42.806270 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jul 7 00:16:42.806278 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jul 7 00:16:42.806286 kernel: iommu: Default domain type: Translated Jul 7 00:16:42.806294 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jul 7 00:16:42.806309 kernel: efivars: Registered efivars operations Jul 7 00:16:42.806321 kernel: vgaarb: loaded Jul 7 00:16:42.806331 kernel: clocksource: Switched to clocksource arch_sys_counter Jul 7 00:16:42.806339 kernel: VFS: Disk quotas dquot_6.6.0 Jul 7 00:16:42.806347 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jul 7 00:16:42.806357 kernel: pnp: PnP ACPI init Jul 7 00:16:42.806438 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jul 7 00:16:42.806450 kernel: pnp: PnP ACPI: found 1 devices Jul 7 00:16:42.806458 kernel: NET: Registered PF_INET protocol family Jul 7 00:16:42.806466 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jul 7 00:16:42.806474 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jul 7 00:16:42.806481 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jul 7 00:16:42.806489 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jul 7 00:16:42.806510 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jul 7 00:16:42.806518 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jul 7 00:16:42.807217 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jul 7 00:16:42.807227 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jul 7 00:16:42.807234 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jul 7 00:16:42.807353 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jul 7 00:16:42.807366 kernel: PCI: CLS 0 bytes, default 64 Jul 7 00:16:42.807374 kernel: kvm [1]: HYP mode not available Jul 7 00:16:42.807382 kernel: Initialise system trusted keyrings Jul 7 00:16:42.807395 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jul 7 00:16:42.807402 kernel: Key type asymmetric registered Jul 7 00:16:42.807410 kernel: Asymmetric key parser 'x509' registered Jul 7 00:16:42.807418 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jul 7 00:16:42.807426 kernel: io scheduler mq-deadline registered Jul 7 00:16:42.807434 kernel: io scheduler kyber registered Jul 7 00:16:42.807441 kernel: io scheduler bfq registered Jul 7 00:16:42.807450 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jul 7 00:16:42.807546 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Jul 7 00:16:42.807617 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Jul 7 00:16:42.807681 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 7 00:16:42.807747 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Jul 7 00:16:42.807811 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Jul 7 00:16:42.807874 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 7 00:16:42.807940 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Jul 7 00:16:42.808023 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Jul 7 00:16:42.808089 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 7 00:16:42.808158 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Jul 7 00:16:42.808224 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Jul 7 00:16:42.808286 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 7 00:16:42.808352 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Jul 7 00:16:42.808415 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Jul 7 00:16:42.808477 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 7 00:16:42.809685 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Jul 7 00:16:42.809771 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Jul 7 00:16:42.809836 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 7 00:16:42.809902 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Jul 7 00:16:42.809966 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Jul 7 00:16:42.810048 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 7 00:16:42.810116 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Jul 7 00:16:42.810180 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Jul 7 00:16:42.810242 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 7 00:16:42.810256 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jul 7 00:16:42.810320 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Jul 7 00:16:42.810383 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Jul 7 00:16:42.810445 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 7 00:16:42.810456 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jul 7 00:16:42.810463 kernel: ACPI: button: Power Button [PWRB] Jul 7 00:16:42.810471 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jul 7 00:16:42.811625 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jul 7 00:16:42.811730 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Jul 7 00:16:42.811742 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jul 7 00:16:42.811751 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jul 7 00:16:42.811817 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Jul 7 00:16:42.811828 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Jul 7 00:16:42.811836 kernel: thunder_xcv, ver 1.0 Jul 7 00:16:42.811844 kernel: thunder_bgx, ver 1.0 Jul 7 00:16:42.811852 kernel: nicpf, ver 1.0 Jul 7 00:16:42.811859 kernel: nicvf, ver 1.0 Jul 7 00:16:42.811937 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jul 7 00:16:42.812019 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-07-07T00:16:42 UTC (1751847402) Jul 7 00:16:42.812031 kernel: hid: raw HID events driver (C) Jiri Kosina Jul 7 00:16:42.812039 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jul 7 00:16:42.812046 kernel: watchdog: NMI not fully supported Jul 7 00:16:42.812054 kernel: watchdog: Hard watchdog permanently disabled Jul 7 00:16:42.812062 kernel: NET: Registered PF_INET6 protocol family Jul 7 00:16:42.812070 kernel: Segment Routing with IPv6 Jul 7 00:16:42.812080 kernel: In-situ OAM (IOAM) with IPv6 Jul 7 00:16:42.812088 kernel: NET: Registered PF_PACKET protocol family Jul 7 00:16:42.812095 kernel: Key type dns_resolver registered Jul 7 00:16:42.812103 kernel: registered taskstats version 1 Jul 7 00:16:42.812111 kernel: Loading compiled-in X.509 certificates Jul 7 00:16:42.812119 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.35-flatcar: 90fb300ebe1fa0773739bb35dad461c5679d8dfb' Jul 7 00:16:42.812127 kernel: Demotion targets for Node 0: null Jul 7 00:16:42.812135 kernel: Key type .fscrypt registered Jul 7 00:16:42.812142 kernel: Key type fscrypt-provisioning registered Jul 7 00:16:42.812151 kernel: ima: No TPM chip found, activating TPM-bypass! Jul 7 00:16:42.812159 kernel: ima: Allocated hash algorithm: sha1 Jul 7 00:16:42.812167 kernel: ima: No architecture policies found Jul 7 00:16:42.812175 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jul 7 00:16:42.812183 kernel: clk: Disabling unused clocks Jul 7 00:16:42.812191 kernel: PM: genpd: Disabling unused power domains Jul 7 00:16:42.812199 kernel: Warning: unable to open an initial console. Jul 7 00:16:42.812207 kernel: Freeing unused kernel memory: 39424K Jul 7 00:16:42.812214 kernel: Run /init as init process Jul 7 00:16:42.812223 kernel: with arguments: Jul 7 00:16:42.812231 kernel: /init Jul 7 00:16:42.812239 kernel: with environment: Jul 7 00:16:42.812247 kernel: HOME=/ Jul 7 00:16:42.812254 kernel: TERM=linux Jul 7 00:16:42.812262 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jul 7 00:16:42.812270 systemd[1]: Successfully made /usr/ read-only. Jul 7 00:16:42.812281 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 7 00:16:42.812292 systemd[1]: Detected virtualization kvm. Jul 7 00:16:42.812300 systemd[1]: Detected architecture arm64. Jul 7 00:16:42.812309 systemd[1]: Running in initrd. Jul 7 00:16:42.812317 systemd[1]: No hostname configured, using default hostname. Jul 7 00:16:42.812326 systemd[1]: Hostname set to . Jul 7 00:16:42.812342 systemd[1]: Initializing machine ID from VM UUID. Jul 7 00:16:42.812350 systemd[1]: Queued start job for default target initrd.target. Jul 7 00:16:42.812359 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 7 00:16:42.812370 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 7 00:16:42.812378 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jul 7 00:16:42.812387 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 7 00:16:42.812395 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jul 7 00:16:42.812404 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jul 7 00:16:42.812414 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jul 7 00:16:42.812424 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jul 7 00:16:42.812432 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 7 00:16:42.812441 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 7 00:16:42.812449 systemd[1]: Reached target paths.target - Path Units. Jul 7 00:16:42.812457 systemd[1]: Reached target slices.target - Slice Units. Jul 7 00:16:42.812465 systemd[1]: Reached target swap.target - Swaps. Jul 7 00:16:42.812474 systemd[1]: Reached target timers.target - Timer Units. Jul 7 00:16:42.812482 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jul 7 00:16:42.812491 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 7 00:16:42.813549 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jul 7 00:16:42.813570 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jul 7 00:16:42.813579 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 7 00:16:42.813588 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 7 00:16:42.813597 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 7 00:16:42.813605 systemd[1]: Reached target sockets.target - Socket Units. Jul 7 00:16:42.813614 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jul 7 00:16:42.813623 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 7 00:16:42.813632 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jul 7 00:16:42.813646 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jul 7 00:16:42.813655 systemd[1]: Starting systemd-fsck-usr.service... Jul 7 00:16:42.813663 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 7 00:16:42.813672 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 7 00:16:42.813680 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 7 00:16:42.813722 systemd-journald[244]: Collecting audit messages is disabled. Jul 7 00:16:42.813746 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jul 7 00:16:42.813755 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 7 00:16:42.813765 systemd[1]: Finished systemd-fsck-usr.service. Jul 7 00:16:42.813774 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 7 00:16:42.813783 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 7 00:16:42.813792 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 7 00:16:42.813802 systemd-journald[244]: Journal started Jul 7 00:16:42.813821 systemd-journald[244]: Runtime Journal (/run/log/journal/4f990e9bbd754329ad87a5e17abc3d1b) is 8M, max 76.5M, 68.5M free. Jul 7 00:16:42.803300 systemd-modules-load[247]: Inserted module 'overlay' Jul 7 00:16:42.815960 systemd[1]: Started systemd-journald.service - Journal Service. Jul 7 00:16:42.821396 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jul 7 00:16:42.821451 kernel: Bridge firewalling registered Jul 7 00:16:42.821649 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 7 00:16:42.823198 systemd-modules-load[247]: Inserted module 'br_netfilter' Jul 7 00:16:42.824497 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jul 7 00:16:42.827859 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 7 00:16:42.830906 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 7 00:16:42.837711 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 7 00:16:42.842956 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 7 00:16:42.848181 systemd-tmpfiles[265]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jul 7 00:16:42.855279 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 7 00:16:42.860106 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 7 00:16:42.862351 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 7 00:16:42.865384 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 7 00:16:42.867168 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jul 7 00:16:42.891810 dracut-cmdline[285]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=dd2d39de40482a23e9bb75390ff5ca85cd9bd34d902b8049121a8373f8cb2ef2 Jul 7 00:16:42.909701 systemd-resolved[283]: Positive Trust Anchors: Jul 7 00:16:42.909720 systemd-resolved[283]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 7 00:16:42.910273 systemd-resolved[283]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 7 00:16:42.915231 systemd-resolved[283]: Defaulting to hostname 'linux'. Jul 7 00:16:42.916249 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 7 00:16:42.917289 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 7 00:16:43.005561 kernel: SCSI subsystem initialized Jul 7 00:16:43.009563 kernel: Loading iSCSI transport class v2.0-870. Jul 7 00:16:43.017929 kernel: iscsi: registered transport (tcp) Jul 7 00:16:43.030607 kernel: iscsi: registered transport (qla4xxx) Jul 7 00:16:43.030713 kernel: QLogic iSCSI HBA Driver Jul 7 00:16:43.053197 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 7 00:16:43.078945 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 7 00:16:43.083392 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 7 00:16:43.139213 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jul 7 00:16:43.142596 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jul 7 00:16:43.206542 kernel: raid6: neonx8 gen() 15704 MB/s Jul 7 00:16:43.223559 kernel: raid6: neonx4 gen() 15779 MB/s Jul 7 00:16:43.240545 kernel: raid6: neonx2 gen() 13174 MB/s Jul 7 00:16:43.257543 kernel: raid6: neonx1 gen() 10432 MB/s Jul 7 00:16:43.274566 kernel: raid6: int64x8 gen() 6867 MB/s Jul 7 00:16:43.291567 kernel: raid6: int64x4 gen() 7316 MB/s Jul 7 00:16:43.308558 kernel: raid6: int64x2 gen() 6082 MB/s Jul 7 00:16:43.325571 kernel: raid6: int64x1 gen() 5025 MB/s Jul 7 00:16:43.325657 kernel: raid6: using algorithm neonx4 gen() 15779 MB/s Jul 7 00:16:43.342581 kernel: raid6: .... xor() 12260 MB/s, rmw enabled Jul 7 00:16:43.342663 kernel: raid6: using neon recovery algorithm Jul 7 00:16:43.347821 kernel: xor: measuring software checksum speed Jul 7 00:16:43.347878 kernel: 8regs : 18808 MB/sec Jul 7 00:16:43.347919 kernel: 32regs : 21670 MB/sec Jul 7 00:16:43.347950 kernel: arm64_neon : 28041 MB/sec Jul 7 00:16:43.348577 kernel: xor: using function: arm64_neon (28041 MB/sec) Jul 7 00:16:43.401582 kernel: Btrfs loaded, zoned=no, fsverity=no Jul 7 00:16:43.409687 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jul 7 00:16:43.413707 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 7 00:16:43.444828 systemd-udevd[493]: Using default interface naming scheme 'v255'. Jul 7 00:16:43.449165 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 7 00:16:43.452742 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jul 7 00:16:43.476687 dracut-pre-trigger[503]: rd.md=0: removing MD RAID activation Jul 7 00:16:43.505139 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jul 7 00:16:43.508376 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 7 00:16:43.562390 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 7 00:16:43.565863 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jul 7 00:16:43.647533 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Jul 7 00:16:43.658537 kernel: scsi host0: Virtio SCSI HBA Jul 7 00:16:43.668546 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Jul 7 00:16:43.668632 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Jul 7 00:16:43.678554 kernel: ACPI: bus type USB registered Jul 7 00:16:43.678604 kernel: usbcore: registered new interface driver usbfs Jul 7 00:16:43.680541 kernel: usbcore: registered new interface driver hub Jul 7 00:16:43.681550 kernel: usbcore: registered new device driver usb Jul 7 00:16:43.696708 kernel: sd 0:0:0:1: Power-on or device reset occurred Jul 7 00:16:43.696907 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Jul 7 00:16:43.696419 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 7 00:16:43.698884 kernel: sd 0:0:0:1: [sda] Write Protect is off Jul 7 00:16:43.699226 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Jul 7 00:16:43.699309 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jul 7 00:16:43.696554 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 7 00:16:43.700259 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jul 7 00:16:43.702889 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 7 00:16:43.712553 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jul 7 00:16:43.712656 kernel: GPT:17805311 != 80003071 Jul 7 00:16:43.712668 kernel: GPT:Alternate GPT header not at the end of the disk. Jul 7 00:16:43.712678 kernel: GPT:17805311 != 80003071 Jul 7 00:16:43.712687 kernel: GPT: Use GNU Parted to correct GPT errors. Jul 7 00:16:43.712704 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 7 00:16:43.714167 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Jul 7 00:16:43.720770 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jul 7 00:16:43.721008 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jul 7 00:16:43.722132 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jul 7 00:16:43.726397 kernel: sr 0:0:0:0: Power-on or device reset occurred Jul 7 00:16:43.726628 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jul 7 00:16:43.726741 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Jul 7 00:16:43.726823 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jul 7 00:16:43.726901 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jul 7 00:16:43.727851 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jul 7 00:16:43.731514 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Jul 7 00:16:43.732533 kernel: hub 1-0:1.0: USB hub found Jul 7 00:16:43.733554 kernel: hub 1-0:1.0: 4 ports detected Jul 7 00:16:43.734526 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jul 7 00:16:43.736831 kernel: hub 2-0:1.0: USB hub found Jul 7 00:16:43.737069 kernel: hub 2-0:1.0: 4 ports detected Jul 7 00:16:43.738776 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 7 00:16:43.801666 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Jul 7 00:16:43.811081 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jul 7 00:16:43.821033 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Jul 7 00:16:43.822586 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Jul 7 00:16:43.833701 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Jul 7 00:16:43.840185 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jul 7 00:16:43.853307 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jul 7 00:16:43.855419 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jul 7 00:16:43.858788 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 7 00:16:43.859364 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 7 00:16:43.862880 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jul 7 00:16:43.866223 disk-uuid[602]: Primary Header is updated. Jul 7 00:16:43.866223 disk-uuid[602]: Secondary Entries is updated. Jul 7 00:16:43.866223 disk-uuid[602]: Secondary Header is updated. Jul 7 00:16:43.879698 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 7 00:16:43.885768 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jul 7 00:16:43.970563 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jul 7 00:16:44.105709 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jul 7 00:16:44.105781 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jul 7 00:16:44.106695 kernel: usbcore: registered new interface driver usbhid Jul 7 00:16:44.106722 kernel: usbhid: USB HID core driver Jul 7 00:16:44.207620 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jul 7 00:16:44.334560 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jul 7 00:16:44.387558 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jul 7 00:16:44.899806 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 7 00:16:44.902131 disk-uuid[604]: The operation has completed successfully. Jul 7 00:16:44.976230 systemd[1]: disk-uuid.service: Deactivated successfully. Jul 7 00:16:44.976351 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jul 7 00:16:44.998122 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jul 7 00:16:45.012536 sh[627]: Success Jul 7 00:16:45.031797 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jul 7 00:16:45.031862 kernel: device-mapper: uevent: version 1.0.3 Jul 7 00:16:45.032522 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jul 7 00:16:45.043553 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jul 7 00:16:45.098552 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jul 7 00:16:45.102087 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jul 7 00:16:45.121432 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jul 7 00:16:45.132548 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Jul 7 00:16:45.134537 kernel: BTRFS: device fsid aa7ffdf7-f152-4ceb-bd0e-b3b3f8f8b296 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (639) Jul 7 00:16:45.135738 kernel: BTRFS info (device dm-0): first mount of filesystem aa7ffdf7-f152-4ceb-bd0e-b3b3f8f8b296 Jul 7 00:16:45.135791 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jul 7 00:16:45.135813 kernel: BTRFS info (device dm-0): using free-space-tree Jul 7 00:16:45.145470 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jul 7 00:16:45.148001 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jul 7 00:16:45.149606 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jul 7 00:16:45.150784 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jul 7 00:16:45.154722 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jul 7 00:16:45.188576 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (671) Jul 7 00:16:45.190766 kernel: BTRFS info (device sda6): first mount of filesystem 492b2e2a-5dd7-445f-b930-e9dd6acadf93 Jul 7 00:16:45.190825 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jul 7 00:16:45.190839 kernel: BTRFS info (device sda6): using free-space-tree Jul 7 00:16:45.202568 kernel: BTRFS info (device sda6): last unmount of filesystem 492b2e2a-5dd7-445f-b930-e9dd6acadf93 Jul 7 00:16:45.203122 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jul 7 00:16:45.209685 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jul 7 00:16:45.334227 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 7 00:16:45.339135 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 7 00:16:45.361473 ignition[719]: Ignition 2.21.0 Jul 7 00:16:45.362074 ignition[719]: Stage: fetch-offline Jul 7 00:16:45.362122 ignition[719]: no configs at "/usr/lib/ignition/base.d" Jul 7 00:16:45.362130 ignition[719]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 7 00:16:45.362680 ignition[719]: parsed url from cmdline: "" Jul 7 00:16:45.362688 ignition[719]: no config URL provided Jul 7 00:16:45.362703 ignition[719]: reading system config file "/usr/lib/ignition/user.ign" Jul 7 00:16:45.362712 ignition[719]: no config at "/usr/lib/ignition/user.ign" Jul 7 00:16:45.362717 ignition[719]: failed to fetch config: resource requires networking Jul 7 00:16:45.362906 ignition[719]: Ignition finished successfully Jul 7 00:16:45.367634 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jul 7 00:16:45.376757 systemd-networkd[814]: lo: Link UP Jul 7 00:16:45.376772 systemd-networkd[814]: lo: Gained carrier Jul 7 00:16:45.378832 systemd-networkd[814]: Enumeration completed Jul 7 00:16:45.378944 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 7 00:16:45.379595 systemd[1]: Reached target network.target - Network. Jul 7 00:16:45.381027 systemd-networkd[814]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 7 00:16:45.381031 systemd-networkd[814]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 7 00:16:45.381418 systemd-networkd[814]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 7 00:16:45.381422 systemd-networkd[814]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 7 00:16:45.381736 systemd-networkd[814]: eth0: Link UP Jul 7 00:16:45.381739 systemd-networkd[814]: eth0: Gained carrier Jul 7 00:16:45.381746 systemd-networkd[814]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 7 00:16:45.383671 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jul 7 00:16:45.385912 systemd-networkd[814]: eth1: Link UP Jul 7 00:16:45.385916 systemd-networkd[814]: eth1: Gained carrier Jul 7 00:16:45.385929 systemd-networkd[814]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 7 00:16:45.412653 systemd-networkd[814]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 7 00:16:45.419302 ignition[818]: Ignition 2.21.0 Jul 7 00:16:45.419319 ignition[818]: Stage: fetch Jul 7 00:16:45.419476 ignition[818]: no configs at "/usr/lib/ignition/base.d" Jul 7 00:16:45.419486 ignition[818]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 7 00:16:45.419595 ignition[818]: parsed url from cmdline: "" Jul 7 00:16:45.419599 ignition[818]: no config URL provided Jul 7 00:16:45.419603 ignition[818]: reading system config file "/usr/lib/ignition/user.ign" Jul 7 00:16:45.419611 ignition[818]: no config at "/usr/lib/ignition/user.ign" Jul 7 00:16:45.419888 ignition[818]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Jul 7 00:16:45.423234 ignition[818]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Jul 7 00:16:45.435666 systemd-networkd[814]: eth0: DHCPv4 address 91.107.203.174/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jul 7 00:16:45.623443 ignition[818]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Jul 7 00:16:45.629769 ignition[818]: GET result: OK Jul 7 00:16:45.630055 ignition[818]: parsing config with SHA512: cfcf58f5ffec1d04e45626b85d07c65d1936b12768ba3f84803b83ff8aaee92698322fd0f45a8175123c1ab89f57f12ebebb501720c95c8fb7fa4d5e9c2eb14c Jul 7 00:16:45.636910 unknown[818]: fetched base config from "system" Jul 7 00:16:45.636930 unknown[818]: fetched base config from "system" Jul 7 00:16:45.637275 ignition[818]: fetch: fetch complete Jul 7 00:16:45.636935 unknown[818]: fetched user config from "hetzner" Jul 7 00:16:45.637280 ignition[818]: fetch: fetch passed Jul 7 00:16:45.637328 ignition[818]: Ignition finished successfully Jul 7 00:16:45.641315 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jul 7 00:16:45.644676 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jul 7 00:16:45.690871 ignition[825]: Ignition 2.21.0 Jul 7 00:16:45.690884 ignition[825]: Stage: kargs Jul 7 00:16:45.691095 ignition[825]: no configs at "/usr/lib/ignition/base.d" Jul 7 00:16:45.691106 ignition[825]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 7 00:16:45.695728 ignition[825]: kargs: kargs passed Jul 7 00:16:45.695802 ignition[825]: Ignition finished successfully Jul 7 00:16:45.698410 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jul 7 00:16:45.702154 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jul 7 00:16:45.724197 ignition[831]: Ignition 2.21.0 Jul 7 00:16:45.724217 ignition[831]: Stage: disks Jul 7 00:16:45.724450 ignition[831]: no configs at "/usr/lib/ignition/base.d" Jul 7 00:16:45.724468 ignition[831]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 7 00:16:45.727387 ignition[831]: disks: disks passed Jul 7 00:16:45.731317 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jul 7 00:16:45.729303 ignition[831]: Ignition finished successfully Jul 7 00:16:45.732706 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jul 7 00:16:45.733295 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jul 7 00:16:45.733946 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 7 00:16:45.735112 systemd[1]: Reached target sysinit.target - System Initialization. Jul 7 00:16:45.735954 systemd[1]: Reached target basic.target - Basic System. Jul 7 00:16:45.738096 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jul 7 00:16:45.772640 systemd-fsck[839]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Jul 7 00:16:45.777032 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jul 7 00:16:45.781525 systemd[1]: Mounting sysroot.mount - /sysroot... Jul 7 00:16:45.869549 kernel: EXT4-fs (sda9): mounted filesystem a6b10247-fbe6-4a25-95d9-ddd4b58604ec r/w with ordered data mode. Quota mode: none. Jul 7 00:16:45.871185 systemd[1]: Mounted sysroot.mount - /sysroot. Jul 7 00:16:45.872936 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jul 7 00:16:45.875606 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 7 00:16:45.877663 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jul 7 00:16:45.892810 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jul 7 00:16:45.897884 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jul 7 00:16:45.899648 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jul 7 00:16:45.902847 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jul 7 00:16:45.904058 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (847) Jul 7 00:16:45.904089 kernel: BTRFS info (device sda6): first mount of filesystem 492b2e2a-5dd7-445f-b930-e9dd6acadf93 Jul 7 00:16:45.904142 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jul 7 00:16:45.904154 kernel: BTRFS info (device sda6): using free-space-tree Jul 7 00:16:45.907091 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jul 7 00:16:45.918833 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 7 00:16:45.971997 initrd-setup-root[874]: cut: /sysroot/etc/passwd: No such file or directory Jul 7 00:16:45.976973 coreos-metadata[849]: Jul 07 00:16:45.976 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Jul 7 00:16:45.978866 coreos-metadata[849]: Jul 07 00:16:45.978 INFO Fetch successful Jul 7 00:16:45.978866 coreos-metadata[849]: Jul 07 00:16:45.978 INFO wrote hostname ci-4344-1-1-1-1232b7205a to /sysroot/etc/hostname Jul 7 00:16:45.981425 initrd-setup-root[881]: cut: /sysroot/etc/group: No such file or directory Jul 7 00:16:45.984532 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 7 00:16:45.986792 initrd-setup-root[889]: cut: /sysroot/etc/shadow: No such file or directory Jul 7 00:16:45.991667 initrd-setup-root[896]: cut: /sysroot/etc/gshadow: No such file or directory Jul 7 00:16:46.099302 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jul 7 00:16:46.100930 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jul 7 00:16:46.102685 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jul 7 00:16:46.117617 kernel: BTRFS info (device sda6): last unmount of filesystem 492b2e2a-5dd7-445f-b930-e9dd6acadf93 Jul 7 00:16:46.133249 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jul 7 00:16:46.142563 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jul 7 00:16:46.150047 ignition[964]: INFO : Ignition 2.21.0 Jul 7 00:16:46.150047 ignition[964]: INFO : Stage: mount Jul 7 00:16:46.152694 ignition[964]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 7 00:16:46.152694 ignition[964]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 7 00:16:46.152694 ignition[964]: INFO : mount: mount passed Jul 7 00:16:46.152694 ignition[964]: INFO : Ignition finished successfully Jul 7 00:16:46.154319 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jul 7 00:16:46.157952 systemd[1]: Starting ignition-files.service - Ignition (files)... Jul 7 00:16:46.183420 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 7 00:16:46.211563 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (976) Jul 7 00:16:46.213711 kernel: BTRFS info (device sda6): first mount of filesystem 492b2e2a-5dd7-445f-b930-e9dd6acadf93 Jul 7 00:16:46.213776 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jul 7 00:16:46.213803 kernel: BTRFS info (device sda6): using free-space-tree Jul 7 00:16:46.221334 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 7 00:16:46.254527 ignition[993]: INFO : Ignition 2.21.0 Jul 7 00:16:46.254527 ignition[993]: INFO : Stage: files Jul 7 00:16:46.255807 ignition[993]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 7 00:16:46.255807 ignition[993]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 7 00:16:46.255807 ignition[993]: DEBUG : files: compiled without relabeling support, skipping Jul 7 00:16:46.260077 ignition[993]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jul 7 00:16:46.260077 ignition[993]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jul 7 00:16:46.260077 ignition[993]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jul 7 00:16:46.260077 ignition[993]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jul 7 00:16:46.264008 ignition[993]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jul 7 00:16:46.264008 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Jul 7 00:16:46.264008 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Jul 7 00:16:46.260859 unknown[993]: wrote ssh authorized keys file for user: core Jul 7 00:16:46.372936 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jul 7 00:16:46.528567 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Jul 7 00:16:46.528567 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jul 7 00:16:46.528567 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jul 7 00:16:46.528567 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jul 7 00:16:46.528567 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jul 7 00:16:46.528567 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 7 00:16:46.528567 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 7 00:16:46.528567 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 7 00:16:46.528567 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 7 00:16:46.539317 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jul 7 00:16:46.539317 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jul 7 00:16:46.539317 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Jul 7 00:16:46.539317 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Jul 7 00:16:46.539317 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Jul 7 00:16:46.539317 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-arm64.raw: attempt #1 Jul 7 00:16:46.598780 systemd-networkd[814]: eth1: Gained IPv6LL Jul 7 00:16:46.726799 systemd-networkd[814]: eth0: Gained IPv6LL Jul 7 00:16:47.193039 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jul 7 00:16:47.414642 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Jul 7 00:16:47.414642 ignition[993]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jul 7 00:16:47.417881 ignition[993]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 7 00:16:47.422242 ignition[993]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 7 00:16:47.422242 ignition[993]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jul 7 00:16:47.422242 ignition[993]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jul 7 00:16:47.422242 ignition[993]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jul 7 00:16:47.429918 ignition[993]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jul 7 00:16:47.429918 ignition[993]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jul 7 00:16:47.429918 ignition[993]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Jul 7 00:16:47.429918 ignition[993]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Jul 7 00:16:47.429918 ignition[993]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Jul 7 00:16:47.429918 ignition[993]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Jul 7 00:16:47.429918 ignition[993]: INFO : files: files passed Jul 7 00:16:47.429918 ignition[993]: INFO : Ignition finished successfully Jul 7 00:16:47.430057 systemd[1]: Finished ignition-files.service - Ignition (files). Jul 7 00:16:47.434735 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jul 7 00:16:47.439556 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jul 7 00:16:47.458349 systemd[1]: ignition-quench.service: Deactivated successfully. Jul 7 00:16:47.458477 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jul 7 00:16:47.465590 initrd-setup-root-after-ignition[1022]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 7 00:16:47.465590 initrd-setup-root-after-ignition[1022]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jul 7 00:16:47.468196 initrd-setup-root-after-ignition[1026]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 7 00:16:47.470263 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 7 00:16:47.471211 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jul 7 00:16:47.473348 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jul 7 00:16:47.532689 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jul 7 00:16:47.532873 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jul 7 00:16:47.535408 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jul 7 00:16:47.536852 systemd[1]: Reached target initrd.target - Initrd Default Target. Jul 7 00:16:47.538483 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jul 7 00:16:47.539485 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jul 7 00:16:47.583579 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 7 00:16:47.587806 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jul 7 00:16:47.610546 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jul 7 00:16:47.612042 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 7 00:16:47.612765 systemd[1]: Stopped target timers.target - Timer Units. Jul 7 00:16:47.613758 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jul 7 00:16:47.613936 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 7 00:16:47.615752 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jul 7 00:16:47.616603 systemd[1]: Stopped target basic.target - Basic System. Jul 7 00:16:47.617594 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jul 7 00:16:47.618556 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jul 7 00:16:47.619541 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jul 7 00:16:47.620585 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jul 7 00:16:47.621541 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jul 7 00:16:47.622448 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jul 7 00:16:47.623457 systemd[1]: Stopped target sysinit.target - System Initialization. Jul 7 00:16:47.624409 systemd[1]: Stopped target local-fs.target - Local File Systems. Jul 7 00:16:47.625274 systemd[1]: Stopped target swap.target - Swaps. Jul 7 00:16:47.626023 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jul 7 00:16:47.626210 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jul 7 00:16:47.627298 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jul 7 00:16:47.628362 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 7 00:16:47.629278 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jul 7 00:16:47.629386 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 7 00:16:47.630433 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jul 7 00:16:47.630623 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jul 7 00:16:47.631930 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jul 7 00:16:47.632108 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 7 00:16:47.633106 systemd[1]: ignition-files.service: Deactivated successfully. Jul 7 00:16:47.633254 systemd[1]: Stopped ignition-files.service - Ignition (files). Jul 7 00:16:47.634016 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jul 7 00:16:47.634159 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 7 00:16:47.637736 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jul 7 00:16:47.638266 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jul 7 00:16:47.638453 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jul 7 00:16:47.642700 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jul 7 00:16:47.643199 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jul 7 00:16:47.643368 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jul 7 00:16:47.644835 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jul 7 00:16:47.644986 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jul 7 00:16:47.651312 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jul 7 00:16:47.652453 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jul 7 00:16:47.663534 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jul 7 00:16:47.673538 ignition[1046]: INFO : Ignition 2.21.0 Jul 7 00:16:47.675363 ignition[1046]: INFO : Stage: umount Jul 7 00:16:47.675363 ignition[1046]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 7 00:16:47.675363 ignition[1046]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 7 00:16:47.675363 ignition[1046]: INFO : umount: umount passed Jul 7 00:16:47.675363 ignition[1046]: INFO : Ignition finished successfully Jul 7 00:16:47.678349 systemd[1]: ignition-mount.service: Deactivated successfully. Jul 7 00:16:47.678822 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jul 7 00:16:47.684860 systemd[1]: ignition-disks.service: Deactivated successfully. Jul 7 00:16:47.684927 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jul 7 00:16:47.685616 systemd[1]: ignition-kargs.service: Deactivated successfully. Jul 7 00:16:47.685659 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jul 7 00:16:47.686686 systemd[1]: ignition-fetch.service: Deactivated successfully. Jul 7 00:16:47.686723 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jul 7 00:16:47.691646 systemd[1]: Stopped target network.target - Network. Jul 7 00:16:47.692777 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jul 7 00:16:47.692879 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jul 7 00:16:47.694104 systemd[1]: Stopped target paths.target - Path Units. Jul 7 00:16:47.695651 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jul 7 00:16:47.695736 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 7 00:16:47.697075 systemd[1]: Stopped target slices.target - Slice Units. Jul 7 00:16:47.698016 systemd[1]: Stopped target sockets.target - Socket Units. Jul 7 00:16:47.699008 systemd[1]: iscsid.socket: Deactivated successfully. Jul 7 00:16:47.699050 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jul 7 00:16:47.700902 systemd[1]: iscsiuio.socket: Deactivated successfully. Jul 7 00:16:47.700938 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 7 00:16:47.707624 systemd[1]: ignition-setup.service: Deactivated successfully. Jul 7 00:16:47.707695 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jul 7 00:16:47.708880 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jul 7 00:16:47.708925 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jul 7 00:16:47.709831 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jul 7 00:16:47.711691 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jul 7 00:16:47.713519 systemd[1]: sysroot-boot.service: Deactivated successfully. Jul 7 00:16:47.713755 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jul 7 00:16:47.715137 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jul 7 00:16:47.715231 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jul 7 00:16:47.720555 systemd[1]: systemd-resolved.service: Deactivated successfully. Jul 7 00:16:47.720666 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jul 7 00:16:47.724927 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jul 7 00:16:47.725220 systemd[1]: systemd-networkd.service: Deactivated successfully. Jul 7 00:16:47.726311 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jul 7 00:16:47.728555 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jul 7 00:16:47.729184 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jul 7 00:16:47.729987 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jul 7 00:16:47.730043 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jul 7 00:16:47.732154 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jul 7 00:16:47.734659 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jul 7 00:16:47.734741 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 7 00:16:47.735844 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jul 7 00:16:47.735885 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jul 7 00:16:47.737145 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jul 7 00:16:47.737184 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jul 7 00:16:47.737764 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jul 7 00:16:47.737800 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 7 00:16:47.739040 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 7 00:16:47.743476 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jul 7 00:16:47.743578 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jul 7 00:16:47.774053 systemd[1]: systemd-udevd.service: Deactivated successfully. Jul 7 00:16:47.774328 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 7 00:16:47.776166 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jul 7 00:16:47.776233 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jul 7 00:16:47.777630 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jul 7 00:16:47.777771 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jul 7 00:16:47.779759 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jul 7 00:16:47.779821 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jul 7 00:16:47.780913 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jul 7 00:16:47.780968 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jul 7 00:16:47.782405 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jul 7 00:16:47.782458 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 7 00:16:47.784724 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jul 7 00:16:47.787053 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jul 7 00:16:47.787125 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jul 7 00:16:47.789463 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jul 7 00:16:47.790185 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 7 00:16:47.791887 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 7 00:16:47.792468 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 7 00:16:47.795201 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Jul 7 00:16:47.795271 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Jul 7 00:16:47.795306 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jul 7 00:16:47.796480 systemd[1]: network-cleanup.service: Deactivated successfully. Jul 7 00:16:47.796635 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jul 7 00:16:47.803146 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jul 7 00:16:47.803259 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jul 7 00:16:47.804842 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jul 7 00:16:47.808523 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jul 7 00:16:47.831285 systemd[1]: Switching root. Jul 7 00:16:47.880488 systemd-journald[244]: Journal stopped Jul 7 00:16:48.858579 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Jul 7 00:16:48.858650 kernel: SELinux: policy capability network_peer_controls=1 Jul 7 00:16:48.858673 kernel: SELinux: policy capability open_perms=1 Jul 7 00:16:48.858685 kernel: SELinux: policy capability extended_socket_class=1 Jul 7 00:16:48.858699 kernel: SELinux: policy capability always_check_network=0 Jul 7 00:16:48.858709 kernel: SELinux: policy capability cgroup_seclabel=1 Jul 7 00:16:48.858726 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jul 7 00:16:48.858737 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jul 7 00:16:48.858747 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jul 7 00:16:48.858757 kernel: SELinux: policy capability userspace_initial_context=0 Jul 7 00:16:48.858775 kernel: audit: type=1403 audit(1751847408.005:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jul 7 00:16:48.858786 systemd[1]: Successfully loaded SELinux policy in 37.900ms. Jul 7 00:16:48.858805 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 10.914ms. Jul 7 00:16:48.858817 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 7 00:16:48.858829 systemd[1]: Detected virtualization kvm. Jul 7 00:16:48.858839 systemd[1]: Detected architecture arm64. Jul 7 00:16:48.858849 systemd[1]: Detected first boot. Jul 7 00:16:48.858859 systemd[1]: Hostname set to . Jul 7 00:16:48.858870 systemd[1]: Initializing machine ID from VM UUID. Jul 7 00:16:48.858880 zram_generator::config[1090]: No configuration found. Jul 7 00:16:48.858895 kernel: NET: Registered PF_VSOCK protocol family Jul 7 00:16:48.858905 systemd[1]: Populated /etc with preset unit settings. Jul 7 00:16:48.858916 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jul 7 00:16:48.858928 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jul 7 00:16:48.858965 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jul 7 00:16:48.858979 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jul 7 00:16:48.858990 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jul 7 00:16:48.859000 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jul 7 00:16:48.859010 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jul 7 00:16:48.859025 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jul 7 00:16:48.859037 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jul 7 00:16:48.859047 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jul 7 00:16:48.859058 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jul 7 00:16:48.859068 systemd[1]: Created slice user.slice - User and Session Slice. Jul 7 00:16:48.859078 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 7 00:16:48.859089 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 7 00:16:48.859103 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jul 7 00:16:48.859115 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jul 7 00:16:48.859125 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jul 7 00:16:48.859136 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 7 00:16:48.859150 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jul 7 00:16:48.859161 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 7 00:16:48.859171 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 7 00:16:48.859183 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jul 7 00:16:48.859194 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jul 7 00:16:48.859204 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jul 7 00:16:48.859214 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jul 7 00:16:48.859225 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 7 00:16:48.859235 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 7 00:16:48.859246 systemd[1]: Reached target slices.target - Slice Units. Jul 7 00:16:48.859283 systemd[1]: Reached target swap.target - Swaps. Jul 7 00:16:48.859294 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jul 7 00:16:48.860071 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jul 7 00:16:48.860102 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jul 7 00:16:48.860113 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 7 00:16:48.860124 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 7 00:16:48.860135 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 7 00:16:48.860145 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jul 7 00:16:48.860157 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jul 7 00:16:48.860168 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jul 7 00:16:48.860179 systemd[1]: Mounting media.mount - External Media Directory... Jul 7 00:16:48.860197 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jul 7 00:16:48.860208 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jul 7 00:16:48.860222 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jul 7 00:16:48.860240 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jul 7 00:16:48.860251 systemd[1]: Reached target machines.target - Containers. Jul 7 00:16:48.860261 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jul 7 00:16:48.860272 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 7 00:16:48.860283 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 7 00:16:48.860293 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jul 7 00:16:48.860306 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 7 00:16:48.860316 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 7 00:16:48.860327 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 7 00:16:48.860337 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jul 7 00:16:48.860348 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 7 00:16:48.860359 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jul 7 00:16:48.860370 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jul 7 00:16:48.860380 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jul 7 00:16:48.860393 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jul 7 00:16:48.860404 systemd[1]: Stopped systemd-fsck-usr.service. Jul 7 00:16:48.860415 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 7 00:16:48.860426 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 7 00:16:48.860437 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 7 00:16:48.860450 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 7 00:16:48.860463 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jul 7 00:16:48.860474 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jul 7 00:16:48.860489 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 7 00:16:48.860523 systemd[1]: verity-setup.service: Deactivated successfully. Jul 7 00:16:48.860536 systemd[1]: Stopped verity-setup.service. Jul 7 00:16:48.860547 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jul 7 00:16:48.860556 kernel: loop: module loaded Jul 7 00:16:48.860567 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jul 7 00:16:48.860578 systemd[1]: Mounted media.mount - External Media Directory. Jul 7 00:16:48.860588 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jul 7 00:16:48.860598 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jul 7 00:16:48.860609 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jul 7 00:16:48.860620 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 7 00:16:48.860631 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 7 00:16:48.860641 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 7 00:16:48.860652 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 7 00:16:48.860663 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 7 00:16:48.860673 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 7 00:16:48.860684 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 7 00:16:48.860695 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 7 00:16:48.860706 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jul 7 00:16:48.860720 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jul 7 00:16:48.860731 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 7 00:16:48.860741 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jul 7 00:16:48.860752 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jul 7 00:16:48.864065 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 7 00:16:48.864117 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jul 7 00:16:48.864129 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 7 00:16:48.864140 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jul 7 00:16:48.864155 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 7 00:16:48.864197 systemd-journald[1154]: Collecting audit messages is disabled. Jul 7 00:16:48.864221 kernel: fuse: init (API version 7.41) Jul 7 00:16:48.864233 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 7 00:16:48.864245 systemd-journald[1154]: Journal started Jul 7 00:16:48.864270 systemd-journald[1154]: Runtime Journal (/run/log/journal/4f990e9bbd754329ad87a5e17abc3d1b) is 8M, max 76.5M, 68.5M free. Jul 7 00:16:48.568654 systemd[1]: Queued start job for default target multi-user.target. Jul 7 00:16:48.595722 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jul 7 00:16:48.596248 systemd[1]: systemd-journald.service: Deactivated successfully. Jul 7 00:16:48.875154 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jul 7 00:16:48.875207 systemd[1]: Started systemd-journald.service - Journal Service. Jul 7 00:16:48.872246 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jul 7 00:16:48.872429 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jul 7 00:16:48.875018 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jul 7 00:16:48.879863 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jul 7 00:16:48.881194 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 7 00:16:48.882697 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jul 7 00:16:48.914534 kernel: ACPI: bus type drm_connector registered Jul 7 00:16:48.904037 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 7 00:16:48.910706 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jul 7 00:16:48.918731 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jul 7 00:16:48.926688 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jul 7 00:16:48.930212 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 7 00:16:48.937493 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 7 00:16:48.949561 kernel: loop0: detected capacity change from 0 to 138376 Jul 7 00:16:48.941004 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jul 7 00:16:48.952737 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jul 7 00:16:48.954664 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jul 7 00:16:48.961173 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jul 7 00:16:48.967482 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jul 7 00:16:48.988852 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jul 7 00:16:48.999792 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jul 7 00:16:49.005163 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 7 00:16:49.014518 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jul 7 00:16:49.015563 systemd-journald[1154]: Time spent on flushing to /var/log/journal/4f990e9bbd754329ad87a5e17abc3d1b is 71.104ms for 1172 entries. Jul 7 00:16:49.015563 systemd-journald[1154]: System Journal (/var/log/journal/4f990e9bbd754329ad87a5e17abc3d1b) is 8M, max 584.8M, 576.8M free. Jul 7 00:16:49.095656 systemd-journald[1154]: Received client request to flush runtime journal. Jul 7 00:16:49.095704 kernel: loop1: detected capacity change from 0 to 107312 Jul 7 00:16:49.095726 kernel: loop2: detected capacity change from 0 to 8 Jul 7 00:16:49.094309 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 7 00:16:49.100172 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jul 7 00:16:49.107429 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jul 7 00:16:49.109200 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jul 7 00:16:49.113844 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 7 00:16:49.129598 kernel: loop3: detected capacity change from 0 to 203944 Jul 7 00:16:49.162609 systemd-tmpfiles[1228]: ACLs are not supported, ignoring. Jul 7 00:16:49.163572 systemd-tmpfiles[1228]: ACLs are not supported, ignoring. Jul 7 00:16:49.170553 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 7 00:16:49.176348 kernel: loop4: detected capacity change from 0 to 138376 Jul 7 00:16:49.204534 kernel: loop5: detected capacity change from 0 to 107312 Jul 7 00:16:49.219566 kernel: loop6: detected capacity change from 0 to 8 Jul 7 00:16:49.221614 kernel: loop7: detected capacity change from 0 to 203944 Jul 7 00:16:49.240120 (sd-merge)[1232]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Jul 7 00:16:49.240651 (sd-merge)[1232]: Merged extensions into '/usr'. Jul 7 00:16:49.248461 systemd[1]: Reload requested from client PID 1181 ('systemd-sysext') (unit systemd-sysext.service)... Jul 7 00:16:49.248480 systemd[1]: Reloading... Jul 7 00:16:49.359556 zram_generator::config[1261]: No configuration found. Jul 7 00:16:49.446902 ldconfig[1170]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jul 7 00:16:49.493816 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 7 00:16:49.569104 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jul 7 00:16:49.569484 systemd[1]: Reloading finished in 320 ms. Jul 7 00:16:49.589727 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jul 7 00:16:49.591839 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jul 7 00:16:49.612861 systemd[1]: Starting ensure-sysext.service... Jul 7 00:16:49.619203 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 7 00:16:49.640690 systemd[1]: Reload requested from client PID 1295 ('systemctl') (unit ensure-sysext.service)... Jul 7 00:16:49.640706 systemd[1]: Reloading... Jul 7 00:16:49.664617 systemd-tmpfiles[1296]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jul 7 00:16:49.664642 systemd-tmpfiles[1296]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jul 7 00:16:49.664906 systemd-tmpfiles[1296]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jul 7 00:16:49.665123 systemd-tmpfiles[1296]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jul 7 00:16:49.665746 systemd-tmpfiles[1296]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jul 7 00:16:49.665978 systemd-tmpfiles[1296]: ACLs are not supported, ignoring. Jul 7 00:16:49.666031 systemd-tmpfiles[1296]: ACLs are not supported, ignoring. Jul 7 00:16:49.677619 systemd-tmpfiles[1296]: Detected autofs mount point /boot during canonicalization of boot. Jul 7 00:16:49.677631 systemd-tmpfiles[1296]: Skipping /boot Jul 7 00:16:49.695834 systemd-tmpfiles[1296]: Detected autofs mount point /boot during canonicalization of boot. Jul 7 00:16:49.697542 systemd-tmpfiles[1296]: Skipping /boot Jul 7 00:16:49.725530 zram_generator::config[1323]: No configuration found. Jul 7 00:16:49.823396 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 7 00:16:49.897455 systemd[1]: Reloading finished in 256 ms. Jul 7 00:16:49.921032 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jul 7 00:16:49.949606 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 7 00:16:49.956922 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 7 00:16:49.961691 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jul 7 00:16:49.965882 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jul 7 00:16:49.973122 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 7 00:16:49.975632 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 7 00:16:49.979781 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jul 7 00:16:49.985212 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 7 00:16:49.989772 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 7 00:16:49.998813 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 7 00:16:50.007821 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 7 00:16:50.008611 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 7 00:16:50.008738 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 7 00:16:50.022669 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jul 7 00:16:50.025400 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jul 7 00:16:50.028092 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 7 00:16:50.028579 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 7 00:16:50.031229 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 7 00:16:50.031371 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 7 00:16:50.036840 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 7 00:16:50.038556 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 7 00:16:50.045767 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 7 00:16:50.047193 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 7 00:16:50.050514 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 7 00:16:50.059904 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 7 00:16:50.061637 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 7 00:16:50.061823 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 7 00:16:50.063152 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jul 7 00:16:50.069690 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 7 00:16:50.071055 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 7 00:16:50.071695 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 7 00:16:50.071802 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 7 00:16:50.073557 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jul 7 00:16:50.081668 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jul 7 00:16:50.087308 systemd-udevd[1367]: Using default interface naming scheme 'v255'. Jul 7 00:16:50.087320 systemd[1]: Finished ensure-sysext.service. Jul 7 00:16:50.101745 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jul 7 00:16:50.102545 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jul 7 00:16:50.102893 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 7 00:16:50.104609 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 7 00:16:50.109527 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 7 00:16:50.111566 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 7 00:16:50.113039 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 7 00:16:50.113211 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 7 00:16:50.117293 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 7 00:16:50.117868 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 7 00:16:50.119674 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 7 00:16:50.119752 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 7 00:16:50.124556 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 7 00:16:50.126398 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jul 7 00:16:50.132674 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 7 00:16:50.173562 augenrules[1421]: No rules Jul 7 00:16:50.174448 systemd[1]: audit-rules.service: Deactivated successfully. Jul 7 00:16:50.174912 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 7 00:16:50.192270 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jul 7 00:16:50.302751 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jul 7 00:16:50.430655 systemd-networkd[1414]: lo: Link UP Jul 7 00:16:50.430667 systemd-networkd[1414]: lo: Gained carrier Jul 7 00:16:50.432016 systemd-networkd[1414]: Enumeration completed Jul 7 00:16:50.432133 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 7 00:16:50.434895 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jul 7 00:16:50.438724 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jul 7 00:16:50.470124 systemd-networkd[1414]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 7 00:16:50.470139 systemd-networkd[1414]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 7 00:16:50.471698 systemd-networkd[1414]: eth1: Link UP Jul 7 00:16:50.471821 systemd-networkd[1414]: eth1: Gained carrier Jul 7 00:16:50.471846 systemd-networkd[1414]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 7 00:16:50.482816 kernel: mousedev: PS/2 mouse device common for all mice Jul 7 00:16:50.482574 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jul 7 00:16:50.483647 systemd[1]: Reached target time-set.target - System Time Set. Jul 7 00:16:50.489351 systemd-networkd[1414]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 7 00:16:50.489366 systemd-networkd[1414]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 7 00:16:50.494613 systemd-networkd[1414]: eth0: Link UP Jul 7 00:16:50.495029 systemd-networkd[1414]: eth0: Gained carrier Jul 7 00:16:50.495058 systemd-networkd[1414]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 7 00:16:50.514998 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jul 7 00:16:50.517783 systemd-resolved[1365]: Positive Trust Anchors: Jul 7 00:16:50.517797 systemd-resolved[1365]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 7 00:16:50.517829 systemd-resolved[1365]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 7 00:16:50.518762 systemd-networkd[1414]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 7 00:16:50.519478 systemd-timesyncd[1402]: Network configuration changed, trying to establish connection. Jul 7 00:16:50.523586 systemd-resolved[1365]: Using system hostname 'ci-4344-1-1-1-1232b7205a'. Jul 7 00:16:50.525782 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 7 00:16:50.527228 systemd[1]: Reached target network.target - Network. Jul 7 00:16:50.528046 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 7 00:16:50.528839 systemd[1]: Reached target sysinit.target - System Initialization. Jul 7 00:16:50.529715 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jul 7 00:16:50.530390 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jul 7 00:16:50.531253 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jul 7 00:16:50.532035 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jul 7 00:16:50.532670 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jul 7 00:16:50.533302 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jul 7 00:16:50.533337 systemd[1]: Reached target paths.target - Path Units. Jul 7 00:16:50.534560 systemd[1]: Reached target timers.target - Timer Units. Jul 7 00:16:50.536960 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jul 7 00:16:50.540105 systemd[1]: Starting docker.socket - Docker Socket for the API... Jul 7 00:16:50.545634 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jul 7 00:16:50.546697 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jul 7 00:16:50.547866 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jul 7 00:16:50.563615 systemd-networkd[1414]: eth0: DHCPv4 address 91.107.203.174/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jul 7 00:16:50.565167 systemd-timesyncd[1402]: Network configuration changed, trying to establish connection. Jul 7 00:16:50.565346 systemd-timesyncd[1402]: Network configuration changed, trying to establish connection. Jul 7 00:16:50.573779 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jul 7 00:16:50.575875 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jul 7 00:16:50.577820 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jul 7 00:16:50.582310 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Jul 7 00:16:50.585204 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jul 7 00:16:50.586024 systemd[1]: Reached target sockets.target - Socket Units. Jul 7 00:16:50.586531 systemd[1]: Reached target basic.target - Basic System. Jul 7 00:16:50.587054 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jul 7 00:16:50.587086 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jul 7 00:16:50.588138 systemd[1]: Starting containerd.service - containerd container runtime... Jul 7 00:16:50.590711 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jul 7 00:16:50.593122 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jul 7 00:16:50.595340 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jul 7 00:16:50.601612 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jul 7 00:16:50.603736 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jul 7 00:16:50.605596 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jul 7 00:16:50.622658 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Jul 7 00:16:50.623542 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jul 7 00:16:50.623587 kernel: [drm] features: -context_init Jul 7 00:16:50.627873 kernel: [drm] number of scanouts: 1 Jul 7 00:16:50.627944 kernel: [drm] number of cap sets: 0 Jul 7 00:16:50.627990 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jul 7 00:16:50.629780 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jul 7 00:16:50.632310 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jul 7 00:16:50.635801 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Jul 7 00:16:50.639557 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jul 7 00:16:50.642530 jq[1484]: false Jul 7 00:16:50.644773 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jul 7 00:16:50.650770 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jul 7 00:16:50.659966 systemd[1]: Starting systemd-logind.service - User Login Management... Jul 7 00:16:50.662516 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jul 7 00:16:50.663089 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jul 7 00:16:50.667744 systemd[1]: Starting update-engine.service - Update Engine... Jul 7 00:16:50.672232 extend-filesystems[1485]: Found /dev/sda6 Jul 7 00:16:50.673805 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jul 7 00:16:50.678572 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jul 7 00:16:50.679760 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jul 7 00:16:50.679982 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jul 7 00:16:50.691793 extend-filesystems[1485]: Found /dev/sda9 Jul 7 00:16:50.697088 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jul 7 00:16:50.697332 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jul 7 00:16:50.704613 extend-filesystems[1485]: Checking size of /dev/sda9 Jul 7 00:16:50.720383 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 7 00:16:50.726081 jq[1503]: true Jul 7 00:16:50.736570 systemd[1]: motdgen.service: Deactivated successfully. Jul 7 00:16:50.738557 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jul 7 00:16:50.761772 tar[1506]: linux-arm64/helm Jul 7 00:16:50.777407 extend-filesystems[1485]: Resized partition /dev/sda9 Jul 7 00:16:50.780265 extend-filesystems[1537]: resize2fs 1.47.2 (1-Jan-2025) Jul 7 00:16:50.792991 update_engine[1501]: I20250707 00:16:50.792814 1501 main.cc:92] Flatcar Update Engine starting Jul 7 00:16:50.795526 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Jul 7 00:16:50.795609 jq[1529]: true Jul 7 00:16:50.798651 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jul 7 00:16:50.881099 dbus-daemon[1482]: [system] SELinux support is enabled Jul 7 00:16:50.882722 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jul 7 00:16:50.887109 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jul 7 00:16:50.887188 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jul 7 00:16:50.888484 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jul 7 00:16:50.888564 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jul 7 00:16:50.920684 systemd[1]: Started update-engine.service - Update Engine. Jul 7 00:16:50.921337 update_engine[1501]: I20250707 00:16:50.920814 1501 update_check_scheduler.cc:74] Next update check in 10m50s Jul 7 00:16:50.932834 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jul 7 00:16:50.945710 (ntainerd)[1548]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jul 7 00:16:50.958559 coreos-metadata[1481]: Jul 07 00:16:50.958 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Jul 7 00:16:50.961555 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 7 00:16:50.967050 coreos-metadata[1481]: Jul 07 00:16:50.964 INFO Fetch successful Jul 7 00:16:50.967050 coreos-metadata[1481]: Jul 07 00:16:50.965 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Jul 7 00:16:50.974445 coreos-metadata[1481]: Jul 07 00:16:50.971 INFO Fetch successful Jul 7 00:16:51.000130 bash[1568]: Updated "/home/core/.ssh/authorized_keys" Jul 7 00:16:51.005199 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jul 7 00:16:51.009086 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Jul 7 00:16:51.015004 systemd[1]: Starting sshkeys.service... Jul 7 00:16:51.024382 extend-filesystems[1537]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jul 7 00:16:51.024382 extend-filesystems[1537]: old_desc_blocks = 1, new_desc_blocks = 5 Jul 7 00:16:51.024382 extend-filesystems[1537]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Jul 7 00:16:51.035397 extend-filesystems[1485]: Resized filesystem in /dev/sda9 Jul 7 00:16:51.025578 systemd[1]: extend-filesystems.service: Deactivated successfully. Jul 7 00:16:51.025775 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jul 7 00:16:51.088729 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jul 7 00:16:51.091244 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jul 7 00:16:51.191322 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jul 7 00:16:51.192799 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jul 7 00:16:51.222991 coreos-metadata[1578]: Jul 07 00:16:51.222 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Jul 7 00:16:51.224413 systemd-logind[1498]: New seat seat0. Jul 7 00:16:51.225641 systemd-logind[1498]: Watching system buttons on /dev/input/event0 (Power Button) Jul 7 00:16:51.225657 systemd-logind[1498]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jul 7 00:16:51.225892 systemd[1]: Started systemd-logind.service - User Login Management. Jul 7 00:16:51.225984 locksmithd[1555]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jul 7 00:16:51.229397 coreos-metadata[1578]: Jul 07 00:16:51.228 INFO Fetch successful Jul 7 00:16:51.233494 unknown[1578]: wrote ssh authorized keys file for user: core Jul 7 00:16:51.272415 update-ssh-keys[1597]: Updated "/home/core/.ssh/authorized_keys" Jul 7 00:16:51.274055 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jul 7 00:16:51.278618 systemd[1]: Finished sshkeys.service. Jul 7 00:16:51.404856 containerd[1548]: time="2025-07-07T00:16:51Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jul 7 00:16:51.408523 containerd[1548]: time="2025-07-07T00:16:51.408465960Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Jul 7 00:16:51.430892 containerd[1548]: time="2025-07-07T00:16:51.428850640Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.88µs" Jul 7 00:16:51.430892 containerd[1548]: time="2025-07-07T00:16:51.428898080Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jul 7 00:16:51.430892 containerd[1548]: time="2025-07-07T00:16:51.428917880Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jul 7 00:16:51.430892 containerd[1548]: time="2025-07-07T00:16:51.429112360Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jul 7 00:16:51.430892 containerd[1548]: time="2025-07-07T00:16:51.429129320Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jul 7 00:16:51.430892 containerd[1548]: time="2025-07-07T00:16:51.429158880Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 7 00:16:51.430892 containerd[1548]: time="2025-07-07T00:16:51.429221200Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 7 00:16:51.430892 containerd[1548]: time="2025-07-07T00:16:51.429232600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 7 00:16:51.433102 containerd[1548]: time="2025-07-07T00:16:51.429473560Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 7 00:16:51.433233 containerd[1548]: time="2025-07-07T00:16:51.433207440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 7 00:16:51.433337 containerd[1548]: time="2025-07-07T00:16:51.433320760Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 7 00:16:51.433579 containerd[1548]: time="2025-07-07T00:16:51.433564480Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jul 7 00:16:51.433783 containerd[1548]: time="2025-07-07T00:16:51.433763600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jul 7 00:16:51.434311 containerd[1548]: time="2025-07-07T00:16:51.434286160Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 7 00:16:51.434891 containerd[1548]: time="2025-07-07T00:16:51.434869480Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 7 00:16:51.435205 containerd[1548]: time="2025-07-07T00:16:51.435186440Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jul 7 00:16:51.435302 containerd[1548]: time="2025-07-07T00:16:51.435287320Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jul 7 00:16:51.435853 containerd[1548]: time="2025-07-07T00:16:51.435830080Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jul 7 00:16:51.436998 containerd[1548]: time="2025-07-07T00:16:51.436975280Z" level=info msg="metadata content store policy set" policy=shared Jul 7 00:16:51.441849 containerd[1548]: time="2025-07-07T00:16:51.441805840Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jul 7 00:16:51.442049 containerd[1548]: time="2025-07-07T00:16:51.442031520Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jul 7 00:16:51.442227 containerd[1548]: time="2025-07-07T00:16:51.442212880Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jul 7 00:16:51.442282 containerd[1548]: time="2025-07-07T00:16:51.442269960Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jul 7 00:16:51.442373 containerd[1548]: time="2025-07-07T00:16:51.442357800Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jul 7 00:16:51.442692 containerd[1548]: time="2025-07-07T00:16:51.442676640Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jul 7 00:16:51.442763 containerd[1548]: time="2025-07-07T00:16:51.442750560Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jul 7 00:16:51.442817 containerd[1548]: time="2025-07-07T00:16:51.442805520Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jul 7 00:16:51.443186 containerd[1548]: time="2025-07-07T00:16:51.443165280Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jul 7 00:16:51.443461 containerd[1548]: time="2025-07-07T00:16:51.443446560Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jul 7 00:16:51.443532 containerd[1548]: time="2025-07-07T00:16:51.443518760Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jul 7 00:16:51.443584 containerd[1548]: time="2025-07-07T00:16:51.443572880Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jul 7 00:16:51.445788 containerd[1548]: time="2025-07-07T00:16:51.443814480Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jul 7 00:16:51.445788 containerd[1548]: time="2025-07-07T00:16:51.443843840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jul 7 00:16:51.445788 containerd[1548]: time="2025-07-07T00:16:51.443862520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jul 7 00:16:51.445788 containerd[1548]: time="2025-07-07T00:16:51.443874360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jul 7 00:16:51.445788 containerd[1548]: time="2025-07-07T00:16:51.443886640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jul 7 00:16:51.445788 containerd[1548]: time="2025-07-07T00:16:51.443898200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jul 7 00:16:51.445788 containerd[1548]: time="2025-07-07T00:16:51.443909760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jul 7 00:16:51.445788 containerd[1548]: time="2025-07-07T00:16:51.443920960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jul 7 00:16:51.445788 containerd[1548]: time="2025-07-07T00:16:51.443961880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jul 7 00:16:51.445788 containerd[1548]: time="2025-07-07T00:16:51.443974080Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jul 7 00:16:51.445788 containerd[1548]: time="2025-07-07T00:16:51.443986240Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jul 7 00:16:51.445788 containerd[1548]: time="2025-07-07T00:16:51.444185320Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jul 7 00:16:51.445788 containerd[1548]: time="2025-07-07T00:16:51.444199320Z" level=info msg="Start snapshots syncer" Jul 7 00:16:51.445788 containerd[1548]: time="2025-07-07T00:16:51.444231120Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jul 7 00:16:51.446097 containerd[1548]: time="2025-07-07T00:16:51.444571640Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jul 7 00:16:51.446097 containerd[1548]: time="2025-07-07T00:16:51.444626480Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jul 7 00:16:51.446209 containerd[1548]: time="2025-07-07T00:16:51.444701240Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jul 7 00:16:51.446209 containerd[1548]: time="2025-07-07T00:16:51.444810280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jul 7 00:16:51.446209 containerd[1548]: time="2025-07-07T00:16:51.444832640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jul 7 00:16:51.446209 containerd[1548]: time="2025-07-07T00:16:51.444843440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jul 7 00:16:51.446209 containerd[1548]: time="2025-07-07T00:16:51.444858480Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jul 7 00:16:51.446209 containerd[1548]: time="2025-07-07T00:16:51.444871640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jul 7 00:16:51.446209 containerd[1548]: time="2025-07-07T00:16:51.444882280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jul 7 00:16:51.446209 containerd[1548]: time="2025-07-07T00:16:51.444893440Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jul 7 00:16:51.446209 containerd[1548]: time="2025-07-07T00:16:51.444966920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jul 7 00:16:51.446209 containerd[1548]: time="2025-07-07T00:16:51.444987160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jul 7 00:16:51.446209 containerd[1548]: time="2025-07-07T00:16:51.444998880Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jul 7 00:16:51.446209 containerd[1548]: time="2025-07-07T00:16:51.445032360Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 7 00:16:51.446209 containerd[1548]: time="2025-07-07T00:16:51.445051160Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 7 00:16:51.446209 containerd[1548]: time="2025-07-07T00:16:51.445062000Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 7 00:16:51.446436 containerd[1548]: time="2025-07-07T00:16:51.445074280Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 7 00:16:51.446436 containerd[1548]: time="2025-07-07T00:16:51.445082200Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jul 7 00:16:51.446436 containerd[1548]: time="2025-07-07T00:16:51.445091720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jul 7 00:16:51.446436 containerd[1548]: time="2025-07-07T00:16:51.445101960Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jul 7 00:16:51.446436 containerd[1548]: time="2025-07-07T00:16:51.445179160Z" level=info msg="runtime interface created" Jul 7 00:16:51.446436 containerd[1548]: time="2025-07-07T00:16:51.445184240Z" level=info msg="created NRI interface" Jul 7 00:16:51.446436 containerd[1548]: time="2025-07-07T00:16:51.445192240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jul 7 00:16:51.446436 containerd[1548]: time="2025-07-07T00:16:51.445203320Z" level=info msg="Connect containerd service" Jul 7 00:16:51.446436 containerd[1548]: time="2025-07-07T00:16:51.445229880Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jul 7 00:16:51.453148 containerd[1548]: time="2025-07-07T00:16:51.453103440Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 7 00:16:51.656973 tar[1506]: linux-arm64/LICENSE Jul 7 00:16:51.657328 tar[1506]: linux-arm64/README.md Jul 7 00:16:51.666002 containerd[1548]: time="2025-07-07T00:16:51.665952320Z" level=info msg="Start subscribing containerd event" Jul 7 00:16:51.666278 containerd[1548]: time="2025-07-07T00:16:51.666262840Z" level=info msg="Start recovering state" Jul 7 00:16:51.666623 containerd[1548]: time="2025-07-07T00:16:51.666607520Z" level=info msg="Start event monitor" Jul 7 00:16:51.666700 containerd[1548]: time="2025-07-07T00:16:51.666688680Z" level=info msg="Start cni network conf syncer for default" Jul 7 00:16:51.666754 containerd[1548]: time="2025-07-07T00:16:51.666737240Z" level=info msg="Start streaming server" Jul 7 00:16:51.666802 containerd[1548]: time="2025-07-07T00:16:51.666792400Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jul 7 00:16:51.666976 containerd[1548]: time="2025-07-07T00:16:51.666846280Z" level=info msg="runtime interface starting up..." Jul 7 00:16:51.667167 containerd[1548]: time="2025-07-07T00:16:51.667149760Z" level=info msg="starting plugins..." Jul 7 00:16:51.667350 containerd[1548]: time="2025-07-07T00:16:51.667333920Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jul 7 00:16:51.667702 containerd[1548]: time="2025-07-07T00:16:51.667679400Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jul 7 00:16:51.667905 containerd[1548]: time="2025-07-07T00:16:51.667841480Z" level=info msg=serving... address=/run/containerd/containerd.sock Jul 7 00:16:51.668157 systemd[1]: Started containerd.service - containerd container runtime. Jul 7 00:16:51.670410 containerd[1548]: time="2025-07-07T00:16:51.670386560Z" level=info msg="containerd successfully booted in 0.265929s" Jul 7 00:16:51.684550 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jul 7 00:16:51.846717 systemd-networkd[1414]: eth1: Gained IPv6LL Jul 7 00:16:51.849665 systemd-timesyncd[1402]: Network configuration changed, trying to establish connection. Jul 7 00:16:51.853253 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jul 7 00:16:51.854995 systemd[1]: Reached target network-online.target - Network is Online. Jul 7 00:16:51.859824 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:16:51.864220 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jul 7 00:16:51.901757 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jul 7 00:16:52.427178 sshd_keygen[1528]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jul 7 00:16:52.455191 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jul 7 00:16:52.461257 systemd[1]: Starting issuegen.service - Generate /run/issue... Jul 7 00:16:52.484483 systemd[1]: issuegen.service: Deactivated successfully. Jul 7 00:16:52.485188 systemd[1]: Finished issuegen.service - Generate /run/issue. Jul 7 00:16:52.490476 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jul 7 00:16:52.511404 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jul 7 00:16:52.515993 systemd[1]: Started getty@tty1.service - Getty on tty1. Jul 7 00:16:52.518399 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jul 7 00:16:52.521980 systemd[1]: Reached target getty.target - Login Prompts. Jul 7 00:16:52.550790 systemd-networkd[1414]: eth0: Gained IPv6LL Jul 7 00:16:52.551878 systemd-timesyncd[1402]: Network configuration changed, trying to establish connection. Jul 7 00:16:52.787857 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:16:52.789970 systemd[1]: Reached target multi-user.target - Multi-User System. Jul 7 00:16:52.795599 systemd[1]: Startup finished in 2.328s (kernel) + 5.407s (initrd) + 4.828s (userspace) = 12.564s. Jul 7 00:16:52.803078 (kubelet)[1650]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:16:53.405498 kubelet[1650]: E0707 00:16:53.405422 1650 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:16:53.408147 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:16:53.408411 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:16:53.409106 systemd[1]: kubelet.service: Consumed 1.008s CPU time, 255.6M memory peak. Jul 7 00:16:59.537613 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jul 7 00:16:59.540311 systemd[1]: Started sshd@0-91.107.203.174:22-20.244.106.173:35990.service - OpenSSH per-connection server daemon (20.244.106.173:35990). Jul 7 00:16:59.583390 sshd[1662]: Connection closed by 20.244.106.173 port 35990 Jul 7 00:16:59.585774 systemd[1]: sshd@0-91.107.203.174:22-20.244.106.173:35990.service: Deactivated successfully. Jul 7 00:17:03.659310 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jul 7 00:17:03.662361 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:17:03.828298 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:17:03.838459 (kubelet)[1673]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:17:03.886288 kubelet[1673]: E0707 00:17:03.886208 1673 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:17:03.891883 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:17:03.892536 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:17:03.895373 systemd[1]: kubelet.service: Consumed 167ms CPU time, 106.9M memory peak. Jul 7 00:17:14.073021 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jul 7 00:17:14.077730 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:17:14.246630 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:17:14.259115 (kubelet)[1687]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:17:14.307535 kubelet[1687]: E0707 00:17:14.307463 1687 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:17:14.311678 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:17:14.311980 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:17:14.313674 systemd[1]: kubelet.service: Consumed 168ms CPU time, 105.4M memory peak. Jul 7 00:17:21.231052 systemd[1]: Started sshd@1-91.107.203.174:22-80.94.95.115:38754.service - OpenSSH per-connection server daemon (80.94.95.115:38754). Jul 7 00:17:22.737323 systemd-timesyncd[1402]: Contacted time server 91.132.146.190:123 (2.flatcar.pool.ntp.org). Jul 7 00:17:22.737432 systemd-timesyncd[1402]: Initial clock synchronization to Mon 2025-07-07 00:17:22.786613 UTC. Jul 7 00:17:24.323187 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jul 7 00:17:24.325829 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:17:24.497065 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:17:24.508280 (kubelet)[1705]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:17:24.564317 kubelet[1705]: E0707 00:17:24.564248 1705 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:17:24.567175 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:17:24.567310 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:17:24.567927 systemd[1]: kubelet.service: Consumed 176ms CPU time, 105.9M memory peak. Jul 7 00:17:24.687018 sshd[1695]: Invalid user user from 80.94.95.115 port 38754 Jul 7 00:17:24.783720 sshd[1695]: Connection closed by invalid user user 80.94.95.115 port 38754 [preauth] Jul 7 00:17:24.787477 systemd[1]: sshd@1-91.107.203.174:22-80.94.95.115:38754.service: Deactivated successfully. Jul 7 00:17:34.573316 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jul 7 00:17:34.576314 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:17:34.742843 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:17:34.754146 (kubelet)[1723]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:17:34.806970 kubelet[1723]: E0707 00:17:34.806904 1723 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:17:34.811329 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:17:34.811748 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:17:34.812366 systemd[1]: kubelet.service: Consumed 175ms CPU time, 104.9M memory peak. Jul 7 00:17:36.264307 update_engine[1501]: I20250707 00:17:36.263028 1501 update_attempter.cc:509] Updating boot flags... Jul 7 00:17:44.823110 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Jul 7 00:17:44.825675 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:17:44.995101 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:17:45.009214 (kubelet)[1758]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:17:45.059419 kubelet[1758]: E0707 00:17:45.059326 1758 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:17:45.062638 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:17:45.062826 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:17:45.063647 systemd[1]: kubelet.service: Consumed 178ms CPU time, 107.1M memory peak. Jul 7 00:17:55.072625 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Jul 7 00:17:55.075752 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:17:55.240844 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:17:55.255012 (kubelet)[1773]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:17:55.301415 kubelet[1773]: E0707 00:17:55.301364 1773 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:17:55.304046 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:17:55.304284 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:17:55.304634 systemd[1]: kubelet.service: Consumed 167ms CPU time, 106.9M memory peak. Jul 7 00:18:05.323269 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Jul 7 00:18:05.325941 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:18:05.476018 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:18:05.485083 (kubelet)[1788]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:18:05.531315 kubelet[1788]: E0707 00:18:05.531268 1788 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:18:05.534749 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:18:05.534930 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:18:05.535632 systemd[1]: kubelet.service: Consumed 160ms CPU time, 106.9M memory peak. Jul 7 00:18:15.572777 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. Jul 7 00:18:15.576314 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:18:15.754118 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:18:15.765462 (kubelet)[1803]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:18:15.809736 kubelet[1803]: E0707 00:18:15.809690 1803 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:18:15.813148 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:18:15.813434 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:18:15.814674 systemd[1]: kubelet.service: Consumed 172ms CPU time, 107M memory peak. Jul 7 00:18:25.823359 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. Jul 7 00:18:25.826630 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:18:25.995218 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:18:26.007107 (kubelet)[1818]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:18:26.058542 kubelet[1818]: E0707 00:18:26.058440 1818 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:18:26.064235 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:18:26.064539 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:18:26.066619 systemd[1]: kubelet.service: Consumed 179ms CPU time, 107.1M memory peak. Jul 7 00:18:36.073062 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. Jul 7 00:18:36.075559 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:18:36.257430 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:18:36.269074 (kubelet)[1833]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:18:36.306130 kubelet[1833]: E0707 00:18:36.306058 1833 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:18:36.309716 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:18:36.309990 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:18:36.310719 systemd[1]: kubelet.service: Consumed 163ms CPU time, 107.1M memory peak. Jul 7 00:18:39.180931 systemd[1]: Started sshd@2-91.107.203.174:22-139.178.89.65:44378.service - OpenSSH per-connection server daemon (139.178.89.65:44378). Jul 7 00:18:40.295810 sshd[1841]: Accepted publickey for core from 139.178.89.65 port 44378 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:18:40.299800 sshd-session[1841]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:18:40.308795 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jul 7 00:18:40.310456 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jul 7 00:18:40.322088 systemd-logind[1498]: New session 1 of user core. Jul 7 00:18:40.339569 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jul 7 00:18:40.343771 systemd[1]: Starting user@500.service - User Manager for UID 500... Jul 7 00:18:40.364298 (systemd)[1845]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jul 7 00:18:40.370066 systemd-logind[1498]: New session c1 of user core. Jul 7 00:18:40.519403 systemd[1845]: Queued start job for default target default.target. Jul 7 00:18:40.527615 systemd[1845]: Created slice app.slice - User Application Slice. Jul 7 00:18:40.527677 systemd[1845]: Reached target paths.target - Paths. Jul 7 00:18:40.527755 systemd[1845]: Reached target timers.target - Timers. Jul 7 00:18:40.530400 systemd[1845]: Starting dbus.socket - D-Bus User Message Bus Socket... Jul 7 00:18:40.556932 systemd[1845]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jul 7 00:18:40.557085 systemd[1845]: Reached target sockets.target - Sockets. Jul 7 00:18:40.557147 systemd[1845]: Reached target basic.target - Basic System. Jul 7 00:18:40.557190 systemd[1845]: Reached target default.target - Main User Target. Jul 7 00:18:40.557299 systemd[1845]: Startup finished in 179ms. Jul 7 00:18:40.557625 systemd[1]: Started user@500.service - User Manager for UID 500. Jul 7 00:18:40.572898 systemd[1]: Started session-1.scope - Session 1 of User core. Jul 7 00:18:41.355211 systemd[1]: Started sshd@3-91.107.203.174:22-139.178.89.65:44190.service - OpenSSH per-connection server daemon (139.178.89.65:44190). Jul 7 00:18:42.471055 sshd[1856]: Accepted publickey for core from 139.178.89.65 port 44190 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:18:42.473220 sshd-session[1856]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:18:42.478246 systemd-logind[1498]: New session 2 of user core. Jul 7 00:18:42.484792 systemd[1]: Started session-2.scope - Session 2 of User core. Jul 7 00:18:43.228597 sshd[1858]: Connection closed by 139.178.89.65 port 44190 Jul 7 00:18:43.229407 sshd-session[1856]: pam_unix(sshd:session): session closed for user core Jul 7 00:18:43.235007 systemd-logind[1498]: Session 2 logged out. Waiting for processes to exit. Jul 7 00:18:43.235921 systemd[1]: sshd@3-91.107.203.174:22-139.178.89.65:44190.service: Deactivated successfully. Jul 7 00:18:43.237898 systemd[1]: session-2.scope: Deactivated successfully. Jul 7 00:18:43.241784 systemd-logind[1498]: Removed session 2. Jul 7 00:18:43.418229 systemd[1]: Started sshd@4-91.107.203.174:22-139.178.89.65:44198.service - OpenSSH per-connection server daemon (139.178.89.65:44198). Jul 7 00:18:44.518559 sshd[1864]: Accepted publickey for core from 139.178.89.65 port 44198 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:18:44.520868 sshd-session[1864]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:18:44.527954 systemd-logind[1498]: New session 3 of user core. Jul 7 00:18:44.536825 systemd[1]: Started session-3.scope - Session 3 of User core. Jul 7 00:18:45.251271 sshd[1866]: Connection closed by 139.178.89.65 port 44198 Jul 7 00:18:45.250586 sshd-session[1864]: pam_unix(sshd:session): session closed for user core Jul 7 00:18:45.255739 systemd-logind[1498]: Session 3 logged out. Waiting for processes to exit. Jul 7 00:18:45.255908 systemd[1]: sshd@4-91.107.203.174:22-139.178.89.65:44198.service: Deactivated successfully. Jul 7 00:18:45.257766 systemd[1]: session-3.scope: Deactivated successfully. Jul 7 00:18:45.260114 systemd-logind[1498]: Removed session 3. Jul 7 00:18:45.442780 systemd[1]: Started sshd@5-91.107.203.174:22-139.178.89.65:44204.service - OpenSSH per-connection server daemon (139.178.89.65:44204). Jul 7 00:18:46.323129 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. Jul 7 00:18:46.326699 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:18:46.525854 sshd[1872]: Accepted publickey for core from 139.178.89.65 port 44204 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:18:46.529443 sshd-session[1872]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:18:46.530670 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:18:46.538144 systemd-logind[1498]: New session 4 of user core. Jul 7 00:18:46.541347 (kubelet)[1881]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:18:46.541920 systemd[1]: Started session-4.scope - Session 4 of User core. Jul 7 00:18:46.594996 kubelet[1881]: E0707 00:18:46.594836 1881 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:18:46.598919 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:18:46.599249 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:18:46.600127 systemd[1]: kubelet.service: Consumed 186ms CPU time, 105.2M memory peak. Jul 7 00:18:47.262074 sshd[1887]: Connection closed by 139.178.89.65 port 44204 Jul 7 00:18:47.262885 sshd-session[1872]: pam_unix(sshd:session): session closed for user core Jul 7 00:18:47.267925 systemd-logind[1498]: Session 4 logged out. Waiting for processes to exit. Jul 7 00:18:47.269689 systemd[1]: sshd@5-91.107.203.174:22-139.178.89.65:44204.service: Deactivated successfully. Jul 7 00:18:47.273710 systemd[1]: session-4.scope: Deactivated successfully. Jul 7 00:18:47.276248 systemd-logind[1498]: Removed session 4. Jul 7 00:18:47.459180 systemd[1]: Started sshd@6-91.107.203.174:22-139.178.89.65:44208.service - OpenSSH per-connection server daemon (139.178.89.65:44208). Jul 7 00:18:48.585841 sshd[1895]: Accepted publickey for core from 139.178.89.65 port 44208 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:18:48.588140 sshd-session[1895]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:18:48.594258 systemd-logind[1498]: New session 5 of user core. Jul 7 00:18:48.599820 systemd[1]: Started session-5.scope - Session 5 of User core. Jul 7 00:18:49.170580 sudo[1898]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jul 7 00:18:49.170866 sudo[1898]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 7 00:18:49.185871 sudo[1898]: pam_unix(sudo:session): session closed for user root Jul 7 00:18:49.364735 sshd[1897]: Connection closed by 139.178.89.65 port 44208 Jul 7 00:18:49.365561 sshd-session[1895]: pam_unix(sshd:session): session closed for user core Jul 7 00:18:49.370296 systemd[1]: sshd@6-91.107.203.174:22-139.178.89.65:44208.service: Deactivated successfully. Jul 7 00:18:49.372709 systemd[1]: session-5.scope: Deactivated successfully. Jul 7 00:18:49.374013 systemd-logind[1498]: Session 5 logged out. Waiting for processes to exit. Jul 7 00:18:49.376163 systemd-logind[1498]: Removed session 5. Jul 7 00:18:49.559279 systemd[1]: Started sshd@7-91.107.203.174:22-139.178.89.65:44220.service - OpenSSH per-connection server daemon (139.178.89.65:44220). Jul 7 00:18:50.695865 sshd[1904]: Accepted publickey for core from 139.178.89.65 port 44220 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:18:50.697885 sshd-session[1904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:18:50.704444 systemd-logind[1498]: New session 6 of user core. Jul 7 00:18:50.709905 systemd[1]: Started session-6.scope - Session 6 of User core. Jul 7 00:18:51.276019 sudo[1908]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jul 7 00:18:51.276321 sudo[1908]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 7 00:18:51.283480 sudo[1908]: pam_unix(sudo:session): session closed for user root Jul 7 00:18:51.291645 sudo[1907]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jul 7 00:18:51.291915 sudo[1907]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 7 00:18:51.302427 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 7 00:18:51.344527 augenrules[1930]: No rules Jul 7 00:18:51.346574 systemd[1]: audit-rules.service: Deactivated successfully. Jul 7 00:18:51.346868 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 7 00:18:51.347937 sudo[1907]: pam_unix(sudo:session): session closed for user root Jul 7 00:18:51.525787 sshd[1906]: Connection closed by 139.178.89.65 port 44220 Jul 7 00:18:51.526766 sshd-session[1904]: pam_unix(sshd:session): session closed for user core Jul 7 00:18:51.532697 systemd-logind[1498]: Session 6 logged out. Waiting for processes to exit. Jul 7 00:18:51.533182 systemd[1]: sshd@7-91.107.203.174:22-139.178.89.65:44220.service: Deactivated successfully. Jul 7 00:18:51.536244 systemd[1]: session-6.scope: Deactivated successfully. Jul 7 00:18:51.539016 systemd-logind[1498]: Removed session 6. Jul 7 00:18:51.712861 systemd[1]: Started sshd@8-91.107.203.174:22-139.178.89.65:48688.service - OpenSSH per-connection server daemon (139.178.89.65:48688). Jul 7 00:18:52.813845 sshd[1939]: Accepted publickey for core from 139.178.89.65 port 48688 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:18:52.815820 sshd-session[1939]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:18:52.821307 systemd-logind[1498]: New session 7 of user core. Jul 7 00:18:52.830232 systemd[1]: Started session-7.scope - Session 7 of User core. Jul 7 00:18:53.387005 sudo[1942]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jul 7 00:18:53.387286 sudo[1942]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 7 00:18:53.751115 systemd[1]: Starting docker.service - Docker Application Container Engine... Jul 7 00:18:53.765465 (dockerd)[1960]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jul 7 00:18:54.009598 dockerd[1960]: time="2025-07-07T00:18:54.009155765Z" level=info msg="Starting up" Jul 7 00:18:54.015373 dockerd[1960]: time="2025-07-07T00:18:54.015323920Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jul 7 00:18:54.074747 dockerd[1960]: time="2025-07-07T00:18:54.074544067Z" level=info msg="Loading containers: start." Jul 7 00:18:54.086728 kernel: Initializing XFRM netlink socket Jul 7 00:18:54.367688 systemd-networkd[1414]: docker0: Link UP Jul 7 00:18:54.374744 dockerd[1960]: time="2025-07-07T00:18:54.374610196Z" level=info msg="Loading containers: done." Jul 7 00:18:54.393645 dockerd[1960]: time="2025-07-07T00:18:54.393593071Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jul 7 00:18:54.393815 dockerd[1960]: time="2025-07-07T00:18:54.393706433Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Jul 7 00:18:54.393850 dockerd[1960]: time="2025-07-07T00:18:54.393835035Z" level=info msg="Initializing buildkit" Jul 7 00:18:54.393907 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2473376749-merged.mount: Deactivated successfully. Jul 7 00:18:54.423745 dockerd[1960]: time="2025-07-07T00:18:54.423646032Z" level=info msg="Completed buildkit initialization" Jul 7 00:18:54.432160 dockerd[1960]: time="2025-07-07T00:18:54.432071470Z" level=info msg="Daemon has completed initialization" Jul 7 00:18:54.432543 dockerd[1960]: time="2025-07-07T00:18:54.432334475Z" level=info msg="API listen on /run/docker.sock" Jul 7 00:18:54.433319 systemd[1]: Started docker.service - Docker Application Container Engine. Jul 7 00:18:55.542092 containerd[1548]: time="2025-07-07T00:18:55.542034026Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.10\"" Jul 7 00:18:56.162376 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1378807950.mount: Deactivated successfully. Jul 7 00:18:56.822497 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. Jul 7 00:18:56.825859 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:18:56.978554 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:18:56.992411 (kubelet)[2224]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:18:57.039289 kubelet[2224]: E0707 00:18:57.039170 2224 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:18:57.042027 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:18:57.042175 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:18:57.042480 systemd[1]: kubelet.service: Consumed 161ms CPU time, 106M memory peak. Jul 7 00:18:57.245070 containerd[1548]: time="2025-07-07T00:18:57.244722560Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:18:57.247271 containerd[1548]: time="2025-07-07T00:18:57.247216643Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.10: active requests=0, bytes read=25651885" Jul 7 00:18:57.248661 containerd[1548]: time="2025-07-07T00:18:57.248587147Z" level=info msg="ImageCreate event name:\"sha256:8907c2d36348551c1038e24ef688f6830681069380376707e55518007a20a86c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:18:57.256530 containerd[1548]: time="2025-07-07T00:18:57.255975715Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:083d7d64af31cd090f870eb49fb815e6bb42c175fc602ee9dae2f28f082bd4dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:18:57.256739 containerd[1548]: time="2025-07-07T00:18:57.256710088Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.10\" with image id \"sha256:8907c2d36348551c1038e24ef688f6830681069380376707e55518007a20a86c\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:083d7d64af31cd090f870eb49fb815e6bb42c175fc602ee9dae2f28f082bd4dc\", size \"25648593\" in 1.714103692s" Jul 7 00:18:57.256813 containerd[1548]: time="2025-07-07T00:18:57.256800450Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.10\" returns image reference \"sha256:8907c2d36348551c1038e24ef688f6830681069380376707e55518007a20a86c\"" Jul 7 00:18:57.259232 containerd[1548]: time="2025-07-07T00:18:57.259199812Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.10\"" Jul 7 00:18:58.438816 containerd[1548]: time="2025-07-07T00:18:58.438762133Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:18:58.441177 containerd[1548]: time="2025-07-07T00:18:58.441127653Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.10: active requests=0, bytes read=22459697" Jul 7 00:18:58.442401 containerd[1548]: time="2025-07-07T00:18:58.442306033Z" level=info msg="ImageCreate event name:\"sha256:0f640d6889416d515a0ac4de1c26f4d80134c47641ff464abc831560a951175f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:18:58.449582 containerd[1548]: time="2025-07-07T00:18:58.449483355Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3c67387d023c6114879f1e817669fd641797d30f117230682faf3930ecaaf0fe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:18:58.450588 containerd[1548]: time="2025-07-07T00:18:58.450552773Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.10\" with image id \"sha256:0f640d6889416d515a0ac4de1c26f4d80134c47641ff464abc831560a951175f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3c67387d023c6114879f1e817669fd641797d30f117230682faf3930ecaaf0fe\", size \"23995467\" in 1.191152438s" Jul 7 00:18:58.450692 containerd[1548]: time="2025-07-07T00:18:58.450677535Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.10\" returns image reference \"sha256:0f640d6889416d515a0ac4de1c26f4d80134c47641ff464abc831560a951175f\"" Jul 7 00:18:58.451434 containerd[1548]: time="2025-07-07T00:18:58.451167784Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.10\"" Jul 7 00:18:59.392850 containerd[1548]: time="2025-07-07T00:18:59.392782088Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:18:59.394664 containerd[1548]: time="2025-07-07T00:18:59.394614159Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.10: active requests=0, bytes read=17125086" Jul 7 00:18:59.394781 containerd[1548]: time="2025-07-07T00:18:59.394737201Z" level=info msg="ImageCreate event name:\"sha256:23d79b83d912e2633bcb4f9f7b8b46024893e11d492a4249d8f1f8c9a26b7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:18:59.398541 containerd[1548]: time="2025-07-07T00:18:59.398280220Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:284dc2a5cf6afc9b76e39ad4b79c680c23d289488517643b28784a06d0141272\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:18:59.399454 containerd[1548]: time="2025-07-07T00:18:59.399321957Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.10\" with image id \"sha256:23d79b83d912e2633bcb4f9f7b8b46024893e11d492a4249d8f1f8c9a26b7b2c\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:284dc2a5cf6afc9b76e39ad4b79c680c23d289488517643b28784a06d0141272\", size \"18660874\" in 948.122293ms" Jul 7 00:18:59.399454 containerd[1548]: time="2025-07-07T00:18:59.399360277Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.10\" returns image reference \"sha256:23d79b83d912e2633bcb4f9f7b8b46024893e11d492a4249d8f1f8c9a26b7b2c\"" Jul 7 00:18:59.400710 containerd[1548]: time="2025-07-07T00:18:59.400685579Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.10\"" Jul 7 00:19:00.444459 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3090169871.mount: Deactivated successfully. Jul 7 00:19:00.827299 containerd[1548]: time="2025-07-07T00:19:00.827213463Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:00.828333 containerd[1548]: time="2025-07-07T00:19:00.828233719Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.10: active requests=0, bytes read=26915983" Jul 7 00:19:00.829363 containerd[1548]: time="2025-07-07T00:19:00.829312777Z" level=info msg="ImageCreate event name:\"sha256:dde5ff0da443b455e81aefc7bf6a216fdd659d1cbe13b8e8ac8129c3ecd27f89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:00.834055 containerd[1548]: time="2025-07-07T00:19:00.833987373Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bcbb293812bdf587b28ea98369a8c347ca84884160046296761acdf12b27029d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:00.835766 containerd[1548]: time="2025-07-07T00:19:00.835550998Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.10\" with image id \"sha256:dde5ff0da443b455e81aefc7bf6a216fdd659d1cbe13b8e8ac8129c3ecd27f89\", repo tag \"registry.k8s.io/kube-proxy:v1.31.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:bcbb293812bdf587b28ea98369a8c347ca84884160046296761acdf12b27029d\", size \"26914976\" in 1.434721056s" Jul 7 00:19:00.835766 containerd[1548]: time="2025-07-07T00:19:00.835606159Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.10\" returns image reference \"sha256:dde5ff0da443b455e81aefc7bf6a216fdd659d1cbe13b8e8ac8129c3ecd27f89\"" Jul 7 00:19:00.836868 containerd[1548]: time="2025-07-07T00:19:00.836677096Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jul 7 00:19:01.458375 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3917320288.mount: Deactivated successfully. Jul 7 00:19:02.217543 containerd[1548]: time="2025-07-07T00:19:02.216157216Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:02.218074 containerd[1548]: time="2025-07-07T00:19:02.217715680Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951714" Jul 7 00:19:02.218761 containerd[1548]: time="2025-07-07T00:19:02.218698695Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:02.222704 containerd[1548]: time="2025-07-07T00:19:02.222654797Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:02.224200 containerd[1548]: time="2025-07-07T00:19:02.224162420Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.387170679s" Jul 7 00:19:02.224326 containerd[1548]: time="2025-07-07T00:19:02.224308742Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Jul 7 00:19:02.224994 containerd[1548]: time="2025-07-07T00:19:02.224855831Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jul 7 00:19:02.751465 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount695433955.mount: Deactivated successfully. Jul 7 00:19:02.756223 containerd[1548]: time="2025-07-07T00:19:02.756157641Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 7 00:19:02.757728 containerd[1548]: time="2025-07-07T00:19:02.757649064Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Jul 7 00:19:02.758880 containerd[1548]: time="2025-07-07T00:19:02.758809322Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 7 00:19:02.763990 containerd[1548]: time="2025-07-07T00:19:02.762671422Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 7 00:19:02.763990 containerd[1548]: time="2025-07-07T00:19:02.762925106Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 537.708629ms" Jul 7 00:19:02.763990 containerd[1548]: time="2025-07-07T00:19:02.762952186Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Jul 7 00:19:02.763990 containerd[1548]: time="2025-07-07T00:19:02.763968962Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Jul 7 00:19:03.333523 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3682094928.mount: Deactivated successfully. Jul 7 00:19:04.863825 containerd[1548]: time="2025-07-07T00:19:04.863696306Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:04.866080 containerd[1548]: time="2025-07-07T00:19:04.866018940Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66406533" Jul 7 00:19:04.868178 containerd[1548]: time="2025-07-07T00:19:04.868123811Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:04.876243 containerd[1548]: time="2025-07-07T00:19:04.874827990Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:04.876989 containerd[1548]: time="2025-07-07T00:19:04.876940941Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 2.112938179s" Jul 7 00:19:04.877138 containerd[1548]: time="2025-07-07T00:19:04.877114864Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Jul 7 00:19:07.073020 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 13. Jul 7 00:19:07.077838 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:19:07.234688 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:19:07.240951 (kubelet)[2387]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:19:07.299586 kubelet[2387]: E0707 00:19:07.299534 2387 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:19:07.304334 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:19:07.304475 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:19:07.306037 systemd[1]: kubelet.service: Consumed 168ms CPU time, 105M memory peak. Jul 7 00:19:08.972733 systemd[1]: Started sshd@9-91.107.203.174:22-103.23.198.220:35778.service - OpenSSH per-connection server daemon (103.23.198.220:35778). Jul 7 00:19:10.119950 sshd[2395]: Invalid user john from 103.23.198.220 port 35778 Jul 7 00:19:10.339127 sshd[2395]: Received disconnect from 103.23.198.220 port 35778:11: Bye Bye [preauth] Jul 7 00:19:10.339127 sshd[2395]: Disconnected from invalid user john 103.23.198.220 port 35778 [preauth] Jul 7 00:19:10.343122 systemd[1]: sshd@9-91.107.203.174:22-103.23.198.220:35778.service: Deactivated successfully. Jul 7 00:19:10.420408 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:19:10.421091 systemd[1]: kubelet.service: Consumed 168ms CPU time, 105M memory peak. Jul 7 00:19:10.423336 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:19:10.456491 systemd[1]: Reload requested from client PID 2406 ('systemctl') (unit session-7.scope)... Jul 7 00:19:10.456591 systemd[1]: Reloading... Jul 7 00:19:10.586527 zram_generator::config[2450]: No configuration found. Jul 7 00:19:10.668252 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 7 00:19:10.774209 systemd[1]: Reloading finished in 317 ms. Jul 7 00:19:10.843404 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jul 7 00:19:10.843644 systemd[1]: kubelet.service: Failed with result 'signal'. Jul 7 00:19:10.844310 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:19:10.844397 systemd[1]: kubelet.service: Consumed 104ms CPU time, 95M memory peak. Jul 7 00:19:10.849860 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:19:11.009884 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:19:11.024167 (kubelet)[2498]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 7 00:19:11.078000 kubelet[2498]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 7 00:19:11.078000 kubelet[2498]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jul 7 00:19:11.078000 kubelet[2498]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 7 00:19:11.079999 kubelet[2498]: I0707 00:19:11.078079 2498 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 7 00:19:11.691727 kubelet[2498]: I0707 00:19:11.691686 2498 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Jul 7 00:19:11.691932 kubelet[2498]: I0707 00:19:11.691919 2498 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 7 00:19:11.692333 kubelet[2498]: I0707 00:19:11.692316 2498 server.go:934] "Client rotation is on, will bootstrap in background" Jul 7 00:19:11.725985 kubelet[2498]: E0707 00:19:11.725930 2498 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://91.107.203.174:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 91.107.203.174:6443: connect: connection refused" logger="UnhandledError" Jul 7 00:19:11.729247 kubelet[2498]: I0707 00:19:11.729188 2498 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 7 00:19:11.739903 kubelet[2498]: I0707 00:19:11.739873 2498 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 7 00:19:11.744243 kubelet[2498]: I0707 00:19:11.744209 2498 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 7 00:19:11.745581 kubelet[2498]: I0707 00:19:11.745496 2498 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jul 7 00:19:11.745958 kubelet[2498]: I0707 00:19:11.745918 2498 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 7 00:19:11.746216 kubelet[2498]: I0707 00:19:11.746034 2498 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344-1-1-1-1232b7205a","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 7 00:19:11.746490 kubelet[2498]: I0707 00:19:11.746476 2498 topology_manager.go:138] "Creating topology manager with none policy" Jul 7 00:19:11.746602 kubelet[2498]: I0707 00:19:11.746592 2498 container_manager_linux.go:300] "Creating device plugin manager" Jul 7 00:19:11.746947 kubelet[2498]: I0707 00:19:11.746930 2498 state_mem.go:36] "Initialized new in-memory state store" Jul 7 00:19:11.750059 kubelet[2498]: I0707 00:19:11.750028 2498 kubelet.go:408] "Attempting to sync node with API server" Jul 7 00:19:11.750570 kubelet[2498]: I0707 00:19:11.750555 2498 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 7 00:19:11.750693 kubelet[2498]: I0707 00:19:11.750678 2498 kubelet.go:314] "Adding apiserver pod source" Jul 7 00:19:11.750910 kubelet[2498]: I0707 00:19:11.750888 2498 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 7 00:19:11.753887 kubelet[2498]: W0707 00:19:11.753671 2498 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://91.107.203.174:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344-1-1-1-1232b7205a&limit=500&resourceVersion=0": dial tcp 91.107.203.174:6443: connect: connection refused Jul 7 00:19:11.753887 kubelet[2498]: E0707 00:19:11.753741 2498 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://91.107.203.174:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344-1-1-1-1232b7205a&limit=500&resourceVersion=0\": dial tcp 91.107.203.174:6443: connect: connection refused" logger="UnhandledError" Jul 7 00:19:11.757253 kubelet[2498]: W0707 00:19:11.757185 2498 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://91.107.203.174:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 91.107.203.174:6443: connect: connection refused Jul 7 00:19:11.757545 kubelet[2498]: E0707 00:19:11.757387 2498 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://91.107.203.174:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 91.107.203.174:6443: connect: connection refused" logger="UnhandledError" Jul 7 00:19:11.757795 kubelet[2498]: I0707 00:19:11.757779 2498 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Jul 7 00:19:11.759540 kubelet[2498]: I0707 00:19:11.758992 2498 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 7 00:19:11.759540 kubelet[2498]: W0707 00:19:11.759202 2498 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jul 7 00:19:11.761955 kubelet[2498]: I0707 00:19:11.761930 2498 server.go:1274] "Started kubelet" Jul 7 00:19:11.771053 kubelet[2498]: I0707 00:19:11.770994 2498 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 7 00:19:11.772367 kubelet[2498]: E0707 00:19:11.770941 2498 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://91.107.203.174:6443/api/v1/namespaces/default/events\": dial tcp 91.107.203.174:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4344-1-1-1-1232b7205a.184fd0110f67eae3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4344-1-1-1-1232b7205a,UID:ci-4344-1-1-1-1232b7205a,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4344-1-1-1-1232b7205a,},FirstTimestamp:2025-07-07 00:19:11.761902307 +0000 UTC m=+0.733103889,LastTimestamp:2025-07-07 00:19:11.761902307 +0000 UTC m=+0.733103889,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4344-1-1-1-1232b7205a,}" Jul 7 00:19:11.774756 kubelet[2498]: I0707 00:19:11.774709 2498 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jul 7 00:19:11.775998 kubelet[2498]: I0707 00:19:11.775978 2498 server.go:449] "Adding debug handlers to kubelet server" Jul 7 00:19:11.777144 kubelet[2498]: I0707 00:19:11.777044 2498 volume_manager.go:289] "Starting Kubelet Volume Manager" Jul 7 00:19:11.777269 kubelet[2498]: I0707 00:19:11.777222 2498 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 7 00:19:11.777377 kubelet[2498]: E0707 00:19:11.777348 2498 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4344-1-1-1-1232b7205a\" not found" Jul 7 00:19:11.777616 kubelet[2498]: I0707 00:19:11.777603 2498 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 7 00:19:11.778000 kubelet[2498]: I0707 00:19:11.777978 2498 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 7 00:19:11.778846 kubelet[2498]: E0707 00:19:11.778801 2498 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.107.203.174:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-1-1-1-1232b7205a?timeout=10s\": dial tcp 91.107.203.174:6443: connect: connection refused" interval="200ms" Jul 7 00:19:11.778979 kubelet[2498]: I0707 00:19:11.778969 2498 reconciler.go:26] "Reconciler: start to sync state" Jul 7 00:19:11.779073 kubelet[2498]: I0707 00:19:11.779063 2498 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Jul 7 00:19:11.779561 kubelet[2498]: W0707 00:19:11.779495 2498 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://91.107.203.174:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 91.107.203.174:6443: connect: connection refused Jul 7 00:19:11.779676 kubelet[2498]: E0707 00:19:11.779658 2498 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://91.107.203.174:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 91.107.203.174:6443: connect: connection refused" logger="UnhandledError" Jul 7 00:19:11.779920 kubelet[2498]: I0707 00:19:11.779900 2498 factory.go:221] Registration of the systemd container factory successfully Jul 7 00:19:11.780084 kubelet[2498]: I0707 00:19:11.780048 2498 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 7 00:19:11.781527 kubelet[2498]: E0707 00:19:11.781482 2498 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 7 00:19:11.782052 kubelet[2498]: I0707 00:19:11.782030 2498 factory.go:221] Registration of the containerd container factory successfully Jul 7 00:19:11.789997 kubelet[2498]: I0707 00:19:11.789944 2498 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 7 00:19:11.791589 kubelet[2498]: I0707 00:19:11.791200 2498 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 7 00:19:11.791589 kubelet[2498]: I0707 00:19:11.791235 2498 status_manager.go:217] "Starting to sync pod status with apiserver" Jul 7 00:19:11.791589 kubelet[2498]: I0707 00:19:11.791258 2498 kubelet.go:2321] "Starting kubelet main sync loop" Jul 7 00:19:11.791589 kubelet[2498]: E0707 00:19:11.791302 2498 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 7 00:19:11.799363 kubelet[2498]: W0707 00:19:11.799287 2498 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://91.107.203.174:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 91.107.203.174:6443: connect: connection refused Jul 7 00:19:11.799595 kubelet[2498]: E0707 00:19:11.799559 2498 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://91.107.203.174:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 91.107.203.174:6443: connect: connection refused" logger="UnhandledError" Jul 7 00:19:11.817182 kubelet[2498]: I0707 00:19:11.817157 2498 cpu_manager.go:214] "Starting CPU manager" policy="none" Jul 7 00:19:11.817368 kubelet[2498]: I0707 00:19:11.817355 2498 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jul 7 00:19:11.817521 kubelet[2498]: I0707 00:19:11.817449 2498 state_mem.go:36] "Initialized new in-memory state store" Jul 7 00:19:11.820573 kubelet[2498]: I0707 00:19:11.820542 2498 policy_none.go:49] "None policy: Start" Jul 7 00:19:11.822572 kubelet[2498]: I0707 00:19:11.822544 2498 memory_manager.go:170] "Starting memorymanager" policy="None" Jul 7 00:19:11.822572 kubelet[2498]: I0707 00:19:11.822661 2498 state_mem.go:35] "Initializing new in-memory state store" Jul 7 00:19:11.830842 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jul 7 00:19:11.850143 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jul 7 00:19:11.854630 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jul 7 00:19:11.863757 kubelet[2498]: I0707 00:19:11.863650 2498 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 7 00:19:11.864377 kubelet[2498]: I0707 00:19:11.864309 2498 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 7 00:19:11.864377 kubelet[2498]: I0707 00:19:11.864328 2498 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 7 00:19:11.866979 kubelet[2498]: I0707 00:19:11.865745 2498 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 7 00:19:11.869056 kubelet[2498]: E0707 00:19:11.869012 2498 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4344-1-1-1-1232b7205a\" not found" Jul 7 00:19:11.908092 systemd[1]: Created slice kubepods-burstable-podde85395b64c2d4ba6a1bb8b18ffbb2c4.slice - libcontainer container kubepods-burstable-podde85395b64c2d4ba6a1bb8b18ffbb2c4.slice. Jul 7 00:19:11.926718 systemd[1]: Created slice kubepods-burstable-pod9025b911180694d43a681cfb2352862e.slice - libcontainer container kubepods-burstable-pod9025b911180694d43a681cfb2352862e.slice. Jul 7 00:19:11.937976 systemd[1]: Created slice kubepods-burstable-podfdf14af2b0b9088f0c14528332e0a414.slice - libcontainer container kubepods-burstable-podfdf14af2b0b9088f0c14528332e0a414.slice. Jul 7 00:19:11.968364 kubelet[2498]: I0707 00:19:11.968101 2498 kubelet_node_status.go:72] "Attempting to register node" node="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:11.969356 kubelet[2498]: E0707 00:19:11.969304 2498 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://91.107.203.174:6443/api/v1/nodes\": dial tcp 91.107.203.174:6443: connect: connection refused" node="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:11.979773 kubelet[2498]: E0707 00:19:11.979697 2498 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.107.203.174:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-1-1-1-1232b7205a?timeout=10s\": dial tcp 91.107.203.174:6443: connect: connection refused" interval="400ms" Jul 7 00:19:11.980032 kubelet[2498]: I0707 00:19:11.979859 2498 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fdf14af2b0b9088f0c14528332e0a414-kubeconfig\") pod \"kube-scheduler-ci-4344-1-1-1-1232b7205a\" (UID: \"fdf14af2b0b9088f0c14528332e0a414\") " pod="kube-system/kube-scheduler-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:11.980032 kubelet[2498]: I0707 00:19:11.979906 2498 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/de85395b64c2d4ba6a1bb8b18ffbb2c4-ca-certs\") pod \"kube-apiserver-ci-4344-1-1-1-1232b7205a\" (UID: \"de85395b64c2d4ba6a1bb8b18ffbb2c4\") " pod="kube-system/kube-apiserver-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:11.980290 kubelet[2498]: I0707 00:19:11.980158 2498 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/de85395b64c2d4ba6a1bb8b18ffbb2c4-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344-1-1-1-1232b7205a\" (UID: \"de85395b64c2d4ba6a1bb8b18ffbb2c4\") " pod="kube-system/kube-apiserver-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:11.980290 kubelet[2498]: I0707 00:19:11.980192 2498 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9025b911180694d43a681cfb2352862e-ca-certs\") pod \"kube-controller-manager-ci-4344-1-1-1-1232b7205a\" (UID: \"9025b911180694d43a681cfb2352862e\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:11.980290 kubelet[2498]: I0707 00:19:11.980262 2498 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/9025b911180694d43a681cfb2352862e-flexvolume-dir\") pod \"kube-controller-manager-ci-4344-1-1-1-1232b7205a\" (UID: \"9025b911180694d43a681cfb2352862e\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:11.980497 kubelet[2498]: I0707 00:19:11.980447 2498 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9025b911180694d43a681cfb2352862e-k8s-certs\") pod \"kube-controller-manager-ci-4344-1-1-1-1232b7205a\" (UID: \"9025b911180694d43a681cfb2352862e\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:11.980619 kubelet[2498]: I0707 00:19:11.980600 2498 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/9025b911180694d43a681cfb2352862e-kubeconfig\") pod \"kube-controller-manager-ci-4344-1-1-1-1232b7205a\" (UID: \"9025b911180694d43a681cfb2352862e\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:11.980803 kubelet[2498]: I0707 00:19:11.980757 2498 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/de85395b64c2d4ba6a1bb8b18ffbb2c4-k8s-certs\") pod \"kube-apiserver-ci-4344-1-1-1-1232b7205a\" (UID: \"de85395b64c2d4ba6a1bb8b18ffbb2c4\") " pod="kube-system/kube-apiserver-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:11.980902 kubelet[2498]: I0707 00:19:11.980789 2498 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9025b911180694d43a681cfb2352862e-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344-1-1-1-1232b7205a\" (UID: \"9025b911180694d43a681cfb2352862e\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:12.173615 kubelet[2498]: I0707 00:19:12.173248 2498 kubelet_node_status.go:72] "Attempting to register node" node="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:12.174114 kubelet[2498]: E0707 00:19:12.174078 2498 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://91.107.203.174:6443/api/v1/nodes\": dial tcp 91.107.203.174:6443: connect: connection refused" node="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:12.226031 containerd[1548]: time="2025-07-07T00:19:12.225805092Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344-1-1-1-1232b7205a,Uid:de85395b64c2d4ba6a1bb8b18ffbb2c4,Namespace:kube-system,Attempt:0,}" Jul 7 00:19:12.237067 containerd[1548]: time="2025-07-07T00:19:12.236870069Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344-1-1-1-1232b7205a,Uid:9025b911180694d43a681cfb2352862e,Namespace:kube-system,Attempt:0,}" Jul 7 00:19:12.243634 containerd[1548]: time="2025-07-07T00:19:12.243370109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344-1-1-1-1232b7205a,Uid:fdf14af2b0b9088f0c14528332e0a414,Namespace:kube-system,Attempt:0,}" Jul 7 00:19:12.280656 containerd[1548]: time="2025-07-07T00:19:12.280603851Z" level=info msg="connecting to shim ff16dc30947317e889b1f660521b783fe0688e942925a366bd71f1c9293583be" address="unix:///run/containerd/s/43fb172a97e8f19d2c06ba9c0117d302a2d80ca211ce2a0a04a0cc95865f8a30" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:19:12.282478 containerd[1548]: time="2025-07-07T00:19:12.282440794Z" level=info msg="connecting to shim a39607a2fae23d067618fb30492d936941afa7ee70321575d7a54354d9762604" address="unix:///run/containerd/s/8d27ff861e9834afb6212e1d5f2c85e399f65591fa9ee8b076efa4a0ed42ec9e" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:19:12.322367 containerd[1548]: time="2025-07-07T00:19:12.322252728Z" level=info msg="connecting to shim d28c46c6273a031f36829172e18c789caa80563b1b91e8f9310122b886156e8c" address="unix:///run/containerd/s/65a519d422198a37ac4c901aa92c313ee61b5c8496982607525ad21de5b8bf2d" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:19:12.324786 systemd[1]: Started cri-containerd-ff16dc30947317e889b1f660521b783fe0688e942925a366bd71f1c9293583be.scope - libcontainer container ff16dc30947317e889b1f660521b783fe0688e942925a366bd71f1c9293583be. Jul 7 00:19:12.342835 systemd[1]: Started cri-containerd-a39607a2fae23d067618fb30492d936941afa7ee70321575d7a54354d9762604.scope - libcontainer container a39607a2fae23d067618fb30492d936941afa7ee70321575d7a54354d9762604. Jul 7 00:19:12.365889 systemd[1]: Started cri-containerd-d28c46c6273a031f36829172e18c789caa80563b1b91e8f9310122b886156e8c.scope - libcontainer container d28c46c6273a031f36829172e18c789caa80563b1b91e8f9310122b886156e8c. Jul 7 00:19:12.380921 kubelet[2498]: E0707 00:19:12.380336 2498 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.107.203.174:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-1-1-1-1232b7205a?timeout=10s\": dial tcp 91.107.203.174:6443: connect: connection refused" interval="800ms" Jul 7 00:19:12.409804 containerd[1548]: time="2025-07-07T00:19:12.409591412Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344-1-1-1-1232b7205a,Uid:9025b911180694d43a681cfb2352862e,Namespace:kube-system,Attempt:0,} returns sandbox id \"ff16dc30947317e889b1f660521b783fe0688e942925a366bd71f1c9293583be\"" Jul 7 00:19:12.418803 containerd[1548]: time="2025-07-07T00:19:12.418759045Z" level=info msg="CreateContainer within sandbox \"ff16dc30947317e889b1f660521b783fe0688e942925a366bd71f1c9293583be\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jul 7 00:19:12.427615 containerd[1548]: time="2025-07-07T00:19:12.426913627Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344-1-1-1-1232b7205a,Uid:de85395b64c2d4ba6a1bb8b18ffbb2c4,Namespace:kube-system,Attempt:0,} returns sandbox id \"a39607a2fae23d067618fb30492d936941afa7ee70321575d7a54354d9762604\"" Jul 7 00:19:12.433103 containerd[1548]: time="2025-07-07T00:19:12.432312894Z" level=info msg="CreateContainer within sandbox \"a39607a2fae23d067618fb30492d936941afa7ee70321575d7a54354d9762604\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jul 7 00:19:12.436032 containerd[1548]: time="2025-07-07T00:19:12.435999179Z" level=info msg="Container 026caa3cbfb40d9f93e906135ae21e2e168a3077a0a9a4041a09757daaaa704c: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:19:12.440023 containerd[1548]: time="2025-07-07T00:19:12.439983309Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344-1-1-1-1232b7205a,Uid:fdf14af2b0b9088f0c14528332e0a414,Namespace:kube-system,Attempt:0,} returns sandbox id \"d28c46c6273a031f36829172e18c789caa80563b1b91e8f9310122b886156e8c\"" Jul 7 00:19:12.444125 containerd[1548]: time="2025-07-07T00:19:12.444092200Z" level=info msg="CreateContainer within sandbox \"d28c46c6273a031f36829172e18c789caa80563b1b91e8f9310122b886156e8c\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jul 7 00:19:12.444448 containerd[1548]: time="2025-07-07T00:19:12.444427684Z" level=info msg="Container 9e349d99aad19fab0491b30799d4d96703f488b5faa14262e2764555aceb97b8: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:19:12.448353 containerd[1548]: time="2025-07-07T00:19:12.448307612Z" level=info msg="CreateContainer within sandbox \"ff16dc30947317e889b1f660521b783fe0688e942925a366bd71f1c9293583be\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"026caa3cbfb40d9f93e906135ae21e2e168a3077a0a9a4041a09757daaaa704c\"" Jul 7 00:19:12.449899 containerd[1548]: time="2025-07-07T00:19:12.449863831Z" level=info msg="StartContainer for \"026caa3cbfb40d9f93e906135ae21e2e168a3077a0a9a4041a09757daaaa704c\"" Jul 7 00:19:12.451712 containerd[1548]: time="2025-07-07T00:19:12.451683654Z" level=info msg="connecting to shim 026caa3cbfb40d9f93e906135ae21e2e168a3077a0a9a4041a09757daaaa704c" address="unix:///run/containerd/s/43fb172a97e8f19d2c06ba9c0117d302a2d80ca211ce2a0a04a0cc95865f8a30" protocol=ttrpc version=3 Jul 7 00:19:12.455762 containerd[1548]: time="2025-07-07T00:19:12.455718864Z" level=info msg="CreateContainer within sandbox \"a39607a2fae23d067618fb30492d936941afa7ee70321575d7a54354d9762604\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"9e349d99aad19fab0491b30799d4d96703f488b5faa14262e2764555aceb97b8\"" Jul 7 00:19:12.456697 containerd[1548]: time="2025-07-07T00:19:12.456651996Z" level=info msg="StartContainer for \"9e349d99aad19fab0491b30799d4d96703f488b5faa14262e2764555aceb97b8\"" Jul 7 00:19:12.460624 containerd[1548]: time="2025-07-07T00:19:12.460474643Z" level=info msg="connecting to shim 9e349d99aad19fab0491b30799d4d96703f488b5faa14262e2764555aceb97b8" address="unix:///run/containerd/s/8d27ff861e9834afb6212e1d5f2c85e399f65591fa9ee8b076efa4a0ed42ec9e" protocol=ttrpc version=3 Jul 7 00:19:12.463625 containerd[1548]: time="2025-07-07T00:19:12.463370239Z" level=info msg="Container e758906844fbd49d501758cae788f076f71ece82aca590d755add8e8f0570bcd: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:19:12.479913 systemd[1]: Started cri-containerd-026caa3cbfb40d9f93e906135ae21e2e168a3077a0a9a4041a09757daaaa704c.scope - libcontainer container 026caa3cbfb40d9f93e906135ae21e2e168a3077a0a9a4041a09757daaaa704c. Jul 7 00:19:12.481888 containerd[1548]: time="2025-07-07T00:19:12.481712427Z" level=info msg="CreateContainer within sandbox \"d28c46c6273a031f36829172e18c789caa80563b1b91e8f9310122b886156e8c\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"e758906844fbd49d501758cae788f076f71ece82aca590d755add8e8f0570bcd\"" Jul 7 00:19:12.482594 containerd[1548]: time="2025-07-07T00:19:12.482409235Z" level=info msg="StartContainer for \"e758906844fbd49d501758cae788f076f71ece82aca590d755add8e8f0570bcd\"" Jul 7 00:19:12.489361 containerd[1548]: time="2025-07-07T00:19:12.488807875Z" level=info msg="connecting to shim e758906844fbd49d501758cae788f076f71ece82aca590d755add8e8f0570bcd" address="unix:///run/containerd/s/65a519d422198a37ac4c901aa92c313ee61b5c8496982607525ad21de5b8bf2d" protocol=ttrpc version=3 Jul 7 00:19:12.491680 systemd[1]: Started cri-containerd-9e349d99aad19fab0491b30799d4d96703f488b5faa14262e2764555aceb97b8.scope - libcontainer container 9e349d99aad19fab0491b30799d4d96703f488b5faa14262e2764555aceb97b8. Jul 7 00:19:12.526580 systemd[1]: Started cri-containerd-e758906844fbd49d501758cae788f076f71ece82aca590d755add8e8f0570bcd.scope - libcontainer container e758906844fbd49d501758cae788f076f71ece82aca590d755add8e8f0570bcd. Jul 7 00:19:12.552343 containerd[1548]: time="2025-07-07T00:19:12.551774736Z" level=info msg="StartContainer for \"026caa3cbfb40d9f93e906135ae21e2e168a3077a0a9a4041a09757daaaa704c\" returns successfully" Jul 7 00:19:12.576247 containerd[1548]: time="2025-07-07T00:19:12.575960116Z" level=info msg="StartContainer for \"9e349d99aad19fab0491b30799d4d96703f488b5faa14262e2764555aceb97b8\" returns successfully" Jul 7 00:19:12.578457 kubelet[2498]: I0707 00:19:12.578409 2498 kubelet_node_status.go:72] "Attempting to register node" node="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:12.580053 kubelet[2498]: E0707 00:19:12.580021 2498 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://91.107.203.174:6443/api/v1/nodes\": dial tcp 91.107.203.174:6443: connect: connection refused" node="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:12.625737 containerd[1548]: time="2025-07-07T00:19:12.625694573Z" level=info msg="StartContainer for \"e758906844fbd49d501758cae788f076f71ece82aca590d755add8e8f0570bcd\" returns successfully" Jul 7 00:19:13.384931 kubelet[2498]: I0707 00:19:13.384242 2498 kubelet_node_status.go:72] "Attempting to register node" node="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:14.891601 kubelet[2498]: E0707 00:19:14.891555 2498 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4344-1-1-1-1232b7205a\" not found" node="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:14.986593 kubelet[2498]: I0707 00:19:14.986281 2498 kubelet_node_status.go:75] "Successfully registered node" node="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:15.759012 kubelet[2498]: I0707 00:19:15.758935 2498 apiserver.go:52] "Watching apiserver" Jul 7 00:19:15.779630 kubelet[2498]: I0707 00:19:15.779562 2498 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Jul 7 00:19:16.983136 systemd[1]: Reload requested from client PID 2769 ('systemctl') (unit session-7.scope)... Jul 7 00:19:16.983625 systemd[1]: Reloading... Jul 7 00:19:17.113540 zram_generator::config[2811]: No configuration found. Jul 7 00:19:17.213077 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 7 00:19:17.331438 systemd[1]: Reloading finished in 347 ms. Jul 7 00:19:17.369519 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:19:17.370227 kubelet[2498]: I0707 00:19:17.369893 2498 dynamic_cafile_content.go:174] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 7 00:19:17.384729 systemd[1]: kubelet.service: Deactivated successfully. Jul 7 00:19:17.386961 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:19:17.387062 systemd[1]: kubelet.service: Consumed 1.191s CPU time, 125M memory peak. Jul 7 00:19:17.392087 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:19:17.576457 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:19:17.588000 (kubelet)[2859]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 7 00:19:17.657114 kubelet[2859]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 7 00:19:17.658641 kubelet[2859]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jul 7 00:19:17.658641 kubelet[2859]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 7 00:19:17.658641 kubelet[2859]: I0707 00:19:17.658116 2859 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 7 00:19:17.667677 kubelet[2859]: I0707 00:19:17.667618 2859 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Jul 7 00:19:17.667677 kubelet[2859]: I0707 00:19:17.667651 2859 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 7 00:19:17.668029 kubelet[2859]: I0707 00:19:17.667926 2859 server.go:934] "Client rotation is on, will bootstrap in background" Jul 7 00:19:17.669825 kubelet[2859]: I0707 00:19:17.669780 2859 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jul 7 00:19:17.673116 kubelet[2859]: I0707 00:19:17.673060 2859 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 7 00:19:17.682550 kubelet[2859]: I0707 00:19:17.682488 2859 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 7 00:19:17.687965 kubelet[2859]: I0707 00:19:17.687899 2859 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 7 00:19:17.688316 kubelet[2859]: I0707 00:19:17.688302 2859 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jul 7 00:19:17.688614 kubelet[2859]: I0707 00:19:17.688569 2859 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 7 00:19:17.688979 kubelet[2859]: I0707 00:19:17.688707 2859 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344-1-1-1-1232b7205a","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 7 00:19:17.689199 kubelet[2859]: I0707 00:19:17.689108 2859 topology_manager.go:138] "Creating topology manager with none policy" Jul 7 00:19:17.689199 kubelet[2859]: I0707 00:19:17.689124 2859 container_manager_linux.go:300] "Creating device plugin manager" Jul 7 00:19:17.689199 kubelet[2859]: I0707 00:19:17.689166 2859 state_mem.go:36] "Initialized new in-memory state store" Jul 7 00:19:17.689521 kubelet[2859]: I0707 00:19:17.689467 2859 kubelet.go:408] "Attempting to sync node with API server" Jul 7 00:19:17.689521 kubelet[2859]: I0707 00:19:17.689485 2859 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 7 00:19:17.689720 kubelet[2859]: I0707 00:19:17.689708 2859 kubelet.go:314] "Adding apiserver pod source" Jul 7 00:19:17.689802 kubelet[2859]: I0707 00:19:17.689793 2859 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 7 00:19:17.692401 kubelet[2859]: I0707 00:19:17.692380 2859 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Jul 7 00:19:17.693757 kubelet[2859]: I0707 00:19:17.693649 2859 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 7 00:19:17.695043 kubelet[2859]: I0707 00:19:17.695024 2859 server.go:1274] "Started kubelet" Jul 7 00:19:17.699375 kubelet[2859]: I0707 00:19:17.699347 2859 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 7 00:19:17.706748 kubelet[2859]: I0707 00:19:17.706696 2859 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jul 7 00:19:17.708180 kubelet[2859]: I0707 00:19:17.708151 2859 server.go:449] "Adding debug handlers to kubelet server" Jul 7 00:19:17.711207 kubelet[2859]: I0707 00:19:17.711154 2859 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 7 00:19:17.711571 kubelet[2859]: I0707 00:19:17.711553 2859 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 7 00:19:17.711942 kubelet[2859]: I0707 00:19:17.711903 2859 volume_manager.go:289] "Starting Kubelet Volume Manager" Jul 7 00:19:17.712253 kubelet[2859]: E0707 00:19:17.712223 2859 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4344-1-1-1-1232b7205a\" not found" Jul 7 00:19:17.713796 kubelet[2859]: I0707 00:19:17.711907 2859 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 7 00:19:17.715649 kubelet[2859]: I0707 00:19:17.714770 2859 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Jul 7 00:19:17.716664 kubelet[2859]: I0707 00:19:17.716639 2859 factory.go:221] Registration of the systemd container factory successfully Jul 7 00:19:17.716933 kubelet[2859]: I0707 00:19:17.716905 2859 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 7 00:19:17.717317 kubelet[2859]: E0707 00:19:17.717297 2859 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 7 00:19:17.719270 kubelet[2859]: I0707 00:19:17.719249 2859 factory.go:221] Registration of the containerd container factory successfully Jul 7 00:19:17.719428 kubelet[2859]: I0707 00:19:17.719392 2859 reconciler.go:26] "Reconciler: start to sync state" Jul 7 00:19:17.723470 kubelet[2859]: I0707 00:19:17.723411 2859 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 7 00:19:17.739038 kubelet[2859]: I0707 00:19:17.738989 2859 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 7 00:19:17.739219 kubelet[2859]: I0707 00:19:17.739208 2859 status_manager.go:217] "Starting to sync pod status with apiserver" Jul 7 00:19:17.739470 kubelet[2859]: I0707 00:19:17.739453 2859 kubelet.go:2321] "Starting kubelet main sync loop" Jul 7 00:19:17.739783 kubelet[2859]: E0707 00:19:17.739698 2859 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 7 00:19:17.803484 kubelet[2859]: I0707 00:19:17.803436 2859 cpu_manager.go:214] "Starting CPU manager" policy="none" Jul 7 00:19:17.803484 kubelet[2859]: I0707 00:19:17.803457 2859 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jul 7 00:19:17.803484 kubelet[2859]: I0707 00:19:17.803479 2859 state_mem.go:36] "Initialized new in-memory state store" Jul 7 00:19:17.803789 kubelet[2859]: I0707 00:19:17.803666 2859 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jul 7 00:19:17.803789 kubelet[2859]: I0707 00:19:17.803678 2859 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jul 7 00:19:17.803789 kubelet[2859]: I0707 00:19:17.803697 2859 policy_none.go:49] "None policy: Start" Jul 7 00:19:17.805067 kubelet[2859]: I0707 00:19:17.805020 2859 memory_manager.go:170] "Starting memorymanager" policy="None" Jul 7 00:19:17.805067 kubelet[2859]: I0707 00:19:17.805065 2859 state_mem.go:35] "Initializing new in-memory state store" Jul 7 00:19:17.805256 kubelet[2859]: I0707 00:19:17.805230 2859 state_mem.go:75] "Updated machine memory state" Jul 7 00:19:17.812320 kubelet[2859]: I0707 00:19:17.811664 2859 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 7 00:19:17.812320 kubelet[2859]: I0707 00:19:17.811880 2859 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 7 00:19:17.812320 kubelet[2859]: I0707 00:19:17.811893 2859 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 7 00:19:17.812320 kubelet[2859]: I0707 00:19:17.812142 2859 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 7 00:19:17.917793 kubelet[2859]: I0707 00:19:17.917692 2859 kubelet_node_status.go:72] "Attempting to register node" node="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:17.919940 kubelet[2859]: I0707 00:19:17.919894 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/de85395b64c2d4ba6a1bb8b18ffbb2c4-ca-certs\") pod \"kube-apiserver-ci-4344-1-1-1-1232b7205a\" (UID: \"de85395b64c2d4ba6a1bb8b18ffbb2c4\") " pod="kube-system/kube-apiserver-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:17.920126 kubelet[2859]: I0707 00:19:17.920107 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/9025b911180694d43a681cfb2352862e-flexvolume-dir\") pod \"kube-controller-manager-ci-4344-1-1-1-1232b7205a\" (UID: \"9025b911180694d43a681cfb2352862e\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:17.920620 kubelet[2859]: I0707 00:19:17.920579 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9025b911180694d43a681cfb2352862e-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344-1-1-1-1232b7205a\" (UID: \"9025b911180694d43a681cfb2352862e\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:17.920751 kubelet[2859]: I0707 00:19:17.920735 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/de85395b64c2d4ba6a1bb8b18ffbb2c4-k8s-certs\") pod \"kube-apiserver-ci-4344-1-1-1-1232b7205a\" (UID: \"de85395b64c2d4ba6a1bb8b18ffbb2c4\") " pod="kube-system/kube-apiserver-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:17.921133 kubelet[2859]: I0707 00:19:17.921115 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/de85395b64c2d4ba6a1bb8b18ffbb2c4-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344-1-1-1-1232b7205a\" (UID: \"de85395b64c2d4ba6a1bb8b18ffbb2c4\") " pod="kube-system/kube-apiserver-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:17.921250 kubelet[2859]: I0707 00:19:17.921235 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9025b911180694d43a681cfb2352862e-ca-certs\") pod \"kube-controller-manager-ci-4344-1-1-1-1232b7205a\" (UID: \"9025b911180694d43a681cfb2352862e\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:17.921341 kubelet[2859]: I0707 00:19:17.921328 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9025b911180694d43a681cfb2352862e-k8s-certs\") pod \"kube-controller-manager-ci-4344-1-1-1-1232b7205a\" (UID: \"9025b911180694d43a681cfb2352862e\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:17.922032 kubelet[2859]: I0707 00:19:17.921433 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/9025b911180694d43a681cfb2352862e-kubeconfig\") pod \"kube-controller-manager-ci-4344-1-1-1-1232b7205a\" (UID: \"9025b911180694d43a681cfb2352862e\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:17.922187 kubelet[2859]: I0707 00:19:17.921458 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fdf14af2b0b9088f0c14528332e0a414-kubeconfig\") pod \"kube-scheduler-ci-4344-1-1-1-1232b7205a\" (UID: \"fdf14af2b0b9088f0c14528332e0a414\") " pod="kube-system/kube-scheduler-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:17.929841 kubelet[2859]: I0707 00:19:17.929803 2859 kubelet_node_status.go:111] "Node was previously registered" node="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:17.932585 kubelet[2859]: I0707 00:19:17.929906 2859 kubelet_node_status.go:75] "Successfully registered node" node="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:18.702170 kubelet[2859]: I0707 00:19:18.701775 2859 apiserver.go:52] "Watching apiserver" Jul 7 00:19:18.715941 kubelet[2859]: I0707 00:19:18.715889 2859 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Jul 7 00:19:18.799216 kubelet[2859]: E0707 00:19:18.799169 2859 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4344-1-1-1-1232b7205a\" already exists" pod="kube-system/kube-apiserver-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:18.803213 kubelet[2859]: E0707 00:19:18.803161 2859 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-4344-1-1-1-1232b7205a\" already exists" pod="kube-system/kube-controller-manager-ci-4344-1-1-1-1232b7205a" Jul 7 00:19:18.831498 kubelet[2859]: I0707 00:19:18.831149 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4344-1-1-1-1232b7205a" podStartSLOduration=1.831110242 podStartE2EDuration="1.831110242s" podCreationTimestamp="2025-07-07 00:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-07 00:19:18.830114112 +0000 UTC m=+1.236290760" watchObservedRunningTime="2025-07-07 00:19:18.831110242 +0000 UTC m=+1.237286850" Jul 7 00:19:18.842479 kubelet[2859]: I0707 00:19:18.841901 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4344-1-1-1-1232b7205a" podStartSLOduration=1.841877921 podStartE2EDuration="1.841877921s" podCreationTimestamp="2025-07-07 00:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-07 00:19:18.841115352 +0000 UTC m=+1.247291960" watchObservedRunningTime="2025-07-07 00:19:18.841877921 +0000 UTC m=+1.248054569" Jul 7 00:19:18.877538 kubelet[2859]: I0707 00:19:18.877392 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4344-1-1-1-1232b7205a" podStartSLOduration=1.877374551 podStartE2EDuration="1.877374551s" podCreationTimestamp="2025-07-07 00:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-07 00:19:18.85815642 +0000 UTC m=+1.264333068" watchObservedRunningTime="2025-07-07 00:19:18.877374551 +0000 UTC m=+1.283551159" Jul 7 00:19:22.028462 kubelet[2859]: I0707 00:19:22.028407 2859 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jul 7 00:19:22.029259 containerd[1548]: time="2025-07-07T00:19:22.029210099Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jul 7 00:19:22.029720 kubelet[2859]: I0707 00:19:22.029542 2859 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jul 7 00:19:22.904417 systemd[1]: Created slice kubepods-besteffort-pod67fca619_885f_4881_af95_cd55dab4460f.slice - libcontainer container kubepods-besteffort-pod67fca619_885f_4881_af95_cd55dab4460f.slice. Jul 7 00:19:22.953050 kubelet[2859]: I0707 00:19:22.952958 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/67fca619-885f-4881-af95-cd55dab4460f-kube-proxy\") pod \"kube-proxy-64sx6\" (UID: \"67fca619-885f-4881-af95-cd55dab4460f\") " pod="kube-system/kube-proxy-64sx6" Jul 7 00:19:22.953533 kubelet[2859]: I0707 00:19:22.953422 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/67fca619-885f-4881-af95-cd55dab4460f-xtables-lock\") pod \"kube-proxy-64sx6\" (UID: \"67fca619-885f-4881-af95-cd55dab4460f\") " pod="kube-system/kube-proxy-64sx6" Jul 7 00:19:22.953706 kubelet[2859]: I0707 00:19:22.953682 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/67fca619-885f-4881-af95-cd55dab4460f-lib-modules\") pod \"kube-proxy-64sx6\" (UID: \"67fca619-885f-4881-af95-cd55dab4460f\") " pod="kube-system/kube-proxy-64sx6" Jul 7 00:19:22.954022 kubelet[2859]: I0707 00:19:22.953967 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hsnt\" (UniqueName: \"kubernetes.io/projected/67fca619-885f-4881-af95-cd55dab4460f-kube-api-access-6hsnt\") pod \"kube-proxy-64sx6\" (UID: \"67fca619-885f-4881-af95-cd55dab4460f\") " pod="kube-system/kube-proxy-64sx6" Jul 7 00:19:23.043680 systemd[1]: Created slice kubepods-besteffort-pod03c81a6a_e716_4f95_9626_58946f83779c.slice - libcontainer container kubepods-besteffort-pod03c81a6a_e716_4f95_9626_58946f83779c.slice. Jul 7 00:19:23.054578 kubelet[2859]: I0707 00:19:23.054127 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/03c81a6a-e716-4f95-9626-58946f83779c-var-lib-calico\") pod \"tigera-operator-5bf8dfcb4-qnq2v\" (UID: \"03c81a6a-e716-4f95-9626-58946f83779c\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-qnq2v" Jul 7 00:19:23.054578 kubelet[2859]: I0707 00:19:23.054189 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjtxd\" (UniqueName: \"kubernetes.io/projected/03c81a6a-e716-4f95-9626-58946f83779c-kube-api-access-vjtxd\") pod \"tigera-operator-5bf8dfcb4-qnq2v\" (UID: \"03c81a6a-e716-4f95-9626-58946f83779c\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-qnq2v" Jul 7 00:19:23.216634 containerd[1548]: time="2025-07-07T00:19:23.216330754Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-64sx6,Uid:67fca619-885f-4881-af95-cd55dab4460f,Namespace:kube-system,Attempt:0,}" Jul 7 00:19:23.241403 containerd[1548]: time="2025-07-07T00:19:23.241006681Z" level=info msg="connecting to shim d83936188a6abd12eb5318f64994fdbcc1a376fd86f3a739ad6b4bdb7ed9e9f5" address="unix:///run/containerd/s/399f01020822e272441a1208e11bc58bc97f85d206671bb06e2ec3b99873d7df" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:19:23.272768 systemd[1]: Started cri-containerd-d83936188a6abd12eb5318f64994fdbcc1a376fd86f3a739ad6b4bdb7ed9e9f5.scope - libcontainer container d83936188a6abd12eb5318f64994fdbcc1a376fd86f3a739ad6b4bdb7ed9e9f5. Jul 7 00:19:23.304310 containerd[1548]: time="2025-07-07T00:19:23.304181553Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-64sx6,Uid:67fca619-885f-4881-af95-cd55dab4460f,Namespace:kube-system,Attempt:0,} returns sandbox id \"d83936188a6abd12eb5318f64994fdbcc1a376fd86f3a739ad6b4bdb7ed9e9f5\"" Jul 7 00:19:23.309580 containerd[1548]: time="2025-07-07T00:19:23.309494046Z" level=info msg="CreateContainer within sandbox \"d83936188a6abd12eb5318f64994fdbcc1a376fd86f3a739ad6b4bdb7ed9e9f5\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jul 7 00:19:23.324730 containerd[1548]: time="2025-07-07T00:19:23.323388905Z" level=info msg="Container a1cfa8ff888aed79ac4fd9b96e43b1a27a623a57423eef1f94ffb98dc46ae34c: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:19:23.335448 containerd[1548]: time="2025-07-07T00:19:23.335382025Z" level=info msg="CreateContainer within sandbox \"d83936188a6abd12eb5318f64994fdbcc1a376fd86f3a739ad6b4bdb7ed9e9f5\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"a1cfa8ff888aed79ac4fd9b96e43b1a27a623a57423eef1f94ffb98dc46ae34c\"" Jul 7 00:19:23.336582 containerd[1548]: time="2025-07-07T00:19:23.336473436Z" level=info msg="StartContainer for \"a1cfa8ff888aed79ac4fd9b96e43b1a27a623a57423eef1f94ffb98dc46ae34c\"" Jul 7 00:19:23.338248 containerd[1548]: time="2025-07-07T00:19:23.338207413Z" level=info msg="connecting to shim a1cfa8ff888aed79ac4fd9b96e43b1a27a623a57423eef1f94ffb98dc46ae34c" address="unix:///run/containerd/s/399f01020822e272441a1208e11bc58bc97f85d206671bb06e2ec3b99873d7df" protocol=ttrpc version=3 Jul 7 00:19:23.348621 containerd[1548]: time="2025-07-07T00:19:23.348568837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-qnq2v,Uid:03c81a6a-e716-4f95-9626-58946f83779c,Namespace:tigera-operator,Attempt:0,}" Jul 7 00:19:23.361745 systemd[1]: Started cri-containerd-a1cfa8ff888aed79ac4fd9b96e43b1a27a623a57423eef1f94ffb98dc46ae34c.scope - libcontainer container a1cfa8ff888aed79ac4fd9b96e43b1a27a623a57423eef1f94ffb98dc46ae34c. Jul 7 00:19:23.375231 containerd[1548]: time="2025-07-07T00:19:23.374717739Z" level=info msg="connecting to shim 36f8433403bd11d833053a2193aebf56088eb3f34900457d2ecaf0c292c0697a" address="unix:///run/containerd/s/198060c2a201520b7cc42f21b2d859c91821eec2a4b2e44c4a4a25eaa228af5a" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:19:23.404737 systemd[1]: Started cri-containerd-36f8433403bd11d833053a2193aebf56088eb3f34900457d2ecaf0c292c0697a.scope - libcontainer container 36f8433403bd11d833053a2193aebf56088eb3f34900457d2ecaf0c292c0697a. Jul 7 00:19:23.423014 containerd[1548]: time="2025-07-07T00:19:23.422968861Z" level=info msg="StartContainer for \"a1cfa8ff888aed79ac4fd9b96e43b1a27a623a57423eef1f94ffb98dc46ae34c\" returns successfully" Jul 7 00:19:23.460573 containerd[1548]: time="2025-07-07T00:19:23.460157513Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-qnq2v,Uid:03c81a6a-e716-4f95-9626-58946f83779c,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"36f8433403bd11d833053a2193aebf56088eb3f34900457d2ecaf0c292c0697a\"" Jul 7 00:19:23.462045 containerd[1548]: time="2025-07-07T00:19:23.461984051Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Jul 7 00:19:24.075363 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount40398941.mount: Deactivated successfully. Jul 7 00:19:25.137432 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3849732224.mount: Deactivated successfully. Jul 7 00:19:25.396097 kubelet[2859]: I0707 00:19:25.395955 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-64sx6" podStartSLOduration=3.395934078 podStartE2EDuration="3.395934078s" podCreationTimestamp="2025-07-07 00:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-07 00:19:23.828856561 +0000 UTC m=+6.235033169" watchObservedRunningTime="2025-07-07 00:19:25.395934078 +0000 UTC m=+7.802110686" Jul 7 00:19:25.770005 containerd[1548]: time="2025-07-07T00:19:25.769845806Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:25.771217 containerd[1548]: time="2025-07-07T00:19:25.771067858Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=22150610" Jul 7 00:19:25.772595 containerd[1548]: time="2025-07-07T00:19:25.772557272Z" level=info msg="ImageCreate event name:\"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:25.775556 containerd[1548]: time="2025-07-07T00:19:25.775355739Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:25.776218 containerd[1548]: time="2025-07-07T00:19:25.776145907Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"22146605\" in 2.314116135s" Jul 7 00:19:25.776218 containerd[1548]: time="2025-07-07T00:19:25.776184067Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\"" Jul 7 00:19:25.780055 containerd[1548]: time="2025-07-07T00:19:25.779556220Z" level=info msg="CreateContainer within sandbox \"36f8433403bd11d833053a2193aebf56088eb3f34900457d2ecaf0c292c0697a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jul 7 00:19:25.790464 containerd[1548]: time="2025-07-07T00:19:25.790416245Z" level=info msg="Container c15d5cdd1143f9f16f1b414480a07a046d83f9439e2a2812dec6af9eed0f174b: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:19:25.800609 containerd[1548]: time="2025-07-07T00:19:25.800444942Z" level=info msg="CreateContainer within sandbox \"36f8433403bd11d833053a2193aebf56088eb3f34900457d2ecaf0c292c0697a\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"c15d5cdd1143f9f16f1b414480a07a046d83f9439e2a2812dec6af9eed0f174b\"" Jul 7 00:19:25.803603 containerd[1548]: time="2025-07-07T00:19:25.803543091Z" level=info msg="StartContainer for \"c15d5cdd1143f9f16f1b414480a07a046d83f9439e2a2812dec6af9eed0f174b\"" Jul 7 00:19:25.806005 containerd[1548]: time="2025-07-07T00:19:25.805952235Z" level=info msg="connecting to shim c15d5cdd1143f9f16f1b414480a07a046d83f9439e2a2812dec6af9eed0f174b" address="unix:///run/containerd/s/198060c2a201520b7cc42f21b2d859c91821eec2a4b2e44c4a4a25eaa228af5a" protocol=ttrpc version=3 Jul 7 00:19:25.843817 systemd[1]: Started cri-containerd-c15d5cdd1143f9f16f1b414480a07a046d83f9439e2a2812dec6af9eed0f174b.scope - libcontainer container c15d5cdd1143f9f16f1b414480a07a046d83f9439e2a2812dec6af9eed0f174b. Jul 7 00:19:25.885731 containerd[1548]: time="2025-07-07T00:19:25.885664364Z" level=info msg="StartContainer for \"c15d5cdd1143f9f16f1b414480a07a046d83f9439e2a2812dec6af9eed0f174b\" returns successfully" Jul 7 00:19:26.836864 kubelet[2859]: I0707 00:19:26.836661 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5bf8dfcb4-qnq2v" podStartSLOduration=2.520536206 podStartE2EDuration="4.83663816s" podCreationTimestamp="2025-07-07 00:19:22 +0000 UTC" firstStartedPulling="2025-07-07 00:19:23.461304045 +0000 UTC m=+5.867480613" lastFinishedPulling="2025-07-07 00:19:25.777405999 +0000 UTC m=+8.183582567" observedRunningTime="2025-07-07 00:19:26.835452148 +0000 UTC m=+9.241628796" watchObservedRunningTime="2025-07-07 00:19:26.83663816 +0000 UTC m=+9.242814728" Jul 7 00:19:32.161737 sudo[1942]: pam_unix(sudo:session): session closed for user root Jul 7 00:19:32.339410 sshd[1941]: Connection closed by 139.178.89.65 port 48688 Jul 7 00:19:32.340715 sshd-session[1939]: pam_unix(sshd:session): session closed for user core Jul 7 00:19:32.347300 systemd[1]: sshd@8-91.107.203.174:22-139.178.89.65:48688.service: Deactivated successfully. Jul 7 00:19:32.352835 systemd[1]: session-7.scope: Deactivated successfully. Jul 7 00:19:32.353268 systemd[1]: session-7.scope: Consumed 7.329s CPU time, 229.1M memory peak. Jul 7 00:19:32.356154 systemd-logind[1498]: Session 7 logged out. Waiting for processes to exit. Jul 7 00:19:32.360936 systemd-logind[1498]: Removed session 7. Jul 7 00:19:40.867094 systemd[1]: Created slice kubepods-besteffort-pod0fe0b531_3220_4b8c_9e5f_e9b5493fca5b.slice - libcontainer container kubepods-besteffort-pod0fe0b531_3220_4b8c_9e5f_e9b5493fca5b.slice. Jul 7 00:19:40.973493 kubelet[2859]: I0707 00:19:40.973355 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe0b531-3220-4b8c-9e5f-e9b5493fca5b-tigera-ca-bundle\") pod \"calico-typha-578989bf9c-xt49b\" (UID: \"0fe0b531-3220-4b8c-9e5f-e9b5493fca5b\") " pod="calico-system/calico-typha-578989bf9c-xt49b" Jul 7 00:19:40.973493 kubelet[2859]: I0707 00:19:40.973419 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjcfk\" (UniqueName: \"kubernetes.io/projected/0fe0b531-3220-4b8c-9e5f-e9b5493fca5b-kube-api-access-hjcfk\") pod \"calico-typha-578989bf9c-xt49b\" (UID: \"0fe0b531-3220-4b8c-9e5f-e9b5493fca5b\") " pod="calico-system/calico-typha-578989bf9c-xt49b" Jul 7 00:19:40.973493 kubelet[2859]: I0707 00:19:40.973450 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/0fe0b531-3220-4b8c-9e5f-e9b5493fca5b-typha-certs\") pod \"calico-typha-578989bf9c-xt49b\" (UID: \"0fe0b531-3220-4b8c-9e5f-e9b5493fca5b\") " pod="calico-system/calico-typha-578989bf9c-xt49b" Jul 7 00:19:41.127581 systemd[1]: Created slice kubepods-besteffort-pod8ae621e3_908e_4b95_96d4_a460265cd348.slice - libcontainer container kubepods-besteffort-pod8ae621e3_908e_4b95_96d4_a460265cd348.slice. Jul 7 00:19:41.171369 containerd[1548]: time="2025-07-07T00:19:41.171319775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-578989bf9c-xt49b,Uid:0fe0b531-3220-4b8c-9e5f-e9b5493fca5b,Namespace:calico-system,Attempt:0,}" Jul 7 00:19:41.176815 kubelet[2859]: I0707 00:19:41.176683 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/8ae621e3-908e-4b95-96d4-a460265cd348-cni-bin-dir\") pod \"calico-node-565mc\" (UID: \"8ae621e3-908e-4b95-96d4-a460265cd348\") " pod="calico-system/calico-node-565mc" Jul 7 00:19:41.176815 kubelet[2859]: I0707 00:19:41.176770 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/8ae621e3-908e-4b95-96d4-a460265cd348-var-run-calico\") pod \"calico-node-565mc\" (UID: \"8ae621e3-908e-4b95-96d4-a460265cd348\") " pod="calico-system/calico-node-565mc" Jul 7 00:19:41.176815 kubelet[2859]: I0707 00:19:41.176790 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxsfn\" (UniqueName: \"kubernetes.io/projected/8ae621e3-908e-4b95-96d4-a460265cd348-kube-api-access-zxsfn\") pod \"calico-node-565mc\" (UID: \"8ae621e3-908e-4b95-96d4-a460265cd348\") " pod="calico-system/calico-node-565mc" Jul 7 00:19:41.176995 kubelet[2859]: I0707 00:19:41.176812 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/8ae621e3-908e-4b95-96d4-a460265cd348-cni-net-dir\") pod \"calico-node-565mc\" (UID: \"8ae621e3-908e-4b95-96d4-a460265cd348\") " pod="calico-system/calico-node-565mc" Jul 7 00:19:41.176995 kubelet[2859]: I0707 00:19:41.176866 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ae621e3-908e-4b95-96d4-a460265cd348-tigera-ca-bundle\") pod \"calico-node-565mc\" (UID: \"8ae621e3-908e-4b95-96d4-a460265cd348\") " pod="calico-system/calico-node-565mc" Jul 7 00:19:41.176995 kubelet[2859]: I0707 00:19:41.176899 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8ae621e3-908e-4b95-96d4-a460265cd348-var-lib-calico\") pod \"calico-node-565mc\" (UID: \"8ae621e3-908e-4b95-96d4-a460265cd348\") " pod="calico-system/calico-node-565mc" Jul 7 00:19:41.177071 kubelet[2859]: I0707 00:19:41.177013 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/8ae621e3-908e-4b95-96d4-a460265cd348-policysync\") pod \"calico-node-565mc\" (UID: \"8ae621e3-908e-4b95-96d4-a460265cd348\") " pod="calico-system/calico-node-565mc" Jul 7 00:19:41.177096 kubelet[2859]: I0707 00:19:41.177084 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/8ae621e3-908e-4b95-96d4-a460265cd348-node-certs\") pod \"calico-node-565mc\" (UID: \"8ae621e3-908e-4b95-96d4-a460265cd348\") " pod="calico-system/calico-node-565mc" Jul 7 00:19:41.177119 kubelet[2859]: I0707 00:19:41.177103 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8ae621e3-908e-4b95-96d4-a460265cd348-xtables-lock\") pod \"calico-node-565mc\" (UID: \"8ae621e3-908e-4b95-96d4-a460265cd348\") " pod="calico-system/calico-node-565mc" Jul 7 00:19:41.177846 kubelet[2859]: I0707 00:19:41.177242 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/8ae621e3-908e-4b95-96d4-a460265cd348-cni-log-dir\") pod \"calico-node-565mc\" (UID: \"8ae621e3-908e-4b95-96d4-a460265cd348\") " pod="calico-system/calico-node-565mc" Jul 7 00:19:41.177846 kubelet[2859]: I0707 00:19:41.177391 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/8ae621e3-908e-4b95-96d4-a460265cd348-flexvol-driver-host\") pod \"calico-node-565mc\" (UID: \"8ae621e3-908e-4b95-96d4-a460265cd348\") " pod="calico-system/calico-node-565mc" Jul 7 00:19:41.177846 kubelet[2859]: I0707 00:19:41.177423 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ae621e3-908e-4b95-96d4-a460265cd348-lib-modules\") pod \"calico-node-565mc\" (UID: \"8ae621e3-908e-4b95-96d4-a460265cd348\") " pod="calico-system/calico-node-565mc" Jul 7 00:19:41.206789 containerd[1548]: time="2025-07-07T00:19:41.205648673Z" level=info msg="connecting to shim 245712a953c856043a5cb225ec1ae0bc4893b9a1e7b89726b022a090e329a8ce" address="unix:///run/containerd/s/80bf65ec70c7ed3756b907279e9c4730ffc84aefbe887ad5fa13ede781e0776b" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:19:41.246916 systemd[1]: Started cri-containerd-245712a953c856043a5cb225ec1ae0bc4893b9a1e7b89726b022a090e329a8ce.scope - libcontainer container 245712a953c856043a5cb225ec1ae0bc4893b9a1e7b89726b022a090e329a8ce. Jul 7 00:19:41.280531 kubelet[2859]: E0707 00:19:41.280108 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.280531 kubelet[2859]: W0707 00:19:41.280378 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.280531 kubelet[2859]: E0707 00:19:41.280404 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.281761 kubelet[2859]: E0707 00:19:41.281656 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.282390 kubelet[2859]: W0707 00:19:41.282015 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.282390 kubelet[2859]: E0707 00:19:41.282050 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.285607 kubelet[2859]: E0707 00:19:41.285488 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.285607 kubelet[2859]: W0707 00:19:41.285533 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.285710 kubelet[2859]: E0707 00:19:41.285618 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.286283 kubelet[2859]: E0707 00:19:41.286246 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.286283 kubelet[2859]: W0707 00:19:41.286265 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.287391 kubelet[2859]: E0707 00:19:41.286536 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.288897 kubelet[2859]: E0707 00:19:41.288875 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.289352 kubelet[2859]: W0707 00:19:41.289087 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.289567 kubelet[2859]: E0707 00:19:41.289549 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.290372 kubelet[2859]: E0707 00:19:41.290237 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.290372 kubelet[2859]: W0707 00:19:41.290252 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.290372 kubelet[2859]: E0707 00:19:41.290295 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.291527 kubelet[2859]: E0707 00:19:41.291429 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.291527 kubelet[2859]: W0707 00:19:41.291447 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.291527 kubelet[2859]: E0707 00:19:41.291518 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.292555 kubelet[2859]: E0707 00:19:41.292536 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.292717 kubelet[2859]: W0707 00:19:41.292636 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.292831 kubelet[2859]: E0707 00:19:41.292769 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.293525 kubelet[2859]: E0707 00:19:41.293445 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.293525 kubelet[2859]: W0707 00:19:41.293462 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.294680 kubelet[2859]: E0707 00:19:41.293617 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.294871 kubelet[2859]: E0707 00:19:41.294856 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.295000 kubelet[2859]: W0707 00:19:41.294946 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.295105 kubelet[2859]: E0707 00:19:41.295048 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.295253 kubelet[2859]: E0707 00:19:41.295242 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.295322 kubelet[2859]: W0707 00:19:41.295306 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.295404 kubelet[2859]: E0707 00:19:41.295384 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.295664 kubelet[2859]: E0707 00:19:41.295652 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.295764 kubelet[2859]: W0707 00:19:41.295720 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.295911 kubelet[2859]: E0707 00:19:41.295898 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.296645 kubelet[2859]: E0707 00:19:41.296586 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.296645 kubelet[2859]: W0707 00:19:41.296601 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.296731 kubelet[2859]: E0707 00:19:41.296642 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.297048 kubelet[2859]: E0707 00:19:41.297031 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.297265 kubelet[2859]: W0707 00:19:41.297175 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.297265 kubelet[2859]: E0707 00:19:41.297227 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.298454 kubelet[2859]: E0707 00:19:41.298412 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.298454 kubelet[2859]: W0707 00:19:41.298430 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.298636 kubelet[2859]: E0707 00:19:41.298540 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.298864 kubelet[2859]: E0707 00:19:41.298842 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.298864 kubelet[2859]: W0707 00:19:41.298858 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.299074 kubelet[2859]: E0707 00:19:41.299032 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.299432 kubelet[2859]: E0707 00:19:41.299414 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.299432 kubelet[2859]: W0707 00:19:41.299428 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.299612 kubelet[2859]: E0707 00:19:41.299486 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.299612 kubelet[2859]: E0707 00:19:41.299566 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.299612 kubelet[2859]: W0707 00:19:41.299574 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.299747 kubelet[2859]: E0707 00:19:41.299721 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.299747 kubelet[2859]: W0707 00:19:41.299743 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.299965 kubelet[2859]: E0707 00:19:41.299724 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.299965 kubelet[2859]: E0707 00:19:41.299785 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.300110 kubelet[2859]: E0707 00:19:41.300093 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.300110 kubelet[2859]: W0707 00:19:41.300107 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.300377 kubelet[2859]: E0707 00:19:41.300306 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.300481 kubelet[2859]: E0707 00:19:41.300453 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.300481 kubelet[2859]: W0707 00:19:41.300471 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.300481 kubelet[2859]: E0707 00:19:41.300496 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.300886 kubelet[2859]: E0707 00:19:41.300823 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.300886 kubelet[2859]: W0707 00:19:41.300833 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.300886 kubelet[2859]: E0707 00:19:41.300846 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.301495 kubelet[2859]: E0707 00:19:41.301478 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.301875 kubelet[2859]: W0707 00:19:41.301848 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.302167 kubelet[2859]: E0707 00:19:41.301972 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.302672 kubelet[2859]: E0707 00:19:41.302569 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.303577 kubelet[2859]: W0707 00:19:41.303496 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.303577 kubelet[2859]: E0707 00:19:41.303543 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.304157 kubelet[2859]: E0707 00:19:41.304125 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.304157 kubelet[2859]: W0707 00:19:41.304140 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.304157 kubelet[2859]: E0707 00:19:41.304158 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.304733 kubelet[2859]: E0707 00:19:41.304580 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.304733 kubelet[2859]: W0707 00:19:41.304691 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.304733 kubelet[2859]: E0707 00:19:41.304708 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.320786 kubelet[2859]: E0707 00:19:41.320750 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.321167 kubelet[2859]: W0707 00:19:41.321100 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.321167 kubelet[2859]: E0707 00:19:41.321128 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.322321 kubelet[2859]: E0707 00:19:41.322199 2859 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8kl6f" podUID="a5dfbb9f-cbfa-4ab7-b6be-7e804154425e" Jul 7 00:19:41.362352 kubelet[2859]: E0707 00:19:41.362316 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.362352 kubelet[2859]: W0707 00:19:41.362342 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.364051 kubelet[2859]: E0707 00:19:41.362365 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.364051 kubelet[2859]: E0707 00:19:41.363667 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.364051 kubelet[2859]: W0707 00:19:41.363684 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.364051 kubelet[2859]: E0707 00:19:41.363702 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.364051 kubelet[2859]: E0707 00:19:41.363873 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.364051 kubelet[2859]: W0707 00:19:41.363881 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.364051 kubelet[2859]: E0707 00:19:41.363889 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.364051 kubelet[2859]: E0707 00:19:41.364014 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.364051 kubelet[2859]: W0707 00:19:41.364025 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.364051 kubelet[2859]: E0707 00:19:41.364032 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.364773 kubelet[2859]: E0707 00:19:41.364199 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.364773 kubelet[2859]: W0707 00:19:41.364207 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.364773 kubelet[2859]: E0707 00:19:41.364214 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.364773 kubelet[2859]: E0707 00:19:41.364362 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.364773 kubelet[2859]: W0707 00:19:41.364373 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.364773 kubelet[2859]: E0707 00:19:41.364385 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.364773 kubelet[2859]: E0707 00:19:41.364588 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.364773 kubelet[2859]: W0707 00:19:41.364598 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.364773 kubelet[2859]: E0707 00:19:41.364609 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.366644 kubelet[2859]: E0707 00:19:41.366619 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.366644 kubelet[2859]: W0707 00:19:41.366638 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.366829 kubelet[2859]: E0707 00:19:41.366654 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.366899 kubelet[2859]: E0707 00:19:41.366841 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.366899 kubelet[2859]: W0707 00:19:41.366849 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.366899 kubelet[2859]: E0707 00:19:41.366858 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.367079 kubelet[2859]: E0707 00:19:41.366987 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.367079 kubelet[2859]: W0707 00:19:41.366995 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.367079 kubelet[2859]: E0707 00:19:41.367003 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.367198 kubelet[2859]: E0707 00:19:41.367123 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.367198 kubelet[2859]: W0707 00:19:41.367131 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.367198 kubelet[2859]: E0707 00:19:41.367138 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.367313 kubelet[2859]: E0707 00:19:41.367246 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.367313 kubelet[2859]: W0707 00:19:41.367253 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.367313 kubelet[2859]: E0707 00:19:41.367260 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.367431 kubelet[2859]: E0707 00:19:41.367368 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.367431 kubelet[2859]: W0707 00:19:41.367375 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.367431 kubelet[2859]: E0707 00:19:41.367382 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.367582 kubelet[2859]: E0707 00:19:41.367482 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.367582 kubelet[2859]: W0707 00:19:41.367488 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.367582 kubelet[2859]: E0707 00:19:41.367495 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.367811 kubelet[2859]: E0707 00:19:41.367623 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.367811 kubelet[2859]: W0707 00:19:41.367629 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.367811 kubelet[2859]: E0707 00:19:41.367637 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.367811 kubelet[2859]: E0707 00:19:41.367736 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.367811 kubelet[2859]: W0707 00:19:41.367743 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.367811 kubelet[2859]: E0707 00:19:41.367750 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.368125 kubelet[2859]: E0707 00:19:41.367861 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.368125 kubelet[2859]: W0707 00:19:41.367868 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.368125 kubelet[2859]: E0707 00:19:41.367874 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.368125 kubelet[2859]: E0707 00:19:41.367998 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.368125 kubelet[2859]: W0707 00:19:41.368006 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.368125 kubelet[2859]: E0707 00:19:41.368013 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.368125 kubelet[2859]: E0707 00:19:41.368108 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.368125 kubelet[2859]: W0707 00:19:41.368114 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.368125 kubelet[2859]: E0707 00:19:41.368120 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.368670 kubelet[2859]: E0707 00:19:41.368649 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.368670 kubelet[2859]: W0707 00:19:41.368666 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.368797 kubelet[2859]: E0707 00:19:41.368677 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.381544 kubelet[2859]: E0707 00:19:41.380579 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.381544 kubelet[2859]: W0707 00:19:41.380777 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.381544 kubelet[2859]: E0707 00:19:41.380806 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.382191 kubelet[2859]: I0707 00:19:41.381833 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a5dfbb9f-cbfa-4ab7-b6be-7e804154425e-socket-dir\") pod \"csi-node-driver-8kl6f\" (UID: \"a5dfbb9f-cbfa-4ab7-b6be-7e804154425e\") " pod="calico-system/csi-node-driver-8kl6f" Jul 7 00:19:41.383233 kubelet[2859]: E0707 00:19:41.382680 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.383233 kubelet[2859]: W0707 00:19:41.382718 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.383233 kubelet[2859]: E0707 00:19:41.382744 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.383233 kubelet[2859]: E0707 00:19:41.382963 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.383233 kubelet[2859]: W0707 00:19:41.382973 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.383233 kubelet[2859]: E0707 00:19:41.382985 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.384734 kubelet[2859]: E0707 00:19:41.384707 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.384734 kubelet[2859]: W0707 00:19:41.384727 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.385699 kubelet[2859]: E0707 00:19:41.384745 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.385699 kubelet[2859]: I0707 00:19:41.384799 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/a5dfbb9f-cbfa-4ab7-b6be-7e804154425e-varrun\") pod \"csi-node-driver-8kl6f\" (UID: \"a5dfbb9f-cbfa-4ab7-b6be-7e804154425e\") " pod="calico-system/csi-node-driver-8kl6f" Jul 7 00:19:41.385699 kubelet[2859]: E0707 00:19:41.385075 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.385699 kubelet[2859]: W0707 00:19:41.385087 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.385699 kubelet[2859]: E0707 00:19:41.385104 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.385699 kubelet[2859]: I0707 00:19:41.385120 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq584\" (UniqueName: \"kubernetes.io/projected/a5dfbb9f-cbfa-4ab7-b6be-7e804154425e-kube-api-access-vq584\") pod \"csi-node-driver-8kl6f\" (UID: \"a5dfbb9f-cbfa-4ab7-b6be-7e804154425e\") " pod="calico-system/csi-node-driver-8kl6f" Jul 7 00:19:41.385699 kubelet[2859]: E0707 00:19:41.385248 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.385699 kubelet[2859]: W0707 00:19:41.385256 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.387842 kubelet[2859]: E0707 00:19:41.385264 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.387842 kubelet[2859]: I0707 00:19:41.385277 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a5dfbb9f-cbfa-4ab7-b6be-7e804154425e-kubelet-dir\") pod \"csi-node-driver-8kl6f\" (UID: \"a5dfbb9f-cbfa-4ab7-b6be-7e804154425e\") " pod="calico-system/csi-node-driver-8kl6f" Jul 7 00:19:41.387842 kubelet[2859]: E0707 00:19:41.385388 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.387842 kubelet[2859]: W0707 00:19:41.385395 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.387842 kubelet[2859]: E0707 00:19:41.385404 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.387842 kubelet[2859]: I0707 00:19:41.385417 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a5dfbb9f-cbfa-4ab7-b6be-7e804154425e-registration-dir\") pod \"csi-node-driver-8kl6f\" (UID: \"a5dfbb9f-cbfa-4ab7-b6be-7e804154425e\") " pod="calico-system/csi-node-driver-8kl6f" Jul 7 00:19:41.387842 kubelet[2859]: E0707 00:19:41.386545 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.387842 kubelet[2859]: W0707 00:19:41.386560 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.389060 kubelet[2859]: E0707 00:19:41.386580 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.389060 kubelet[2859]: E0707 00:19:41.386713 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.389060 kubelet[2859]: W0707 00:19:41.386720 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.389060 kubelet[2859]: E0707 00:19:41.386727 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.389060 kubelet[2859]: E0707 00:19:41.386827 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.389060 kubelet[2859]: W0707 00:19:41.386837 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.389060 kubelet[2859]: E0707 00:19:41.386844 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.389060 kubelet[2859]: E0707 00:19:41.386976 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.389060 kubelet[2859]: W0707 00:19:41.387017 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.389060 kubelet[2859]: E0707 00:19:41.387027 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.389301 kubelet[2859]: E0707 00:19:41.387802 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.389301 kubelet[2859]: W0707 00:19:41.387816 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.389301 kubelet[2859]: E0707 00:19:41.387830 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.389301 kubelet[2859]: E0707 00:19:41.388057 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.389301 kubelet[2859]: W0707 00:19:41.388066 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.389301 kubelet[2859]: E0707 00:19:41.388076 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.389301 kubelet[2859]: E0707 00:19:41.388330 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.389301 kubelet[2859]: W0707 00:19:41.388341 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.389301 kubelet[2859]: E0707 00:19:41.388352 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.389301 kubelet[2859]: E0707 00:19:41.388829 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.389513 kubelet[2859]: W0707 00:19:41.388840 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.389513 kubelet[2859]: E0707 00:19:41.388852 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.401526 containerd[1548]: time="2025-07-07T00:19:41.401415101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-578989bf9c-xt49b,Uid:0fe0b531-3220-4b8c-9e5f-e9b5493fca5b,Namespace:calico-system,Attempt:0,} returns sandbox id \"245712a953c856043a5cb225ec1ae0bc4893b9a1e7b89726b022a090e329a8ce\"" Jul 7 00:19:41.404116 containerd[1548]: time="2025-07-07T00:19:41.404088481Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Jul 7 00:19:41.434041 containerd[1548]: time="2025-07-07T00:19:41.433709783Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-565mc,Uid:8ae621e3-908e-4b95-96d4-a460265cd348,Namespace:calico-system,Attempt:0,}" Jul 7 00:19:41.462829 containerd[1548]: time="2025-07-07T00:19:41.462575280Z" level=info msg="connecting to shim e1fde83f752ad3c8ddb84d4ef444d8df742b4ca5672c0d2315af7acd9065bf71" address="unix:///run/containerd/s/7dd369a16bcb4762e0966b3b239f7b3712c2ea27903d486031e0ae7f864dfb49" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:19:41.487440 kubelet[2859]: E0707 00:19:41.487413 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.487870 kubelet[2859]: W0707 00:19:41.487805 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.488857 kubelet[2859]: E0707 00:19:41.488148 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.488857 kubelet[2859]: E0707 00:19:41.488735 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.488857 kubelet[2859]: W0707 00:19:41.488766 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.488857 kubelet[2859]: E0707 00:19:41.488782 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.489042 kubelet[2859]: E0707 00:19:41.489017 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.489042 kubelet[2859]: W0707 00:19:41.489027 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.489042 kubelet[2859]: E0707 00:19:41.489047 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.489391 kubelet[2859]: E0707 00:19:41.489370 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.489391 kubelet[2859]: W0707 00:19:41.489385 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.489593 kubelet[2859]: E0707 00:19:41.489549 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.490889 systemd[1]: Started cri-containerd-e1fde83f752ad3c8ddb84d4ef444d8df742b4ca5672c0d2315af7acd9065bf71.scope - libcontainer container e1fde83f752ad3c8ddb84d4ef444d8df742b4ca5672c0d2315af7acd9065bf71. Jul 7 00:19:41.491301 kubelet[2859]: E0707 00:19:41.491174 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.491301 kubelet[2859]: W0707 00:19:41.491186 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.491301 kubelet[2859]: E0707 00:19:41.491239 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.492175 kubelet[2859]: E0707 00:19:41.491493 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.492175 kubelet[2859]: W0707 00:19:41.491642 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.492175 kubelet[2859]: E0707 00:19:41.491655 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.493860 kubelet[2859]: E0707 00:19:41.492805 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.493860 kubelet[2859]: W0707 00:19:41.492827 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.493860 kubelet[2859]: E0707 00:19:41.492891 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.493860 kubelet[2859]: E0707 00:19:41.493356 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.493860 kubelet[2859]: W0707 00:19:41.493371 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.493860 kubelet[2859]: E0707 00:19:41.493391 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.494453 kubelet[2859]: E0707 00:19:41.494354 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.494453 kubelet[2859]: W0707 00:19:41.494379 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.494682 kubelet[2859]: E0707 00:19:41.494541 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.496248 kubelet[2859]: E0707 00:19:41.496111 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.496370 kubelet[2859]: W0707 00:19:41.496337 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.496630 kubelet[2859]: E0707 00:19:41.496609 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.497054 kubelet[2859]: E0707 00:19:41.497023 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.498359 kubelet[2859]: W0707 00:19:41.497997 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.498619 kubelet[2859]: E0707 00:19:41.498580 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.498898 kubelet[2859]: E0707 00:19:41.498813 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.499409 kubelet[2859]: W0707 00:19:41.499008 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.499409 kubelet[2859]: E0707 00:19:41.499066 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.499803 kubelet[2859]: E0707 00:19:41.499750 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.499803 kubelet[2859]: W0707 00:19:41.499787 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.500343 kubelet[2859]: E0707 00:19:41.500004 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.500663 kubelet[2859]: E0707 00:19:41.500584 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.500663 kubelet[2859]: W0707 00:19:41.500599 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.500663 kubelet[2859]: E0707 00:19:41.500637 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.501065 kubelet[2859]: E0707 00:19:41.500997 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.501065 kubelet[2859]: W0707 00:19:41.501011 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.501207 kubelet[2859]: E0707 00:19:41.501066 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.501476 kubelet[2859]: E0707 00:19:41.501409 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.501476 kubelet[2859]: W0707 00:19:41.501422 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.501476 kubelet[2859]: E0707 00:19:41.501450 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.501941 kubelet[2859]: E0707 00:19:41.501904 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.502075 kubelet[2859]: W0707 00:19:41.502016 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.502075 kubelet[2859]: E0707 00:19:41.502063 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.502293 kubelet[2859]: E0707 00:19:41.502280 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.502375 kubelet[2859]: W0707 00:19:41.502346 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.502531 kubelet[2859]: E0707 00:19:41.502444 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.502697 kubelet[2859]: E0707 00:19:41.502668 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.502697 kubelet[2859]: W0707 00:19:41.502682 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.502913 kubelet[2859]: E0707 00:19:41.502862 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.503157 kubelet[2859]: E0707 00:19:41.503141 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.503312 kubelet[2859]: W0707 00:19:41.503237 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.503312 kubelet[2859]: E0707 00:19:41.503268 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.503666 kubelet[2859]: E0707 00:19:41.503550 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.503666 kubelet[2859]: W0707 00:19:41.503567 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.503666 kubelet[2859]: E0707 00:19:41.503586 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.504011 kubelet[2859]: E0707 00:19:41.503890 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.504011 kubelet[2859]: W0707 00:19:41.503905 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.504089 kubelet[2859]: E0707 00:19:41.504008 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.504256 kubelet[2859]: E0707 00:19:41.504244 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.504415 kubelet[2859]: W0707 00:19:41.504341 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.504415 kubelet[2859]: E0707 00:19:41.504386 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.504674 kubelet[2859]: E0707 00:19:41.504659 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.504775 kubelet[2859]: W0707 00:19:41.504731 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.504775 kubelet[2859]: E0707 00:19:41.504748 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.506377 kubelet[2859]: E0707 00:19:41.506311 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.506377 kubelet[2859]: W0707 00:19:41.506334 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.506377 kubelet[2859]: E0707 00:19:41.506350 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.518753 kubelet[2859]: E0707 00:19:41.518720 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:41.518753 kubelet[2859]: W0707 00:19:41.518743 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:41.518753 kubelet[2859]: E0707 00:19:41.518763 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:41.548417 containerd[1548]: time="2025-07-07T00:19:41.548314483Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-565mc,Uid:8ae621e3-908e-4b95-96d4-a460265cd348,Namespace:calico-system,Attempt:0,} returns sandbox id \"e1fde83f752ad3c8ddb84d4ef444d8df742b4ca5672c0d2315af7acd9065bf71\"" Jul 7 00:19:42.740488 kubelet[2859]: E0707 00:19:42.740404 2859 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8kl6f" podUID="a5dfbb9f-cbfa-4ab7-b6be-7e804154425e" Jul 7 00:19:42.796627 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount669054997.mount: Deactivated successfully. Jul 7 00:19:44.022478 containerd[1548]: time="2025-07-07T00:19:44.022385572Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:44.025439 containerd[1548]: time="2025-07-07T00:19:44.025386873Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=33087207" Jul 7 00:19:44.026637 containerd[1548]: time="2025-07-07T00:19:44.026588562Z" level=info msg="ImageCreate event name:\"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:44.030723 containerd[1548]: time="2025-07-07T00:19:44.030654391Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:44.031526 containerd[1548]: time="2025-07-07T00:19:44.031456437Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"33087061\" in 2.627185995s" Jul 7 00:19:44.031526 containerd[1548]: time="2025-07-07T00:19:44.031498237Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\"" Jul 7 00:19:44.034550 containerd[1548]: time="2025-07-07T00:19:44.034464419Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Jul 7 00:19:44.050861 containerd[1548]: time="2025-07-07T00:19:44.050812056Z" level=info msg="CreateContainer within sandbox \"245712a953c856043a5cb225ec1ae0bc4893b9a1e7b89726b022a090e329a8ce\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jul 7 00:19:44.062537 containerd[1548]: time="2025-07-07T00:19:44.060493086Z" level=info msg="Container 3784859ea5a4780ad64484f8a06bb374e2cf0349a48069ebca02c3bde12e0518: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:19:44.078152 containerd[1548]: time="2025-07-07T00:19:44.078081133Z" level=info msg="CreateContainer within sandbox \"245712a953c856043a5cb225ec1ae0bc4893b9a1e7b89726b022a090e329a8ce\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"3784859ea5a4780ad64484f8a06bb374e2cf0349a48069ebca02c3bde12e0518\"" Jul 7 00:19:44.079657 containerd[1548]: time="2025-07-07T00:19:44.079609904Z" level=info msg="StartContainer for \"3784859ea5a4780ad64484f8a06bb374e2cf0349a48069ebca02c3bde12e0518\"" Jul 7 00:19:44.082178 containerd[1548]: time="2025-07-07T00:19:44.082123282Z" level=info msg="connecting to shim 3784859ea5a4780ad64484f8a06bb374e2cf0349a48069ebca02c3bde12e0518" address="unix:///run/containerd/s/80bf65ec70c7ed3756b907279e9c4730ffc84aefbe887ad5fa13ede781e0776b" protocol=ttrpc version=3 Jul 7 00:19:44.111746 systemd[1]: Started cri-containerd-3784859ea5a4780ad64484f8a06bb374e2cf0349a48069ebca02c3bde12e0518.scope - libcontainer container 3784859ea5a4780ad64484f8a06bb374e2cf0349a48069ebca02c3bde12e0518. Jul 7 00:19:44.157446 containerd[1548]: time="2025-07-07T00:19:44.157037302Z" level=info msg="StartContainer for \"3784859ea5a4780ad64484f8a06bb374e2cf0349a48069ebca02c3bde12e0518\" returns successfully" Jul 7 00:19:44.740864 kubelet[2859]: E0707 00:19:44.740364 2859 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8kl6f" podUID="a5dfbb9f-cbfa-4ab7-b6be-7e804154425e" Jul 7 00:19:44.891026 kubelet[2859]: I0707 00:19:44.890839 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-578989bf9c-xt49b" podStartSLOduration=2.261525179 podStartE2EDuration="4.890798909s" podCreationTimestamp="2025-07-07 00:19:40 +0000 UTC" firstStartedPulling="2025-07-07 00:19:41.403460556 +0000 UTC m=+23.809637164" lastFinishedPulling="2025-07-07 00:19:44.032734286 +0000 UTC m=+26.438910894" observedRunningTime="2025-07-07 00:19:44.889796982 +0000 UTC m=+27.295973590" watchObservedRunningTime="2025-07-07 00:19:44.890798909 +0000 UTC m=+27.296975517" Jul 7 00:19:44.896796 kubelet[2859]: E0707 00:19:44.896657 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.896796 kubelet[2859]: W0707 00:19:44.896694 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.896796 kubelet[2859]: E0707 00:19:44.896725 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.897437 kubelet[2859]: E0707 00:19:44.897413 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.897686 kubelet[2859]: W0707 00:19:44.897548 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.897686 kubelet[2859]: E0707 00:19:44.897588 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.898383 kubelet[2859]: E0707 00:19:44.898240 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.898383 kubelet[2859]: W0707 00:19:44.898265 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.898383 kubelet[2859]: E0707 00:19:44.898292 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.899012 kubelet[2859]: E0707 00:19:44.898878 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.899012 kubelet[2859]: W0707 00:19:44.898898 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.899012 kubelet[2859]: E0707 00:19:44.898916 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.899338 kubelet[2859]: E0707 00:19:44.899325 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.899414 kubelet[2859]: W0707 00:19:44.899403 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.899542 kubelet[2859]: E0707 00:19:44.899476 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.899880 kubelet[2859]: E0707 00:19:44.899812 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.899880 kubelet[2859]: W0707 00:19:44.899826 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.899880 kubelet[2859]: E0707 00:19:44.899838 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.900303 kubelet[2859]: E0707 00:19:44.900231 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.900303 kubelet[2859]: W0707 00:19:44.900246 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.900303 kubelet[2859]: E0707 00:19:44.900259 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.900835 kubelet[2859]: E0707 00:19:44.900753 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.900835 kubelet[2859]: W0707 00:19:44.900768 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.900835 kubelet[2859]: E0707 00:19:44.900780 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.901539 kubelet[2859]: E0707 00:19:44.901482 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.901660 kubelet[2859]: W0707 00:19:44.901497 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.901730 kubelet[2859]: E0707 00:19:44.901718 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.902125 kubelet[2859]: E0707 00:19:44.902058 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.902125 kubelet[2859]: W0707 00:19:44.902073 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.902125 kubelet[2859]: E0707 00:19:44.902085 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.902622 kubelet[2859]: E0707 00:19:44.902483 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.902622 kubelet[2859]: W0707 00:19:44.902553 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.902622 kubelet[2859]: E0707 00:19:44.902569 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.903337 kubelet[2859]: E0707 00:19:44.903157 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.903337 kubelet[2859]: W0707 00:19:44.903184 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.903337 kubelet[2859]: E0707 00:19:44.903197 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.904352 kubelet[2859]: E0707 00:19:44.904260 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.904352 kubelet[2859]: W0707 00:19:44.904281 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.904352 kubelet[2859]: E0707 00:19:44.904298 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.904728 kubelet[2859]: E0707 00:19:44.904713 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.904903 kubelet[2859]: W0707 00:19:44.904775 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.904903 kubelet[2859]: E0707 00:19:44.904792 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.905146 kubelet[2859]: E0707 00:19:44.905132 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.905271 kubelet[2859]: W0707 00:19:44.905202 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.905271 kubelet[2859]: E0707 00:19:44.905217 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.925253 kubelet[2859]: E0707 00:19:44.925220 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.925602 kubelet[2859]: W0707 00:19:44.925437 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.925602 kubelet[2859]: E0707 00:19:44.925470 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.926236 kubelet[2859]: E0707 00:19:44.926198 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.926322 kubelet[2859]: W0707 00:19:44.926232 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.926322 kubelet[2859]: E0707 00:19:44.926269 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.926691 kubelet[2859]: E0707 00:19:44.926575 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.926691 kubelet[2859]: W0707 00:19:44.926608 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.926770 kubelet[2859]: E0707 00:19:44.926633 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.927443 kubelet[2859]: E0707 00:19:44.926988 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.927443 kubelet[2859]: W0707 00:19:44.927013 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.927443 kubelet[2859]: E0707 00:19:44.927034 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.927870 kubelet[2859]: E0707 00:19:44.927778 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.927870 kubelet[2859]: W0707 00:19:44.927795 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.929148 kubelet[2859]: E0707 00:19:44.928186 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.929148 kubelet[2859]: W0707 00:19:44.928214 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.929148 kubelet[2859]: E0707 00:19:44.928231 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.929148 kubelet[2859]: E0707 00:19:44.928462 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.929148 kubelet[2859]: W0707 00:19:44.928475 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.929148 kubelet[2859]: E0707 00:19:44.928486 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.929148 kubelet[2859]: E0707 00:19:44.928739 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.929148 kubelet[2859]: W0707 00:19:44.928754 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.929148 kubelet[2859]: E0707 00:19:44.928766 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.929905 kubelet[2859]: E0707 00:19:44.929595 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.929905 kubelet[2859]: W0707 00:19:44.929619 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.929905 kubelet[2859]: E0707 00:19:44.929640 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.929905 kubelet[2859]: E0707 00:19:44.929827 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.929905 kubelet[2859]: W0707 00:19:44.929836 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.929905 kubelet[2859]: E0707 00:19:44.929847 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.930098 kubelet[2859]: E0707 00:19:44.930051 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.930098 kubelet[2859]: W0707 00:19:44.930062 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.930098 kubelet[2859]: E0707 00:19:44.930073 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.930813 kubelet[2859]: E0707 00:19:44.930681 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.930813 kubelet[2859]: W0707 00:19:44.930709 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.930813 kubelet[2859]: E0707 00:19:44.930729 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.930813 kubelet[2859]: E0707 00:19:44.930763 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.931553 kubelet[2859]: E0707 00:19:44.931448 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.931553 kubelet[2859]: W0707 00:19:44.931466 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.931553 kubelet[2859]: E0707 00:19:44.931479 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.931769 kubelet[2859]: E0707 00:19:44.931694 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.931769 kubelet[2859]: W0707 00:19:44.931704 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.931769 kubelet[2859]: E0707 00:19:44.931712 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.932599 kubelet[2859]: E0707 00:19:44.931842 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.932599 kubelet[2859]: W0707 00:19:44.931855 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.932599 kubelet[2859]: E0707 00:19:44.931866 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.932599 kubelet[2859]: E0707 00:19:44.931980 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.932599 kubelet[2859]: W0707 00:19:44.931987 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.932599 kubelet[2859]: E0707 00:19:44.931993 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.932599 kubelet[2859]: E0707 00:19:44.932118 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.932599 kubelet[2859]: W0707 00:19:44.932124 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.932599 kubelet[2859]: E0707 00:19:44.932131 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:44.932599 kubelet[2859]: E0707 00:19:44.932419 2859 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:19:44.932887 kubelet[2859]: W0707 00:19:44.932428 2859 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:19:44.932887 kubelet[2859]: E0707 00:19:44.932436 2859 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:19:45.429241 containerd[1548]: time="2025-07-07T00:19:45.429160988Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:45.430622 containerd[1548]: time="2025-07-07T00:19:45.430536678Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4266981" Jul 7 00:19:45.432244 containerd[1548]: time="2025-07-07T00:19:45.431856568Z" level=info msg="ImageCreate event name:\"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:45.435292 containerd[1548]: time="2025-07-07T00:19:45.435201511Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:45.436048 containerd[1548]: time="2025-07-07T00:19:45.435723395Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5636182\" in 1.401183536s" Jul 7 00:19:45.436048 containerd[1548]: time="2025-07-07T00:19:45.435816436Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\"" Jul 7 00:19:45.441873 containerd[1548]: time="2025-07-07T00:19:45.440762591Z" level=info msg="CreateContainer within sandbox \"e1fde83f752ad3c8ddb84d4ef444d8df742b4ca5672c0d2315af7acd9065bf71\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jul 7 00:19:45.451709 containerd[1548]: time="2025-07-07T00:19:45.449969416Z" level=info msg="Container 3599a6bfae7fbd3b28463d1e522da30d9ef49f2be211086ecef712e7273888a5: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:19:45.465380 containerd[1548]: time="2025-07-07T00:19:45.465311846Z" level=info msg="CreateContainer within sandbox \"e1fde83f752ad3c8ddb84d4ef444d8df742b4ca5672c0d2315af7acd9065bf71\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"3599a6bfae7fbd3b28463d1e522da30d9ef49f2be211086ecef712e7273888a5\"" Jul 7 00:19:45.466176 containerd[1548]: time="2025-07-07T00:19:45.466109571Z" level=info msg="StartContainer for \"3599a6bfae7fbd3b28463d1e522da30d9ef49f2be211086ecef712e7273888a5\"" Jul 7 00:19:45.467886 containerd[1548]: time="2025-07-07T00:19:45.467852144Z" level=info msg="connecting to shim 3599a6bfae7fbd3b28463d1e522da30d9ef49f2be211086ecef712e7273888a5" address="unix:///run/containerd/s/7dd369a16bcb4762e0966b3b239f7b3712c2ea27903d486031e0ae7f864dfb49" protocol=ttrpc version=3 Jul 7 00:19:45.499756 systemd[1]: Started cri-containerd-3599a6bfae7fbd3b28463d1e522da30d9ef49f2be211086ecef712e7273888a5.scope - libcontainer container 3599a6bfae7fbd3b28463d1e522da30d9ef49f2be211086ecef712e7273888a5. Jul 7 00:19:45.551830 containerd[1548]: time="2025-07-07T00:19:45.551725700Z" level=info msg="StartContainer for \"3599a6bfae7fbd3b28463d1e522da30d9ef49f2be211086ecef712e7273888a5\" returns successfully" Jul 7 00:19:45.574788 systemd[1]: cri-containerd-3599a6bfae7fbd3b28463d1e522da30d9ef49f2be211086ecef712e7273888a5.scope: Deactivated successfully. Jul 7 00:19:45.580528 containerd[1548]: time="2025-07-07T00:19:45.580445625Z" level=info msg="received exit event container_id:\"3599a6bfae7fbd3b28463d1e522da30d9ef49f2be211086ecef712e7273888a5\" id:\"3599a6bfae7fbd3b28463d1e522da30d9ef49f2be211086ecef712e7273888a5\" pid:3543 exited_at:{seconds:1751847585 nanos:580024502}" Jul 7 00:19:45.580789 containerd[1548]: time="2025-07-07T00:19:45.580758787Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3599a6bfae7fbd3b28463d1e522da30d9ef49f2be211086ecef712e7273888a5\" id:\"3599a6bfae7fbd3b28463d1e522da30d9ef49f2be211086ecef712e7273888a5\" pid:3543 exited_at:{seconds:1751847585 nanos:580024502}" Jul 7 00:19:45.606631 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3599a6bfae7fbd3b28463d1e522da30d9ef49f2be211086ecef712e7273888a5-rootfs.mount: Deactivated successfully. Jul 7 00:19:45.882408 kubelet[2859]: I0707 00:19:45.882376 2859 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:19:45.885558 containerd[1548]: time="2025-07-07T00:19:45.885465034Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Jul 7 00:19:46.741569 kubelet[2859]: E0707 00:19:46.741399 2859 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8kl6f" podUID="a5dfbb9f-cbfa-4ab7-b6be-7e804154425e" Jul 7 00:19:48.297438 containerd[1548]: time="2025-07-07T00:19:48.297352088Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:48.298468 containerd[1548]: time="2025-07-07T00:19:48.298402575Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=65888320" Jul 7 00:19:48.300582 containerd[1548]: time="2025-07-07T00:19:48.299668024Z" level=info msg="ImageCreate event name:\"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:48.303995 containerd[1548]: time="2025-07-07T00:19:48.303803212Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:48.305214 containerd[1548]: time="2025-07-07T00:19:48.304783979Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"67257561\" in 2.419210904s" Jul 7 00:19:48.305214 containerd[1548]: time="2025-07-07T00:19:48.304830539Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\"" Jul 7 00:19:48.309672 containerd[1548]: time="2025-07-07T00:19:48.309618012Z" level=info msg="CreateContainer within sandbox \"e1fde83f752ad3c8ddb84d4ef444d8df742b4ca5672c0d2315af7acd9065bf71\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jul 7 00:19:48.324534 containerd[1548]: time="2025-07-07T00:19:48.323666468Z" level=info msg="Container a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:19:48.326575 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount121504238.mount: Deactivated successfully. Jul 7 00:19:48.340783 containerd[1548]: time="2025-07-07T00:19:48.340418183Z" level=info msg="CreateContainer within sandbox \"e1fde83f752ad3c8ddb84d4ef444d8df742b4ca5672c0d2315af7acd9065bf71\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2\"" Jul 7 00:19:48.342746 containerd[1548]: time="2025-07-07T00:19:48.342618198Z" level=info msg="StartContainer for \"a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2\"" Jul 7 00:19:48.345717 containerd[1548]: time="2025-07-07T00:19:48.345595978Z" level=info msg="connecting to shim a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2" address="unix:///run/containerd/s/7dd369a16bcb4762e0966b3b239f7b3712c2ea27903d486031e0ae7f864dfb49" protocol=ttrpc version=3 Jul 7 00:19:48.378799 systemd[1]: Started cri-containerd-a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2.scope - libcontainer container a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2. Jul 7 00:19:48.424720 containerd[1548]: time="2025-07-07T00:19:48.424652760Z" level=info msg="StartContainer for \"a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2\" returns successfully" Jul 7 00:19:48.740820 kubelet[2859]: E0707 00:19:48.740746 2859 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8kl6f" podUID="a5dfbb9f-cbfa-4ab7-b6be-7e804154425e" Jul 7 00:19:48.968578 containerd[1548]: time="2025-07-07T00:19:48.968439367Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 7 00:19:48.973085 systemd[1]: cri-containerd-a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2.scope: Deactivated successfully. Jul 7 00:19:48.975703 systemd[1]: cri-containerd-a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2.scope: Consumed 505ms CPU time, 187.3M memory peak, 165.8M written to disk. Jul 7 00:19:48.979323 containerd[1548]: time="2025-07-07T00:19:48.979256441Z" level=info msg="received exit event container_id:\"a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2\" id:\"a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2\" pid:3599 exited_at:{seconds:1751847588 nanos:978886279}" Jul 7 00:19:48.979795 containerd[1548]: time="2025-07-07T00:19:48.979358242Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2\" id:\"a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2\" pid:3599 exited_at:{seconds:1751847588 nanos:978886279}" Jul 7 00:19:49.006274 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2-rootfs.mount: Deactivated successfully. Jul 7 00:19:49.030659 kubelet[2859]: I0707 00:19:49.029548 2859 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Jul 7 00:19:49.092780 systemd[1]: Created slice kubepods-burstable-podc25c4d92_5234_44bc_85a4_f19c76ed4997.slice - libcontainer container kubepods-burstable-podc25c4d92_5234_44bc_85a4_f19c76ed4997.slice. Jul 7 00:19:49.097922 kubelet[2859]: W0707 00:19:49.097796 2859 reflector.go:561] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4344-1-1-1-1232b7205a" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4344-1-1-1-1232b7205a' and this object Jul 7 00:19:49.097922 kubelet[2859]: E0707 00:19:49.097843 2859 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4344-1-1-1-1232b7205a\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4344-1-1-1-1232b7205a' and this object" logger="UnhandledError" Jul 7 00:19:49.099677 kubelet[2859]: W0707 00:19:49.098236 2859 reflector.go:561] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ci-4344-1-1-1-1232b7205a" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4344-1-1-1-1232b7205a' and this object Jul 7 00:19:49.099677 kubelet[2859]: E0707 00:19:49.098266 2859 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:ci-4344-1-1-1-1232b7205a\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4344-1-1-1-1232b7205a' and this object" logger="UnhandledError" Jul 7 00:19:49.107258 systemd[1]: Created slice kubepods-besteffort-pod06bee497_80f4_43cb_b2d9_67d6f0846b05.slice - libcontainer container kubepods-besteffort-pod06bee497_80f4_43cb_b2d9_67d6f0846b05.slice. Jul 7 00:19:49.125469 systemd[1]: Created slice kubepods-besteffort-podf13c4783_fce7_4cf1_ab5f_5ba43426f9fb.slice - libcontainer container kubepods-besteffort-podf13c4783_fce7_4cf1_ab5f_5ba43426f9fb.slice. Jul 7 00:19:49.140673 systemd[1]: Created slice kubepods-burstable-podd39c4f8c_ed03_4a96_9d5f_5df5a6d7eb00.slice - libcontainer container kubepods-burstable-podd39c4f8c_ed03_4a96_9d5f_5df5a6d7eb00.slice. Jul 7 00:19:49.148428 systemd[1]: Created slice kubepods-besteffort-podb9d9ac66_6750_4558_9996_a65b2992f683.slice - libcontainer container kubepods-besteffort-podb9d9ac66_6750_4558_9996_a65b2992f683.slice. Jul 7 00:19:49.157404 systemd[1]: Created slice kubepods-besteffort-pod8c8b32cc_c249_4ae4_a5f5_5ca79580b06a.slice - libcontainer container kubepods-besteffort-pod8c8b32cc_c249_4ae4_a5f5_5ca79580b06a.slice. Jul 7 00:19:49.162864 kubelet[2859]: I0707 00:19:49.162819 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f991e28-7349-433d-8fb6-e8dea1fb5652-goldmane-ca-bundle\") pod \"goldmane-58fd7646b9-f5hxb\" (UID: \"3f991e28-7349-433d-8fb6-e8dea1fb5652\") " pod="calico-system/goldmane-58fd7646b9-f5hxb" Jul 7 00:19:49.162864 kubelet[2859]: I0707 00:19:49.162867 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b9d9ac66-6750-4558-9996-a65b2992f683-calico-apiserver-certs\") pod \"calico-apiserver-7665774675-4v9q6\" (UID: \"b9d9ac66-6750-4558-9996-a65b2992f683\") " pod="calico-apiserver/calico-apiserver-7665774675-4v9q6" Jul 7 00:19:49.163031 kubelet[2859]: I0707 00:19:49.162888 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rm8c\" (UniqueName: \"kubernetes.io/projected/f13c4783-fce7-4cf1-ab5f-5ba43426f9fb-kube-api-access-9rm8c\") pod \"calico-apiserver-d455968b6-d2496\" (UID: \"f13c4783-fce7-4cf1-ab5f-5ba43426f9fb\") " pod="calico-apiserver/calico-apiserver-d455968b6-d2496" Jul 7 00:19:49.163031 kubelet[2859]: I0707 00:19:49.162906 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4fd678b-b655-42d4-b25e-647c02f2ca21-whisker-ca-bundle\") pod \"whisker-967c948f-bzzvp\" (UID: \"e4fd678b-b655-42d4-b25e-647c02f2ca21\") " pod="calico-system/whisker-967c948f-bzzvp" Jul 7 00:19:49.163031 kubelet[2859]: I0707 00:19:49.162924 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssdtn\" (UniqueName: \"kubernetes.io/projected/06bee497-80f4-43cb-b2d9-67d6f0846b05-kube-api-access-ssdtn\") pod \"calico-kube-controllers-59b677dccf-swp4r\" (UID: \"06bee497-80f4-43cb-b2d9-67d6f0846b05\") " pod="calico-system/calico-kube-controllers-59b677dccf-swp4r" Jul 7 00:19:49.163031 kubelet[2859]: I0707 00:19:49.163005 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f991e28-7349-433d-8fb6-e8dea1fb5652-config\") pod \"goldmane-58fd7646b9-f5hxb\" (UID: \"3f991e28-7349-433d-8fb6-e8dea1fb5652\") " pod="calico-system/goldmane-58fd7646b9-f5hxb" Jul 7 00:19:49.163031 kubelet[2859]: I0707 00:19:49.163026 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d39c4f8c-ed03-4a96-9d5f-5df5a6d7eb00-config-volume\") pod \"coredns-7c65d6cfc9-42w27\" (UID: \"d39c4f8c-ed03-4a96-9d5f-5df5a6d7eb00\") " pod="kube-system/coredns-7c65d6cfc9-42w27" Jul 7 00:19:49.163174 kubelet[2859]: I0707 00:19:49.163044 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06bee497-80f4-43cb-b2d9-67d6f0846b05-tigera-ca-bundle\") pod \"calico-kube-controllers-59b677dccf-swp4r\" (UID: \"06bee497-80f4-43cb-b2d9-67d6f0846b05\") " pod="calico-system/calico-kube-controllers-59b677dccf-swp4r" Jul 7 00:19:49.163174 kubelet[2859]: I0707 00:19:49.163060 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqhpw\" (UniqueName: \"kubernetes.io/projected/b9d9ac66-6750-4558-9996-a65b2992f683-kube-api-access-sqhpw\") pod \"calico-apiserver-7665774675-4v9q6\" (UID: \"b9d9ac66-6750-4558-9996-a65b2992f683\") " pod="calico-apiserver/calico-apiserver-7665774675-4v9q6" Jul 7 00:19:49.163174 kubelet[2859]: I0707 00:19:49.163104 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8c8b32cc-c249-4ae4-a5f5-5ca79580b06a-calico-apiserver-certs\") pod \"calico-apiserver-d455968b6-7vdfw\" (UID: \"8c8b32cc-c249-4ae4-a5f5-5ca79580b06a\") " pod="calico-apiserver/calico-apiserver-d455968b6-7vdfw" Jul 7 00:19:49.163174 kubelet[2859]: I0707 00:19:49.163123 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e4fd678b-b655-42d4-b25e-647c02f2ca21-whisker-backend-key-pair\") pod \"whisker-967c948f-bzzvp\" (UID: \"e4fd678b-b655-42d4-b25e-647c02f2ca21\") " pod="calico-system/whisker-967c948f-bzzvp" Jul 7 00:19:49.163174 kubelet[2859]: I0707 00:19:49.163142 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj98q\" (UniqueName: \"kubernetes.io/projected/d39c4f8c-ed03-4a96-9d5f-5df5a6d7eb00-kube-api-access-hj98q\") pod \"coredns-7c65d6cfc9-42w27\" (UID: \"d39c4f8c-ed03-4a96-9d5f-5df5a6d7eb00\") " pod="kube-system/coredns-7c65d6cfc9-42w27" Jul 7 00:19:49.163287 kubelet[2859]: I0707 00:19:49.163158 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnhbt\" (UniqueName: \"kubernetes.io/projected/c25c4d92-5234-44bc-85a4-f19c76ed4997-kube-api-access-tnhbt\") pod \"coredns-7c65d6cfc9-hkcx5\" (UID: \"c25c4d92-5234-44bc-85a4-f19c76ed4997\") " pod="kube-system/coredns-7c65d6cfc9-hkcx5" Jul 7 00:19:49.163287 kubelet[2859]: I0707 00:19:49.163175 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5f7m\" (UniqueName: \"kubernetes.io/projected/e4fd678b-b655-42d4-b25e-647c02f2ca21-kube-api-access-n5f7m\") pod \"whisker-967c948f-bzzvp\" (UID: \"e4fd678b-b655-42d4-b25e-647c02f2ca21\") " pod="calico-system/whisker-967c948f-bzzvp" Jul 7 00:19:49.163287 kubelet[2859]: I0707 00:19:49.163194 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw2qv\" (UniqueName: \"kubernetes.io/projected/3f991e28-7349-433d-8fb6-e8dea1fb5652-kube-api-access-xw2qv\") pod \"goldmane-58fd7646b9-f5hxb\" (UID: \"3f991e28-7349-433d-8fb6-e8dea1fb5652\") " pod="calico-system/goldmane-58fd7646b9-f5hxb" Jul 7 00:19:49.163287 kubelet[2859]: I0707 00:19:49.163212 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g92ws\" (UniqueName: \"kubernetes.io/projected/8c8b32cc-c249-4ae4-a5f5-5ca79580b06a-kube-api-access-g92ws\") pod \"calico-apiserver-d455968b6-7vdfw\" (UID: \"8c8b32cc-c249-4ae4-a5f5-5ca79580b06a\") " pod="calico-apiserver/calico-apiserver-d455968b6-7vdfw" Jul 7 00:19:49.163287 kubelet[2859]: I0707 00:19:49.163231 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f13c4783-fce7-4cf1-ab5f-5ba43426f9fb-calico-apiserver-certs\") pod \"calico-apiserver-d455968b6-d2496\" (UID: \"f13c4783-fce7-4cf1-ab5f-5ba43426f9fb\") " pod="calico-apiserver/calico-apiserver-d455968b6-d2496" Jul 7 00:19:49.163399 kubelet[2859]: I0707 00:19:49.163249 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/3f991e28-7349-433d-8fb6-e8dea1fb5652-goldmane-key-pair\") pod \"goldmane-58fd7646b9-f5hxb\" (UID: \"3f991e28-7349-433d-8fb6-e8dea1fb5652\") " pod="calico-system/goldmane-58fd7646b9-f5hxb" Jul 7 00:19:49.163399 kubelet[2859]: I0707 00:19:49.163267 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c25c4d92-5234-44bc-85a4-f19c76ed4997-config-volume\") pod \"coredns-7c65d6cfc9-hkcx5\" (UID: \"c25c4d92-5234-44bc-85a4-f19c76ed4997\") " pod="kube-system/coredns-7c65d6cfc9-hkcx5" Jul 7 00:19:49.172175 systemd[1]: Created slice kubepods-besteffort-pode4fd678b_b655_42d4_b25e_647c02f2ca21.slice - libcontainer container kubepods-besteffort-pode4fd678b_b655_42d4_b25e_647c02f2ca21.slice. Jul 7 00:19:49.182957 systemd[1]: Created slice kubepods-besteffort-pod3f991e28_7349_433d_8fb6_e8dea1fb5652.slice - libcontainer container kubepods-besteffort-pod3f991e28_7349_433d_8fb6_e8dea1fb5652.slice. Jul 7 00:19:49.403066 containerd[1548]: time="2025-07-07T00:19:49.402965352Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hkcx5,Uid:c25c4d92-5234-44bc-85a4-f19c76ed4997,Namespace:kube-system,Attempt:0,}" Jul 7 00:19:49.422098 containerd[1548]: time="2025-07-07T00:19:49.422042161Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59b677dccf-swp4r,Uid:06bee497-80f4-43cb-b2d9-67d6f0846b05,Namespace:calico-system,Attempt:0,}" Jul 7 00:19:49.447728 containerd[1548]: time="2025-07-07T00:19:49.447323053Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-42w27,Uid:d39c4f8c-ed03-4a96-9d5f-5df5a6d7eb00,Namespace:kube-system,Attempt:0,}" Jul 7 00:19:49.479768 containerd[1548]: time="2025-07-07T00:19:49.479722312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-967c948f-bzzvp,Uid:e4fd678b-b655-42d4-b25e-647c02f2ca21,Namespace:calico-system,Attempt:0,}" Jul 7 00:19:49.488289 containerd[1548]: time="2025-07-07T00:19:49.487833607Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-f5hxb,Uid:3f991e28-7349-433d-8fb6-e8dea1fb5652,Namespace:calico-system,Attempt:0,}" Jul 7 00:19:49.571354 containerd[1548]: time="2025-07-07T00:19:49.571299012Z" level=error msg="Failed to destroy network for sandbox \"01bc2f94493015ef3eaa787497590feb5ad071af26a912d8d97e469ef5d38e96\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:49.573431 containerd[1548]: time="2025-07-07T00:19:49.573372626Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hkcx5,Uid:c25c4d92-5234-44bc-85a4-f19c76ed4997,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"01bc2f94493015ef3eaa787497590feb5ad071af26a912d8d97e469ef5d38e96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:49.573864 kubelet[2859]: E0707 00:19:49.573804 2859 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01bc2f94493015ef3eaa787497590feb5ad071af26a912d8d97e469ef5d38e96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:49.573953 kubelet[2859]: E0707 00:19:49.573921 2859 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01bc2f94493015ef3eaa787497590feb5ad071af26a912d8d97e469ef5d38e96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-hkcx5" Jul 7 00:19:49.573986 kubelet[2859]: E0707 00:19:49.573962 2859 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"01bc2f94493015ef3eaa787497590feb5ad071af26a912d8d97e469ef5d38e96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-hkcx5" Jul 7 00:19:49.574568 kubelet[2859]: E0707 00:19:49.574014 2859 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-hkcx5_kube-system(c25c4d92-5234-44bc-85a4-f19c76ed4997)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-hkcx5_kube-system(c25c4d92-5234-44bc-85a4-f19c76ed4997)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"01bc2f94493015ef3eaa787497590feb5ad071af26a912d8d97e469ef5d38e96\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-hkcx5" podUID="c25c4d92-5234-44bc-85a4-f19c76ed4997" Jul 7 00:19:49.582789 containerd[1548]: time="2025-07-07T00:19:49.582743290Z" level=error msg="Failed to destroy network for sandbox \"1975bae26be0c62c297508275257b7261ab05b59c289d5ff2fc9a7fec2f4152d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:49.585492 containerd[1548]: time="2025-07-07T00:19:49.585441748Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-42w27,Uid:d39c4f8c-ed03-4a96-9d5f-5df5a6d7eb00,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1975bae26be0c62c297508275257b7261ab05b59c289d5ff2fc9a7fec2f4152d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:49.586349 kubelet[2859]: E0707 00:19:49.586043 2859 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1975bae26be0c62c297508275257b7261ab05b59c289d5ff2fc9a7fec2f4152d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:49.586349 kubelet[2859]: E0707 00:19:49.586120 2859 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1975bae26be0c62c297508275257b7261ab05b59c289d5ff2fc9a7fec2f4152d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-42w27" Jul 7 00:19:49.586349 kubelet[2859]: E0707 00:19:49.586141 2859 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1975bae26be0c62c297508275257b7261ab05b59c289d5ff2fc9a7fec2f4152d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-42w27" Jul 7 00:19:49.586498 kubelet[2859]: E0707 00:19:49.586190 2859 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-42w27_kube-system(d39c4f8c-ed03-4a96-9d5f-5df5a6d7eb00)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-42w27_kube-system(d39c4f8c-ed03-4a96-9d5f-5df5a6d7eb00)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1975bae26be0c62c297508275257b7261ab05b59c289d5ff2fc9a7fec2f4152d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-42w27" podUID="d39c4f8c-ed03-4a96-9d5f-5df5a6d7eb00" Jul 7 00:19:49.591838 containerd[1548]: time="2025-07-07T00:19:49.591765951Z" level=error msg="Failed to destroy network for sandbox \"005eed6bd4d430329d3afd58d96896f64c964fe644ec7397fbad612c942a48b3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:49.595921 containerd[1548]: time="2025-07-07T00:19:49.595875699Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59b677dccf-swp4r,Uid:06bee497-80f4-43cb-b2d9-67d6f0846b05,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"005eed6bd4d430329d3afd58d96896f64c964fe644ec7397fbad612c942a48b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:49.596448 kubelet[2859]: E0707 00:19:49.596409 2859 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"005eed6bd4d430329d3afd58d96896f64c964fe644ec7397fbad612c942a48b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:49.596868 kubelet[2859]: E0707 00:19:49.596650 2859 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"005eed6bd4d430329d3afd58d96896f64c964fe644ec7397fbad612c942a48b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-59b677dccf-swp4r" Jul 7 00:19:49.597549 kubelet[2859]: E0707 00:19:49.596974 2859 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"005eed6bd4d430329d3afd58d96896f64c964fe644ec7397fbad612c942a48b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-59b677dccf-swp4r" Jul 7 00:19:49.597549 kubelet[2859]: E0707 00:19:49.597037 2859 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-59b677dccf-swp4r_calico-system(06bee497-80f4-43cb-b2d9-67d6f0846b05)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-59b677dccf-swp4r_calico-system(06bee497-80f4-43cb-b2d9-67d6f0846b05)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"005eed6bd4d430329d3afd58d96896f64c964fe644ec7397fbad612c942a48b3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-59b677dccf-swp4r" podUID="06bee497-80f4-43cb-b2d9-67d6f0846b05" Jul 7 00:19:49.630465 containerd[1548]: time="2025-07-07T00:19:49.630339972Z" level=error msg="Failed to destroy network for sandbox \"6587283b5d24708f04f61823c5e33cdee2d2603654f692b9bbb79a70a11c376e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:49.630944 containerd[1548]: time="2025-07-07T00:19:49.630351572Z" level=error msg="Failed to destroy network for sandbox \"1d07820b4848a1332ea4c9000331850daca23e7283c56d27776c7da54415b764\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:49.631807 containerd[1548]: time="2025-07-07T00:19:49.631712141Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-967c948f-bzzvp,Uid:e4fd678b-b655-42d4-b25e-647c02f2ca21,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6587283b5d24708f04f61823c5e33cdee2d2603654f692b9bbb79a70a11c376e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:49.632122 kubelet[2859]: E0707 00:19:49.632052 2859 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6587283b5d24708f04f61823c5e33cdee2d2603654f692b9bbb79a70a11c376e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:49.632262 kubelet[2859]: E0707 00:19:49.632140 2859 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6587283b5d24708f04f61823c5e33cdee2d2603654f692b9bbb79a70a11c376e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-967c948f-bzzvp" Jul 7 00:19:49.632262 kubelet[2859]: E0707 00:19:49.632161 2859 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6587283b5d24708f04f61823c5e33cdee2d2603654f692b9bbb79a70a11c376e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-967c948f-bzzvp" Jul 7 00:19:49.632262 kubelet[2859]: E0707 00:19:49.632222 2859 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-967c948f-bzzvp_calico-system(e4fd678b-b655-42d4-b25e-647c02f2ca21)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-967c948f-bzzvp_calico-system(e4fd678b-b655-42d4-b25e-647c02f2ca21)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6587283b5d24708f04f61823c5e33cdee2d2603654f692b9bbb79a70a11c376e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-967c948f-bzzvp" podUID="e4fd678b-b655-42d4-b25e-647c02f2ca21" Jul 7 00:19:49.633196 containerd[1548]: time="2025-07-07T00:19:49.632981230Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-f5hxb,Uid:3f991e28-7349-433d-8fb6-e8dea1fb5652,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d07820b4848a1332ea4c9000331850daca23e7283c56d27776c7da54415b764\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:49.633384 kubelet[2859]: E0707 00:19:49.633358 2859 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d07820b4848a1332ea4c9000331850daca23e7283c56d27776c7da54415b764\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:49.633477 kubelet[2859]: E0707 00:19:49.633400 2859 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d07820b4848a1332ea4c9000331850daca23e7283c56d27776c7da54415b764\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-f5hxb" Jul 7 00:19:49.633477 kubelet[2859]: E0707 00:19:49.633416 2859 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d07820b4848a1332ea4c9000331850daca23e7283c56d27776c7da54415b764\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-f5hxb" Jul 7 00:19:49.633477 kubelet[2859]: E0707 00:19:49.633446 2859 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-58fd7646b9-f5hxb_calico-system(3f991e28-7349-433d-8fb6-e8dea1fb5652)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-58fd7646b9-f5hxb_calico-system(3f991e28-7349-433d-8fb6-e8dea1fb5652)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1d07820b4848a1332ea4c9000331850daca23e7283c56d27776c7da54415b764\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-58fd7646b9-f5hxb" podUID="3f991e28-7349-433d-8fb6-e8dea1fb5652" Jul 7 00:19:49.906542 containerd[1548]: time="2025-07-07T00:19:49.905613676Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Jul 7 00:19:50.299065 kubelet[2859]: E0707 00:19:50.298890 2859 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jul 7 00:19:50.300000 kubelet[2859]: E0707 00:19:50.299711 2859 projected.go:194] Error preparing data for projected volume kube-api-access-9rm8c for pod calico-apiserver/calico-apiserver-d455968b6-d2496: failed to sync configmap cache: timed out waiting for the condition Jul 7 00:19:50.300000 kubelet[2859]: E0707 00:19:50.299874 2859 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f13c4783-fce7-4cf1-ab5f-5ba43426f9fb-kube-api-access-9rm8c podName:f13c4783-fce7-4cf1-ab5f-5ba43426f9fb nodeName:}" failed. No retries permitted until 2025-07-07 00:19:50.799819402 +0000 UTC m=+33.205996050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9rm8c" (UniqueName: "kubernetes.io/projected/f13c4783-fce7-4cf1-ab5f-5ba43426f9fb-kube-api-access-9rm8c") pod "calico-apiserver-d455968b6-d2496" (UID: "f13c4783-fce7-4cf1-ab5f-5ba43426f9fb") : failed to sync configmap cache: timed out waiting for the condition Jul 7 00:19:50.312050 kubelet[2859]: E0707 00:19:50.311943 2859 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jul 7 00:19:50.312050 kubelet[2859]: E0707 00:19:50.311996 2859 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jul 7 00:19:50.312050 kubelet[2859]: E0707 00:19:50.312062 2859 projected.go:194] Error preparing data for projected volume kube-api-access-sqhpw for pod calico-apiserver/calico-apiserver-7665774675-4v9q6: failed to sync configmap cache: timed out waiting for the condition Jul 7 00:19:50.312363 kubelet[2859]: E0707 00:19:50.312149 2859 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b9d9ac66-6750-4558-9996-a65b2992f683-kube-api-access-sqhpw podName:b9d9ac66-6750-4558-9996-a65b2992f683 nodeName:}" failed. No retries permitted until 2025-07-07 00:19:50.812119725 +0000 UTC m=+33.218296373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-sqhpw" (UniqueName: "kubernetes.io/projected/b9d9ac66-6750-4558-9996-a65b2992f683-kube-api-access-sqhpw") pod "calico-apiserver-7665774675-4v9q6" (UID: "b9d9ac66-6750-4558-9996-a65b2992f683") : failed to sync configmap cache: timed out waiting for the condition Jul 7 00:19:50.312363 kubelet[2859]: E0707 00:19:50.312017 2859 projected.go:194] Error preparing data for projected volume kube-api-access-g92ws for pod calico-apiserver/calico-apiserver-d455968b6-7vdfw: failed to sync configmap cache: timed out waiting for the condition Jul 7 00:19:50.312363 kubelet[2859]: E0707 00:19:50.312214 2859 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8c8b32cc-c249-4ae4-a5f5-5ca79580b06a-kube-api-access-g92ws podName:8c8b32cc-c249-4ae4-a5f5-5ca79580b06a nodeName:}" failed. No retries permitted until 2025-07-07 00:19:50.812197845 +0000 UTC m=+33.218374493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-g92ws" (UniqueName: "kubernetes.io/projected/8c8b32cc-c249-4ae4-a5f5-5ca79580b06a-kube-api-access-g92ws") pod "calico-apiserver-d455968b6-7vdfw" (UID: "8c8b32cc-c249-4ae4-a5f5-5ca79580b06a") : failed to sync configmap cache: timed out waiting for the condition Jul 7 00:19:50.327700 systemd[1]: run-netns-cni\x2de6f3064e\x2d6e99\x2d6c6a\x2d1964\x2da14d6fce589e.mount: Deactivated successfully. Jul 7 00:19:50.328114 systemd[1]: run-netns-cni\x2d3e79cf43\x2daead\x2db993\x2df64e\x2dee638d9a77c5.mount: Deactivated successfully. Jul 7 00:19:50.328284 systemd[1]: run-netns-cni\x2d1cc17575\x2d1cbd\x2d88eb\x2d0708\x2d8dbe596baa9f.mount: Deactivated successfully. Jul 7 00:19:50.328414 systemd[1]: run-netns-cni\x2d3419e3ce\x2d099f\x2d772e\x2d1c79\x2dfc90db336e70.mount: Deactivated successfully. Jul 7 00:19:50.748225 systemd[1]: Created slice kubepods-besteffort-poda5dfbb9f_cbfa_4ab7_b6be_7e804154425e.slice - libcontainer container kubepods-besteffort-poda5dfbb9f_cbfa_4ab7_b6be_7e804154425e.slice. Jul 7 00:19:50.751607 containerd[1548]: time="2025-07-07T00:19:50.751319144Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8kl6f,Uid:a5dfbb9f-cbfa-4ab7-b6be-7e804154425e,Namespace:calico-system,Attempt:0,}" Jul 7 00:19:50.808829 containerd[1548]: time="2025-07-07T00:19:50.808755449Z" level=error msg="Failed to destroy network for sandbox \"bd702e4e85af0beee12e2dfc72615659ae4cf9d5fbf488d0cabd98e597d5b84b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:50.811265 systemd[1]: run-netns-cni\x2db1e2dc1d\x2d587c\x2d2c23\x2d5f81\x2da08023dd334a.mount: Deactivated successfully. Jul 7 00:19:50.813803 containerd[1548]: time="2025-07-07T00:19:50.813671842Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8kl6f,Uid:a5dfbb9f-cbfa-4ab7-b6be-7e804154425e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd702e4e85af0beee12e2dfc72615659ae4cf9d5fbf488d0cabd98e597d5b84b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:50.814732 kubelet[2859]: E0707 00:19:50.814603 2859 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd702e4e85af0beee12e2dfc72615659ae4cf9d5fbf488d0cabd98e597d5b84b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:50.814732 kubelet[2859]: E0707 00:19:50.814688 2859 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd702e4e85af0beee12e2dfc72615659ae4cf9d5fbf488d0cabd98e597d5b84b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8kl6f" Jul 7 00:19:50.814732 kubelet[2859]: E0707 00:19:50.814709 2859 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd702e4e85af0beee12e2dfc72615659ae4cf9d5fbf488d0cabd98e597d5b84b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8kl6f" Jul 7 00:19:50.815062 kubelet[2859]: E0707 00:19:50.814939 2859 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8kl6f_calico-system(a5dfbb9f-cbfa-4ab7-b6be-7e804154425e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8kl6f_calico-system(a5dfbb9f-cbfa-4ab7-b6be-7e804154425e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bd702e4e85af0beee12e2dfc72615659ae4cf9d5fbf488d0cabd98e597d5b84b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8kl6f" podUID="a5dfbb9f-cbfa-4ab7-b6be-7e804154425e" Jul 7 00:19:50.933000 containerd[1548]: time="2025-07-07T00:19:50.932610198Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d455968b6-d2496,Uid:f13c4783-fce7-4cf1-ab5f-5ba43426f9fb,Namespace:calico-apiserver,Attempt:0,}" Jul 7 00:19:50.954222 containerd[1548]: time="2025-07-07T00:19:50.954177062Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7665774675-4v9q6,Uid:b9d9ac66-6750-4558-9996-a65b2992f683,Namespace:calico-apiserver,Attempt:0,}" Jul 7 00:19:50.966267 containerd[1548]: time="2025-07-07T00:19:50.966157742Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d455968b6-7vdfw,Uid:8c8b32cc-c249-4ae4-a5f5-5ca79580b06a,Namespace:calico-apiserver,Attempt:0,}" Jul 7 00:19:51.010835 containerd[1548]: time="2025-07-07T00:19:51.010625399Z" level=error msg="Failed to destroy network for sandbox \"5d695edc8a89d190d32b0d8b83fbe387097b838d707be5a6fb2f809bdfc76387\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:51.012481 containerd[1548]: time="2025-07-07T00:19:51.012357811Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d455968b6-d2496,Uid:f13c4783-fce7-4cf1-ab5f-5ba43426f9fb,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d695edc8a89d190d32b0d8b83fbe387097b838d707be5a6fb2f809bdfc76387\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:51.013614 kubelet[2859]: E0707 00:19:51.012651 2859 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d695edc8a89d190d32b0d8b83fbe387097b838d707be5a6fb2f809bdfc76387\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:51.013614 kubelet[2859]: E0707 00:19:51.012742 2859 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d695edc8a89d190d32b0d8b83fbe387097b838d707be5a6fb2f809bdfc76387\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-d455968b6-d2496" Jul 7 00:19:51.013614 kubelet[2859]: E0707 00:19:51.012762 2859 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d695edc8a89d190d32b0d8b83fbe387097b838d707be5a6fb2f809bdfc76387\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-d455968b6-d2496" Jul 7 00:19:51.013772 kubelet[2859]: E0707 00:19:51.012845 2859 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-d455968b6-d2496_calico-apiserver(f13c4783-fce7-4cf1-ab5f-5ba43426f9fb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-d455968b6-d2496_calico-apiserver(f13c4783-fce7-4cf1-ab5f-5ba43426f9fb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5d695edc8a89d190d32b0d8b83fbe387097b838d707be5a6fb2f809bdfc76387\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-d455968b6-d2496" podUID="f13c4783-fce7-4cf1-ab5f-5ba43426f9fb" Jul 7 00:19:51.052701 containerd[1548]: time="2025-07-07T00:19:51.052640557Z" level=error msg="Failed to destroy network for sandbox \"1f743fa11d5c7a8095e27b3b59e89ac7d167c8ea3f867cc48521053324d9b24e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:51.054440 containerd[1548]: time="2025-07-07T00:19:51.054339248Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7665774675-4v9q6,Uid:b9d9ac66-6750-4558-9996-a65b2992f683,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1f743fa11d5c7a8095e27b3b59e89ac7d167c8ea3f867cc48521053324d9b24e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:51.055253 kubelet[2859]: E0707 00:19:51.054846 2859 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1f743fa11d5c7a8095e27b3b59e89ac7d167c8ea3f867cc48521053324d9b24e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:51.055253 kubelet[2859]: E0707 00:19:51.054911 2859 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1f743fa11d5c7a8095e27b3b59e89ac7d167c8ea3f867cc48521053324d9b24e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7665774675-4v9q6" Jul 7 00:19:51.055253 kubelet[2859]: E0707 00:19:51.054958 2859 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1f743fa11d5c7a8095e27b3b59e89ac7d167c8ea3f867cc48521053324d9b24e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7665774675-4v9q6" Jul 7 00:19:51.056136 kubelet[2859]: E0707 00:19:51.055000 2859 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7665774675-4v9q6_calico-apiserver(b9d9ac66-6750-4558-9996-a65b2992f683)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7665774675-4v9q6_calico-apiserver(b9d9ac66-6750-4558-9996-a65b2992f683)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1f743fa11d5c7a8095e27b3b59e89ac7d167c8ea3f867cc48521053324d9b24e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7665774675-4v9q6" podUID="b9d9ac66-6750-4558-9996-a65b2992f683" Jul 7 00:19:51.073521 containerd[1548]: time="2025-07-07T00:19:51.073444535Z" level=error msg="Failed to destroy network for sandbox \"9b2e7cd15295bdea0f36ee2158c25c266ba2763bbf225a10af7699001f20fb65\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:51.076046 containerd[1548]: time="2025-07-07T00:19:51.075883511Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d455968b6-7vdfw,Uid:8c8b32cc-c249-4ae4-a5f5-5ca79580b06a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b2e7cd15295bdea0f36ee2158c25c266ba2763bbf225a10af7699001f20fb65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:51.076704 kubelet[2859]: E0707 00:19:51.076644 2859 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b2e7cd15295bdea0f36ee2158c25c266ba2763bbf225a10af7699001f20fb65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:19:51.076877 kubelet[2859]: E0707 00:19:51.076855 2859 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b2e7cd15295bdea0f36ee2158c25c266ba2763bbf225a10af7699001f20fb65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-d455968b6-7vdfw" Jul 7 00:19:51.077075 kubelet[2859]: E0707 00:19:51.077012 2859 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b2e7cd15295bdea0f36ee2158c25c266ba2763bbf225a10af7699001f20fb65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-d455968b6-7vdfw" Jul 7 00:19:51.077207 kubelet[2859]: E0707 00:19:51.077176 2859 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-d455968b6-7vdfw_calico-apiserver(8c8b32cc-c249-4ae4-a5f5-5ca79580b06a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-d455968b6-7vdfw_calico-apiserver(8c8b32cc-c249-4ae4-a5f5-5ca79580b06a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9b2e7cd15295bdea0f36ee2158c25c266ba2763bbf225a10af7699001f20fb65\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-d455968b6-7vdfw" podUID="8c8b32cc-c249-4ae4-a5f5-5ca79580b06a" Jul 7 00:19:54.118946 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2251299481.mount: Deactivated successfully. Jul 7 00:19:54.145979 containerd[1548]: time="2025-07-07T00:19:54.145887215Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:54.146979 containerd[1548]: time="2025-07-07T00:19:54.146914941Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=152544909" Jul 7 00:19:54.149542 containerd[1548]: time="2025-07-07T00:19:54.148983595Z" level=info msg="ImageCreate event name:\"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:54.153433 containerd[1548]: time="2025-07-07T00:19:54.153390783Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:54.154581 containerd[1548]: time="2025-07-07T00:19:54.154529550Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"152544771\" in 4.248842194s" Jul 7 00:19:54.154658 containerd[1548]: time="2025-07-07T00:19:54.154586111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\"" Jul 7 00:19:54.169416 containerd[1548]: time="2025-07-07T00:19:54.169376005Z" level=info msg="CreateContainer within sandbox \"e1fde83f752ad3c8ddb84d4ef444d8df742b4ca5672c0d2315af7acd9065bf71\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jul 7 00:19:54.186189 containerd[1548]: time="2025-07-07T00:19:54.183425655Z" level=info msg="Container fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:19:54.199868 containerd[1548]: time="2025-07-07T00:19:54.199716079Z" level=info msg="CreateContainer within sandbox \"e1fde83f752ad3c8ddb84d4ef444d8df742b4ca5672c0d2315af7acd9065bf71\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\"" Jul 7 00:19:54.201188 containerd[1548]: time="2025-07-07T00:19:54.201128648Z" level=info msg="StartContainer for \"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\"" Jul 7 00:19:54.204910 containerd[1548]: time="2025-07-07T00:19:54.204846512Z" level=info msg="connecting to shim fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212" address="unix:///run/containerd/s/7dd369a16bcb4762e0966b3b239f7b3712c2ea27903d486031e0ae7f864dfb49" protocol=ttrpc version=3 Jul 7 00:19:54.254831 systemd[1]: Started cri-containerd-fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212.scope - libcontainer container fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212. Jul 7 00:19:54.322416 containerd[1548]: time="2025-07-07T00:19:54.322352105Z" level=info msg="StartContainer for \"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\" returns successfully" Jul 7 00:19:54.473749 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jul 7 00:19:54.473876 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jul 7 00:19:54.707038 kubelet[2859]: I0707 00:19:54.706980 2859 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5f7m\" (UniqueName: \"kubernetes.io/projected/e4fd678b-b655-42d4-b25e-647c02f2ca21-kube-api-access-n5f7m\") pod \"e4fd678b-b655-42d4-b25e-647c02f2ca21\" (UID: \"e4fd678b-b655-42d4-b25e-647c02f2ca21\") " Jul 7 00:19:54.707441 kubelet[2859]: I0707 00:19:54.707060 2859 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e4fd678b-b655-42d4-b25e-647c02f2ca21-whisker-backend-key-pair\") pod \"e4fd678b-b655-42d4-b25e-647c02f2ca21\" (UID: \"e4fd678b-b655-42d4-b25e-647c02f2ca21\") " Jul 7 00:19:54.707441 kubelet[2859]: I0707 00:19:54.707090 2859 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4fd678b-b655-42d4-b25e-647c02f2ca21-whisker-ca-bundle\") pod \"e4fd678b-b655-42d4-b25e-647c02f2ca21\" (UID: \"e4fd678b-b655-42d4-b25e-647c02f2ca21\") " Jul 7 00:19:54.708228 kubelet[2859]: I0707 00:19:54.707676 2859 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4fd678b-b655-42d4-b25e-647c02f2ca21-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "e4fd678b-b655-42d4-b25e-647c02f2ca21" (UID: "e4fd678b-b655-42d4-b25e-647c02f2ca21"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jul 7 00:19:54.711798 kubelet[2859]: I0707 00:19:54.711730 2859 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4fd678b-b655-42d4-b25e-647c02f2ca21-kube-api-access-n5f7m" (OuterVolumeSpecName: "kube-api-access-n5f7m") pod "e4fd678b-b655-42d4-b25e-647c02f2ca21" (UID: "e4fd678b-b655-42d4-b25e-647c02f2ca21"). InnerVolumeSpecName "kube-api-access-n5f7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jul 7 00:19:54.712362 kubelet[2859]: I0707 00:19:54.712307 2859 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4fd678b-b655-42d4-b25e-647c02f2ca21-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "e4fd678b-b655-42d4-b25e-647c02f2ca21" (UID: "e4fd678b-b655-42d4-b25e-647c02f2ca21"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Jul 7 00:19:54.809470 kubelet[2859]: I0707 00:19:54.808725 2859 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4fd678b-b655-42d4-b25e-647c02f2ca21-whisker-ca-bundle\") on node \"ci-4344-1-1-1-1232b7205a\" DevicePath \"\"" Jul 7 00:19:54.809470 kubelet[2859]: I0707 00:19:54.808767 2859 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5f7m\" (UniqueName: \"kubernetes.io/projected/e4fd678b-b655-42d4-b25e-647c02f2ca21-kube-api-access-n5f7m\") on node \"ci-4344-1-1-1-1232b7205a\" DevicePath \"\"" Jul 7 00:19:54.809470 kubelet[2859]: I0707 00:19:54.808779 2859 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e4fd678b-b655-42d4-b25e-647c02f2ca21-whisker-backend-key-pair\") on node \"ci-4344-1-1-1-1232b7205a\" DevicePath \"\"" Jul 7 00:19:54.940745 systemd[1]: Removed slice kubepods-besteffort-pode4fd678b_b655_42d4_b25e_647c02f2ca21.slice - libcontainer container kubepods-besteffort-pode4fd678b_b655_42d4_b25e_647c02f2ca21.slice. Jul 7 00:19:54.969571 kubelet[2859]: I0707 00:19:54.969381 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-565mc" podStartSLOduration=1.363776266 podStartE2EDuration="13.969362887s" podCreationTimestamp="2025-07-07 00:19:41 +0000 UTC" firstStartedPulling="2025-07-07 00:19:41.550230377 +0000 UTC m=+23.956406945" lastFinishedPulling="2025-07-07 00:19:54.155816918 +0000 UTC m=+36.561993566" observedRunningTime="2025-07-07 00:19:54.969149846 +0000 UTC m=+37.375326454" watchObservedRunningTime="2025-07-07 00:19:54.969362887 +0000 UTC m=+37.375539495" Jul 7 00:19:55.049271 systemd[1]: Created slice kubepods-besteffort-podd363cc9f_9da5_4684_a811_c468f6859573.slice - libcontainer container kubepods-besteffort-podd363cc9f_9da5_4684_a811_c468f6859573.slice. Jul 7 00:19:55.111347 kubelet[2859]: I0707 00:19:55.111250 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2cfk\" (UniqueName: \"kubernetes.io/projected/d363cc9f-9da5-4684-a811-c468f6859573-kube-api-access-k2cfk\") pod \"whisker-6fb7b65647-6f9xm\" (UID: \"d363cc9f-9da5-4684-a811-c468f6859573\") " pod="calico-system/whisker-6fb7b65647-6f9xm" Jul 7 00:19:55.111347 kubelet[2859]: I0707 00:19:55.111333 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d363cc9f-9da5-4684-a811-c468f6859573-whisker-ca-bundle\") pod \"whisker-6fb7b65647-6f9xm\" (UID: \"d363cc9f-9da5-4684-a811-c468f6859573\") " pod="calico-system/whisker-6fb7b65647-6f9xm" Jul 7 00:19:55.111663 kubelet[2859]: I0707 00:19:55.111384 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d363cc9f-9da5-4684-a811-c468f6859573-whisker-backend-key-pair\") pod \"whisker-6fb7b65647-6f9xm\" (UID: \"d363cc9f-9da5-4684-a811-c468f6859573\") " pod="calico-system/whisker-6fb7b65647-6f9xm" Jul 7 00:19:55.120175 systemd[1]: var-lib-kubelet-pods-e4fd678b\x2db655\x2d42d4\x2db25e\x2d647c02f2ca21-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dn5f7m.mount: Deactivated successfully. Jul 7 00:19:55.120350 systemd[1]: var-lib-kubelet-pods-e4fd678b\x2db655\x2d42d4\x2db25e\x2d647c02f2ca21-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jul 7 00:19:55.355474 containerd[1548]: time="2025-07-07T00:19:55.355346894Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6fb7b65647-6f9xm,Uid:d363cc9f-9da5-4684-a811-c468f6859573,Namespace:calico-system,Attempt:0,}" Jul 7 00:19:55.567344 systemd-networkd[1414]: calid5c00dd35d6: Link UP Jul 7 00:19:55.567615 systemd-networkd[1414]: calid5c00dd35d6: Gained carrier Jul 7 00:19:55.599250 containerd[1548]: 2025-07-07 00:19:55.384 [INFO][3948] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 7 00:19:55.599250 containerd[1548]: 2025-07-07 00:19:55.425 [INFO][3948] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--1--1232b7205a-k8s-whisker--6fb7b65647--6f9xm-eth0 whisker-6fb7b65647- calico-system d363cc9f-9da5-4684-a811-c468f6859573 919 0 2025-07-07 00:19:55 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6fb7b65647 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4344-1-1-1-1232b7205a whisker-6fb7b65647-6f9xm eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calid5c00dd35d6 [] [] }} ContainerID="fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" Namespace="calico-system" Pod="whisker-6fb7b65647-6f9xm" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-whisker--6fb7b65647--6f9xm-" Jul 7 00:19:55.599250 containerd[1548]: 2025-07-07 00:19:55.425 [INFO][3948] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" Namespace="calico-system" Pod="whisker-6fb7b65647-6f9xm" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-whisker--6fb7b65647--6f9xm-eth0" Jul 7 00:19:55.599250 containerd[1548]: 2025-07-07 00:19:55.479 [INFO][3960] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" HandleID="k8s-pod-network.fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" Workload="ci--4344--1--1--1--1232b7205a-k8s-whisker--6fb7b65647--6f9xm-eth0" Jul 7 00:19:55.599707 containerd[1548]: 2025-07-07 00:19:55.479 [INFO][3960] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" HandleID="k8s-pod-network.fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" Workload="ci--4344--1--1--1--1232b7205a-k8s-whisker--6fb7b65647--6f9xm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-1-1-1-1232b7205a", "pod":"whisker-6fb7b65647-6f9xm", "timestamp":"2025-07-07 00:19:55.479088238 +0000 UTC"}, Hostname:"ci-4344-1-1-1-1232b7205a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:19:55.599707 containerd[1548]: 2025-07-07 00:19:55.479 [INFO][3960] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:19:55.599707 containerd[1548]: 2025-07-07 00:19:55.479 [INFO][3960] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:19:55.599707 containerd[1548]: 2025-07-07 00:19:55.479 [INFO][3960] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-1-1232b7205a' Jul 7 00:19:55.599707 containerd[1548]: 2025-07-07 00:19:55.493 [INFO][3960] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:55.599707 containerd[1548]: 2025-07-07 00:19:55.506 [INFO][3960] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:55.599707 containerd[1548]: 2025-07-07 00:19:55.518 [INFO][3960] ipam/ipam.go 511: Trying affinity for 192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:55.599707 containerd[1548]: 2025-07-07 00:19:55.520 [INFO][3960] ipam/ipam.go 158: Attempting to load block cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:55.599707 containerd[1548]: 2025-07-07 00:19:55.525 [INFO][3960] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:55.600064 containerd[1548]: 2025-07-07 00:19:55.526 [INFO][3960] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.99.128/26 handle="k8s-pod-network.fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:55.600064 containerd[1548]: 2025-07-07 00:19:55.530 [INFO][3960] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b Jul 7 00:19:55.600064 containerd[1548]: 2025-07-07 00:19:55.540 [INFO][3960] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.99.128/26 handle="k8s-pod-network.fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:55.600064 containerd[1548]: 2025-07-07 00:19:55.548 [INFO][3960] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.99.129/26] block=192.168.99.128/26 handle="k8s-pod-network.fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:55.600064 containerd[1548]: 2025-07-07 00:19:55.549 [INFO][3960] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.99.129/26] handle="k8s-pod-network.fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:19:55.600064 containerd[1548]: 2025-07-07 00:19:55.549 [INFO][3960] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:19:55.600064 containerd[1548]: 2025-07-07 00:19:55.549 [INFO][3960] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.99.129/26] IPv6=[] ContainerID="fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" HandleID="k8s-pod-network.fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" Workload="ci--4344--1--1--1--1232b7205a-k8s-whisker--6fb7b65647--6f9xm-eth0" Jul 7 00:19:55.600370 containerd[1548]: 2025-07-07 00:19:55.553 [INFO][3948] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" Namespace="calico-system" Pod="whisker-6fb7b65647-6f9xm" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-whisker--6fb7b65647--6f9xm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-whisker--6fb7b65647--6f9xm-eth0", GenerateName:"whisker-6fb7b65647-", Namespace:"calico-system", SelfLink:"", UID:"d363cc9f-9da5-4684-a811-c468f6859573", ResourceVersion:"919", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6fb7b65647", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"", Pod:"whisker-6fb7b65647-6f9xm", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.99.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calid5c00dd35d6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:19:55.600370 containerd[1548]: 2025-07-07 00:19:55.553 [INFO][3948] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.99.129/32] ContainerID="fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" Namespace="calico-system" Pod="whisker-6fb7b65647-6f9xm" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-whisker--6fb7b65647--6f9xm-eth0" Jul 7 00:19:55.600731 containerd[1548]: 2025-07-07 00:19:55.553 [INFO][3948] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid5c00dd35d6 ContainerID="fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" Namespace="calico-system" Pod="whisker-6fb7b65647-6f9xm" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-whisker--6fb7b65647--6f9xm-eth0" Jul 7 00:19:55.600731 containerd[1548]: 2025-07-07 00:19:55.570 [INFO][3948] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" Namespace="calico-system" Pod="whisker-6fb7b65647-6f9xm" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-whisker--6fb7b65647--6f9xm-eth0" Jul 7 00:19:55.600784 containerd[1548]: 2025-07-07 00:19:55.573 [INFO][3948] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" Namespace="calico-system" Pod="whisker-6fb7b65647-6f9xm" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-whisker--6fb7b65647--6f9xm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-whisker--6fb7b65647--6f9xm-eth0", GenerateName:"whisker-6fb7b65647-", Namespace:"calico-system", SelfLink:"", UID:"d363cc9f-9da5-4684-a811-c468f6859573", ResourceVersion:"919", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6fb7b65647", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b", Pod:"whisker-6fb7b65647-6f9xm", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.99.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calid5c00dd35d6", MAC:"2e:13:bc:1c:57:20", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:19:55.600841 containerd[1548]: 2025-07-07 00:19:55.594 [INFO][3948] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" Namespace="calico-system" Pod="whisker-6fb7b65647-6f9xm" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-whisker--6fb7b65647--6f9xm-eth0" Jul 7 00:19:55.635607 containerd[1548]: time="2025-07-07T00:19:55.635246068Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\" id:\"6b5f959d14d76d609bae83a94a407e3d479e3eadf003bd2cd3bd9f93a7fe4a12\" pid:3980 exit_status:1 exited_at:{seconds:1751847595 nanos:634417582}" Jul 7 00:19:55.657827 containerd[1548]: time="2025-07-07T00:19:55.657750210Z" level=info msg="connecting to shim fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b" address="unix:///run/containerd/s/36dd193ae584e348bd91a93aa32329b0221d2bf7f726d327e4cfcbb95fcd750a" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:19:55.697740 systemd[1]: Started cri-containerd-fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b.scope - libcontainer container fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b. Jul 7 00:19:55.745116 containerd[1548]: time="2025-07-07T00:19:55.745001763Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6fb7b65647-6f9xm,Uid:d363cc9f-9da5-4684-a811-c468f6859573,Namespace:calico-system,Attempt:0,} returns sandbox id \"fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b\"" Jul 7 00:19:55.747299 kubelet[2859]: I0707 00:19:55.747249 2859 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4fd678b-b655-42d4-b25e-647c02f2ca21" path="/var/lib/kubelet/pods/e4fd678b-b655-42d4-b25e-647c02f2ca21/volumes" Jul 7 00:19:55.750177 containerd[1548]: time="2025-07-07T00:19:55.749198789Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Jul 7 00:19:55.774641 containerd[1548]: time="2025-07-07T00:19:55.774585830Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\" id:\"119b8211dfb76aaece725adbb450d8cdac026c739a6cbceae6ff9e686a00c2da\" pid:4039 exit_status:1 exited_at:{seconds:1751847595 nanos:774132027}" Jul 7 00:19:56.032188 containerd[1548]: time="2025-07-07T00:19:56.031715537Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\" id:\"a1e086548a27503162bb03065b7255745e076f0bc9be6438e1331ace2aa7e495\" pid:4079 exit_status:1 exited_at:{seconds:1751847596 nanos:31310255}" Jul 7 00:19:57.038207 containerd[1548]: time="2025-07-07T00:19:57.038142165Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\" id:\"253074b14e3619a1d52d292bdb27075979e8e59e5d6bad10ebf78a229953ecf9\" pid:4194 exit_status:1 exited_at:{seconds:1751847597 nanos:37746443}" Jul 7 00:19:57.458240 containerd[1548]: time="2025-07-07T00:19:57.458193933Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:57.475172 containerd[1548]: time="2025-07-07T00:19:57.460593907Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4605614" Jul 7 00:19:57.475172 containerd[1548]: time="2025-07-07T00:19:57.462563360Z" level=info msg="ImageCreate event name:\"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:57.476165 containerd[1548]: time="2025-07-07T00:19:57.466589185Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"5974847\" in 1.717348115s" Jul 7 00:19:57.476165 containerd[1548]: time="2025-07-07T00:19:57.475649601Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\"" Jul 7 00:19:57.477006 containerd[1548]: time="2025-07-07T00:19:57.476964969Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:19:57.480436 containerd[1548]: time="2025-07-07T00:19:57.480381630Z" level=info msg="CreateContainer within sandbox \"fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Jul 7 00:19:57.502142 containerd[1548]: time="2025-07-07T00:19:57.502096125Z" level=info msg="Container 319ffe0cd337b3ec7b020fc499c41459fdc8df35ede1e379249e8957c68c4cfe: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:19:57.510215 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3001338531.mount: Deactivated successfully. Jul 7 00:19:57.514120 systemd-networkd[1414]: calid5c00dd35d6: Gained IPv6LL Jul 7 00:19:57.521851 containerd[1548]: time="2025-07-07T00:19:57.521764527Z" level=info msg="CreateContainer within sandbox \"fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"319ffe0cd337b3ec7b020fc499c41459fdc8df35ede1e379249e8957c68c4cfe\"" Jul 7 00:19:57.522674 containerd[1548]: time="2025-07-07T00:19:57.522592532Z" level=info msg="StartContainer for \"319ffe0cd337b3ec7b020fc499c41459fdc8df35ede1e379249e8957c68c4cfe\"" Jul 7 00:19:57.525784 containerd[1548]: time="2025-07-07T00:19:57.525739992Z" level=info msg="connecting to shim 319ffe0cd337b3ec7b020fc499c41459fdc8df35ede1e379249e8957c68c4cfe" address="unix:///run/containerd/s/36dd193ae584e348bd91a93aa32329b0221d2bf7f726d327e4cfcbb95fcd750a" protocol=ttrpc version=3 Jul 7 00:19:57.554204 systemd[1]: Started cri-containerd-319ffe0cd337b3ec7b020fc499c41459fdc8df35ede1e379249e8957c68c4cfe.scope - libcontainer container 319ffe0cd337b3ec7b020fc499c41459fdc8df35ede1e379249e8957c68c4cfe. Jul 7 00:19:57.648806 containerd[1548]: time="2025-07-07T00:19:57.648749435Z" level=info msg="StartContainer for \"319ffe0cd337b3ec7b020fc499c41459fdc8df35ede1e379249e8957c68c4cfe\" returns successfully" Jul 7 00:19:57.650886 containerd[1548]: time="2025-07-07T00:19:57.650835848Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Jul 7 00:20:00.603961 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2391026549.mount: Deactivated successfully. Jul 7 00:20:00.626228 containerd[1548]: time="2025-07-07T00:20:00.626126984Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:00.628279 containerd[1548]: time="2025-07-07T00:20:00.627900115Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=30814581" Jul 7 00:20:00.629756 containerd[1548]: time="2025-07-07T00:20:00.629686405Z" level=info msg="ImageCreate event name:\"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:00.632323 containerd[1548]: time="2025-07-07T00:20:00.632226101Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:00.633062 containerd[1548]: time="2025-07-07T00:20:00.633016825Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"30814411\" in 2.982135576s" Jul 7 00:20:00.633062 containerd[1548]: time="2025-07-07T00:20:00.633057786Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\"" Jul 7 00:20:00.637568 containerd[1548]: time="2025-07-07T00:20:00.636422806Z" level=info msg="CreateContainer within sandbox \"fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Jul 7 00:20:00.647341 containerd[1548]: time="2025-07-07T00:20:00.647293992Z" level=info msg="Container 55a010096275dbcf46f2628a48bf2865075fac307011b879e504c5215156723f: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:20:00.659663 containerd[1548]: time="2025-07-07T00:20:00.659611906Z" level=info msg="CreateContainer within sandbox \"fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"55a010096275dbcf46f2628a48bf2865075fac307011b879e504c5215156723f\"" Jul 7 00:20:00.663515 containerd[1548]: time="2025-07-07T00:20:00.663445649Z" level=info msg="StartContainer for \"55a010096275dbcf46f2628a48bf2865075fac307011b879e504c5215156723f\"" Jul 7 00:20:00.666000 containerd[1548]: time="2025-07-07T00:20:00.665940664Z" level=info msg="connecting to shim 55a010096275dbcf46f2628a48bf2865075fac307011b879e504c5215156723f" address="unix:///run/containerd/s/36dd193ae584e348bd91a93aa32329b0221d2bf7f726d327e4cfcbb95fcd750a" protocol=ttrpc version=3 Jul 7 00:20:00.704912 systemd[1]: Started cri-containerd-55a010096275dbcf46f2628a48bf2865075fac307011b879e504c5215156723f.scope - libcontainer container 55a010096275dbcf46f2628a48bf2865075fac307011b879e504c5215156723f. Jul 7 00:20:00.774450 containerd[1548]: time="2025-07-07T00:20:00.774378678Z" level=info msg="StartContainer for \"55a010096275dbcf46f2628a48bf2865075fac307011b879e504c5215156723f\" returns successfully" Jul 7 00:20:01.742544 containerd[1548]: time="2025-07-07T00:20:01.742416594Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59b677dccf-swp4r,Uid:06bee497-80f4-43cb-b2d9-67d6f0846b05,Namespace:calico-system,Attempt:0,}" Jul 7 00:20:01.744835 containerd[1548]: time="2025-07-07T00:20:01.744791128Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-42w27,Uid:d39c4f8c-ed03-4a96-9d5f-5df5a6d7eb00,Namespace:kube-system,Attempt:0,}" Jul 7 00:20:01.970726 systemd-networkd[1414]: caliebc2531796f: Link UP Jul 7 00:20:01.977539 systemd-networkd[1414]: caliebc2531796f: Gained carrier Jul 7 00:20:02.007173 kubelet[2859]: I0707 00:20:02.004462 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6fb7b65647-6f9xm" podStartSLOduration=2.11841311 podStartE2EDuration="7.004440359s" podCreationTimestamp="2025-07-07 00:19:55 +0000 UTC" firstStartedPulling="2025-07-07 00:19:55.748799147 +0000 UTC m=+38.154975795" lastFinishedPulling="2025-07-07 00:20:00.634826436 +0000 UTC m=+43.041003044" observedRunningTime="2025-07-07 00:20:00.993286638 +0000 UTC m=+43.399463286" watchObservedRunningTime="2025-07-07 00:20:02.004440359 +0000 UTC m=+44.410616967" Jul 7 00:20:02.012181 containerd[1548]: 2025-07-07 00:20:01.799 [INFO][4381] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 7 00:20:02.012181 containerd[1548]: 2025-07-07 00:20:01.825 [INFO][4381] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--42w27-eth0 coredns-7c65d6cfc9- kube-system d39c4f8c-ed03-4a96-9d5f-5df5a6d7eb00 851 0 2025-07-07 00:19:22 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344-1-1-1-1232b7205a coredns-7c65d6cfc9-42w27 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliebc2531796f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" Namespace="kube-system" Pod="coredns-7c65d6cfc9-42w27" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--42w27-" Jul 7 00:20:02.012181 containerd[1548]: 2025-07-07 00:20:01.825 [INFO][4381] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" Namespace="kube-system" Pod="coredns-7c65d6cfc9-42w27" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--42w27-eth0" Jul 7 00:20:02.012181 containerd[1548]: 2025-07-07 00:20:01.889 [INFO][4399] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" HandleID="k8s-pod-network.7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" Workload="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--42w27-eth0" Jul 7 00:20:02.012710 containerd[1548]: 2025-07-07 00:20:01.889 [INFO][4399] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" HandleID="k8s-pod-network.7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" Workload="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--42w27-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3550), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344-1-1-1-1232b7205a", "pod":"coredns-7c65d6cfc9-42w27", "timestamp":"2025-07-07 00:20:01.889277471 +0000 UTC"}, Hostname:"ci-4344-1-1-1-1232b7205a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:20:02.012710 containerd[1548]: 2025-07-07 00:20:01.889 [INFO][4399] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:20:02.012710 containerd[1548]: 2025-07-07 00:20:01.889 [INFO][4399] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:20:02.012710 containerd[1548]: 2025-07-07 00:20:01.889 [INFO][4399] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-1-1232b7205a' Jul 7 00:20:02.012710 containerd[1548]: 2025-07-07 00:20:01.905 [INFO][4399] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.012710 containerd[1548]: 2025-07-07 00:20:01.913 [INFO][4399] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.012710 containerd[1548]: 2025-07-07 00:20:01.924 [INFO][4399] ipam/ipam.go 511: Trying affinity for 192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.012710 containerd[1548]: 2025-07-07 00:20:01.927 [INFO][4399] ipam/ipam.go 158: Attempting to load block cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.012710 containerd[1548]: 2025-07-07 00:20:01.936 [INFO][4399] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.012971 containerd[1548]: 2025-07-07 00:20:01.936 [INFO][4399] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.99.128/26 handle="k8s-pod-network.7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.012971 containerd[1548]: 2025-07-07 00:20:01.939 [INFO][4399] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732 Jul 7 00:20:02.012971 containerd[1548]: 2025-07-07 00:20:01.945 [INFO][4399] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.99.128/26 handle="k8s-pod-network.7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.012971 containerd[1548]: 2025-07-07 00:20:01.954 [INFO][4399] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.99.130/26] block=192.168.99.128/26 handle="k8s-pod-network.7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.012971 containerd[1548]: 2025-07-07 00:20:01.954 [INFO][4399] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.99.130/26] handle="k8s-pod-network.7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.012971 containerd[1548]: 2025-07-07 00:20:01.954 [INFO][4399] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:20:02.012971 containerd[1548]: 2025-07-07 00:20:01.954 [INFO][4399] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.99.130/26] IPv6=[] ContainerID="7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" HandleID="k8s-pod-network.7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" Workload="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--42w27-eth0" Jul 7 00:20:02.013143 containerd[1548]: 2025-07-07 00:20:01.959 [INFO][4381] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" Namespace="kube-system" Pod="coredns-7c65d6cfc9-42w27" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--42w27-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--42w27-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"d39c4f8c-ed03-4a96-9d5f-5df5a6d7eb00", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"", Pod:"coredns-7c65d6cfc9-42w27", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.99.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliebc2531796f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:02.013143 containerd[1548]: 2025-07-07 00:20:01.959 [INFO][4381] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.99.130/32] ContainerID="7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" Namespace="kube-system" Pod="coredns-7c65d6cfc9-42w27" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--42w27-eth0" Jul 7 00:20:02.013143 containerd[1548]: 2025-07-07 00:20:01.959 [INFO][4381] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliebc2531796f ContainerID="7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" Namespace="kube-system" Pod="coredns-7c65d6cfc9-42w27" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--42w27-eth0" Jul 7 00:20:02.013143 containerd[1548]: 2025-07-07 00:20:01.975 [INFO][4381] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" Namespace="kube-system" Pod="coredns-7c65d6cfc9-42w27" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--42w27-eth0" Jul 7 00:20:02.013143 containerd[1548]: 2025-07-07 00:20:01.981 [INFO][4381] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" Namespace="kube-system" Pod="coredns-7c65d6cfc9-42w27" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--42w27-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--42w27-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"d39c4f8c-ed03-4a96-9d5f-5df5a6d7eb00", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732", Pod:"coredns-7c65d6cfc9-42w27", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.99.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliebc2531796f", MAC:"b2:ff:00:67:19:0d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:02.013143 containerd[1548]: 2025-07-07 00:20:02.005 [INFO][4381] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" Namespace="kube-system" Pod="coredns-7c65d6cfc9-42w27" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--42w27-eth0" Jul 7 00:20:02.061008 containerd[1548]: time="2025-07-07T00:20:02.060793892Z" level=info msg="connecting to shim 7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732" address="unix:///run/containerd/s/8eac3f9a16a271eada35fcedc8b2e34822e57d8bb664d064f2fc7edb5cc33458" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:20:02.100565 systemd-networkd[1414]: cali64484f4b1b5: Link UP Jul 7 00:20:02.103705 systemd-networkd[1414]: cali64484f4b1b5: Gained carrier Jul 7 00:20:02.130821 systemd[1]: Started cri-containerd-7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732.scope - libcontainer container 7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732. Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:01.820 [INFO][4369] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:01.849 [INFO][4369] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--1--1232b7205a-k8s-calico--kube--controllers--59b677dccf--swp4r-eth0 calico-kube-controllers-59b677dccf- calico-system 06bee497-80f4-43cb-b2d9-67d6f0846b05 852 0 2025-07-07 00:19:41 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:59b677dccf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4344-1-1-1-1232b7205a calico-kube-controllers-59b677dccf-swp4r eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali64484f4b1b5 [] [] }} ContainerID="b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" Namespace="calico-system" Pod="calico-kube-controllers-59b677dccf-swp4r" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--kube--controllers--59b677dccf--swp4r-" Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:01.850 [INFO][4369] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" Namespace="calico-system" Pod="calico-kube-controllers-59b677dccf-swp4r" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--kube--controllers--59b677dccf--swp4r-eth0" Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:01.922 [INFO][4406] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" HandleID="k8s-pod-network.b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--kube--controllers--59b677dccf--swp4r-eth0" Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:01.923 [INFO][4406] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" HandleID="k8s-pod-network.b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--kube--controllers--59b677dccf--swp4r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024aff0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-1-1-1-1232b7205a", "pod":"calico-kube-controllers-59b677dccf-swp4r", "timestamp":"2025-07-07 00:20:01.922740031 +0000 UTC"}, Hostname:"ci-4344-1-1-1-1232b7205a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:01.923 [INFO][4406] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:01.954 [INFO][4406] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:01.955 [INFO][4406] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-1-1232b7205a' Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:02.009 [INFO][4406] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:02.020 [INFO][4406] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:02.046 [INFO][4406] ipam/ipam.go 511: Trying affinity for 192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:02.050 [INFO][4406] ipam/ipam.go 158: Attempting to load block cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:02.054 [INFO][4406] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:02.055 [INFO][4406] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.99.128/26 handle="k8s-pod-network.b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:02.058 [INFO][4406] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:02.070 [INFO][4406] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.99.128/26 handle="k8s-pod-network.b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:02.087 [INFO][4406] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.99.131/26] block=192.168.99.128/26 handle="k8s-pod-network.b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:02.087 [INFO][4406] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.99.131/26] handle="k8s-pod-network.b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:02.087 [INFO][4406] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:20:02.140717 containerd[1548]: 2025-07-07 00:20:02.087 [INFO][4406] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.99.131/26] IPv6=[] ContainerID="b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" HandleID="k8s-pod-network.b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--kube--controllers--59b677dccf--swp4r-eth0" Jul 7 00:20:02.141312 containerd[1548]: 2025-07-07 00:20:02.093 [INFO][4369] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" Namespace="calico-system" Pod="calico-kube-controllers-59b677dccf-swp4r" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--kube--controllers--59b677dccf--swp4r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-calico--kube--controllers--59b677dccf--swp4r-eth0", GenerateName:"calico-kube-controllers-59b677dccf-", Namespace:"calico-system", SelfLink:"", UID:"06bee497-80f4-43cb-b2d9-67d6f0846b05", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"59b677dccf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"", Pod:"calico-kube-controllers-59b677dccf-swp4r", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.99.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali64484f4b1b5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:02.141312 containerd[1548]: 2025-07-07 00:20:02.093 [INFO][4369] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.99.131/32] ContainerID="b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" Namespace="calico-system" Pod="calico-kube-controllers-59b677dccf-swp4r" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--kube--controllers--59b677dccf--swp4r-eth0" Jul 7 00:20:02.141312 containerd[1548]: 2025-07-07 00:20:02.093 [INFO][4369] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali64484f4b1b5 ContainerID="b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" Namespace="calico-system" Pod="calico-kube-controllers-59b677dccf-swp4r" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--kube--controllers--59b677dccf--swp4r-eth0" Jul 7 00:20:02.141312 containerd[1548]: 2025-07-07 00:20:02.106 [INFO][4369] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" Namespace="calico-system" Pod="calico-kube-controllers-59b677dccf-swp4r" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--kube--controllers--59b677dccf--swp4r-eth0" Jul 7 00:20:02.141312 containerd[1548]: 2025-07-07 00:20:02.109 [INFO][4369] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" Namespace="calico-system" Pod="calico-kube-controllers-59b677dccf-swp4r" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--kube--controllers--59b677dccf--swp4r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-calico--kube--controllers--59b677dccf--swp4r-eth0", GenerateName:"calico-kube-controllers-59b677dccf-", Namespace:"calico-system", SelfLink:"", UID:"06bee497-80f4-43cb-b2d9-67d6f0846b05", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"59b677dccf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd", Pod:"calico-kube-controllers-59b677dccf-swp4r", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.99.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali64484f4b1b5", MAC:"ee:a7:39:9d:a9:06", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:02.141312 containerd[1548]: 2025-07-07 00:20:02.136 [INFO][4369] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" Namespace="calico-system" Pod="calico-kube-controllers-59b677dccf-swp4r" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--kube--controllers--59b677dccf--swp4r-eth0" Jul 7 00:20:02.176411 containerd[1548]: time="2025-07-07T00:20:02.176352536Z" level=info msg="connecting to shim b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd" address="unix:///run/containerd/s/6dd83b4432a666f414e9dd6624f3646ca1e7df322078b244c3c11ea8208234c7" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:20:02.224158 containerd[1548]: time="2025-07-07T00:20:02.224092099Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-42w27,Uid:d39c4f8c-ed03-4a96-9d5f-5df5a6d7eb00,Namespace:kube-system,Attempt:0,} returns sandbox id \"7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732\"" Jul 7 00:20:02.227899 systemd[1]: Started cri-containerd-b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd.scope - libcontainer container b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd. Jul 7 00:20:02.229310 containerd[1548]: time="2025-07-07T00:20:02.228548365Z" level=info msg="CreateContainer within sandbox \"7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 7 00:20:02.260583 containerd[1548]: time="2025-07-07T00:20:02.260525875Z" level=info msg="Container 5a620a409915851993a8585896fba7e534d82ca70e2d9f5d9ff8832c7742e909: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:20:02.270738 containerd[1548]: time="2025-07-07T00:20:02.270673935Z" level=info msg="CreateContainer within sandbox \"7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5a620a409915851993a8585896fba7e534d82ca70e2d9f5d9ff8832c7742e909\"" Jul 7 00:20:02.271922 containerd[1548]: time="2025-07-07T00:20:02.271875302Z" level=info msg="StartContainer for \"5a620a409915851993a8585896fba7e534d82ca70e2d9f5d9ff8832c7742e909\"" Jul 7 00:20:02.274262 containerd[1548]: time="2025-07-07T00:20:02.274225796Z" level=info msg="connecting to shim 5a620a409915851993a8585896fba7e534d82ca70e2d9f5d9ff8832c7742e909" address="unix:///run/containerd/s/8eac3f9a16a271eada35fcedc8b2e34822e57d8bb664d064f2fc7edb5cc33458" protocol=ttrpc version=3 Jul 7 00:20:02.300719 systemd[1]: Started cri-containerd-5a620a409915851993a8585896fba7e534d82ca70e2d9f5d9ff8832c7742e909.scope - libcontainer container 5a620a409915851993a8585896fba7e534d82ca70e2d9f5d9ff8832c7742e909. Jul 7 00:20:02.347840 containerd[1548]: time="2025-07-07T00:20:02.347791591Z" level=info msg="StartContainer for \"5a620a409915851993a8585896fba7e534d82ca70e2d9f5d9ff8832c7742e909\" returns successfully" Jul 7 00:20:02.359291 containerd[1548]: time="2025-07-07T00:20:02.359202739Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59b677dccf-swp4r,Uid:06bee497-80f4-43cb-b2d9-67d6f0846b05,Namespace:calico-system,Attempt:0,} returns sandbox id \"b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd\"" Jul 7 00:20:02.362776 containerd[1548]: time="2025-07-07T00:20:02.362629479Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Jul 7 00:20:02.741902 containerd[1548]: time="2025-07-07T00:20:02.741659923Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d455968b6-7vdfw,Uid:8c8b32cc-c249-4ae4-a5f5-5ca79580b06a,Namespace:calico-apiserver,Attempt:0,}" Jul 7 00:20:02.741902 containerd[1548]: time="2025-07-07T00:20:02.741732444Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8kl6f,Uid:a5dfbb9f-cbfa-4ab7-b6be-7e804154425e,Namespace:calico-system,Attempt:0,}" Jul 7 00:20:02.934186 systemd-networkd[1414]: caliecacba4ccf2: Link UP Jul 7 00:20:02.934427 systemd-networkd[1414]: caliecacba4ccf2: Gained carrier Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.792 [INFO][4577] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.810 [INFO][4577] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--1--1232b7205a-k8s-csi--node--driver--8kl6f-eth0 csi-node-driver- calico-system a5dfbb9f-cbfa-4ab7-b6be-7e804154425e 749 0 2025-07-07 00:19:41 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:57bd658777 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4344-1-1-1-1232b7205a csi-node-driver-8kl6f eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] caliecacba4ccf2 [] [] }} ContainerID="9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" Namespace="calico-system" Pod="csi-node-driver-8kl6f" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-csi--node--driver--8kl6f-" Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.811 [INFO][4577] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" Namespace="calico-system" Pod="csi-node-driver-8kl6f" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-csi--node--driver--8kl6f-eth0" Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.848 [INFO][4595] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" HandleID="k8s-pod-network.9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" Workload="ci--4344--1--1--1--1232b7205a-k8s-csi--node--driver--8kl6f-eth0" Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.848 [INFO][4595] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" HandleID="k8s-pod-network.9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" Workload="ci--4344--1--1--1--1232b7205a-k8s-csi--node--driver--8kl6f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3730), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-1-1-1-1232b7205a", "pod":"csi-node-driver-8kl6f", "timestamp":"2025-07-07 00:20:02.848326155 +0000 UTC"}, Hostname:"ci-4344-1-1-1-1232b7205a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.848 [INFO][4595] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.848 [INFO][4595] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.848 [INFO][4595] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-1-1232b7205a' Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.874 [INFO][4595] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.880 [INFO][4595] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.888 [INFO][4595] ipam/ipam.go 511: Trying affinity for 192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.892 [INFO][4595] ipam/ipam.go 158: Attempting to load block cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.896 [INFO][4595] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.896 [INFO][4595] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.99.128/26 handle="k8s-pod-network.9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.898 [INFO][4595] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893 Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.906 [INFO][4595] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.99.128/26 handle="k8s-pod-network.9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.916 [INFO][4595] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.99.132/26] block=192.168.99.128/26 handle="k8s-pod-network.9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.917 [INFO][4595] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.99.132/26] handle="k8s-pod-network.9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.917 [INFO][4595] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:20:02.961832 containerd[1548]: 2025-07-07 00:20:02.917 [INFO][4595] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.99.132/26] IPv6=[] ContainerID="9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" HandleID="k8s-pod-network.9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" Workload="ci--4344--1--1--1--1232b7205a-k8s-csi--node--driver--8kl6f-eth0" Jul 7 00:20:02.963922 containerd[1548]: 2025-07-07 00:20:02.920 [INFO][4577] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" Namespace="calico-system" Pod="csi-node-driver-8kl6f" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-csi--node--driver--8kl6f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-csi--node--driver--8kl6f-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a5dfbb9f-cbfa-4ab7-b6be-7e804154425e", ResourceVersion:"749", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"", Pod:"csi-node-driver-8kl6f", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.99.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliecacba4ccf2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:02.963922 containerd[1548]: 2025-07-07 00:20:02.920 [INFO][4577] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.99.132/32] ContainerID="9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" Namespace="calico-system" Pod="csi-node-driver-8kl6f" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-csi--node--driver--8kl6f-eth0" Jul 7 00:20:02.963922 containerd[1548]: 2025-07-07 00:20:02.920 [INFO][4577] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliecacba4ccf2 ContainerID="9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" Namespace="calico-system" Pod="csi-node-driver-8kl6f" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-csi--node--driver--8kl6f-eth0" Jul 7 00:20:02.963922 containerd[1548]: 2025-07-07 00:20:02.934 [INFO][4577] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" Namespace="calico-system" Pod="csi-node-driver-8kl6f" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-csi--node--driver--8kl6f-eth0" Jul 7 00:20:02.963922 containerd[1548]: 2025-07-07 00:20:02.937 [INFO][4577] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" Namespace="calico-system" Pod="csi-node-driver-8kl6f" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-csi--node--driver--8kl6f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-csi--node--driver--8kl6f-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a5dfbb9f-cbfa-4ab7-b6be-7e804154425e", ResourceVersion:"749", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893", Pod:"csi-node-driver-8kl6f", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.99.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliecacba4ccf2", MAC:"fe:f8:e7:4b:45:6d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:02.963922 containerd[1548]: 2025-07-07 00:20:02.955 [INFO][4577] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" Namespace="calico-system" Pod="csi-node-driver-8kl6f" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-csi--node--driver--8kl6f-eth0" Jul 7 00:20:02.994802 containerd[1548]: time="2025-07-07T00:20:02.993838496Z" level=info msg="connecting to shim 9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893" address="unix:///run/containerd/s/c1d679d209ca5ad11b765616a5b0126c1b4e6028ae3a5997e655f0dbdf3401fc" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:20:03.036902 systemd[1]: Started cri-containerd-9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893.scope - libcontainer container 9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893. Jul 7 00:20:03.075108 systemd-networkd[1414]: calia958e1aa8da: Link UP Jul 7 00:20:03.077448 systemd-networkd[1414]: calia958e1aa8da: Gained carrier Jul 7 00:20:03.117898 kubelet[2859]: I0707 00:20:03.117835 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-42w27" podStartSLOduration=41.117805424 podStartE2EDuration="41.117805424s" podCreationTimestamp="2025-07-07 00:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-07 00:20:03.052083598 +0000 UTC m=+45.458260206" watchObservedRunningTime="2025-07-07 00:20:03.117805424 +0000 UTC m=+45.523982032" Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:02.792 [INFO][4567] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:02.824 [INFO][4567] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0 calico-apiserver-d455968b6- calico-apiserver 8c8b32cc-c249-4ae4-a5f5-5ca79580b06a 854 0 2025-07-07 00:19:35 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:d455968b6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-1-1-1-1232b7205a calico-apiserver-d455968b6-7vdfw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia958e1aa8da [] [] }} ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Namespace="calico-apiserver" Pod="calico-apiserver-d455968b6-7vdfw" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-" Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:02.824 [INFO][4567] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Namespace="calico-apiserver" Pod="calico-apiserver-d455968b6-7vdfw" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:02.865 [INFO][4600] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" HandleID="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:02.866 [INFO][4600] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" HandleID="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ab490), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-1-1-1-1232b7205a", "pod":"calico-apiserver-d455968b6-7vdfw", "timestamp":"2025-07-07 00:20:02.865769418 +0000 UTC"}, Hostname:"ci-4344-1-1-1-1232b7205a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:02.866 [INFO][4600] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:02.917 [INFO][4600] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:02.918 [INFO][4600] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-1-1232b7205a' Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:02.975 [INFO][4600] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:02.990 [INFO][4600] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:03.005 [INFO][4600] ipam/ipam.go 511: Trying affinity for 192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:03.013 [INFO][4600] ipam/ipam.go 158: Attempting to load block cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:03.021 [INFO][4600] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:03.022 [INFO][4600] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.99.128/26 handle="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:03.033 [INFO][4600] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0 Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:03.044 [INFO][4600] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.99.128/26 handle="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:03.063 [INFO][4600] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.99.133/26] block=192.168.99.128/26 handle="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:03.063 [INFO][4600] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.99.133/26] handle="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:03.064 [INFO][4600] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:20:03.122457 containerd[1548]: 2025-07-07 00:20:03.064 [INFO][4600] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.99.133/26] IPv6=[] ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" HandleID="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:20:03.124411 containerd[1548]: 2025-07-07 00:20:03.069 [INFO][4567] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Namespace="calico-apiserver" Pod="calico-apiserver-d455968b6-7vdfw" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0", GenerateName:"calico-apiserver-d455968b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"8c8b32cc-c249-4ae4-a5f5-5ca79580b06a", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"d455968b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"", Pod:"calico-apiserver-d455968b6-7vdfw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.99.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia958e1aa8da", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:03.124411 containerd[1548]: 2025-07-07 00:20:03.070 [INFO][4567] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.99.133/32] ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Namespace="calico-apiserver" Pod="calico-apiserver-d455968b6-7vdfw" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:20:03.124411 containerd[1548]: 2025-07-07 00:20:03.070 [INFO][4567] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia958e1aa8da ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Namespace="calico-apiserver" Pod="calico-apiserver-d455968b6-7vdfw" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:20:03.124411 containerd[1548]: 2025-07-07 00:20:03.078 [INFO][4567] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Namespace="calico-apiserver" Pod="calico-apiserver-d455968b6-7vdfw" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:20:03.124411 containerd[1548]: 2025-07-07 00:20:03.086 [INFO][4567] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Namespace="calico-apiserver" Pod="calico-apiserver-d455968b6-7vdfw" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0", GenerateName:"calico-apiserver-d455968b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"8c8b32cc-c249-4ae4-a5f5-5ca79580b06a", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"d455968b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0", Pod:"calico-apiserver-d455968b6-7vdfw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.99.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia958e1aa8da", MAC:"b6:c8:5b:5b:58:d7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:03.124411 containerd[1548]: 2025-07-07 00:20:03.115 [INFO][4567] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Namespace="calico-apiserver" Pod="calico-apiserver-d455968b6-7vdfw" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:20:03.153117 containerd[1548]: time="2025-07-07T00:20:03.152999551Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8kl6f,Uid:a5dfbb9f-cbfa-4ab7-b6be-7e804154425e,Namespace:calico-system,Attempt:0,} returns sandbox id \"9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893\"" Jul 7 00:20:03.191284 containerd[1548]: time="2025-07-07T00:20:03.191143494Z" level=info msg="connecting to shim d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" address="unix:///run/containerd/s/217cb38f65f4bf8f71386f52e9e264ef59302207d2c640a1c39acc85be208eb6" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:20:03.233770 systemd[1]: Started cri-containerd-d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0.scope - libcontainer container d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0. Jul 7 00:20:03.316245 containerd[1548]: time="2025-07-07T00:20:03.316170668Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d455968b6-7vdfw,Uid:8c8b32cc-c249-4ae4-a5f5-5ca79580b06a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0\"" Jul 7 00:20:03.911015 systemd-networkd[1414]: cali64484f4b1b5: Gained IPv6LL Jul 7 00:20:04.040021 systemd-networkd[1414]: caliebc2531796f: Gained IPv6LL Jul 7 00:20:04.102717 systemd-networkd[1414]: caliecacba4ccf2: Gained IPv6LL Jul 7 00:20:04.167260 systemd-networkd[1414]: calia958e1aa8da: Gained IPv6LL Jul 7 00:20:04.742327 containerd[1548]: time="2025-07-07T00:20:04.741765396Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-f5hxb,Uid:3f991e28-7349-433d-8fb6-e8dea1fb5652,Namespace:calico-system,Attempt:0,}" Jul 7 00:20:04.742823 containerd[1548]: time="2025-07-07T00:20:04.742358680Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hkcx5,Uid:c25c4d92-5234-44bc-85a4-f19c76ed4997,Namespace:kube-system,Attempt:0,}" Jul 7 00:20:04.743154 containerd[1548]: time="2025-07-07T00:20:04.743066684Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d455968b6-d2496,Uid:f13c4783-fce7-4cf1-ab5f-5ba43426f9fb,Namespace:calico-apiserver,Attempt:0,}" Jul 7 00:20:05.020026 systemd-networkd[1414]: cali6e35759f466: Link UP Jul 7 00:20:05.021753 systemd-networkd[1414]: cali6e35759f466: Gained carrier Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:04.817 [INFO][4767] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:04.849 [INFO][4767] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--hkcx5-eth0 coredns-7c65d6cfc9- kube-system c25c4d92-5234-44bc-85a4-f19c76ed4997 842 0 2025-07-07 00:19:22 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344-1-1-1-1232b7205a coredns-7c65d6cfc9-hkcx5 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali6e35759f466 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hkcx5" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--hkcx5-" Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:04.849 [INFO][4767] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hkcx5" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--hkcx5-eth0" Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:04.921 [INFO][4798] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" HandleID="k8s-pod-network.58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" Workload="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--hkcx5-eth0" Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:04.921 [INFO][4798] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" HandleID="k8s-pod-network.58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" Workload="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--hkcx5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d39a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344-1-1-1-1232b7205a", "pod":"coredns-7c65d6cfc9-hkcx5", "timestamp":"2025-07-07 00:20:04.921552242 +0000 UTC"}, Hostname:"ci-4344-1-1-1-1232b7205a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:04.922 [INFO][4798] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:04.922 [INFO][4798] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:04.922 [INFO][4798] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-1-1232b7205a' Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:04.937 [INFO][4798] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:04.945 [INFO][4798] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:04.955 [INFO][4798] ipam/ipam.go 511: Trying affinity for 192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:04.960 [INFO][4798] ipam/ipam.go 158: Attempting to load block cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:04.966 [INFO][4798] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:04.966 [INFO][4798] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.99.128/26 handle="k8s-pod-network.58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:04.971 [INFO][4798] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3 Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:04.985 [INFO][4798] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.99.128/26 handle="k8s-pod-network.58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:05.002 [INFO][4798] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.99.134/26] block=192.168.99.128/26 handle="k8s-pod-network.58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:05.002 [INFO][4798] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.99.134/26] handle="k8s-pod-network.58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:05.003 [INFO][4798] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:20:05.050761 containerd[1548]: 2025-07-07 00:20:05.005 [INFO][4798] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.99.134/26] IPv6=[] ContainerID="58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" HandleID="k8s-pod-network.58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" Workload="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--hkcx5-eth0" Jul 7 00:20:05.052670 containerd[1548]: 2025-07-07 00:20:05.008 [INFO][4767] cni-plugin/k8s.go 418: Populated endpoint ContainerID="58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hkcx5" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--hkcx5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--hkcx5-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"c25c4d92-5234-44bc-85a4-f19c76ed4997", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"", Pod:"coredns-7c65d6cfc9-hkcx5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.99.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6e35759f466", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:05.052670 containerd[1548]: 2025-07-07 00:20:05.009 [INFO][4767] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.99.134/32] ContainerID="58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hkcx5" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--hkcx5-eth0" Jul 7 00:20:05.052670 containerd[1548]: 2025-07-07 00:20:05.009 [INFO][4767] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6e35759f466 ContainerID="58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hkcx5" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--hkcx5-eth0" Jul 7 00:20:05.052670 containerd[1548]: 2025-07-07 00:20:05.023 [INFO][4767] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hkcx5" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--hkcx5-eth0" Jul 7 00:20:05.052670 containerd[1548]: 2025-07-07 00:20:05.025 [INFO][4767] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hkcx5" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--hkcx5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--hkcx5-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"c25c4d92-5234-44bc-85a4-f19c76ed4997", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3", Pod:"coredns-7c65d6cfc9-hkcx5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.99.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6e35759f466", MAC:"46:e3:45:ca:71:22", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:05.052670 containerd[1548]: 2025-07-07 00:20:05.043 [INFO][4767] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hkcx5" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-coredns--7c65d6cfc9--hkcx5-eth0" Jul 7 00:20:05.134645 systemd-networkd[1414]: calied033809da1: Link UP Jul 7 00:20:05.137457 systemd-networkd[1414]: calied033809da1: Gained carrier Jul 7 00:20:05.144595 containerd[1548]: time="2025-07-07T00:20:05.142081398Z" level=info msg="connecting to shim 58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3" address="unix:///run/containerd/s/bb388429a03d96a43e8705c9507ccff7689b2da49849c0891424191d97d60ef8" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:04.818 [INFO][4776] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:04.845 [INFO][4776] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0 calico-apiserver-d455968b6- calico-apiserver f13c4783-fce7-4cf1-ab5f-5ba43426f9fb 856 0 2025-07-07 00:19:35 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:d455968b6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-1-1-1-1232b7205a calico-apiserver-d455968b6-d2496 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calied033809da1 [] [] }} ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Namespace="calico-apiserver" Pod="calico-apiserver-d455968b6-d2496" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-" Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:04.849 [INFO][4776] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Namespace="calico-apiserver" Pod="calico-apiserver-d455968b6-d2496" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:04.932 [INFO][4796] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" HandleID="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:04.936 [INFO][4796] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" HandleID="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c560), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-1-1-1-1232b7205a", "pod":"calico-apiserver-d455968b6-d2496", "timestamp":"2025-07-07 00:20:04.932289025 +0000 UTC"}, Hostname:"ci-4344-1-1-1-1232b7205a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:04.936 [INFO][4796] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:05.003 [INFO][4796] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:05.003 [INFO][4796] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-1-1232b7205a' Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:05.038 [INFO][4796] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:05.054 [INFO][4796] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:05.069 [INFO][4796] ipam/ipam.go 511: Trying affinity for 192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:05.076 [INFO][4796] ipam/ipam.go 158: Attempting to load block cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:05.080 [INFO][4796] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:05.080 [INFO][4796] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.99.128/26 handle="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:05.083 [INFO][4796] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80 Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:05.091 [INFO][4796] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.99.128/26 handle="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:05.113 [INFO][4796] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.99.135/26] block=192.168.99.128/26 handle="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:05.113 [INFO][4796] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.99.135/26] handle="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:05.113 [INFO][4796] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:20:05.181374 containerd[1548]: 2025-07-07 00:20:05.114 [INFO][4796] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.99.135/26] IPv6=[] ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" HandleID="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:20:05.182010 containerd[1548]: 2025-07-07 00:20:05.125 [INFO][4776] cni-plugin/k8s.go 418: Populated endpoint ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Namespace="calico-apiserver" Pod="calico-apiserver-d455968b6-d2496" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0", GenerateName:"calico-apiserver-d455968b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"f13c4783-fce7-4cf1-ab5f-5ba43426f9fb", ResourceVersion:"856", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"d455968b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"", Pod:"calico-apiserver-d455968b6-d2496", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.99.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calied033809da1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:05.182010 containerd[1548]: 2025-07-07 00:20:05.125 [INFO][4776] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.99.135/32] ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Namespace="calico-apiserver" Pod="calico-apiserver-d455968b6-d2496" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:20:05.182010 containerd[1548]: 2025-07-07 00:20:05.125 [INFO][4776] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calied033809da1 ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Namespace="calico-apiserver" Pod="calico-apiserver-d455968b6-d2496" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:20:05.182010 containerd[1548]: 2025-07-07 00:20:05.135 [INFO][4776] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Namespace="calico-apiserver" Pod="calico-apiserver-d455968b6-d2496" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:20:05.182010 containerd[1548]: 2025-07-07 00:20:05.141 [INFO][4776] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Namespace="calico-apiserver" Pod="calico-apiserver-d455968b6-d2496" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0", GenerateName:"calico-apiserver-d455968b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"f13c4783-fce7-4cf1-ab5f-5ba43426f9fb", ResourceVersion:"856", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"d455968b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80", Pod:"calico-apiserver-d455968b6-d2496", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.99.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calied033809da1", MAC:"b6:63:c4:ed:18:04", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:05.182010 containerd[1548]: 2025-07-07 00:20:05.172 [INFO][4776] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Namespace="calico-apiserver" Pod="calico-apiserver-d455968b6-d2496" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:20:05.207367 systemd[1]: Started cri-containerd-58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3.scope - libcontainer container 58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3. Jul 7 00:20:05.245416 containerd[1548]: time="2025-07-07T00:20:05.245364754Z" level=info msg="connecting to shim eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" address="unix:///run/containerd/s/f050644083d6161a5fdb9f5edd36f049653cb6346713755069d125289e82cef3" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:20:05.295370 systemd-networkd[1414]: calib3413a3f4a1: Link UP Jul 7 00:20:05.302585 systemd-networkd[1414]: calib3413a3f4a1: Gained carrier Jul 7 00:20:05.336172 systemd[1]: Started cri-containerd-eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80.scope - libcontainer container eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80. Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:04.809 [INFO][4758] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:04.855 [INFO][4758] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--1--1232b7205a-k8s-goldmane--58fd7646b9--f5hxb-eth0 goldmane-58fd7646b9- calico-system 3f991e28-7349-433d-8fb6-e8dea1fb5652 857 0 2025-07-07 00:19:40 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:58fd7646b9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4344-1-1-1-1232b7205a goldmane-58fd7646b9-f5hxb eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calib3413a3f4a1 [] [] }} ContainerID="4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" Namespace="calico-system" Pod="goldmane-58fd7646b9-f5hxb" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-goldmane--58fd7646b9--f5hxb-" Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:04.855 [INFO][4758] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" Namespace="calico-system" Pod="goldmane-58fd7646b9-f5hxb" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-goldmane--58fd7646b9--f5hxb-eth0" Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:04.944 [INFO][4812] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" HandleID="k8s-pod-network.4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" Workload="ci--4344--1--1--1--1232b7205a-k8s-goldmane--58fd7646b9--f5hxb-eth0" Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:04.944 [INFO][4812] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" HandleID="k8s-pod-network.4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" Workload="ci--4344--1--1--1--1232b7205a-k8s-goldmane--58fd7646b9--f5hxb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400032af70), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-1-1-1-1232b7205a", "pod":"goldmane-58fd7646b9-f5hxb", "timestamp":"2025-07-07 00:20:04.944129854 +0000 UTC"}, Hostname:"ci-4344-1-1-1-1232b7205a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:04.944 [INFO][4812] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:05.113 [INFO][4812] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:05.114 [INFO][4812] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-1-1232b7205a' Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:05.150 [INFO][4812] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:05.175 [INFO][4812] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:05.188 [INFO][4812] ipam/ipam.go 511: Trying affinity for 192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:05.193 [INFO][4812] ipam/ipam.go 158: Attempting to load block cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:05.204 [INFO][4812] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:05.204 [INFO][4812] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.99.128/26 handle="k8s-pod-network.4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:05.214 [INFO][4812] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:05.230 [INFO][4812] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.99.128/26 handle="k8s-pod-network.4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:05.254 [INFO][4812] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.99.136/26] block=192.168.99.128/26 handle="k8s-pod-network.4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:05.254 [INFO][4812] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.99.136/26] handle="k8s-pod-network.4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:05.254 [INFO][4812] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:20:05.341753 containerd[1548]: 2025-07-07 00:20:05.257 [INFO][4812] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.99.136/26] IPv6=[] ContainerID="4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" HandleID="k8s-pod-network.4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" Workload="ci--4344--1--1--1--1232b7205a-k8s-goldmane--58fd7646b9--f5hxb-eth0" Jul 7 00:20:05.342403 containerd[1548]: 2025-07-07 00:20:05.267 [INFO][4758] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" Namespace="calico-system" Pod="goldmane-58fd7646b9-f5hxb" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-goldmane--58fd7646b9--f5hxb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-goldmane--58fd7646b9--f5hxb-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"3f991e28-7349-433d-8fb6-e8dea1fb5652", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"", Pod:"goldmane-58fd7646b9-f5hxb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.99.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib3413a3f4a1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:05.342403 containerd[1548]: 2025-07-07 00:20:05.269 [INFO][4758] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.99.136/32] ContainerID="4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" Namespace="calico-system" Pod="goldmane-58fd7646b9-f5hxb" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-goldmane--58fd7646b9--f5hxb-eth0" Jul 7 00:20:05.342403 containerd[1548]: 2025-07-07 00:20:05.269 [INFO][4758] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib3413a3f4a1 ContainerID="4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" Namespace="calico-system" Pod="goldmane-58fd7646b9-f5hxb" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-goldmane--58fd7646b9--f5hxb-eth0" Jul 7 00:20:05.342403 containerd[1548]: 2025-07-07 00:20:05.304 [INFO][4758] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" Namespace="calico-system" Pod="goldmane-58fd7646b9-f5hxb" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-goldmane--58fd7646b9--f5hxb-eth0" Jul 7 00:20:05.342403 containerd[1548]: 2025-07-07 00:20:05.308 [INFO][4758] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" Namespace="calico-system" Pod="goldmane-58fd7646b9-f5hxb" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-goldmane--58fd7646b9--f5hxb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-goldmane--58fd7646b9--f5hxb-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"3f991e28-7349-433d-8fb6-e8dea1fb5652", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc", Pod:"goldmane-58fd7646b9-f5hxb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.99.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib3413a3f4a1", MAC:"e2:2e:a3:48:07:e7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:05.342403 containerd[1548]: 2025-07-07 00:20:05.332 [INFO][4758] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" Namespace="calico-system" Pod="goldmane-58fd7646b9-f5hxb" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-goldmane--58fd7646b9--f5hxb-eth0" Jul 7 00:20:05.379319 containerd[1548]: time="2025-07-07T00:20:05.378278601Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hkcx5,Uid:c25c4d92-5234-44bc-85a4-f19c76ed4997,Namespace:kube-system,Attempt:0,} returns sandbox id \"58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3\"" Jul 7 00:20:05.392497 containerd[1548]: time="2025-07-07T00:20:05.392441323Z" level=info msg="CreateContainer within sandbox \"58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 7 00:20:05.407382 containerd[1548]: time="2025-07-07T00:20:05.406102721Z" level=info msg="connecting to shim 4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc" address="unix:///run/containerd/s/0fd236fea6f8b994743e53c393fecd308943cd1f7fd223aa9d924ac84cbd3fdc" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:20:05.423731 containerd[1548]: time="2025-07-07T00:20:05.423676263Z" level=info msg="Container 87bd7b4d1af77bdab9d33bcbc0b25174d69cd4ff8fd298f582223434e3e1b0a7: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:20:05.447558 containerd[1548]: time="2025-07-07T00:20:05.447354479Z" level=info msg="CreateContainer within sandbox \"58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"87bd7b4d1af77bdab9d33bcbc0b25174d69cd4ff8fd298f582223434e3e1b0a7\"" Jul 7 00:20:05.450339 containerd[1548]: time="2025-07-07T00:20:05.450284216Z" level=info msg="StartContainer for \"87bd7b4d1af77bdab9d33bcbc0b25174d69cd4ff8fd298f582223434e3e1b0a7\"" Jul 7 00:20:05.460494 containerd[1548]: time="2025-07-07T00:20:05.460453235Z" level=info msg="connecting to shim 87bd7b4d1af77bdab9d33bcbc0b25174d69cd4ff8fd298f582223434e3e1b0a7" address="unix:///run/containerd/s/bb388429a03d96a43e8705c9507ccff7689b2da49849c0891424191d97d60ef8" protocol=ttrpc version=3 Jul 7 00:20:05.493410 kubelet[2859]: I0707 00:20:05.493090 2859 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:20:05.499710 systemd[1]: Started cri-containerd-4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc.scope - libcontainer container 4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc. Jul 7 00:20:05.555723 systemd[1]: Started cri-containerd-87bd7b4d1af77bdab9d33bcbc0b25174d69cd4ff8fd298f582223434e3e1b0a7.scope - libcontainer container 87bd7b4d1af77bdab9d33bcbc0b25174d69cd4ff8fd298f582223434e3e1b0a7. Jul 7 00:20:05.594975 containerd[1548]: time="2025-07-07T00:20:05.594742730Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-d455968b6-d2496,Uid:f13c4783-fce7-4cf1-ab5f-5ba43426f9fb,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80\"" Jul 7 00:20:05.652200 containerd[1548]: time="2025-07-07T00:20:05.652136781Z" level=info msg="StartContainer for \"87bd7b4d1af77bdab9d33bcbc0b25174d69cd4ff8fd298f582223434e3e1b0a7\" returns successfully" Jul 7 00:20:05.747398 containerd[1548]: time="2025-07-07T00:20:05.747348650Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7665774675-4v9q6,Uid:b9d9ac66-6750-4558-9996-a65b2992f683,Namespace:calico-apiserver,Attempt:0,}" Jul 7 00:20:05.798213 containerd[1548]: time="2025-07-07T00:20:05.798154383Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-f5hxb,Uid:3f991e28-7349-433d-8fb6-e8dea1fb5652,Namespace:calico-system,Attempt:0,} returns sandbox id \"4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc\"" Jul 7 00:20:06.079768 systemd-networkd[1414]: calic1b81b0d89c: Link UP Jul 7 00:20:06.081697 systemd-networkd[1414]: calic1b81b0d89c: Gained carrier Jul 7 00:20:06.117962 kubelet[2859]: I0707 00:20:06.117872 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-hkcx5" podStartSLOduration=44.117852142 podStartE2EDuration="44.117852142s" podCreationTimestamp="2025-07-07 00:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-07 00:20:06.097948188 +0000 UTC m=+48.504124796" watchObservedRunningTime="2025-07-07 00:20:06.117852142 +0000 UTC m=+48.524028750" Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:05.842 [INFO][5032] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:05.887 [INFO][5032] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--4v9q6-eth0 calico-apiserver-7665774675- calico-apiserver b9d9ac66-6750-4558-9996-a65b2992f683 847 0 2025-07-07 00:19:37 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7665774675 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-1-1-1-1232b7205a calico-apiserver-7665774675-4v9q6 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic1b81b0d89c [] [] }} ContainerID="45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" Namespace="calico-apiserver" Pod="calico-apiserver-7665774675-4v9q6" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--4v9q6-" Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:05.888 [INFO][5032] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" Namespace="calico-apiserver" Pod="calico-apiserver-7665774675-4v9q6" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--4v9q6-eth0" Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:05.954 [INFO][5044] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" HandleID="k8s-pod-network.45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--4v9q6-eth0" Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:05.954 [INFO][5044] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" HandleID="k8s-pod-network.45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--4v9q6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d39b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-1-1-1-1232b7205a", "pod":"calico-apiserver-7665774675-4v9q6", "timestamp":"2025-07-07 00:20:05.954535325 +0000 UTC"}, Hostname:"ci-4344-1-1-1-1232b7205a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:05.954 [INFO][5044] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:05.955 [INFO][5044] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:05.955 [INFO][5044] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-1-1232b7205a' Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:05.975 [INFO][5044] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:05.990 [INFO][5044] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:06.007 [INFO][5044] ipam/ipam.go 511: Trying affinity for 192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:06.015 [INFO][5044] ipam/ipam.go 158: Attempting to load block cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:06.022 [INFO][5044] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:06.022 [INFO][5044] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.99.128/26 handle="k8s-pod-network.45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:06.028 [INFO][5044] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:06.043 [INFO][5044] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.99.128/26 handle="k8s-pod-network.45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:06.062 [INFO][5044] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.99.137/26] block=192.168.99.128/26 handle="k8s-pod-network.45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:06.062 [INFO][5044] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.99.137/26] handle="k8s-pod-network.45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:06.062 [INFO][5044] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:20:06.127302 containerd[1548]: 2025-07-07 00:20:06.062 [INFO][5044] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.99.137/26] IPv6=[] ContainerID="45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" HandleID="k8s-pod-network.45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--4v9q6-eth0" Jul 7 00:20:06.129335 containerd[1548]: 2025-07-07 00:20:06.067 [INFO][5032] cni-plugin/k8s.go 418: Populated endpoint ContainerID="45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" Namespace="calico-apiserver" Pod="calico-apiserver-7665774675-4v9q6" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--4v9q6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--4v9q6-eth0", GenerateName:"calico-apiserver-7665774675-", Namespace:"calico-apiserver", SelfLink:"", UID:"b9d9ac66-6750-4558-9996-a65b2992f683", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7665774675", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"", Pod:"calico-apiserver-7665774675-4v9q6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.99.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic1b81b0d89c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:06.129335 containerd[1548]: 2025-07-07 00:20:06.068 [INFO][5032] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.99.137/32] ContainerID="45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" Namespace="calico-apiserver" Pod="calico-apiserver-7665774675-4v9q6" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--4v9q6-eth0" Jul 7 00:20:06.129335 containerd[1548]: 2025-07-07 00:20:06.068 [INFO][5032] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic1b81b0d89c ContainerID="45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" Namespace="calico-apiserver" Pod="calico-apiserver-7665774675-4v9q6" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--4v9q6-eth0" Jul 7 00:20:06.129335 containerd[1548]: 2025-07-07 00:20:06.084 [INFO][5032] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" Namespace="calico-apiserver" Pod="calico-apiserver-7665774675-4v9q6" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--4v9q6-eth0" Jul 7 00:20:06.129335 containerd[1548]: 2025-07-07 00:20:06.087 [INFO][5032] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" Namespace="calico-apiserver" Pod="calico-apiserver-7665774675-4v9q6" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--4v9q6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--4v9q6-eth0", GenerateName:"calico-apiserver-7665774675-", Namespace:"calico-apiserver", SelfLink:"", UID:"b9d9ac66-6750-4558-9996-a65b2992f683", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 19, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7665774675", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d", Pod:"calico-apiserver-7665774675-4v9q6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.99.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic1b81b0d89c", MAC:"62:e4:8b:bd:76:f1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:06.129335 containerd[1548]: 2025-07-07 00:20:06.119 [INFO][5032] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" Namespace="calico-apiserver" Pod="calico-apiserver-7665774675-4v9q6" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--4v9q6-eth0" Jul 7 00:20:06.195247 containerd[1548]: time="2025-07-07T00:20:06.194433340Z" level=info msg="connecting to shim 45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d" address="unix:///run/containerd/s/177a6ab064439bb5b35a4f7d8208eb5a8d2428e4723c574feefc3cc9b0c5e394" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:20:06.214630 systemd-networkd[1414]: cali6e35759f466: Gained IPv6LL Jul 7 00:20:06.270012 systemd[1]: Started cri-containerd-45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d.scope - libcontainer container 45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d. Jul 7 00:20:06.483125 containerd[1548]: time="2025-07-07T00:20:06.483067991Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7665774675-4v9q6,Uid:b9d9ac66-6750-4558-9996-a65b2992f683,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d\"" Jul 7 00:20:06.697244 containerd[1548]: time="2025-07-07T00:20:06.697187497Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:06.699284 containerd[1548]: time="2025-07-07T00:20:06.699245068Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=48128336" Jul 7 00:20:06.700128 containerd[1548]: time="2025-07-07T00:20:06.700097233Z" level=info msg="ImageCreate event name:\"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:06.703798 containerd[1548]: time="2025-07-07T00:20:06.703757454Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"49497545\" in 4.341074815s" Jul 7 00:20:06.703876 containerd[1548]: time="2025-07-07T00:20:06.703801614Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\"" Jul 7 00:20:06.704007 containerd[1548]: time="2025-07-07T00:20:06.703975735Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:06.705352 containerd[1548]: time="2025-07-07T00:20:06.705318103Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Jul 7 00:20:06.722884 containerd[1548]: time="2025-07-07T00:20:06.722838603Z" level=info msg="CreateContainer within sandbox \"b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jul 7 00:20:06.730458 containerd[1548]: time="2025-07-07T00:20:06.730405447Z" level=info msg="Container 5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:20:06.755241 containerd[1548]: time="2025-07-07T00:20:06.755118308Z" level=info msg="CreateContainer within sandbox \"b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\"" Jul 7 00:20:06.758026 containerd[1548]: time="2025-07-07T00:20:06.757911804Z" level=info msg="StartContainer for \"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\"" Jul 7 00:20:06.762253 containerd[1548]: time="2025-07-07T00:20:06.762213069Z" level=info msg="connecting to shim 5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c" address="unix:///run/containerd/s/6dd83b4432a666f414e9dd6624f3646ca1e7df322078b244c3c11ea8208234c7" protocol=ttrpc version=3 Jul 7 00:20:06.790658 systemd-networkd[1414]: calib3413a3f4a1: Gained IPv6LL Jul 7 00:20:06.796820 systemd[1]: Started cri-containerd-5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c.scope - libcontainer container 5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c. Jul 7 00:20:06.873851 containerd[1548]: time="2025-07-07T00:20:06.873616866Z" level=info msg="StartContainer for \"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" returns successfully" Jul 7 00:20:06.960357 systemd-networkd[1414]: vxlan.calico: Link UP Jul 7 00:20:06.960366 systemd-networkd[1414]: vxlan.calico: Gained carrier Jul 7 00:20:07.170762 kubelet[2859]: I0707 00:20:07.170688 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-59b677dccf-swp4r" podStartSLOduration=21.826741687 podStartE2EDuration="26.170669038s" podCreationTimestamp="2025-07-07 00:19:41 +0000 UTC" firstStartedPulling="2025-07-07 00:20:02.361219311 +0000 UTC m=+44.767395919" lastFinishedPulling="2025-07-07 00:20:06.705146582 +0000 UTC m=+49.111323270" observedRunningTime="2025-07-07 00:20:07.132207139 +0000 UTC m=+49.538383747" watchObservedRunningTime="2025-07-07 00:20:07.170669038 +0000 UTC m=+49.576845646" Jul 7 00:20:07.174769 systemd-networkd[1414]: calied033809da1: Gained IPv6LL Jul 7 00:20:07.235110 containerd[1548]: time="2025-07-07T00:20:07.235054723Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"b8b4c7cd61fad3bd222b1f45b2ef3c34c435bc828a2e6d649bdb82fd4a646a02\" pid:5231 exit_status:1 exited_at:{seconds:1751847607 nanos:234371679}" Jul 7 00:20:08.134778 systemd-networkd[1414]: calic1b81b0d89c: Gained IPv6LL Jul 7 00:20:08.138000 containerd[1548]: time="2025-07-07T00:20:08.137915681Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"d13044b0687ad7963db18b8d0e41c5159c218b485d3875a2d2517a529c9c872e\" pid:5305 exited_at:{seconds:1751847608 nanos:134706423}" Jul 7 00:20:08.399963 containerd[1548]: time="2025-07-07T00:20:08.399786836Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:08.401467 containerd[1548]: time="2025-07-07T00:20:08.401348685Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8225702" Jul 7 00:20:08.402675 containerd[1548]: time="2025-07-07T00:20:08.402570252Z" level=info msg="ImageCreate event name:\"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:08.406689 containerd[1548]: time="2025-07-07T00:20:08.406600754Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:08.409543 containerd[1548]: time="2025-07-07T00:20:08.408989168Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"9594943\" in 1.703631345s" Jul 7 00:20:08.409543 containerd[1548]: time="2025-07-07T00:20:08.409039248Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\"" Jul 7 00:20:08.410708 containerd[1548]: time="2025-07-07T00:20:08.410668457Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 7 00:20:08.414218 containerd[1548]: time="2025-07-07T00:20:08.414161637Z" level=info msg="CreateContainer within sandbox \"9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jul 7 00:20:08.429974 containerd[1548]: time="2025-07-07T00:20:08.429904046Z" level=info msg="Container 1800e16d56298b5ae4fea1fdc8c63dbbad87376ebbf0b08feba0c162fec4b0c7: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:20:08.461877 containerd[1548]: time="2025-07-07T00:20:08.461730145Z" level=info msg="CreateContainer within sandbox \"9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"1800e16d56298b5ae4fea1fdc8c63dbbad87376ebbf0b08feba0c162fec4b0c7\"" Jul 7 00:20:08.465961 containerd[1548]: time="2025-07-07T00:20:08.465579607Z" level=info msg="StartContainer for \"1800e16d56298b5ae4fea1fdc8c63dbbad87376ebbf0b08feba0c162fec4b0c7\"" Jul 7 00:20:08.468816 containerd[1548]: time="2025-07-07T00:20:08.468772825Z" level=info msg="connecting to shim 1800e16d56298b5ae4fea1fdc8c63dbbad87376ebbf0b08feba0c162fec4b0c7" address="unix:///run/containerd/s/c1d679d209ca5ad11b765616a5b0126c1b4e6028ae3a5997e655f0dbdf3401fc" protocol=ttrpc version=3 Jul 7 00:20:08.497840 systemd[1]: Started cri-containerd-1800e16d56298b5ae4fea1fdc8c63dbbad87376ebbf0b08feba0c162fec4b0c7.scope - libcontainer container 1800e16d56298b5ae4fea1fdc8c63dbbad87376ebbf0b08feba0c162fec4b0c7. Jul 7 00:20:08.553920 containerd[1548]: time="2025-07-07T00:20:08.553784583Z" level=info msg="StartContainer for \"1800e16d56298b5ae4fea1fdc8c63dbbad87376ebbf0b08feba0c162fec4b0c7\" returns successfully" Jul 7 00:20:08.838827 systemd-networkd[1414]: vxlan.calico: Gained IPv6LL Jul 7 00:20:11.578016 containerd[1548]: time="2025-07-07T00:20:11.577896653Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:11.580110 containerd[1548]: time="2025-07-07T00:20:11.580059625Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=44517149" Jul 7 00:20:11.581962 containerd[1548]: time="2025-07-07T00:20:11.581865955Z" level=info msg="ImageCreate event name:\"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:11.585037 containerd[1548]: time="2025-07-07T00:20:11.584990972Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:11.585863 containerd[1548]: time="2025-07-07T00:20:11.585778016Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 3.175068878s" Jul 7 00:20:11.585863 containerd[1548]: time="2025-07-07T00:20:11.585850336Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Jul 7 00:20:11.589538 containerd[1548]: time="2025-07-07T00:20:11.589254395Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 7 00:20:11.592644 containerd[1548]: time="2025-07-07T00:20:11.592107291Z" level=info msg="CreateContainer within sandbox \"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 7 00:20:11.604186 containerd[1548]: time="2025-07-07T00:20:11.604130117Z" level=info msg="Container 4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:20:11.614115 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount938787692.mount: Deactivated successfully. Jul 7 00:20:11.623219 containerd[1548]: time="2025-07-07T00:20:11.623165742Z" level=info msg="CreateContainer within sandbox \"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\"" Jul 7 00:20:11.624646 containerd[1548]: time="2025-07-07T00:20:11.624606230Z" level=info msg="StartContainer for \"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\"" Jul 7 00:20:11.626545 containerd[1548]: time="2025-07-07T00:20:11.626458920Z" level=info msg="connecting to shim 4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4" address="unix:///run/containerd/s/217cb38f65f4bf8f71386f52e9e264ef59302207d2c640a1c39acc85be208eb6" protocol=ttrpc version=3 Jul 7 00:20:11.656730 systemd[1]: Started cri-containerd-4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4.scope - libcontainer container 4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4. Jul 7 00:20:11.738963 containerd[1548]: time="2025-07-07T00:20:11.738750058Z" level=info msg="StartContainer for \"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\" returns successfully" Jul 7 00:20:11.970436 containerd[1548]: time="2025-07-07T00:20:11.970262933Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:11.978190 containerd[1548]: time="2025-07-07T00:20:11.976406327Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 7 00:20:11.979360 containerd[1548]: time="2025-07-07T00:20:11.979323703Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 389.599305ms" Jul 7 00:20:11.979464 containerd[1548]: time="2025-07-07T00:20:11.979448824Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Jul 7 00:20:11.981984 containerd[1548]: time="2025-07-07T00:20:11.981949717Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Jul 7 00:20:11.983221 containerd[1548]: time="2025-07-07T00:20:11.983190004Z" level=info msg="CreateContainer within sandbox \"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 7 00:20:11.998967 containerd[1548]: time="2025-07-07T00:20:11.998916251Z" level=info msg="Container d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:20:12.006123 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3460419736.mount: Deactivated successfully. Jul 7 00:20:12.012521 containerd[1548]: time="2025-07-07T00:20:12.012455445Z" level=info msg="CreateContainer within sandbox \"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\"" Jul 7 00:20:12.015359 containerd[1548]: time="2025-07-07T00:20:12.014613377Z" level=info msg="StartContainer for \"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\"" Jul 7 00:20:12.017571 containerd[1548]: time="2025-07-07T00:20:12.017525713Z" level=info msg="connecting to shim d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983" address="unix:///run/containerd/s/f050644083d6161a5fdb9f5edd36f049653cb6346713755069d125289e82cef3" protocol=ttrpc version=3 Jul 7 00:20:12.045763 systemd[1]: Started cri-containerd-d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983.scope - libcontainer container d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983. Jul 7 00:20:12.142443 containerd[1548]: time="2025-07-07T00:20:12.142399275Z" level=info msg="StartContainer for \"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\" returns successfully" Jul 7 00:20:12.145782 kubelet[2859]: I0707 00:20:12.145714 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-d455968b6-7vdfw" podStartSLOduration=28.877135031999998 podStartE2EDuration="37.145696933s" podCreationTimestamp="2025-07-07 00:19:35 +0000 UTC" firstStartedPulling="2025-07-07 00:20:03.319659969 +0000 UTC m=+45.725836577" lastFinishedPulling="2025-07-07 00:20:11.58822187 +0000 UTC m=+53.994398478" observedRunningTime="2025-07-07 00:20:12.144550127 +0000 UTC m=+54.550726735" watchObservedRunningTime="2025-07-07 00:20:12.145696933 +0000 UTC m=+54.551873541" Jul 7 00:20:13.145117 kubelet[2859]: I0707 00:20:13.145021 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-d455968b6-d2496" podStartSLOduration=31.763812358 podStartE2EDuration="38.144998751s" podCreationTimestamp="2025-07-07 00:19:35 +0000 UTC" firstStartedPulling="2025-07-07 00:20:05.599134195 +0000 UTC m=+48.005310803" lastFinishedPulling="2025-07-07 00:20:11.980320588 +0000 UTC m=+54.386497196" observedRunningTime="2025-07-07 00:20:13.144692309 +0000 UTC m=+55.550868917" watchObservedRunningTime="2025-07-07 00:20:13.144998751 +0000 UTC m=+55.551175319" Jul 7 00:20:14.130473 kubelet[2859]: I0707 00:20:14.130430 2859 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:20:15.027131 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount226472298.mount: Deactivated successfully. Jul 7 00:20:15.636636 containerd[1548]: time="2025-07-07T00:20:15.636484154Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:15.638996 containerd[1548]: time="2025-07-07T00:20:15.638884167Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=61838790" Jul 7 00:20:15.640194 containerd[1548]: time="2025-07-07T00:20:15.640112654Z" level=info msg="ImageCreate event name:\"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:15.643512 containerd[1548]: time="2025-07-07T00:20:15.643415991Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:15.645042 containerd[1548]: time="2025-07-07T00:20:15.644865399Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"61838636\" in 3.662073597s" Jul 7 00:20:15.645042 containerd[1548]: time="2025-07-07T00:20:15.644914559Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\"" Jul 7 00:20:15.647014 containerd[1548]: time="2025-07-07T00:20:15.646722809Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 7 00:20:15.653938 containerd[1548]: time="2025-07-07T00:20:15.653027483Z" level=info msg="CreateContainer within sandbox \"4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Jul 7 00:20:15.665963 containerd[1548]: time="2025-07-07T00:20:15.665892552Z" level=info msg="Container 5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:20:15.672119 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1145382465.mount: Deactivated successfully. Jul 7 00:20:15.681541 containerd[1548]: time="2025-07-07T00:20:15.681379795Z" level=info msg="CreateContainer within sandbox \"4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\"" Jul 7 00:20:15.685836 containerd[1548]: time="2025-07-07T00:20:15.685780858Z" level=info msg="StartContainer for \"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\"" Jul 7 00:20:15.687481 containerd[1548]: time="2025-07-07T00:20:15.687439587Z" level=info msg="connecting to shim 5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a" address="unix:///run/containerd/s/0fd236fea6f8b994743e53c393fecd308943cd1f7fd223aa9d924ac84cbd3fdc" protocol=ttrpc version=3 Jul 7 00:20:15.708807 systemd[1]: Started cri-containerd-5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a.scope - libcontainer container 5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a. Jul 7 00:20:15.772209 containerd[1548]: time="2025-07-07T00:20:15.772160721Z" level=info msg="StartContainer for \"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" returns successfully" Jul 7 00:20:16.044624 containerd[1548]: time="2025-07-07T00:20:16.042778089Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:16.044624 containerd[1548]: time="2025-07-07T00:20:16.043706454Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 7 00:20:16.047838 containerd[1548]: time="2025-07-07T00:20:16.047791476Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 401.021947ms" Jul 7 00:20:16.048068 containerd[1548]: time="2025-07-07T00:20:16.048045677Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Jul 7 00:20:16.049835 containerd[1548]: time="2025-07-07T00:20:16.049788726Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Jul 7 00:20:16.051366 containerd[1548]: time="2025-07-07T00:20:16.051326815Z" level=info msg="CreateContainer within sandbox \"45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 7 00:20:16.064775 containerd[1548]: time="2025-07-07T00:20:16.064714646Z" level=info msg="Container d9d81cbcac5e07f41f450c60a964da43762211fd73789d6b95f52ae5b24f0fdb: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:20:16.069488 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1278144667.mount: Deactivated successfully. Jul 7 00:20:16.079975 containerd[1548]: time="2025-07-07T00:20:16.079791006Z" level=info msg="CreateContainer within sandbox \"45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d9d81cbcac5e07f41f450c60a964da43762211fd73789d6b95f52ae5b24f0fdb\"" Jul 7 00:20:16.081039 containerd[1548]: time="2025-07-07T00:20:16.080999333Z" level=info msg="StartContainer for \"d9d81cbcac5e07f41f450c60a964da43762211fd73789d6b95f52ae5b24f0fdb\"" Jul 7 00:20:16.082946 containerd[1548]: time="2025-07-07T00:20:16.082873023Z" level=info msg="connecting to shim d9d81cbcac5e07f41f450c60a964da43762211fd73789d6b95f52ae5b24f0fdb" address="unix:///run/containerd/s/177a6ab064439bb5b35a4f7d8208eb5a8d2428e4723c574feefc3cc9b0c5e394" protocol=ttrpc version=3 Jul 7 00:20:16.113775 systemd[1]: Started cri-containerd-d9d81cbcac5e07f41f450c60a964da43762211fd73789d6b95f52ae5b24f0fdb.scope - libcontainer container d9d81cbcac5e07f41f450c60a964da43762211fd73789d6b95f52ae5b24f0fdb. Jul 7 00:20:16.203272 containerd[1548]: time="2025-07-07T00:20:16.203188543Z" level=info msg="StartContainer for \"d9d81cbcac5e07f41f450c60a964da43762211fd73789d6b95f52ae5b24f0fdb\" returns successfully" Jul 7 00:20:16.323042 containerd[1548]: time="2025-07-07T00:20:16.322997941Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"a3dae413047ef5fcdff0ee1dbbc79bf54c4d60088149c1edcbc6488759437783\" pid:5522 exited_at:{seconds:1751847616 nanos:321222451}" Jul 7 00:20:16.344043 kubelet[2859]: I0707 00:20:16.343218 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-58fd7646b9-f5hxb" podStartSLOduration=26.501316141 podStartE2EDuration="36.343196048s" podCreationTimestamp="2025-07-07 00:19:40 +0000 UTC" firstStartedPulling="2025-07-07 00:20:05.804687581 +0000 UTC m=+48.210864149" lastFinishedPulling="2025-07-07 00:20:15.646567408 +0000 UTC m=+58.052744056" observedRunningTime="2025-07-07 00:20:16.174606191 +0000 UTC m=+58.580782799" watchObservedRunningTime="2025-07-07 00:20:16.343196048 +0000 UTC m=+58.749372656" Jul 7 00:20:18.101447 containerd[1548]: time="2025-07-07T00:20:18.101388765Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:18.103413 containerd[1548]: time="2025-07-07T00:20:18.103342776Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=13754366" Jul 7 00:20:18.106609 containerd[1548]: time="2025-07-07T00:20:18.104682263Z" level=info msg="ImageCreate event name:\"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:18.111496 containerd[1548]: time="2025-07-07T00:20:18.111439098Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:20:18.114547 containerd[1548]: time="2025-07-07T00:20:18.114482234Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"15123559\" in 2.064472586s" Jul 7 00:20:18.114834 containerd[1548]: time="2025-07-07T00:20:18.114713955Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\"" Jul 7 00:20:18.118604 containerd[1548]: time="2025-07-07T00:20:18.118564816Z" level=info msg="CreateContainer within sandbox \"9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jul 7 00:20:18.135022 containerd[1548]: time="2025-07-07T00:20:18.134963542Z" level=info msg="Container bb34a49ecb9edcb68352ca216a748beb2aca64471ecad5a2617529d621a28bc1: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:20:18.154427 containerd[1548]: time="2025-07-07T00:20:18.154374044Z" level=info msg="CreateContainer within sandbox \"9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"bb34a49ecb9edcb68352ca216a748beb2aca64471ecad5a2617529d621a28bc1\"" Jul 7 00:20:18.159711 containerd[1548]: time="2025-07-07T00:20:18.159618951Z" level=info msg="StartContainer for \"bb34a49ecb9edcb68352ca216a748beb2aca64471ecad5a2617529d621a28bc1\"" Jul 7 00:20:18.168025 containerd[1548]: time="2025-07-07T00:20:18.167965555Z" level=info msg="connecting to shim bb34a49ecb9edcb68352ca216a748beb2aca64471ecad5a2617529d621a28bc1" address="unix:///run/containerd/s/c1d679d209ca5ad11b765616a5b0126c1b4e6028ae3a5997e655f0dbdf3401fc" protocol=ttrpc version=3 Jul 7 00:20:18.171136 kubelet[2859]: I0707 00:20:18.171008 2859 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:20:18.197899 systemd[1]: Started cri-containerd-bb34a49ecb9edcb68352ca216a748beb2aca64471ecad5a2617529d621a28bc1.scope - libcontainer container bb34a49ecb9edcb68352ca216a748beb2aca64471ecad5a2617529d621a28bc1. Jul 7 00:20:18.246387 containerd[1548]: time="2025-07-07T00:20:18.246270087Z" level=info msg="StartContainer for \"bb34a49ecb9edcb68352ca216a748beb2aca64471ecad5a2617529d621a28bc1\" returns successfully" Jul 7 00:20:18.875551 kubelet[2859]: I0707 00:20:18.875389 2859 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jul 7 00:20:18.875551 kubelet[2859]: I0707 00:20:18.875466 2859 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jul 7 00:20:19.203067 kubelet[2859]: I0707 00:20:19.202080 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7665774675-4v9q6" podStartSLOduration=32.638811109 podStartE2EDuration="42.202054304s" podCreationTimestamp="2025-07-07 00:19:37 +0000 UTC" firstStartedPulling="2025-07-07 00:20:06.485721487 +0000 UTC m=+48.891898095" lastFinishedPulling="2025-07-07 00:20:16.048964682 +0000 UTC m=+58.455141290" observedRunningTime="2025-07-07 00:20:17.177916645 +0000 UTC m=+59.584093253" watchObservedRunningTime="2025-07-07 00:20:19.202054304 +0000 UTC m=+61.608230992" Jul 7 00:20:19.461213 containerd[1548]: time="2025-07-07T00:20:19.461065338Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"0bdba61ef799341b569f5ca5f13f760369733a940b33cb9ce8b42a951715312d\" pid:5611 exited_at:{seconds:1751847619 nanos:459397569}" Jul 7 00:20:19.606933 containerd[1548]: time="2025-07-07T00:20:19.606874220Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"494010b09027edf3e8cfae9339e923818d0d989acbd91a2fbcf131c669b72e5e\" pid:5631 exited_at:{seconds:1751847619 nanos:605446812}" Jul 7 00:20:25.509354 containerd[1548]: time="2025-07-07T00:20:25.509288177Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\" id:\"7a7777210849ea34b222913d1f751b0972bba7f3f6a52ba43be7dcb954f40a59\" pid:5657 exited_at:{seconds:1751847625 nanos:508742575}" Jul 7 00:20:25.542369 kubelet[2859]: I0707 00:20:25.541041 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-8kl6f" podStartSLOduration=29.587748023 podStartE2EDuration="44.541016098s" podCreationTimestamp="2025-07-07 00:19:41 +0000 UTC" firstStartedPulling="2025-07-07 00:20:03.162269645 +0000 UTC m=+45.568446253" lastFinishedPulling="2025-07-07 00:20:18.11553772 +0000 UTC m=+60.521714328" observedRunningTime="2025-07-07 00:20:19.20508892 +0000 UTC m=+61.611265528" watchObservedRunningTime="2025-07-07 00:20:25.541016098 +0000 UTC m=+67.947192866" Jul 7 00:20:32.322112 kubelet[2859]: I0707 00:20:32.322063 2859 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:20:32.415454 kubelet[2859]: I0707 00:20:32.415390 2859 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:20:32.417643 containerd[1548]: time="2025-07-07T00:20:32.417597022Z" level=info msg="StopContainer for \"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\" with timeout 30 (s)" Jul 7 00:20:32.419551 containerd[1548]: time="2025-07-07T00:20:32.419467071Z" level=info msg="Stop container \"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\" with signal terminated" Jul 7 00:20:32.463770 systemd[1]: cri-containerd-d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983.scope: Deactivated successfully. Jul 7 00:20:32.464128 systemd[1]: cri-containerd-d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983.scope: Consumed 1.451s CPU time, 43.7M memory peak. Jul 7 00:20:32.475571 containerd[1548]: time="2025-07-07T00:20:32.475182384Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\" id:\"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\" pid:5405 exit_status:1 exited_at:{seconds:1751847632 nanos:474562421}" Jul 7 00:20:32.475571 containerd[1548]: time="2025-07-07T00:20:32.475232944Z" level=info msg="received exit event container_id:\"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\" id:\"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\" pid:5405 exit_status:1 exited_at:{seconds:1751847632 nanos:474562421}" Jul 7 00:20:32.491987 systemd[1]: Created slice kubepods-besteffort-pod6c8b8162_5805_4051_b269_0d39b83bfb47.slice - libcontainer container kubepods-besteffort-pod6c8b8162_5805_4051_b269_0d39b83bfb47.slice. Jul 7 00:20:32.525014 kubelet[2859]: I0707 00:20:32.524297 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6c8b8162-5805-4051-b269-0d39b83bfb47-calico-apiserver-certs\") pod \"calico-apiserver-7665774675-bpzsw\" (UID: \"6c8b8162-5805-4051-b269-0d39b83bfb47\") " pod="calico-apiserver/calico-apiserver-7665774675-bpzsw" Jul 7 00:20:32.525014 kubelet[2859]: I0707 00:20:32.524572 2859 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89vlc\" (UniqueName: \"kubernetes.io/projected/6c8b8162-5805-4051-b269-0d39b83bfb47-kube-api-access-89vlc\") pod \"calico-apiserver-7665774675-bpzsw\" (UID: \"6c8b8162-5805-4051-b269-0d39b83bfb47\") " pod="calico-apiserver/calico-apiserver-7665774675-bpzsw" Jul 7 00:20:32.537139 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983-rootfs.mount: Deactivated successfully. Jul 7 00:20:32.657968 containerd[1548]: time="2025-07-07T00:20:32.657819398Z" level=info msg="StopContainer for \"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\" returns successfully" Jul 7 00:20:32.659198 containerd[1548]: time="2025-07-07T00:20:32.659155085Z" level=info msg="StopPodSandbox for \"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80\"" Jul 7 00:20:32.659460 containerd[1548]: time="2025-07-07T00:20:32.659340206Z" level=info msg="Container to stop \"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jul 7 00:20:32.672322 systemd[1]: cri-containerd-eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80.scope: Deactivated successfully. Jul 7 00:20:32.678880 containerd[1548]: time="2025-07-07T00:20:32.678766781Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80\" id:\"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80\" pid:4926 exit_status:137 exited_at:{seconds:1751847632 nanos:677159413}" Jul 7 00:20:32.724615 containerd[1548]: time="2025-07-07T00:20:32.724565445Z" level=info msg="shim disconnected" id=eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80 namespace=k8s.io Jul 7 00:20:32.724746 containerd[1548]: time="2025-07-07T00:20:32.724612445Z" level=warning msg="cleaning up after shim disconnected" id=eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80 namespace=k8s.io Jul 7 00:20:32.724746 containerd[1548]: time="2025-07-07T00:20:32.724669126Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jul 7 00:20:32.725983 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80-rootfs.mount: Deactivated successfully. Jul 7 00:20:32.768445 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80-shm.mount: Deactivated successfully. Jul 7 00:20:32.775110 containerd[1548]: time="2025-07-07T00:20:32.774960972Z" level=info msg="received exit event sandbox_id:\"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80\" exit_status:137 exited_at:{seconds:1751847632 nanos:677159413}" Jul 7 00:20:32.809439 containerd[1548]: time="2025-07-07T00:20:32.809102459Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7665774675-bpzsw,Uid:6c8b8162-5805-4051-b269-0d39b83bfb47,Namespace:calico-apiserver,Attempt:0,}" Jul 7 00:20:32.873561 systemd-networkd[1414]: calied033809da1: Link DOWN Jul 7 00:20:32.873568 systemd-networkd[1414]: calied033809da1: Lost carrier Jul 7 00:20:33.028584 containerd[1548]: 2025-07-07 00:20:32.870 [INFO][5748] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Jul 7 00:20:33.028584 containerd[1548]: 2025-07-07 00:20:32.870 [INFO][5748] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" iface="eth0" netns="/var/run/netns/cni-5e8044ba-0546-86b7-0acd-41d295e5629c" Jul 7 00:20:33.028584 containerd[1548]: 2025-07-07 00:20:32.872 [INFO][5748] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" iface="eth0" netns="/var/run/netns/cni-5e8044ba-0546-86b7-0acd-41d295e5629c" Jul 7 00:20:33.028584 containerd[1548]: 2025-07-07 00:20:32.880 [INFO][5748] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" after=9.519847ms iface="eth0" netns="/var/run/netns/cni-5e8044ba-0546-86b7-0acd-41d295e5629c" Jul 7 00:20:33.028584 containerd[1548]: 2025-07-07 00:20:32.880 [INFO][5748] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Jul 7 00:20:33.028584 containerd[1548]: 2025-07-07 00:20:32.880 [INFO][5748] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Jul 7 00:20:33.028584 containerd[1548]: 2025-07-07 00:20:32.933 [INFO][5766] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" HandleID="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:20:33.028584 containerd[1548]: 2025-07-07 00:20:32.933 [INFO][5766] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:20:33.028584 containerd[1548]: 2025-07-07 00:20:32.934 [INFO][5766] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:20:33.028584 containerd[1548]: 2025-07-07 00:20:33.020 [INFO][5766] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" HandleID="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:20:33.028584 containerd[1548]: 2025-07-07 00:20:33.020 [INFO][5766] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" HandleID="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:20:33.028584 containerd[1548]: 2025-07-07 00:20:33.022 [INFO][5766] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:20:33.028584 containerd[1548]: 2025-07-07 00:20:33.026 [INFO][5748] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Jul 7 00:20:33.031127 containerd[1548]: time="2025-07-07T00:20:33.030779544Z" level=info msg="TearDown network for sandbox \"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80\" successfully" Jul 7 00:20:33.031127 containerd[1548]: time="2025-07-07T00:20:33.030814864Z" level=info msg="StopPodSandbox for \"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80\" returns successfully" Jul 7 00:20:33.132703 kubelet[2859]: I0707 00:20:33.132660 2859 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rm8c\" (UniqueName: \"kubernetes.io/projected/f13c4783-fce7-4cf1-ab5f-5ba43426f9fb-kube-api-access-9rm8c\") pod \"f13c4783-fce7-4cf1-ab5f-5ba43426f9fb\" (UID: \"f13c4783-fce7-4cf1-ab5f-5ba43426f9fb\") " Jul 7 00:20:33.132879 kubelet[2859]: I0707 00:20:33.132735 2859 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f13c4783-fce7-4cf1-ab5f-5ba43426f9fb-calico-apiserver-certs\") pod \"f13c4783-fce7-4cf1-ab5f-5ba43426f9fb\" (UID: \"f13c4783-fce7-4cf1-ab5f-5ba43426f9fb\") " Jul 7 00:20:33.137149 systemd-networkd[1414]: caliabac0144578: Link UP Jul 7 00:20:33.138208 systemd-networkd[1414]: caliabac0144578: Gained carrier Jul 7 00:20:33.144545 kubelet[2859]: I0707 00:20:33.143302 2859 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f13c4783-fce7-4cf1-ab5f-5ba43426f9fb-kube-api-access-9rm8c" (OuterVolumeSpecName: "kube-api-access-9rm8c") pod "f13c4783-fce7-4cf1-ab5f-5ba43426f9fb" (UID: "f13c4783-fce7-4cf1-ab5f-5ba43426f9fb"). InnerVolumeSpecName "kube-api-access-9rm8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jul 7 00:20:33.145782 kubelet[2859]: I0707 00:20:33.145733 2859 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13c4783-fce7-4cf1-ab5f-5ba43426f9fb-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "f13c4783-fce7-4cf1-ab5f-5ba43426f9fb" (UID: "f13c4783-fce7-4cf1-ab5f-5ba43426f9fb"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:32.887 [INFO][5754] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--bpzsw-eth0 calico-apiserver-7665774675- calico-apiserver 6c8b8162-5805-4051-b269-0d39b83bfb47 1174 0 2025-07-07 00:20:32 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7665774675 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-1-1-1-1232b7205a calico-apiserver-7665774675-bpzsw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliabac0144578 [] [] }} ContainerID="9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" Namespace="calico-apiserver" Pod="calico-apiserver-7665774675-bpzsw" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--bpzsw-" Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:32.889 [INFO][5754] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" Namespace="calico-apiserver" Pod="calico-apiserver-7665774675-bpzsw" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--bpzsw-eth0" Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:32.954 [INFO][5774] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" HandleID="k8s-pod-network.9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--bpzsw-eth0" Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:32.955 [INFO][5774] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" HandleID="k8s-pod-network.9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--bpzsw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb9a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-1-1-1-1232b7205a", "pod":"calico-apiserver-7665774675-bpzsw", "timestamp":"2025-07-07 00:20:32.953164404 +0000 UTC"}, Hostname:"ci-4344-1-1-1-1232b7205a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:32.955 [INFO][5774] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:33.022 [INFO][5774] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:33.023 [INFO][5774] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-1-1232b7205a' Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:33.045 [INFO][5774] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:33.051 [INFO][5774] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:33.061 [INFO][5774] ipam/ipam.go 511: Trying affinity for 192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:33.067 [INFO][5774] ipam/ipam.go 158: Attempting to load block cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:33.083 [INFO][5774] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.99.128/26 host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:33.083 [INFO][5774] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.99.128/26 handle="k8s-pod-network.9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:33.089 [INFO][5774] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:33.112 [INFO][5774] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.99.128/26 handle="k8s-pod-network.9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:33.126 [INFO][5774] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.99.138/26] block=192.168.99.128/26 handle="k8s-pod-network.9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:33.127 [INFO][5774] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.99.138/26] handle="k8s-pod-network.9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" host="ci-4344-1-1-1-1232b7205a" Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:33.127 [INFO][5774] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:20:33.164636 containerd[1548]: 2025-07-07 00:20:33.127 [INFO][5774] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.99.138/26] IPv6=[] ContainerID="9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" HandleID="k8s-pod-network.9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--bpzsw-eth0" Jul 7 00:20:33.166350 containerd[1548]: 2025-07-07 00:20:33.131 [INFO][5754] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" Namespace="calico-apiserver" Pod="calico-apiserver-7665774675-bpzsw" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--bpzsw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--bpzsw-eth0", GenerateName:"calico-apiserver-7665774675-", Namespace:"calico-apiserver", SelfLink:"", UID:"6c8b8162-5805-4051-b269-0d39b83bfb47", ResourceVersion:"1174", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 20, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7665774675", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"", Pod:"calico-apiserver-7665774675-bpzsw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.99.138/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliabac0144578", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:33.166350 containerd[1548]: 2025-07-07 00:20:33.132 [INFO][5754] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.99.138/32] ContainerID="9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" Namespace="calico-apiserver" Pod="calico-apiserver-7665774675-bpzsw" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--bpzsw-eth0" Jul 7 00:20:33.166350 containerd[1548]: 2025-07-07 00:20:33.132 [INFO][5754] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliabac0144578 ContainerID="9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" Namespace="calico-apiserver" Pod="calico-apiserver-7665774675-bpzsw" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--bpzsw-eth0" Jul 7 00:20:33.166350 containerd[1548]: 2025-07-07 00:20:33.138 [INFO][5754] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" Namespace="calico-apiserver" Pod="calico-apiserver-7665774675-bpzsw" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--bpzsw-eth0" Jul 7 00:20:33.166350 containerd[1548]: 2025-07-07 00:20:33.139 [INFO][5754] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" Namespace="calico-apiserver" Pod="calico-apiserver-7665774675-bpzsw" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--bpzsw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--bpzsw-eth0", GenerateName:"calico-apiserver-7665774675-", Namespace:"calico-apiserver", SelfLink:"", UID:"6c8b8162-5805-4051-b269-0d39b83bfb47", ResourceVersion:"1174", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 20, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7665774675", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-1-1232b7205a", ContainerID:"9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec", Pod:"calico-apiserver-7665774675-bpzsw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.99.138/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliabac0144578", MAC:"b2:b0:88:d3:4b:b5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:20:33.166350 containerd[1548]: 2025-07-07 00:20:33.159 [INFO][5754] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" Namespace="calico-apiserver" Pod="calico-apiserver-7665774675-bpzsw" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--7665774675--bpzsw-eth0" Jul 7 00:20:33.204876 containerd[1548]: time="2025-07-07T00:20:33.204815552Z" level=info msg="connecting to shim 9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec" address="unix:///run/containerd/s/2ec9c821cf389820bfc89f392e8e33b82d534d3dc710744dc168d50e0a33c71d" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:20:33.233105 kubelet[2859]: I0707 00:20:33.233037 2859 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f13c4783-fce7-4cf1-ab5f-5ba43426f9fb-calico-apiserver-certs\") on node \"ci-4344-1-1-1-1232b7205a\" DevicePath \"\"" Jul 7 00:20:33.233105 kubelet[2859]: I0707 00:20:33.233061 2859 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rm8c\" (UniqueName: \"kubernetes.io/projected/f13c4783-fce7-4cf1-ab5f-5ba43426f9fb-kube-api-access-9rm8c\") on node \"ci-4344-1-1-1-1232b7205a\" DevicePath \"\"" Jul 7 00:20:33.234709 systemd[1]: Started cri-containerd-9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec.scope - libcontainer container 9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec. Jul 7 00:20:33.240690 kubelet[2859]: I0707 00:20:33.240651 2859 scope.go:117] "RemoveContainer" containerID="d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983" Jul 7 00:20:33.244441 containerd[1548]: time="2025-07-07T00:20:33.244400786Z" level=info msg="RemoveContainer for \"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\"" Jul 7 00:20:33.253684 containerd[1548]: time="2025-07-07T00:20:33.253639191Z" level=info msg="RemoveContainer for \"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\" returns successfully" Jul 7 00:20:33.254170 kubelet[2859]: I0707 00:20:33.254120 2859 scope.go:117] "RemoveContainer" containerID="d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983" Jul 7 00:20:33.255234 containerd[1548]: time="2025-07-07T00:20:33.255198358Z" level=error msg="ContainerStatus for \"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\": not found" Jul 7 00:20:33.255882 kubelet[2859]: E0707 00:20:33.255631 2859 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\": not found" containerID="d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983" Jul 7 00:20:33.257417 kubelet[2859]: I0707 00:20:33.257248 2859 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983"} err="failed to get container status \"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\": rpc error: code = NotFound desc = an error occurred when try to find container \"d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983\": not found" Jul 7 00:20:33.257875 systemd[1]: Removed slice kubepods-besteffort-podf13c4783_fce7_4cf1_ab5f_5ba43426f9fb.slice - libcontainer container kubepods-besteffort-podf13c4783_fce7_4cf1_ab5f_5ba43426f9fb.slice. Jul 7 00:20:33.258200 systemd[1]: kubepods-besteffort-podf13c4783_fce7_4cf1_ab5f_5ba43426f9fb.slice: Consumed 1.478s CPU time, 44.1M memory peak, 405K read from disk. Jul 7 00:20:33.313971 containerd[1548]: time="2025-07-07T00:20:33.313872284Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7665774675-bpzsw,Uid:6c8b8162-5805-4051-b269-0d39b83bfb47,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec\"" Jul 7 00:20:33.319672 containerd[1548]: time="2025-07-07T00:20:33.319635552Z" level=info msg="CreateContainer within sandbox \"9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 7 00:20:33.339537 containerd[1548]: time="2025-07-07T00:20:33.339468009Z" level=info msg="Container b8ed5202eee70c05d7ec498b1518f7a6ce91b7cae4c37130c01a9df8e0e88c0e: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:20:33.347021 containerd[1548]: time="2025-07-07T00:20:33.346839685Z" level=info msg="CreateContainer within sandbox \"9fe3f904a77ba7575a945de513a8b3cdb021b203bcfcd38244a9214f6d8f98ec\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b8ed5202eee70c05d7ec498b1518f7a6ce91b7cae4c37130c01a9df8e0e88c0e\"" Jul 7 00:20:33.348053 containerd[1548]: time="2025-07-07T00:20:33.347798650Z" level=info msg="StartContainer for \"b8ed5202eee70c05d7ec498b1518f7a6ce91b7cae4c37130c01a9df8e0e88c0e\"" Jul 7 00:20:33.349271 containerd[1548]: time="2025-07-07T00:20:33.349242537Z" level=info msg="connecting to shim b8ed5202eee70c05d7ec498b1518f7a6ce91b7cae4c37130c01a9df8e0e88c0e" address="unix:///run/containerd/s/2ec9c821cf389820bfc89f392e8e33b82d534d3dc710744dc168d50e0a33c71d" protocol=ttrpc version=3 Jul 7 00:20:33.372439 systemd[1]: Started cri-containerd-b8ed5202eee70c05d7ec498b1518f7a6ce91b7cae4c37130c01a9df8e0e88c0e.scope - libcontainer container b8ed5202eee70c05d7ec498b1518f7a6ce91b7cae4c37130c01a9df8e0e88c0e. Jul 7 00:20:33.428443 containerd[1548]: time="2025-07-07T00:20:33.428378923Z" level=info msg="StartContainer for \"b8ed5202eee70c05d7ec498b1518f7a6ce91b7cae4c37130c01a9df8e0e88c0e\" returns successfully" Jul 7 00:20:33.542947 systemd[1]: run-netns-cni\x2d5e8044ba\x2d0546\x2d86b7\x2d0acd\x2d41d295e5629c.mount: Deactivated successfully. Jul 7 00:20:33.544867 systemd[1]: var-lib-kubelet-pods-f13c4783\x2dfce7\x2d4cf1\x2dab5f\x2d5ba43426f9fb-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d9rm8c.mount: Deactivated successfully. Jul 7 00:20:33.544946 systemd[1]: var-lib-kubelet-pods-f13c4783\x2dfce7\x2d4cf1\x2dab5f\x2d5ba43426f9fb-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Jul 7 00:20:33.745599 kubelet[2859]: I0707 00:20:33.745338 2859 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f13c4783-fce7-4cf1-ab5f-5ba43426f9fb" path="/var/lib/kubelet/pods/f13c4783-fce7-4cf1-ab5f-5ba43426f9fb/volumes" Jul 7 00:20:34.375229 systemd-networkd[1414]: caliabac0144578: Gained IPv6LL Jul 7 00:20:36.252031 kubelet[2859]: I0707 00:20:36.251996 2859 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:20:36.326622 kubelet[2859]: I0707 00:20:36.326551 2859 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7665774675-bpzsw" podStartSLOduration=4.326530262 podStartE2EDuration="4.326530262s" podCreationTimestamp="2025-07-07 00:20:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-07 00:20:34.261938462 +0000 UTC m=+76.668115070" watchObservedRunningTime="2025-07-07 00:20:36.326530262 +0000 UTC m=+78.732706870" Jul 7 00:20:36.390188 containerd[1548]: time="2025-07-07T00:20:36.389967677Z" level=info msg="StopContainer for \"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\" with timeout 30 (s)" Jul 7 00:20:36.391354 containerd[1548]: time="2025-07-07T00:20:36.391112289Z" level=info msg="Stop container \"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\" with signal terminated" Jul 7 00:20:36.450666 systemd[1]: cri-containerd-4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4.scope: Deactivated successfully. Jul 7 00:20:36.451482 systemd[1]: cri-containerd-4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4.scope: Consumed 1.729s CPU time, 61.3M memory peak. Jul 7 00:20:36.457338 containerd[1548]: time="2025-07-07T00:20:36.457291934Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\" id:\"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\" pid:5370 exit_status:1 exited_at:{seconds:1751847636 nanos:456821969}" Jul 7 00:20:36.457677 containerd[1548]: time="2025-07-07T00:20:36.457447495Z" level=info msg="received exit event container_id:\"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\" id:\"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\" pid:5370 exit_status:1 exited_at:{seconds:1751847636 nanos:456821969}" Jul 7 00:20:36.498107 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4-rootfs.mount: Deactivated successfully. Jul 7 00:20:36.522576 containerd[1548]: time="2025-07-07T00:20:36.522405966Z" level=info msg="StopContainer for \"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\" returns successfully" Jul 7 00:20:36.523236 containerd[1548]: time="2025-07-07T00:20:36.523196015Z" level=info msg="StopPodSandbox for \"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0\"" Jul 7 00:20:36.523305 containerd[1548]: time="2025-07-07T00:20:36.523261736Z" level=info msg="Container to stop \"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jul 7 00:20:36.578555 containerd[1548]: time="2025-07-07T00:20:36.578483340Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0\" id:\"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0\" pid:4714 exit_status:137 exited_at:{seconds:1751847636 nanos:578128736}" Jul 7 00:20:36.579688 systemd[1]: cri-containerd-d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0.scope: Deactivated successfully. Jul 7 00:20:36.580045 systemd[1]: cri-containerd-d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0.scope: Consumed 33ms CPU time, 3.8M memory peak, 1.7M read from disk. Jul 7 00:20:36.633523 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0-rootfs.mount: Deactivated successfully. Jul 7 00:20:36.636917 containerd[1548]: time="2025-07-07T00:20:36.636870259Z" level=info msg="shim disconnected" id=d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0 namespace=k8s.io Jul 7 00:20:36.638160 containerd[1548]: time="2025-07-07T00:20:36.637207863Z" level=warning msg="cleaning up after shim disconnected" id=d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0 namespace=k8s.io Jul 7 00:20:36.638160 containerd[1548]: time="2025-07-07T00:20:36.637313304Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jul 7 00:20:36.664040 containerd[1548]: time="2025-07-07T00:20:36.663961796Z" level=info msg="received exit event sandbox_id:\"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0\" exit_status:137 exited_at:{seconds:1751847636 nanos:578128736}" Jul 7 00:20:36.669377 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0-shm.mount: Deactivated successfully. Jul 7 00:20:36.746843 systemd-networkd[1414]: calia958e1aa8da: Link DOWN Jul 7 00:20:36.746849 systemd-networkd[1414]: calia958e1aa8da: Lost carrier Jul 7 00:20:36.864861 containerd[1548]: 2025-07-07 00:20:36.741 [INFO][5962] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Jul 7 00:20:36.864861 containerd[1548]: 2025-07-07 00:20:36.743 [INFO][5962] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" iface="eth0" netns="/var/run/netns/cni-3a8b185c-031f-b703-36b9-661b8579a7b6" Jul 7 00:20:36.864861 containerd[1548]: 2025-07-07 00:20:36.744 [INFO][5962] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" iface="eth0" netns="/var/run/netns/cni-3a8b185c-031f-b703-36b9-661b8579a7b6" Jul 7 00:20:36.864861 containerd[1548]: 2025-07-07 00:20:36.753 [INFO][5962] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" after=9.662825ms iface="eth0" netns="/var/run/netns/cni-3a8b185c-031f-b703-36b9-661b8579a7b6" Jul 7 00:20:36.864861 containerd[1548]: 2025-07-07 00:20:36.753 [INFO][5962] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Jul 7 00:20:36.864861 containerd[1548]: 2025-07-07 00:20:36.753 [INFO][5962] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Jul 7 00:20:36.864861 containerd[1548]: 2025-07-07 00:20:36.796 [INFO][5970] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" HandleID="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:20:36.864861 containerd[1548]: 2025-07-07 00:20:36.796 [INFO][5970] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:20:36.864861 containerd[1548]: 2025-07-07 00:20:36.796 [INFO][5970] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:20:36.864861 containerd[1548]: 2025-07-07 00:20:36.854 [INFO][5970] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" HandleID="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:20:36.864861 containerd[1548]: 2025-07-07 00:20:36.854 [INFO][5970] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" HandleID="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:20:36.864861 containerd[1548]: 2025-07-07 00:20:36.858 [INFO][5970] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:20:36.864861 containerd[1548]: 2025-07-07 00:20:36.862 [INFO][5962] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Jul 7 00:20:36.871728 containerd[1548]: time="2025-07-07T00:20:36.871637189Z" level=info msg="TearDown network for sandbox \"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0\" successfully" Jul 7 00:20:36.871728 containerd[1548]: time="2025-07-07T00:20:36.871677070Z" level=info msg="StopPodSandbox for \"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0\" returns successfully" Jul 7 00:20:36.874861 systemd[1]: run-netns-cni\x2d3a8b185c\x2d031f\x2db703\x2d36b9\x2d661b8579a7b6.mount: Deactivated successfully. Jul 7 00:20:36.965856 kubelet[2859]: I0707 00:20:36.965785 2859 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g92ws\" (UniqueName: \"kubernetes.io/projected/8c8b32cc-c249-4ae4-a5f5-5ca79580b06a-kube-api-access-g92ws\") pod \"8c8b32cc-c249-4ae4-a5f5-5ca79580b06a\" (UID: \"8c8b32cc-c249-4ae4-a5f5-5ca79580b06a\") " Jul 7 00:20:36.967111 kubelet[2859]: I0707 00:20:36.966592 2859 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8c8b32cc-c249-4ae4-a5f5-5ca79580b06a-calico-apiserver-certs\") pod \"8c8b32cc-c249-4ae4-a5f5-5ca79580b06a\" (UID: \"8c8b32cc-c249-4ae4-a5f5-5ca79580b06a\") " Jul 7 00:20:36.974193 kubelet[2859]: I0707 00:20:36.974146 2859 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8b32cc-c249-4ae4-a5f5-5ca79580b06a-kube-api-access-g92ws" (OuterVolumeSpecName: "kube-api-access-g92ws") pod "8c8b32cc-c249-4ae4-a5f5-5ca79580b06a" (UID: "8c8b32cc-c249-4ae4-a5f5-5ca79580b06a"). InnerVolumeSpecName "kube-api-access-g92ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Jul 7 00:20:36.974897 systemd[1]: var-lib-kubelet-pods-8c8b32cc\x2dc249\x2d4ae4\x2da5f5\x2d5ca79580b06a-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dg92ws.mount: Deactivated successfully. Jul 7 00:20:36.977076 kubelet[2859]: I0707 00:20:36.976276 2859 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8b32cc-c249-4ae4-a5f5-5ca79580b06a-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "8c8b32cc-c249-4ae4-a5f5-5ca79580b06a" (UID: "8c8b32cc-c249-4ae4-a5f5-5ca79580b06a"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jul 7 00:20:37.067748 kubelet[2859]: I0707 00:20:37.067685 2859 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8c8b32cc-c249-4ae4-a5f5-5ca79580b06a-calico-apiserver-certs\") on node \"ci-4344-1-1-1-1232b7205a\" DevicePath \"\"" Jul 7 00:20:37.067748 kubelet[2859]: I0707 00:20:37.067724 2859 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g92ws\" (UniqueName: \"kubernetes.io/projected/8c8b32cc-c249-4ae4-a5f5-5ca79580b06a-kube-api-access-g92ws\") on node \"ci-4344-1-1-1-1232b7205a\" DevicePath \"\"" Jul 7 00:20:37.260194 kubelet[2859]: I0707 00:20:37.258811 2859 scope.go:117] "RemoveContainer" containerID="4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4" Jul 7 00:20:37.262118 containerd[1548]: time="2025-07-07T00:20:37.262066959Z" level=info msg="RemoveContainer for \"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\"" Jul 7 00:20:37.270091 systemd[1]: Removed slice kubepods-besteffort-pod8c8b32cc_c249_4ae4_a5f5_5ca79580b06a.slice - libcontainer container kubepods-besteffort-pod8c8b32cc_c249_4ae4_a5f5_5ca79580b06a.slice. Jul 7 00:20:37.270200 systemd[1]: kubepods-besteffort-pod8c8b32cc_c249_4ae4_a5f5_5ca79580b06a.slice: Consumed 1.762s CPU time, 62.2M memory peak, 1.7M read from disk. Jul 7 00:20:37.278650 containerd[1548]: time="2025-07-07T00:20:37.278580738Z" level=info msg="RemoveContainer for \"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\" returns successfully" Jul 7 00:20:37.283170 kubelet[2859]: I0707 00:20:37.283127 2859 scope.go:117] "RemoveContainer" containerID="4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4" Jul 7 00:20:37.283481 containerd[1548]: time="2025-07-07T00:20:37.283431391Z" level=error msg="ContainerStatus for \"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\": not found" Jul 7 00:20:37.283692 kubelet[2859]: E0707 00:20:37.283663 2859 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\": not found" containerID="4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4" Jul 7 00:20:37.283733 kubelet[2859]: I0707 00:20:37.283696 2859 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4"} err="failed to get container status \"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\": rpc error: code = NotFound desc = an error occurred when try to find container \"4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4\": not found" Jul 7 00:20:37.500746 systemd[1]: var-lib-kubelet-pods-8c8b32cc\x2dc249\x2d4ae4\x2da5f5\x2d5ca79580b06a-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Jul 7 00:20:37.746007 kubelet[2859]: I0707 00:20:37.745959 2859 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c8b32cc-c249-4ae4-a5f5-5ca79580b06a" path="/var/lib/kubelet/pods/8c8b32cc-c249-4ae4-a5f5-5ca79580b06a/volumes" Jul 7 00:20:45.315930 containerd[1548]: time="2025-07-07T00:20:45.315798465Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"ad4213421046fd91b3e54f848816a8f2c3dc8476696c7375cf779bf4a4b468bb\" pid:6002 exited_at:{seconds:1751847645 nanos:315315500}" Jul 7 00:20:49.472840 containerd[1548]: time="2025-07-07T00:20:49.472765592Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"b12ae262ef6aec7617637be7bc5c552738def1026ff98b8c4b3712bee41874fd\" pid:6031 exited_at:{seconds:1751847649 nanos:472178386}" Jul 7 00:20:49.575931 containerd[1548]: time="2025-07-07T00:20:49.575879967Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"adc488cfa2e459aca425de7e6314c1d2829b2a12092971b90bac3ccd9626b734\" pid:6053 exited_at:{seconds:1751847649 nanos:575366602}" Jul 7 00:20:55.515321 containerd[1548]: time="2025-07-07T00:20:55.515276262Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\" id:\"b30cab840e04d34adab89abd2f351c020a5fee6ed18cdc149d807acd3af0fe18\" pid:6079 exited_at:{seconds:1751847655 nanos:514932019}" Jul 7 00:21:09.287066 containerd[1548]: time="2025-07-07T00:21:09.286890077Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"d8b582e582f1ec36c572f71f64b57134d20f850606e0bca67ab185014c4abd89\" pid:6107 exited_at:{seconds:1751847669 nanos:286572514}" Jul 7 00:21:17.720224 containerd[1548]: time="2025-07-07T00:21:17.720144973Z" level=info msg="StopPodSandbox for \"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80\"" Jul 7 00:21:17.816279 containerd[1548]: 2025-07-07 00:21:17.770 [WARNING][6127] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:21:17.816279 containerd[1548]: 2025-07-07 00:21:17.771 [INFO][6127] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Jul 7 00:21:17.816279 containerd[1548]: 2025-07-07 00:21:17.771 [INFO][6127] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" iface="eth0" netns="" Jul 7 00:21:17.816279 containerd[1548]: 2025-07-07 00:21:17.771 [INFO][6127] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Jul 7 00:21:17.816279 containerd[1548]: 2025-07-07 00:21:17.771 [INFO][6127] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Jul 7 00:21:17.816279 containerd[1548]: 2025-07-07 00:21:17.793 [INFO][6136] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" HandleID="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:21:17.816279 containerd[1548]: 2025-07-07 00:21:17.793 [INFO][6136] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:21:17.816279 containerd[1548]: 2025-07-07 00:21:17.793 [INFO][6136] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:21:17.816279 containerd[1548]: 2025-07-07 00:21:17.810 [WARNING][6136] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" HandleID="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:21:17.816279 containerd[1548]: 2025-07-07 00:21:17.810 [INFO][6136] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" HandleID="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:21:17.816279 containerd[1548]: 2025-07-07 00:21:17.812 [INFO][6136] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:21:17.816279 containerd[1548]: 2025-07-07 00:21:17.814 [INFO][6127] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Jul 7 00:21:17.816737 containerd[1548]: time="2025-07-07T00:21:17.816321034Z" level=info msg="TearDown network for sandbox \"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80\" successfully" Jul 7 00:21:17.816737 containerd[1548]: time="2025-07-07T00:21:17.816363154Z" level=info msg="StopPodSandbox for \"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80\" returns successfully" Jul 7 00:21:17.817528 containerd[1548]: time="2025-07-07T00:21:17.817477803Z" level=info msg="RemovePodSandbox for \"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80\"" Jul 7 00:21:17.817590 containerd[1548]: time="2025-07-07T00:21:17.817541444Z" level=info msg="Forcibly stopping sandbox \"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80\"" Jul 7 00:21:17.925571 containerd[1548]: 2025-07-07 00:21:17.878 [WARNING][6150] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:21:17.925571 containerd[1548]: 2025-07-07 00:21:17.878 [INFO][6150] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Jul 7 00:21:17.925571 containerd[1548]: 2025-07-07 00:21:17.879 [INFO][6150] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" iface="eth0" netns="" Jul 7 00:21:17.925571 containerd[1548]: 2025-07-07 00:21:17.879 [INFO][6150] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Jul 7 00:21:17.925571 containerd[1548]: 2025-07-07 00:21:17.879 [INFO][6150] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Jul 7 00:21:17.925571 containerd[1548]: 2025-07-07 00:21:17.906 [INFO][6157] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" HandleID="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:21:17.925571 containerd[1548]: 2025-07-07 00:21:17.906 [INFO][6157] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:21:17.925571 containerd[1548]: 2025-07-07 00:21:17.907 [INFO][6157] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:21:17.925571 containerd[1548]: 2025-07-07 00:21:17.917 [WARNING][6157] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" HandleID="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:21:17.925571 containerd[1548]: 2025-07-07 00:21:17.917 [INFO][6157] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" HandleID="k8s-pod-network.eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--d2496-eth0" Jul 7 00:21:17.925571 containerd[1548]: 2025-07-07 00:21:17.920 [INFO][6157] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:21:17.925571 containerd[1548]: 2025-07-07 00:21:17.922 [INFO][6150] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80" Jul 7 00:21:17.925571 containerd[1548]: time="2025-07-07T00:21:17.924214630Z" level=info msg="TearDown network for sandbox \"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80\" successfully" Jul 7 00:21:17.929054 containerd[1548]: time="2025-07-07T00:21:17.928978268Z" level=info msg="Ensure that sandbox eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80 in task-service has been cleanup successfully" Jul 7 00:21:17.933901 containerd[1548]: time="2025-07-07T00:21:17.933858428Z" level=info msg="RemovePodSandbox \"eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80\" returns successfully" Jul 7 00:21:17.934784 containerd[1548]: time="2025-07-07T00:21:17.934739515Z" level=info msg="StopPodSandbox for \"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0\"" Jul 7 00:21:18.030552 containerd[1548]: 2025-07-07 00:21:17.983 [WARNING][6171] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:21:18.030552 containerd[1548]: 2025-07-07 00:21:17.984 [INFO][6171] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Jul 7 00:21:18.030552 containerd[1548]: 2025-07-07 00:21:17.984 [INFO][6171] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" iface="eth0" netns="" Jul 7 00:21:18.030552 containerd[1548]: 2025-07-07 00:21:17.984 [INFO][6171] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Jul 7 00:21:18.030552 containerd[1548]: 2025-07-07 00:21:17.984 [INFO][6171] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Jul 7 00:21:18.030552 containerd[1548]: 2025-07-07 00:21:18.008 [INFO][6178] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" HandleID="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:21:18.030552 containerd[1548]: 2025-07-07 00:21:18.008 [INFO][6178] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:21:18.030552 containerd[1548]: 2025-07-07 00:21:18.009 [INFO][6178] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:21:18.030552 containerd[1548]: 2025-07-07 00:21:18.023 [WARNING][6178] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" HandleID="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:21:18.030552 containerd[1548]: 2025-07-07 00:21:18.023 [INFO][6178] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" HandleID="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:21:18.030552 containerd[1548]: 2025-07-07 00:21:18.025 [INFO][6178] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:21:18.030552 containerd[1548]: 2025-07-07 00:21:18.027 [INFO][6171] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Jul 7 00:21:18.031301 containerd[1548]: time="2025-07-07T00:21:18.030578651Z" level=info msg="TearDown network for sandbox \"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0\" successfully" Jul 7 00:21:18.031301 containerd[1548]: time="2025-07-07T00:21:18.030629732Z" level=info msg="StopPodSandbox for \"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0\" returns successfully" Jul 7 00:21:18.031370 containerd[1548]: time="2025-07-07T00:21:18.031324618Z" level=info msg="RemovePodSandbox for \"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0\"" Jul 7 00:21:18.031370 containerd[1548]: time="2025-07-07T00:21:18.031362578Z" level=info msg="Forcibly stopping sandbox \"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0\"" Jul 7 00:21:18.121238 containerd[1548]: 2025-07-07 00:21:18.077 [WARNING][6192] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" WorkloadEndpoint="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:21:18.121238 containerd[1548]: 2025-07-07 00:21:18.077 [INFO][6192] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Jul 7 00:21:18.121238 containerd[1548]: 2025-07-07 00:21:18.077 [INFO][6192] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" iface="eth0" netns="" Jul 7 00:21:18.121238 containerd[1548]: 2025-07-07 00:21:18.077 [INFO][6192] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Jul 7 00:21:18.121238 containerd[1548]: 2025-07-07 00:21:18.077 [INFO][6192] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Jul 7 00:21:18.121238 containerd[1548]: 2025-07-07 00:21:18.101 [INFO][6199] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" HandleID="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:21:18.121238 containerd[1548]: 2025-07-07 00:21:18.102 [INFO][6199] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:21:18.121238 containerd[1548]: 2025-07-07 00:21:18.102 [INFO][6199] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:21:18.121238 containerd[1548]: 2025-07-07 00:21:18.114 [WARNING][6199] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" HandleID="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:21:18.121238 containerd[1548]: 2025-07-07 00:21:18.114 [INFO][6199] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" HandleID="k8s-pod-network.d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Workload="ci--4344--1--1--1--1232b7205a-k8s-calico--apiserver--d455968b6--7vdfw-eth0" Jul 7 00:21:18.121238 containerd[1548]: 2025-07-07 00:21:18.117 [INFO][6199] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:21:18.121238 containerd[1548]: 2025-07-07 00:21:18.119 [INFO][6192] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0" Jul 7 00:21:18.121672 containerd[1548]: time="2025-07-07T00:21:18.121286143Z" level=info msg="TearDown network for sandbox \"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0\" successfully" Jul 7 00:21:18.123569 containerd[1548]: time="2025-07-07T00:21:18.123513681Z" level=info msg="Ensure that sandbox d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0 in task-service has been cleanup successfully" Jul 7 00:21:18.128928 containerd[1548]: time="2025-07-07T00:21:18.128494361Z" level=info msg="RemovePodSandbox \"d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0\" returns successfully" Jul 7 00:21:19.460366 containerd[1548]: time="2025-07-07T00:21:19.460122362Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"59382c8c9bbb26d696bce7c00f20675bfdde297255520d20108d23ee7770f7e1\" pid:6217 exited_at:{seconds:1751847679 nanos:459260035}" Jul 7 00:21:19.574374 containerd[1548]: time="2025-07-07T00:21:19.574293438Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"33b0a36801229d9730bcc17653ac87eeae415436fd46bc37b8b24e6e4c648496\" pid:6239 exited_at:{seconds:1751847679 nanos:573713233}" Jul 7 00:21:25.523421 containerd[1548]: time="2025-07-07T00:21:25.523358830Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\" id:\"391735681c50df1a0d1024c740a6057b591e7c25905af280e6b8b265864d0f3b\" pid:6263 exited_at:{seconds:1751847685 nanos:522488104}" Jul 7 00:21:45.207004 containerd[1548]: time="2025-07-07T00:21:45.206930199Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"d3435fabb2a0a90cf8b4cf335b50023da34a8b709586d0ee054b5d98adcf5472\" pid:6313 exited_at:{seconds:1751847705 nanos:204907865}" Jul 7 00:21:49.459570 containerd[1548]: time="2025-07-07T00:21:49.459423381Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"81714479dae267116130e1a8b1ff1f4717784d4cea9ebd3dbb23dea22c95ebdc\" pid:6336 exited_at:{seconds:1751847709 nanos:459196180}" Jul 7 00:21:49.577840 containerd[1548]: time="2025-07-07T00:21:49.577758876Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"44d50f2e988076778fcb98b60b44d903a7b1b57e074a2e9f87078268ecb0a3a0\" pid:6357 exited_at:{seconds:1751847709 nanos:577021070}" Jul 7 00:21:55.515447 containerd[1548]: time="2025-07-07T00:21:55.515386535Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\" id:\"a2dc57779b6199924947cde27a373358114966151edab4bf790cf3b26d748ddf\" pid:6382 exited_at:{seconds:1751847715 nanos:514975452}" Jul 7 00:22:09.250597 containerd[1548]: time="2025-07-07T00:22:09.250408702Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"335bb9877f6728bd4d04403fdfef889e95c2695780f84291e6b349036cdde0fc\" pid:6411 exited_at:{seconds:1751847729 nanos:249926859}" Jul 7 00:22:19.463763 containerd[1548]: time="2025-07-07T00:22:19.463694444Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"15172d13e9ce29de2b8ba4500f3cd8152349dc6515bb01c2bb434d20832e55bd\" pid:6436 exited_at:{seconds:1751847739 nanos:462460876}" Jul 7 00:22:19.572908 containerd[1548]: time="2025-07-07T00:22:19.572849037Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"09f9b09284e3da4b35eaa3a330b77571d43aec8aa06b8147521d991778b826c6\" pid:6458 exited_at:{seconds:1751847739 nanos:572151033}" Jul 7 00:22:25.514979 containerd[1548]: time="2025-07-07T00:22:25.514909267Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\" id:\"3de3f12bc9acf0b40969f5b9e25a71a2cb8cb3e10bc4fdae1ac8e579a18370de\" pid:6483 exited_at:{seconds:1751847745 nanos:514287424}" Jul 7 00:22:45.208285 containerd[1548]: time="2025-07-07T00:22:45.208226572Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"365ce98cad0a1b7bcfbcfce3cc92436eabdab74c3a5add6d3100b336cb47f431\" pid:6509 exited_at:{seconds:1751847765 nanos:206815244}" Jul 7 00:22:49.465153 containerd[1548]: time="2025-07-07T00:22:49.464909429Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"d5233639d3a0573d6ebbb600cf4c3d89f691bd90a7f73fc83e4428165bd51eee\" pid:6536 exited_at:{seconds:1751847769 nanos:464205465}" Jul 7 00:22:49.577758 containerd[1548]: time="2025-07-07T00:22:49.577716195Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"41b6bb6dbe0264e0eb1384ba6029377b36d23cb2f2e3ce2970c435bd7986b25f\" pid:6558 exited_at:{seconds:1751847769 nanos:577297313}" Jul 7 00:22:55.514719 containerd[1548]: time="2025-07-07T00:22:55.514655049Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\" id:\"869a526237f9e2d463812d2c0318796c773cf8da19fddc50afcfacb2f299e1ae\" pid:6584 exited_at:{seconds:1751847775 nanos:514046525}" Jul 7 00:23:09.244736 containerd[1548]: time="2025-07-07T00:23:09.244682605Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"b271aceb9fbd517926a0f61b26b74ea318fa17aa1949559ce02137fa4ad3592b\" pid:6610 exited_at:{seconds:1751847789 nanos:244202963}" Jul 7 00:23:19.462384 containerd[1548]: time="2025-07-07T00:23:19.462313048Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"ee83693c4614f8808fab2cc60c04f663c412c37da09841dbf4ec49c83c47d5bd\" pid:6659 exited_at:{seconds:1751847799 nanos:461439764}" Jul 7 00:23:19.567400 containerd[1548]: time="2025-07-07T00:23:19.567306941Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"3046f2d1d46caa610f3a881667d964bdb872903bac3a8d8d5b406a98d95c75e6\" pid:6681 exited_at:{seconds:1751847799 nanos:566738498}" Jul 7 00:23:25.515602 containerd[1548]: time="2025-07-07T00:23:25.515551295Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\" id:\"d6f0e56afeed54ef5b78fd512a8d8b13fbfdaf82da9d3c87cdf88237fed4e7ca\" pid:6704 exited_at:{seconds:1751847805 nanos:515224453}" Jul 7 00:23:45.205231 containerd[1548]: time="2025-07-07T00:23:45.204991916Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"f9ee9bbb9bb995b6769d300199dea5179cd4135353d36e041e3a0451c4218d6e\" pid:6729 exited_at:{seconds:1751847825 nanos:204688155}" Jul 7 00:23:49.456795 containerd[1548]: time="2025-07-07T00:23:49.456735695Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"23cdceefb94bcd921dca1dffe0e30d892fe446d373b819723f8750ca22e2caf7\" pid:6750 exited_at:{seconds:1751847829 nanos:456244893}" Jul 7 00:23:49.568365 containerd[1548]: time="2025-07-07T00:23:49.568243604Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"7e0c3bb9c3fc2e01d4c3696df6f15f4b553d60e983f7e654076799f113a5cfdf\" pid:6772 exited_at:{seconds:1751847829 nanos:567746122}" Jul 7 00:23:55.516672 containerd[1548]: time="2025-07-07T00:23:55.516580463Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\" id:\"596031ea1c5d5fab0e73b6eb47b277d7c0edec13219952015ca86532a68992f0\" pid:6797 exited_at:{seconds:1751847835 nanos:515716459}" Jul 7 00:24:09.245250 containerd[1548]: time="2025-07-07T00:24:09.245188343Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"7e9af39f7e795866bc0dac44134d9cdabbc75cdce5931a39e7cc1a5faa639b6a\" pid:6822 exited_at:{seconds:1751847849 nanos:244755900}" Jul 7 00:24:09.875838 systemd[1]: Started sshd@10-91.107.203.174:22-103.23.198.220:39010.service - OpenSSH per-connection server daemon (103.23.198.220:39010). Jul 7 00:24:11.041000 sshd[6833]: Invalid user odoo15 from 103.23.198.220 port 39010 Jul 7 00:24:11.260055 sshd[6833]: Received disconnect from 103.23.198.220 port 39010:11: Bye Bye [preauth] Jul 7 00:24:11.260055 sshd[6833]: Disconnected from invalid user odoo15 103.23.198.220 port 39010 [preauth] Jul 7 00:24:11.263719 systemd[1]: sshd@10-91.107.203.174:22-103.23.198.220:39010.service: Deactivated successfully. Jul 7 00:24:12.409831 containerd[1548]: time="2025-07-07T00:24:12.409634599Z" level=warning msg="container event discarded" container=ff16dc30947317e889b1f660521b783fe0688e942925a366bd71f1c9293583be type=CONTAINER_CREATED_EVENT Jul 7 00:24:12.421180 containerd[1548]: time="2025-07-07T00:24:12.421095139Z" level=warning msg="container event discarded" container=ff16dc30947317e889b1f660521b783fe0688e942925a366bd71f1c9293583be type=CONTAINER_STARTED_EVENT Jul 7 00:24:12.436625 containerd[1548]: time="2025-07-07T00:24:12.436473939Z" level=warning msg="container event discarded" container=a39607a2fae23d067618fb30492d936941afa7ee70321575d7a54354d9762604 type=CONTAINER_CREATED_EVENT Jul 7 00:24:12.436625 containerd[1548]: time="2025-07-07T00:24:12.436596019Z" level=warning msg="container event discarded" container=a39607a2fae23d067618fb30492d936941afa7ee70321575d7a54354d9762604 type=CONTAINER_STARTED_EVENT Jul 7 00:24:12.451080 containerd[1548]: time="2025-07-07T00:24:12.450991374Z" level=warning msg="container event discarded" container=d28c46c6273a031f36829172e18c789caa80563b1b91e8f9310122b886156e8c type=CONTAINER_CREATED_EVENT Jul 7 00:24:12.451080 containerd[1548]: time="2025-07-07T00:24:12.451058055Z" level=warning msg="container event discarded" container=d28c46c6273a031f36829172e18c789caa80563b1b91e8f9310122b886156e8c type=CONTAINER_STARTED_EVENT Jul 7 00:24:12.451080 containerd[1548]: time="2025-07-07T00:24:12.451075655Z" level=warning msg="container event discarded" container=026caa3cbfb40d9f93e906135ae21e2e168a3077a0a9a4041a09757daaaa704c type=CONTAINER_CREATED_EVENT Jul 7 00:24:12.464724 containerd[1548]: time="2025-07-07T00:24:12.464633325Z" level=warning msg="container event discarded" container=9e349d99aad19fab0491b30799d4d96703f488b5faa14262e2764555aceb97b8 type=CONTAINER_CREATED_EVENT Jul 7 00:24:12.487326 containerd[1548]: time="2025-07-07T00:24:12.487214962Z" level=warning msg="container event discarded" container=e758906844fbd49d501758cae788f076f71ece82aca590d755add8e8f0570bcd type=CONTAINER_CREATED_EVENT Jul 7 00:24:12.557858 containerd[1548]: time="2025-07-07T00:24:12.557763009Z" level=warning msg="container event discarded" container=026caa3cbfb40d9f93e906135ae21e2e168a3077a0a9a4041a09757daaaa704c type=CONTAINER_STARTED_EVENT Jul 7 00:24:12.582431 containerd[1548]: time="2025-07-07T00:24:12.582328937Z" level=warning msg="container event discarded" container=9e349d99aad19fab0491b30799d4d96703f488b5faa14262e2764555aceb97b8 type=CONTAINER_STARTED_EVENT Jul 7 00:24:12.632898 containerd[1548]: time="2025-07-07T00:24:12.632778839Z" level=warning msg="container event discarded" container=e758906844fbd49d501758cae788f076f71ece82aca590d755add8e8f0570bcd type=CONTAINER_STARTED_EVENT Jul 7 00:24:15.220074 systemd[1]: Started sshd@11-91.107.203.174:22-139.178.89.65:47396.service - OpenSSH per-connection server daemon (139.178.89.65:47396). Jul 7 00:24:16.353231 sshd[6841]: Accepted publickey for core from 139.178.89.65 port 47396 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:24:16.355635 sshd-session[6841]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:24:16.365169 systemd-logind[1498]: New session 8 of user core. Jul 7 00:24:16.371175 systemd[1]: Started session-8.scope - Session 8 of User core. Jul 7 00:24:17.229539 sshd[6843]: Connection closed by 139.178.89.65 port 47396 Jul 7 00:24:17.228674 sshd-session[6841]: pam_unix(sshd:session): session closed for user core Jul 7 00:24:17.234763 systemd[1]: sshd@11-91.107.203.174:22-139.178.89.65:47396.service: Deactivated successfully. Jul 7 00:24:17.239361 systemd[1]: session-8.scope: Deactivated successfully. Jul 7 00:24:17.241538 systemd-logind[1498]: Session 8 logged out. Waiting for processes to exit. Jul 7 00:24:17.244785 systemd-logind[1498]: Removed session 8. Jul 7 00:24:19.470810 containerd[1548]: time="2025-07-07T00:24:19.470699770Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"0f5caee0cd3d96d24b70cd5f3f0b7701ef2cc37d9dce2f8b84ba67cdb76740a9\" pid:6869 exited_at:{seconds:1751847859 nanos:470130407}" Jul 7 00:24:19.603957 containerd[1548]: time="2025-07-07T00:24:19.603896699Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"94fa42044d0e296d65cb7bac508158878cb06ac4c4c098eacb1a6ea598e7da42\" pid:6890 exited_at:{seconds:1751847859 nanos:602839494}" Jul 7 00:24:22.422424 systemd[1]: Started sshd@12-91.107.203.174:22-139.178.89.65:42070.service - OpenSSH per-connection server daemon (139.178.89.65:42070). Jul 7 00:24:23.314669 containerd[1548]: time="2025-07-07T00:24:23.314354639Z" level=warning msg="container event discarded" container=d83936188a6abd12eb5318f64994fdbcc1a376fd86f3a739ad6b4bdb7ed9e9f5 type=CONTAINER_CREATED_EVENT Jul 7 00:24:23.314669 containerd[1548]: time="2025-07-07T00:24:23.314449200Z" level=warning msg="container event discarded" container=d83936188a6abd12eb5318f64994fdbcc1a376fd86f3a739ad6b4bdb7ed9e9f5 type=CONTAINER_STARTED_EVENT Jul 7 00:24:23.344917 containerd[1548]: time="2025-07-07T00:24:23.344796196Z" level=warning msg="container event discarded" container=a1cfa8ff888aed79ac4fd9b96e43b1a27a623a57423eef1f94ffb98dc46ae34c type=CONTAINER_CREATED_EVENT Jul 7 00:24:23.431286 containerd[1548]: time="2025-07-07T00:24:23.431202363Z" level=warning msg="container event discarded" container=a1cfa8ff888aed79ac4fd9b96e43b1a27a623a57423eef1f94ffb98dc46ae34c type=CONTAINER_STARTED_EVENT Jul 7 00:24:23.470812 containerd[1548]: time="2025-07-07T00:24:23.470739927Z" level=warning msg="container event discarded" container=36f8433403bd11d833053a2193aebf56088eb3f34900457d2ecaf0c292c0697a type=CONTAINER_CREATED_EVENT Jul 7 00:24:23.470812 containerd[1548]: time="2025-07-07T00:24:23.470802327Z" level=warning msg="container event discarded" container=36f8433403bd11d833053a2193aebf56088eb3f34900457d2ecaf0c292c0697a type=CONTAINER_STARTED_EVENT Jul 7 00:24:23.534689 sshd[6901]: Accepted publickey for core from 139.178.89.65 port 42070 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:24:23.537460 sshd-session[6901]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:24:23.543421 systemd-logind[1498]: New session 9 of user core. Jul 7 00:24:23.551922 systemd[1]: Started session-9.scope - Session 9 of User core. Jul 7 00:24:24.375379 sshd[6904]: Connection closed by 139.178.89.65 port 42070 Jul 7 00:24:24.375819 sshd-session[6901]: pam_unix(sshd:session): session closed for user core Jul 7 00:24:24.380532 systemd[1]: sshd@12-91.107.203.174:22-139.178.89.65:42070.service: Deactivated successfully. Jul 7 00:24:24.383207 systemd[1]: session-9.scope: Deactivated successfully. Jul 7 00:24:24.388892 systemd-logind[1498]: Session 9 logged out. Waiting for processes to exit. Jul 7 00:24:24.390308 systemd-logind[1498]: Removed session 9. Jul 7 00:24:24.571841 systemd[1]: Started sshd@13-91.107.203.174:22-139.178.89.65:42076.service - OpenSSH per-connection server daemon (139.178.89.65:42076). Jul 7 00:24:25.511800 containerd[1548]: time="2025-07-07T00:24:25.511751541Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\" id:\"e9ea2e45ba8f9863b35646d469684d36414f998af9b8f03b36150df0de70c251\" pid:6933 exited_at:{seconds:1751847865 nanos:511257938}" Jul 7 00:24:25.682934 sshd[6919]: Accepted publickey for core from 139.178.89.65 port 42076 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:24:25.685759 sshd-session[6919]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:24:25.691557 systemd-logind[1498]: New session 10 of user core. Jul 7 00:24:25.695724 systemd[1]: Started session-10.scope - Session 10 of User core. Jul 7 00:24:25.810245 containerd[1548]: time="2025-07-07T00:24:25.810088840Z" level=warning msg="container event discarded" container=c15d5cdd1143f9f16f1b414480a07a046d83f9439e2a2812dec6af9eed0f174b type=CONTAINER_CREATED_EVENT Jul 7 00:24:25.894726 containerd[1548]: time="2025-07-07T00:24:25.894621436Z" level=warning msg="container event discarded" container=c15d5cdd1143f9f16f1b414480a07a046d83f9439e2a2812dec6af9eed0f174b type=CONTAINER_STARTED_EVENT Jul 7 00:24:26.558838 sshd[6945]: Connection closed by 139.178.89.65 port 42076 Jul 7 00:24:26.560445 sshd-session[6919]: pam_unix(sshd:session): session closed for user core Jul 7 00:24:26.566387 systemd[1]: sshd@13-91.107.203.174:22-139.178.89.65:42076.service: Deactivated successfully. Jul 7 00:24:26.568951 systemd[1]: session-10.scope: Deactivated successfully. Jul 7 00:24:26.570586 systemd-logind[1498]: Session 10 logged out. Waiting for processes to exit. Jul 7 00:24:26.573192 systemd-logind[1498]: Removed session 10. Jul 7 00:24:26.749043 systemd[1]: Started sshd@14-91.107.203.174:22-139.178.89.65:42090.service - OpenSSH per-connection server daemon (139.178.89.65:42090). Jul 7 00:24:27.861548 sshd[6955]: Accepted publickey for core from 139.178.89.65 port 42090 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:24:27.864749 sshd-session[6955]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:24:27.873708 systemd-logind[1498]: New session 11 of user core. Jul 7 00:24:27.879762 systemd[1]: Started session-11.scope - Session 11 of User core. Jul 7 00:24:28.721464 sshd[6957]: Connection closed by 139.178.89.65 port 42090 Jul 7 00:24:28.722370 sshd-session[6955]: pam_unix(sshd:session): session closed for user core Jul 7 00:24:28.728322 systemd[1]: sshd@14-91.107.203.174:22-139.178.89.65:42090.service: Deactivated successfully. Jul 7 00:24:28.730964 systemd[1]: session-11.scope: Deactivated successfully. Jul 7 00:24:28.732630 systemd-logind[1498]: Session 11 logged out. Waiting for processes to exit. Jul 7 00:24:28.734741 systemd-logind[1498]: Removed session 11. Jul 7 00:24:33.914809 systemd[1]: Started sshd@15-91.107.203.174:22-139.178.89.65:58530.service - OpenSSH per-connection server daemon (139.178.89.65:58530). Jul 7 00:24:34.992291 sshd[6973]: Accepted publickey for core from 139.178.89.65 port 58530 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:24:34.993794 sshd-session[6973]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:24:35.000280 systemd-logind[1498]: New session 12 of user core. Jul 7 00:24:35.004711 systemd[1]: Started session-12.scope - Session 12 of User core. Jul 7 00:24:35.818049 sshd[6975]: Connection closed by 139.178.89.65 port 58530 Jul 7 00:24:35.819141 sshd-session[6973]: pam_unix(sshd:session): session closed for user core Jul 7 00:24:35.826660 systemd-logind[1498]: Session 12 logged out. Waiting for processes to exit. Jul 7 00:24:35.827302 systemd[1]: sshd@15-91.107.203.174:22-139.178.89.65:58530.service: Deactivated successfully. Jul 7 00:24:35.831935 systemd[1]: session-12.scope: Deactivated successfully. Jul 7 00:24:35.837254 systemd-logind[1498]: Removed session 12. Jul 7 00:24:41.009264 systemd[1]: Started sshd@16-91.107.203.174:22-139.178.89.65:37174.service - OpenSSH per-connection server daemon (139.178.89.65:37174). Jul 7 00:24:41.412177 containerd[1548]: time="2025-07-07T00:24:41.412022706Z" level=warning msg="container event discarded" container=245712a953c856043a5cb225ec1ae0bc4893b9a1e7b89726b022a090e329a8ce type=CONTAINER_CREATED_EVENT Jul 7 00:24:41.412177 containerd[1548]: time="2025-07-07T00:24:41.412131186Z" level=warning msg="container event discarded" container=245712a953c856043a5cb225ec1ae0bc4893b9a1e7b89726b022a090e329a8ce type=CONTAINER_STARTED_EVENT Jul 7 00:24:41.559074 containerd[1548]: time="2025-07-07T00:24:41.558890698Z" level=warning msg="container event discarded" container=e1fde83f752ad3c8ddb84d4ef444d8df742b4ca5672c0d2315af7acd9065bf71 type=CONTAINER_CREATED_EVENT Jul 7 00:24:41.559074 containerd[1548]: time="2025-07-07T00:24:41.558955898Z" level=warning msg="container event discarded" container=e1fde83f752ad3c8ddb84d4ef444d8df742b4ca5672c0d2315af7acd9065bf71 type=CONTAINER_STARTED_EVENT Jul 7 00:24:42.109705 sshd[6988]: Accepted publickey for core from 139.178.89.65 port 37174 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:24:42.111721 sshd-session[6988]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:24:42.121209 systemd-logind[1498]: New session 13 of user core. Jul 7 00:24:42.130121 systemd[1]: Started session-13.scope - Session 13 of User core. Jul 7 00:24:42.934566 sshd[6990]: Connection closed by 139.178.89.65 port 37174 Jul 7 00:24:42.935764 sshd-session[6988]: pam_unix(sshd:session): session closed for user core Jul 7 00:24:42.942575 systemd-logind[1498]: Session 13 logged out. Waiting for processes to exit. Jul 7 00:24:42.942858 systemd[1]: sshd@16-91.107.203.174:22-139.178.89.65:37174.service: Deactivated successfully. Jul 7 00:24:42.946180 systemd[1]: session-13.scope: Deactivated successfully. Jul 7 00:24:42.949105 systemd-logind[1498]: Removed session 13. Jul 7 00:24:43.127466 systemd[1]: Started sshd@17-91.107.203.174:22-139.178.89.65:37178.service - OpenSSH per-connection server daemon (139.178.89.65:37178). Jul 7 00:24:44.087352 containerd[1548]: time="2025-07-07T00:24:44.087238403Z" level=warning msg="container event discarded" container=3784859ea5a4780ad64484f8a06bb374e2cf0349a48069ebca02c3bde12e0518 type=CONTAINER_CREATED_EVENT Jul 7 00:24:44.166716 containerd[1548]: time="2025-07-07T00:24:44.166619089Z" level=warning msg="container event discarded" container=3784859ea5a4780ad64484f8a06bb374e2cf0349a48069ebca02c3bde12e0518 type=CONTAINER_STARTED_EVENT Jul 7 00:24:44.224867 sshd[7003]: Accepted publickey for core from 139.178.89.65 port 37178 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:24:44.227005 sshd-session[7003]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:24:44.232075 systemd-logind[1498]: New session 14 of user core. Jul 7 00:24:44.240832 systemd[1]: Started session-14.scope - Session 14 of User core. Jul 7 00:24:45.200732 sshd[7005]: Connection closed by 139.178.89.65 port 37178 Jul 7 00:24:45.201687 sshd-session[7003]: pam_unix(sshd:session): session closed for user core Jul 7 00:24:45.207958 systemd[1]: sshd@17-91.107.203.174:22-139.178.89.65:37178.service: Deactivated successfully. Jul 7 00:24:45.214157 systemd[1]: session-14.scope: Deactivated successfully. Jul 7 00:24:45.218370 containerd[1548]: time="2025-07-07T00:24:45.218244710Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"1384405f381a9f344dd379a8f2b4e18e6e4286b7a3ea1c4c800c82d4196bdda4\" pid:7023 exited_at:{seconds:1751847885 nanos:216849383}" Jul 7 00:24:45.219251 systemd-logind[1498]: Session 14 logged out. Waiting for processes to exit. Jul 7 00:24:45.222739 systemd-logind[1498]: Removed session 14. Jul 7 00:24:45.392330 systemd[1]: Started sshd@18-91.107.203.174:22-139.178.89.65:37184.service - OpenSSH per-connection server daemon (139.178.89.65:37184). Jul 7 00:24:45.475231 containerd[1548]: time="2025-07-07T00:24:45.475019823Z" level=warning msg="container event discarded" container=3599a6bfae7fbd3b28463d1e522da30d9ef49f2be211086ecef712e7273888a5 type=CONTAINER_CREATED_EVENT Jul 7 00:24:45.559435 containerd[1548]: time="2025-07-07T00:24:45.559354615Z" level=warning msg="container event discarded" container=3599a6bfae7fbd3b28463d1e522da30d9ef49f2be211086ecef712e7273888a5 type=CONTAINER_STARTED_EVENT Jul 7 00:24:45.722186 containerd[1548]: time="2025-07-07T00:24:45.722106607Z" level=warning msg="container event discarded" container=3599a6bfae7fbd3b28463d1e522da30d9ef49f2be211086ecef712e7273888a5 type=CONTAINER_STOPPED_EVENT Jul 7 00:24:46.496486 sshd[7036]: Accepted publickey for core from 139.178.89.65 port 37184 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:24:46.499036 sshd-session[7036]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:24:46.506738 systemd-logind[1498]: New session 15 of user core. Jul 7 00:24:46.513823 systemd[1]: Started session-15.scope - Session 15 of User core. Jul 7 00:24:48.348826 containerd[1548]: time="2025-07-07T00:24:48.348732874Z" level=warning msg="container event discarded" container=a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2 type=CONTAINER_CREATED_EVENT Jul 7 00:24:48.433081 containerd[1548]: time="2025-07-07T00:24:48.433005065Z" level=warning msg="container event discarded" container=a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2 type=CONTAINER_STARTED_EVENT Jul 7 00:24:49.119811 containerd[1548]: time="2025-07-07T00:24:49.119069770Z" level=warning msg="container event discarded" container=a6d5bc24eb27286ebd150eafad15a22c873b971ef5eb9b45aaa76e1bd53cbef2 type=CONTAINER_STOPPED_EVENT Jul 7 00:24:49.573316 containerd[1548]: time="2025-07-07T00:24:49.573261170Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"b9c6df5b8d3e94d48b421f205f573f55f89d37e6e755d95be3f5fadd094dc559\" pid:7079 exited_at:{seconds:1751847889 nanos:570544636}" Jul 7 00:24:49.704179 containerd[1548]: time="2025-07-07T00:24:49.704072118Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"4b504e56a41f12c71cd1b5592b73913fca1a8840c6bd224851fd585087c28e0b\" pid:7099 exited_at:{seconds:1751847889 nanos:702115868}" Jul 7 00:24:49.790431 sshd[7038]: Connection closed by 139.178.89.65 port 37184 Jul 7 00:24:49.791477 sshd-session[7036]: pam_unix(sshd:session): session closed for user core Jul 7 00:24:49.800293 systemd[1]: sshd@18-91.107.203.174:22-139.178.89.65:37184.service: Deactivated successfully. Jul 7 00:24:49.805899 systemd[1]: session-15.scope: Deactivated successfully. Jul 7 00:24:49.806328 systemd[1]: session-15.scope: Consumed 562ms CPU time, 84.7M memory peak. Jul 7 00:24:49.807419 systemd-logind[1498]: Session 15 logged out. Waiting for processes to exit. Jul 7 00:24:49.812163 systemd-logind[1498]: Removed session 15. Jul 7 00:24:49.984799 systemd[1]: Started sshd@19-91.107.203.174:22-139.178.89.65:46764.service - OpenSSH per-connection server daemon (139.178.89.65:46764). Jul 7 00:24:51.104961 sshd[7120]: Accepted publickey for core from 139.178.89.65 port 46764 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:24:51.108470 sshd-session[7120]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:24:51.114704 systemd-logind[1498]: New session 16 of user core. Jul 7 00:24:51.121817 systemd[1]: Started session-16.scope - Session 16 of User core. Jul 7 00:24:52.065603 sshd[7122]: Connection closed by 139.178.89.65 port 46764 Jul 7 00:24:52.065830 sshd-session[7120]: pam_unix(sshd:session): session closed for user core Jul 7 00:24:52.071267 systemd[1]: sshd@19-91.107.203.174:22-139.178.89.65:46764.service: Deactivated successfully. Jul 7 00:24:52.073938 systemd[1]: session-16.scope: Deactivated successfully. Jul 7 00:24:52.076866 systemd-logind[1498]: Session 16 logged out. Waiting for processes to exit. Jul 7 00:24:52.081000 systemd-logind[1498]: Removed session 16. Jul 7 00:24:52.255700 systemd[1]: Started sshd@20-91.107.203.174:22-139.178.89.65:46772.service - OpenSSH per-connection server daemon (139.178.89.65:46772). Jul 7 00:24:53.353490 sshd[7132]: Accepted publickey for core from 139.178.89.65 port 46772 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:24:53.355420 sshd-session[7132]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:24:53.361496 systemd-logind[1498]: New session 17 of user core. Jul 7 00:24:53.363891 systemd[1]: Started session-17.scope - Session 17 of User core. Jul 7 00:24:54.172839 sshd[7134]: Connection closed by 139.178.89.65 port 46772 Jul 7 00:24:54.173903 sshd-session[7132]: pam_unix(sshd:session): session closed for user core Jul 7 00:24:54.180411 systemd-logind[1498]: Session 17 logged out. Waiting for processes to exit. Jul 7 00:24:54.180494 systemd[1]: sshd@20-91.107.203.174:22-139.178.89.65:46772.service: Deactivated successfully. Jul 7 00:24:54.187444 systemd[1]: session-17.scope: Deactivated successfully. Jul 7 00:24:54.190260 systemd-logind[1498]: Removed session 17. Jul 7 00:24:54.208451 containerd[1548]: time="2025-07-07T00:24:54.208348809Z" level=warning msg="container event discarded" container=fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212 type=CONTAINER_CREATED_EVENT Jul 7 00:24:54.331482 containerd[1548]: time="2025-07-07T00:24:54.331372082Z" level=warning msg="container event discarded" container=fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212 type=CONTAINER_STARTED_EVENT Jul 7 00:24:55.503953 containerd[1548]: time="2025-07-07T00:24:55.503775195Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc3137f8f740c8b5bc74358c98a34d6eb02d5d0dadbf9cc4a4e51721c3614212\" id:\"26c31fdd20039a0f3c98b10e92c423b6faddeedc2f12ec89a52cc7de9c0fc99b\" pid:7161 exited_at:{seconds:1751847895 nanos:503341992}" Jul 7 00:24:55.755046 containerd[1548]: time="2025-07-07T00:24:55.754843616Z" level=warning msg="container event discarded" container=fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b type=CONTAINER_CREATED_EVENT Jul 7 00:24:55.755046 containerd[1548]: time="2025-07-07T00:24:55.754913016Z" level=warning msg="container event discarded" container=fa4f0ade81c9d7616aad182845a20558b0e3c8961ae3c17c338fdd801450997b type=CONTAINER_STARTED_EVENT Jul 7 00:24:57.531770 containerd[1548]: time="2025-07-07T00:24:57.531682909Z" level=warning msg="container event discarded" container=319ffe0cd337b3ec7b020fc499c41459fdc8df35ede1e379249e8957c68c4cfe type=CONTAINER_CREATED_EVENT Jul 7 00:24:57.657465 containerd[1548]: time="2025-07-07T00:24:57.657336955Z" level=warning msg="container event discarded" container=319ffe0cd337b3ec7b020fc499c41459fdc8df35ede1e379249e8957c68c4cfe type=CONTAINER_STARTED_EVENT Jul 7 00:24:59.362271 systemd[1]: Started sshd@21-91.107.203.174:22-139.178.89.65:46778.service - OpenSSH per-connection server daemon (139.178.89.65:46778). Jul 7 00:25:00.461079 sshd[7173]: Accepted publickey for core from 139.178.89.65 port 46778 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:25:00.464156 sshd-session[7173]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:25:00.472066 systemd-logind[1498]: New session 18 of user core. Jul 7 00:25:00.478704 systemd[1]: Started session-18.scope - Session 18 of User core. Jul 7 00:25:00.668230 containerd[1548]: time="2025-07-07T00:25:00.668141058Z" level=warning msg="container event discarded" container=55a010096275dbcf46f2628a48bf2865075fac307011b879e504c5215156723f type=CONTAINER_CREATED_EVENT Jul 7 00:25:00.783591 containerd[1548]: time="2025-07-07T00:25:00.783382218Z" level=warning msg="container event discarded" container=55a010096275dbcf46f2628a48bf2865075fac307011b879e504c5215156723f type=CONTAINER_STARTED_EVENT Jul 7 00:25:01.315971 sshd[7175]: Connection closed by 139.178.89.65 port 46778 Jul 7 00:25:01.318331 sshd-session[7173]: pam_unix(sshd:session): session closed for user core Jul 7 00:25:01.323620 systemd[1]: sshd@21-91.107.203.174:22-139.178.89.65:46778.service: Deactivated successfully. Jul 7 00:25:01.330278 systemd[1]: session-18.scope: Deactivated successfully. Jul 7 00:25:01.333580 systemd-logind[1498]: Session 18 logged out. Waiting for processes to exit. Jul 7 00:25:01.336581 systemd-logind[1498]: Removed session 18. Jul 7 00:25:02.234860 containerd[1548]: time="2025-07-07T00:25:02.234748873Z" level=warning msg="container event discarded" container=7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732 type=CONTAINER_CREATED_EVENT Jul 7 00:25:02.234860 containerd[1548]: time="2025-07-07T00:25:02.234850793Z" level=warning msg="container event discarded" container=7b402fdd1b54835e2da11a811feaae5a47776f4be6e09416ef63fa27963e2732 type=CONTAINER_STARTED_EVENT Jul 7 00:25:02.280050 containerd[1548]: time="2025-07-07T00:25:02.279959256Z" level=warning msg="container event discarded" container=5a620a409915851993a8585896fba7e534d82ca70e2d9f5d9ff8832c7742e909 type=CONTAINER_CREATED_EVENT Jul 7 00:25:02.356333 containerd[1548]: time="2025-07-07T00:25:02.356246396Z" level=warning msg="container event discarded" container=5a620a409915851993a8585896fba7e534d82ca70e2d9f5d9ff8832c7742e909 type=CONTAINER_STARTED_EVENT Jul 7 00:25:02.369744 containerd[1548]: time="2025-07-07T00:25:02.369640418Z" level=warning msg="container event discarded" container=b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd type=CONTAINER_CREATED_EVENT Jul 7 00:25:02.369744 containerd[1548]: time="2025-07-07T00:25:02.369715618Z" level=warning msg="container event discarded" container=b43b725db274dd22987232b1433c3bdf4322e010ff0983fcf6edc86d3570e4bd type=CONTAINER_STARTED_EVENT Jul 7 00:25:03.163403 containerd[1548]: time="2025-07-07T00:25:03.163314566Z" level=warning msg="container event discarded" container=9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893 type=CONTAINER_CREATED_EVENT Jul 7 00:25:03.163403 containerd[1548]: time="2025-07-07T00:25:03.163368326Z" level=warning msg="container event discarded" container=9d2bf5e38c60612d905e7fe2db41dc532e6ef4a5f7bacb0882b2373f5aba2893 type=CONTAINER_STARTED_EVENT Jul 7 00:25:03.326701 containerd[1548]: time="2025-07-07T00:25:03.326623364Z" level=warning msg="container event discarded" container=d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0 type=CONTAINER_CREATED_EVENT Jul 7 00:25:03.326701 containerd[1548]: time="2025-07-07T00:25:03.326688244Z" level=warning msg="container event discarded" container=d3d59d26d6af0dfdbb2379cc19ce5695aede9e989dfd89f9e5c1f8a91b823eb0 type=CONTAINER_STARTED_EVENT Jul 7 00:25:05.388588 containerd[1548]: time="2025-07-07T00:25:05.388478204Z" level=warning msg="container event discarded" container=58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3 type=CONTAINER_CREATED_EVENT Jul 7 00:25:05.388588 containerd[1548]: time="2025-07-07T00:25:05.388571484Z" level=warning msg="container event discarded" container=58d9f6ca76dbf521d43d3833980d990408e488d1cd5d73152b9d431b009b82e3 type=CONTAINER_STARTED_EVENT Jul 7 00:25:05.455883 containerd[1548]: time="2025-07-07T00:25:05.455773631Z" level=warning msg="container event discarded" container=87bd7b4d1af77bdab9d33bcbc0b25174d69cd4ff8fd298f582223434e3e1b0a7 type=CONTAINER_CREATED_EVENT Jul 7 00:25:05.605204 containerd[1548]: time="2025-07-07T00:25:05.605131398Z" level=warning msg="container event discarded" container=eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80 type=CONTAINER_CREATED_EVENT Jul 7 00:25:05.605204 containerd[1548]: time="2025-07-07T00:25:05.605189239Z" level=warning msg="container event discarded" container=eaa5093eea4cf856c3c998fef44e2b27cd852d7404c5c0f1613469e1623ebd80 type=CONTAINER_STARTED_EVENT Jul 7 00:25:05.660585 containerd[1548]: time="2025-07-07T00:25:05.660382855Z" level=warning msg="container event discarded" container=87bd7b4d1af77bdab9d33bcbc0b25174d69cd4ff8fd298f582223434e3e1b0a7 type=CONTAINER_STARTED_EVENT Jul 7 00:25:05.809082 containerd[1548]: time="2025-07-07T00:25:05.808965457Z" level=warning msg="container event discarded" container=4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc type=CONTAINER_CREATED_EVENT Jul 7 00:25:05.809082 containerd[1548]: time="2025-07-07T00:25:05.809056857Z" level=warning msg="container event discarded" container=4f578cab05ae39f5067c525273f6b7289998b0b114f97ded888a5c0ed6f1c0cc type=CONTAINER_STARTED_EVENT Jul 7 00:25:06.493872 containerd[1548]: time="2025-07-07T00:25:06.493778455Z" level=warning msg="container event discarded" container=45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d type=CONTAINER_CREATED_EVENT Jul 7 00:25:06.493872 containerd[1548]: time="2025-07-07T00:25:06.493841296Z" level=warning msg="container event discarded" container=45c6c26097dad8a1b5211a915eb16080e55d930660299b73c154cb946c9b302d type=CONTAINER_STARTED_EVENT Jul 7 00:25:06.511758 systemd[1]: Started sshd@22-91.107.203.174:22-139.178.89.65:58496.service - OpenSSH per-connection server daemon (139.178.89.65:58496). Jul 7 00:25:06.765090 containerd[1548]: time="2025-07-07T00:25:06.764490373Z" level=warning msg="container event discarded" container=5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c type=CONTAINER_CREATED_EVENT Jul 7 00:25:06.882431 containerd[1548]: time="2025-07-07T00:25:06.882348301Z" level=warning msg="container event discarded" container=5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c type=CONTAINER_STARTED_EVENT Jul 7 00:25:07.639864 sshd[7187]: Accepted publickey for core from 139.178.89.65 port 58496 ssh2: RSA SHA256:xsTbxm6TqKKecdMV2y7CU8EmXnfmdc/OYMcYiabApH8 Jul 7 00:25:07.643192 sshd-session[7187]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:25:07.649938 systemd-logind[1498]: New session 19 of user core. Jul 7 00:25:07.661849 systemd[1]: Started session-19.scope - Session 19 of User core. Jul 7 00:25:08.457703 containerd[1548]: time="2025-07-07T00:25:08.457620046Z" level=warning msg="container event discarded" container=1800e16d56298b5ae4fea1fdc8c63dbbad87376ebbf0b08feba0c162fec4b0c7 type=CONTAINER_CREATED_EVENT Jul 7 00:25:08.481575 sshd[7189]: Connection closed by 139.178.89.65 port 58496 Jul 7 00:25:08.481444 sshd-session[7187]: pam_unix(sshd:session): session closed for user core Jul 7 00:25:08.489195 systemd[1]: sshd@22-91.107.203.174:22-139.178.89.65:58496.service: Deactivated successfully. Jul 7 00:25:08.493104 systemd[1]: session-19.scope: Deactivated successfully. Jul 7 00:25:08.494292 systemd-logind[1498]: Session 19 logged out. Waiting for processes to exit. Jul 7 00:25:08.496437 systemd-logind[1498]: Removed session 19. Jul 7 00:25:08.562274 containerd[1548]: time="2025-07-07T00:25:08.562136430Z" level=warning msg="container event discarded" container=1800e16d56298b5ae4fea1fdc8c63dbbad87376ebbf0b08feba0c162fec4b0c7 type=CONTAINER_STARTED_EVENT Jul 7 00:25:09.246181 containerd[1548]: time="2025-07-07T00:25:09.246110271Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"a6ebc49e83855933222d0712c6c10d00bb93718a5166e7efbf32e9c99551ebff\" pid:7212 exited_at:{seconds:1751847909 nanos:245374145}" Jul 7 00:25:11.631238 containerd[1548]: time="2025-07-07T00:25:11.631111504Z" level=warning msg="container event discarded" container=4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4 type=CONTAINER_CREATED_EVENT Jul 7 00:25:11.742601 containerd[1548]: time="2025-07-07T00:25:11.742524933Z" level=warning msg="container event discarded" container=4a034095c12767f4101690468cf2320963f6c273a50671619a6aa7402711a3a4 type=CONTAINER_STARTED_EVENT Jul 7 00:25:12.022187 containerd[1548]: time="2025-07-07T00:25:12.021982813Z" level=warning msg="container event discarded" container=d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983 type=CONTAINER_CREATED_EVENT Jul 7 00:25:12.149635 containerd[1548]: time="2025-07-07T00:25:12.149469240Z" level=warning msg="container event discarded" container=d6148c3d090948fc3755fddf5379706c89861692efb35a1663cd4addd40b7983 type=CONTAINER_STARTED_EVENT Jul 7 00:25:15.690953 containerd[1548]: time="2025-07-07T00:25:15.690810535Z" level=warning msg="container event discarded" container=5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a type=CONTAINER_CREATED_EVENT Jul 7 00:25:15.780633 containerd[1548]: time="2025-07-07T00:25:15.780526796Z" level=warning msg="container event discarded" container=5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a type=CONTAINER_STARTED_EVENT Jul 7 00:25:16.089223 containerd[1548]: time="2025-07-07T00:25:16.089126952Z" level=warning msg="container event discarded" container=d9d81cbcac5e07f41f450c60a964da43762211fd73789d6b95f52ae5b24f0fdb type=CONTAINER_CREATED_EVENT Jul 7 00:25:16.209357 containerd[1548]: time="2025-07-07T00:25:16.209235676Z" level=warning msg="container event discarded" container=d9d81cbcac5e07f41f450c60a964da43762211fd73789d6b95f52ae5b24f0fdb type=CONTAINER_STARTED_EVENT Jul 7 00:25:18.161873 containerd[1548]: time="2025-07-07T00:25:18.161769826Z" level=warning msg="container event discarded" container=bb34a49ecb9edcb68352ca216a748beb2aca64471ecad5a2617529d621a28bc1 type=CONTAINER_CREATED_EVENT Jul 7 00:25:18.255454 containerd[1548]: time="2025-07-07T00:25:18.255329072Z" level=warning msg="container event discarded" container=bb34a49ecb9edcb68352ca216a748beb2aca64471ecad5a2617529d621a28bc1 type=CONTAINER_STARTED_EVENT Jul 7 00:25:19.457319 containerd[1548]: time="2025-07-07T00:25:19.457275554Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5d56f10bb09fd9249f0ccf7957e1613287909fd440b737b6163adc575bf6368c\" id:\"81261ed795cb9260f6d1ac83f269d30157dfc2a7cce1439e2b1fe43223b5da76\" pid:7235 exited_at:{seconds:1751847919 nanos:456823950}" Jul 7 00:25:19.566046 containerd[1548]: time="2025-07-07T00:25:19.565898108Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e74c589b3d605e62d5ea48312002a9aa8931a98263a42702e918bf8ff963f5a\" id:\"da2e23c632884c97615df83e8ca7191a5295b64c726fdf99a49d55e76524d1cd\" pid:7257 exited_at:{seconds:1751847919 nanos:565551025}" Jul 7 00:25:23.722122 systemd[1]: cri-containerd-c15d5cdd1143f9f16f1b414480a07a046d83f9439e2a2812dec6af9eed0f174b.scope: Deactivated successfully. Jul 7 00:25:23.722873 systemd[1]: cri-containerd-c15d5cdd1143f9f16f1b414480a07a046d83f9439e2a2812dec6af9eed0f174b.scope: Consumed 26.390s CPU time, 104.9M memory peak, 4.6M read from disk. Jul 7 00:25:23.727533 containerd[1548]: time="2025-07-07T00:25:23.726565396Z" level=info msg="received exit event container_id:\"c15d5cdd1143f9f16f1b414480a07a046d83f9439e2a2812dec6af9eed0f174b\" id:\"c15d5cdd1143f9f16f1b414480a07a046d83f9439e2a2812dec6af9eed0f174b\" pid:3174 exit_status:1 exited_at:{seconds:1751847923 nanos:726245554}" Jul 7 00:25:23.727533 containerd[1548]: time="2025-07-07T00:25:23.726796078Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c15d5cdd1143f9f16f1b414480a07a046d83f9439e2a2812dec6af9eed0f174b\" id:\"c15d5cdd1143f9f16f1b414480a07a046d83f9439e2a2812dec6af9eed0f174b\" pid:3174 exit_status:1 exited_at:{seconds:1751847923 nanos:726245554}" Jul 7 00:25:23.761726 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c15d5cdd1143f9f16f1b414480a07a046d83f9439e2a2812dec6af9eed0f174b-rootfs.mount: Deactivated successfully.