Mar 3 13:34:48.857177 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Mar 3 10:59:45 -00 2026 Mar 3 13:34:48.857197 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=51ade538e3d3c371f07ae1ec6fa9803fff0566ec060cf4b56dc685fc36d0e01c Mar 3 13:34:48.857204 kernel: BIOS-provided physical RAM map: Mar 3 13:34:48.857209 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 3 13:34:48.857216 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ed3efff] usable Mar 3 13:34:48.857221 kernel: BIOS-e820: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Mar 3 13:34:48.857226 kernel: BIOS-e820: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Mar 3 13:34:48.857231 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Mar 3 13:34:48.857236 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Mar 3 13:34:48.857241 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Mar 3 13:34:48.857245 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Mar 3 13:34:48.857250 kernel: BIOS-e820: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Mar 3 13:34:48.857255 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 3 13:34:48.857262 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 3 13:34:48.857267 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Mar 3 13:34:48.857272 kernel: BIOS-e820: [mem 0x0000000100000000-0x0000000179ffffff] usable Mar 3 13:34:48.857277 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 3 13:34:48.857282 kernel: NX (Execute Disable) protection: active Mar 3 13:34:48.857289 kernel: APIC: Static calls initialized Mar 3 13:34:48.857294 kernel: e820: update [mem 0x7dfab018-0x7dfb4a57] usable ==> usable Mar 3 13:34:48.857299 kernel: e820: update [mem 0x7df6f018-0x7dfaa657] usable ==> usable Mar 3 13:34:48.857304 kernel: e820: update [mem 0x7dc01018-0x7dc3c657] usable ==> usable Mar 3 13:34:48.857309 kernel: extended physical RAM map: Mar 3 13:34:48.857314 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 3 13:34:48.857319 kernel: reserve setup_data: [mem 0x0000000000100000-0x000000007dc01017] usable Mar 3 13:34:48.857324 kernel: reserve setup_data: [mem 0x000000007dc01018-0x000000007dc3c657] usable Mar 3 13:34:48.857328 kernel: reserve setup_data: [mem 0x000000007dc3c658-0x000000007df6f017] usable Mar 3 13:34:48.857333 kernel: reserve setup_data: [mem 0x000000007df6f018-0x000000007dfaa657] usable Mar 3 13:34:48.857338 kernel: reserve setup_data: [mem 0x000000007dfaa658-0x000000007dfab017] usable Mar 3 13:34:48.857348 kernel: reserve setup_data: [mem 0x000000007dfab018-0x000000007dfb4a57] usable Mar 3 13:34:48.857353 kernel: reserve setup_data: [mem 0x000000007dfb4a58-0x000000007ed3efff] usable Mar 3 13:34:48.857360 kernel: reserve setup_data: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Mar 3 13:34:48.857368 kernel: reserve setup_data: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Mar 3 13:34:48.857375 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Mar 3 13:34:48.857383 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Mar 3 13:34:48.857388 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Mar 3 13:34:48.857393 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Mar 3 13:34:48.857398 kernel: reserve setup_data: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Mar 3 13:34:48.857403 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 3 13:34:48.857408 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 3 13:34:48.857418 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Mar 3 13:34:48.857424 kernel: reserve setup_data: [mem 0x0000000100000000-0x0000000179ffffff] usable Mar 3 13:34:48.857429 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 3 13:34:48.857434 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Mar 3 13:34:48.857439 kernel: efi: SMBIOS=0x7f988000 SMBIOS 3.0=0x7f986000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7e01b198 RNG=0x7fb73018 Mar 3 13:34:48.857447 kernel: random: crng init done Mar 3 13:34:48.857452 kernel: efi: Remove mem137: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Mar 3 13:34:48.857457 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Mar 3 13:34:48.857462 kernel: secureboot: Secure boot disabled Mar 3 13:34:48.857467 kernel: SMBIOS 3.0.0 present. Mar 3 13:34:48.857473 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Mar 3 13:34:48.857478 kernel: DMI: Memory slots populated: 1/1 Mar 3 13:34:48.857483 kernel: Hypervisor detected: KVM Mar 3 13:34:48.857488 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Mar 3 13:34:48.857493 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 3 13:34:48.857498 kernel: kvm-clock: using sched offset of 13679349008 cycles Mar 3 13:34:48.857506 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 3 13:34:48.857511 kernel: tsc: Detected 2399.996 MHz processor Mar 3 13:34:48.857517 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 3 13:34:48.857522 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 3 13:34:48.857528 kernel: last_pfn = 0x17a000 max_arch_pfn = 0x10000000000 Mar 3 13:34:48.857533 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Mar 3 13:34:48.857539 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 3 13:34:48.857544 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Mar 3 13:34:48.857549 kernel: Using GB pages for direct mapping Mar 3 13:34:48.857556 kernel: ACPI: Early table checksum verification disabled Mar 3 13:34:48.857562 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Mar 3 13:34:48.857567 kernel: ACPI: XSDT 0x000000007FB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Mar 3 13:34:48.857572 kernel: ACPI: FACP 0x000000007FB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 13:34:48.857578 kernel: ACPI: DSDT 0x000000007FB7A000 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 13:34:48.857583 kernel: ACPI: FACS 0x000000007FBDD000 000040 Mar 3 13:34:48.857588 kernel: ACPI: APIC 0x000000007FB78000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 13:34:48.857593 kernel: ACPI: HPET 0x000000007FB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 13:34:48.857599 kernel: ACPI: MCFG 0x000000007FB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 13:34:48.857606 kernel: ACPI: WAET 0x000000007FB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 13:34:48.857612 kernel: ACPI: BGRT 0x000000007FB74000 000038 (v01 INTEL EDK2 00000002 01000013) Mar 3 13:34:48.857617 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb79000-0x7fb790f3] Mar 3 13:34:48.857622 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb7a000-0x7fb7c442] Mar 3 13:34:48.857628 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Mar 3 13:34:48.857633 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb78000-0x7fb7807f] Mar 3 13:34:48.857638 kernel: ACPI: Reserving HPET table memory at [mem 0x7fb77000-0x7fb77037] Mar 3 13:34:48.857643 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb76000-0x7fb7603b] Mar 3 13:34:48.857649 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb75000-0x7fb75027] Mar 3 13:34:48.857656 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb74000-0x7fb74037] Mar 3 13:34:48.857662 kernel: No NUMA configuration found Mar 3 13:34:48.857667 kernel: Faking a node at [mem 0x0000000000000000-0x0000000179ffffff] Mar 3 13:34:48.857672 kernel: NODE_DATA(0) allocated [mem 0x179ff8dc0-0x179ffffff] Mar 3 13:34:48.857678 kernel: Zone ranges: Mar 3 13:34:48.857683 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 3 13:34:48.857688 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 3 13:34:48.857693 kernel: Normal [mem 0x0000000100000000-0x0000000179ffffff] Mar 3 13:34:48.857699 kernel: Device empty Mar 3 13:34:48.857706 kernel: Movable zone start for each node Mar 3 13:34:48.857711 kernel: Early memory node ranges Mar 3 13:34:48.857716 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Mar 3 13:34:48.857722 kernel: node 0: [mem 0x0000000000100000-0x000000007ed3efff] Mar 3 13:34:48.857727 kernel: node 0: [mem 0x000000007ee00000-0x000000007f8ecfff] Mar 3 13:34:48.857732 kernel: node 0: [mem 0x000000007fbff000-0x000000007ff7bfff] Mar 3 13:34:48.857738 kernel: node 0: [mem 0x0000000100000000-0x0000000179ffffff] Mar 3 13:34:48.857743 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x0000000179ffffff] Mar 3 13:34:48.857748 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 3 13:34:48.857753 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Mar 3 13:34:48.857761 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Mar 3 13:34:48.857766 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Mar 3 13:34:48.857772 kernel: On node 0, zone Normal: 132 pages in unavailable ranges Mar 3 13:34:48.857777 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Mar 3 13:34:48.857782 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 3 13:34:48.857788 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 3 13:34:48.857793 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 3 13:34:48.857798 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 3 13:34:48.857804 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 3 13:34:48.857811 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 3 13:34:48.857817 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 3 13:34:48.857822 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 3 13:34:48.857827 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 3 13:34:48.857832 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Mar 3 13:34:48.857858 kernel: CPU topo: Max. logical packages: 1 Mar 3 13:34:48.857864 kernel: CPU topo: Max. logical dies: 1 Mar 3 13:34:48.857878 kernel: CPU topo: Max. dies per package: 1 Mar 3 13:34:48.857883 kernel: CPU topo: Max. threads per core: 1 Mar 3 13:34:48.857889 kernel: CPU topo: Num. cores per package: 2 Mar 3 13:34:48.857894 kernel: CPU topo: Num. threads per package: 2 Mar 3 13:34:48.857900 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Mar 3 13:34:48.857907 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 3 13:34:48.857913 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Mar 3 13:34:48.857918 kernel: Booting paravirtualized kernel on KVM Mar 3 13:34:48.857924 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 3 13:34:48.857930 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 3 13:34:48.857938 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Mar 3 13:34:48.857943 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Mar 3 13:34:48.857948 kernel: pcpu-alloc: [0] 0 1 Mar 3 13:34:48.857954 kernel: kvm-guest: PV spinlocks disabled, no host support Mar 3 13:34:48.857960 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=51ade538e3d3c371f07ae1ec6fa9803fff0566ec060cf4b56dc685fc36d0e01c Mar 3 13:34:48.857966 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 3 13:34:48.857972 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 3 13:34:48.857977 kernel: Fallback order for Node 0: 0 Mar 3 13:34:48.857985 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1022792 Mar 3 13:34:48.857990 kernel: Policy zone: Normal Mar 3 13:34:48.857996 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 3 13:34:48.858001 kernel: software IO TLB: area num 2. Mar 3 13:34:48.858007 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 3 13:34:48.858012 kernel: ftrace: allocating 40099 entries in 157 pages Mar 3 13:34:48.858018 kernel: ftrace: allocated 157 pages with 5 groups Mar 3 13:34:48.858023 kernel: Dynamic Preempt: voluntary Mar 3 13:34:48.858029 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 3 13:34:48.858038 kernel: rcu: RCU event tracing is enabled. Mar 3 13:34:48.858043 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 3 13:34:48.858049 kernel: Trampoline variant of Tasks RCU enabled. Mar 3 13:34:48.858055 kernel: Rude variant of Tasks RCU enabled. Mar 3 13:34:48.858060 kernel: Tracing variant of Tasks RCU enabled. Mar 3 13:34:48.858066 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 3 13:34:48.858071 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 3 13:34:48.858077 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 3 13:34:48.858083 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 3 13:34:48.858090 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 3 13:34:48.858096 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 3 13:34:48.858102 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 3 13:34:48.858107 kernel: Console: colour dummy device 80x25 Mar 3 13:34:48.858113 kernel: printk: legacy console [tty0] enabled Mar 3 13:34:48.858118 kernel: printk: legacy console [ttyS0] enabled Mar 3 13:34:48.858124 kernel: ACPI: Core revision 20240827 Mar 3 13:34:48.858129 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Mar 3 13:34:48.858135 kernel: APIC: Switch to symmetric I/O mode setup Mar 3 13:34:48.858143 kernel: x2apic enabled Mar 3 13:34:48.858148 kernel: APIC: Switched APIC routing to: physical x2apic Mar 3 13:34:48.858154 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 3 13:34:48.858160 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x229833f6470, max_idle_ns: 440795327230 ns Mar 3 13:34:48.858165 kernel: Calibrating delay loop (skipped) preset value.. 4799.99 BogoMIPS (lpj=2399996) Mar 3 13:34:48.858171 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 3 13:34:48.858177 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Mar 3 13:34:48.858182 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Mar 3 13:34:48.858188 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 3 13:34:48.858195 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Mar 3 13:34:48.858201 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 3 13:34:48.858206 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 3 13:34:48.858212 kernel: active return thunk: srso_alias_return_thunk Mar 3 13:34:48.858218 kernel: Speculative Return Stack Overflow: Mitigation: Safe RET Mar 3 13:34:48.858223 kernel: Transient Scheduler Attacks: Forcing mitigation on in a VM Mar 3 13:34:48.858229 kernel: Transient Scheduler Attacks: Vulnerable: Clear CPU buffers attempted, no microcode Mar 3 13:34:48.858234 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 3 13:34:48.858240 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 3 13:34:48.858248 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 3 13:34:48.858253 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Mar 3 13:34:48.858259 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Mar 3 13:34:48.858264 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Mar 3 13:34:48.858270 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Mar 3 13:34:48.858276 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 3 13:34:48.858281 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Mar 3 13:34:48.858287 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Mar 3 13:34:48.858292 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Mar 3 13:34:48.858300 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Mar 3 13:34:48.858305 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Mar 3 13:34:48.858311 kernel: Freeing SMP alternatives memory: 32K Mar 3 13:34:48.858316 kernel: pid_max: default: 32768 minimum: 301 Mar 3 13:34:48.858322 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 3 13:34:48.858327 kernel: landlock: Up and running. Mar 3 13:34:48.858333 kernel: SELinux: Initializing. Mar 3 13:34:48.858338 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 3 13:34:48.858344 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 3 13:34:48.858352 kernel: smpboot: CPU0: AMD EPYC-Genoa Processor (family: 0x19, model: 0x11, stepping: 0x0) Mar 3 13:34:48.858361 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Mar 3 13:34:48.858369 kernel: ... version: 0 Mar 3 13:34:48.858377 kernel: ... bit width: 48 Mar 3 13:34:48.858382 kernel: ... generic registers: 6 Mar 3 13:34:48.858388 kernel: ... value mask: 0000ffffffffffff Mar 3 13:34:48.858394 kernel: ... max period: 00007fffffffffff Mar 3 13:34:48.858399 kernel: ... fixed-purpose events: 0 Mar 3 13:34:48.858405 kernel: ... event mask: 000000000000003f Mar 3 13:34:48.858413 kernel: signal: max sigframe size: 3376 Mar 3 13:34:48.858418 kernel: rcu: Hierarchical SRCU implementation. Mar 3 13:34:48.858424 kernel: rcu: Max phase no-delay instances is 400. Mar 3 13:34:48.858429 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 3 13:34:48.858435 kernel: smp: Bringing up secondary CPUs ... Mar 3 13:34:48.858440 kernel: smpboot: x86: Booting SMP configuration: Mar 3 13:34:48.858446 kernel: .... node #0, CPUs: #1 Mar 3 13:34:48.858451 kernel: smp: Brought up 1 node, 2 CPUs Mar 3 13:34:48.858457 kernel: smpboot: Total of 2 processors activated (9599.98 BogoMIPS) Mar 3 13:34:48.858465 kernel: Memory: 3848512K/4091168K available (14336K kernel code, 2445K rwdata, 26064K rodata, 46200K init, 2560K bss, 237020K reserved, 0K cma-reserved) Mar 3 13:34:48.858470 kernel: devtmpfs: initialized Mar 3 13:34:48.858476 kernel: x86/mm: Memory block size: 128MB Mar 3 13:34:48.858483 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Mar 3 13:34:48.858489 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 3 13:34:48.858495 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 3 13:34:48.858500 kernel: pinctrl core: initialized pinctrl subsystem Mar 3 13:34:48.858506 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 3 13:34:48.858512 kernel: audit: initializing netlink subsys (disabled) Mar 3 13:34:48.858519 kernel: audit: type=2000 audit(1772544887.196:1): state=initialized audit_enabled=0 res=1 Mar 3 13:34:48.858525 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 3 13:34:48.858530 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 3 13:34:48.858536 kernel: cpuidle: using governor menu Mar 3 13:34:48.858542 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 3 13:34:48.858547 kernel: dca service started, version 1.12.1 Mar 3 13:34:48.858553 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Mar 3 13:34:48.858558 kernel: PCI: Using configuration type 1 for base access Mar 3 13:34:48.858564 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 3 13:34:48.858572 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 3 13:34:48.858578 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 3 13:34:48.858584 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 3 13:34:48.858589 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 3 13:34:48.858595 kernel: ACPI: Added _OSI(Module Device) Mar 3 13:34:48.858600 kernel: ACPI: Added _OSI(Processor Device) Mar 3 13:34:48.858606 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 3 13:34:48.858612 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 3 13:34:48.858617 kernel: ACPI: Interpreter enabled Mar 3 13:34:48.858625 kernel: ACPI: PM: (supports S0 S5) Mar 3 13:34:48.858630 kernel: ACPI: Using IOAPIC for interrupt routing Mar 3 13:34:48.858636 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 3 13:34:48.858642 kernel: PCI: Using E820 reservations for host bridge windows Mar 3 13:34:48.858647 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 3 13:34:48.858653 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 3 13:34:48.861073 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 3 13:34:48.861238 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Mar 3 13:34:48.861364 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Mar 3 13:34:48.861373 kernel: PCI host bridge to bus 0000:00 Mar 3 13:34:48.861478 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 3 13:34:48.861570 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 3 13:34:48.861691 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 3 13:34:48.861803 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Mar 3 13:34:48.863279 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Mar 3 13:34:48.863406 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc7ffffffff window] Mar 3 13:34:48.863508 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 3 13:34:48.863624 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Mar 3 13:34:48.863738 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Mar 3 13:34:48.864932 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Mar 3 13:34:48.865080 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc060500000-0xc060503fff 64bit pref] Mar 3 13:34:48.865194 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8138a000-0x8138afff] Mar 3 13:34:48.865294 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Mar 3 13:34:48.865392 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 3 13:34:48.865501 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:48.865602 kernel: pci 0000:00:02.0: BAR 0 [mem 0x81389000-0x81389fff] Mar 3 13:34:48.865699 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 3 13:34:48.869482 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Mar 3 13:34:48.869598 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 3 13:34:48.869706 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:48.869804 kernel: pci 0000:00:02.1: BAR 0 [mem 0x81388000-0x81388fff] Mar 3 13:34:48.869939 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 3 13:34:48.870039 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Mar 3 13:34:48.870143 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:48.870245 kernel: pci 0000:00:02.2: BAR 0 [mem 0x81387000-0x81387fff] Mar 3 13:34:48.870341 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 3 13:34:48.870452 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Mar 3 13:34:48.870560 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 3 13:34:48.870666 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:48.870763 kernel: pci 0000:00:02.3: BAR 0 [mem 0x81386000-0x81386fff] Mar 3 13:34:48.870908 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 3 13:34:48.871009 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 3 13:34:48.871116 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:48.871213 kernel: pci 0000:00:02.4: BAR 0 [mem 0x81385000-0x81385fff] Mar 3 13:34:48.871310 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 3 13:34:48.871406 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Mar 3 13:34:48.871501 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 3 13:34:48.871638 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:48.871739 kernel: pci 0000:00:02.5: BAR 0 [mem 0x81384000-0x81384fff] Mar 3 13:34:48.871901 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 3 13:34:48.872017 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Mar 3 13:34:48.872122 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 3 13:34:48.872239 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:48.872338 kernel: pci 0000:00:02.6: BAR 0 [mem 0x81383000-0x81383fff] Mar 3 13:34:48.872459 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 3 13:34:48.872567 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Mar 3 13:34:48.872669 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 3 13:34:48.872776 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:48.874409 kernel: pci 0000:00:02.7: BAR 0 [mem 0x81382000-0x81382fff] Mar 3 13:34:48.874518 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 3 13:34:48.874629 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Mar 3 13:34:48.874732 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 3 13:34:48.874898 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:48.875003 kernel: pci 0000:00:03.0: BAR 0 [mem 0x81381000-0x81381fff] Mar 3 13:34:48.875100 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 3 13:34:48.875198 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Mar 3 13:34:48.875305 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 3 13:34:48.875427 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Mar 3 13:34:48.875530 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 3 13:34:48.875648 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Mar 3 13:34:48.875747 kernel: pci 0000:00:1f.2: BAR 4 [io 0x6040-0x605f] Mar 3 13:34:48.877910 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x81380000-0x81380fff] Mar 3 13:34:48.878032 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Mar 3 13:34:48.878134 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6000-0x603f] Mar 3 13:34:48.878246 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 3 13:34:48.878352 kernel: pci 0000:01:00.0: BAR 1 [mem 0x81200000-0x81200fff] Mar 3 13:34:48.878453 kernel: pci 0000:01:00.0: BAR 4 [mem 0xc060000000-0xc060003fff 64bit pref] Mar 3 13:34:48.878554 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 3 13:34:48.878652 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 3 13:34:48.878760 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Mar 3 13:34:48.878891 kernel: pci 0000:02:00.0: BAR 0 [mem 0x81100000-0x81103fff 64bit] Mar 3 13:34:48.878990 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 3 13:34:48.879102 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Mar 3 13:34:48.879205 kernel: pci 0000:03:00.0: BAR 1 [mem 0x81000000-0x81000fff] Mar 3 13:34:48.879305 kernel: pci 0000:03:00.0: BAR 4 [mem 0xc060100000-0xc060103fff 64bit pref] Mar 3 13:34:48.879403 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 3 13:34:48.879512 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Mar 3 13:34:48.879613 kernel: pci 0000:04:00.0: BAR 4 [mem 0xc060200000-0xc060203fff 64bit pref] Mar 3 13:34:48.879718 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 3 13:34:48.881078 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Mar 3 13:34:48.881229 kernel: pci 0000:05:00.0: BAR 1 [mem 0x80f00000-0x80f00fff] Mar 3 13:34:48.881356 kernel: pci 0000:05:00.0: BAR 4 [mem 0xc060300000-0xc060303fff 64bit pref] Mar 3 13:34:48.881476 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 3 13:34:48.881591 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Mar 3 13:34:48.881693 kernel: pci 0000:06:00.0: BAR 1 [mem 0x80e00000-0x80e00fff] Mar 3 13:34:48.881795 kernel: pci 0000:06:00.0: BAR 4 [mem 0xc060400000-0xc060403fff 64bit pref] Mar 3 13:34:48.881936 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 3 13:34:48.881945 kernel: acpiphp: Slot [0] registered Mar 3 13:34:48.882053 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 3 13:34:48.882155 kernel: pci 0000:07:00.0: BAR 1 [mem 0x80c00000-0x80c00fff] Mar 3 13:34:48.882255 kernel: pci 0000:07:00.0: BAR 4 [mem 0xc000000000-0xc000003fff 64bit pref] Mar 3 13:34:48.882356 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 3 13:34:48.882454 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 3 13:34:48.882465 kernel: acpiphp: Slot [0-2] registered Mar 3 13:34:48.882593 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 3 13:34:48.882604 kernel: acpiphp: Slot [0-3] registered Mar 3 13:34:48.882708 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 3 13:34:48.882719 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 3 13:34:48.882738 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 3 13:34:48.882746 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 3 13:34:48.882752 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 3 13:34:48.882760 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 3 13:34:48.882766 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 3 13:34:48.882772 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 3 13:34:48.882778 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 3 13:34:48.882783 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 3 13:34:48.882789 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 3 13:34:48.882795 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 3 13:34:48.882801 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 3 13:34:48.882807 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 3 13:34:48.882815 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 3 13:34:48.882821 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 3 13:34:48.882829 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 3 13:34:48.884879 kernel: iommu: Default domain type: Translated Mar 3 13:34:48.884891 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 3 13:34:48.884897 kernel: efivars: Registered efivars operations Mar 3 13:34:48.884906 kernel: PCI: Using ACPI for IRQ routing Mar 3 13:34:48.884911 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 3 13:34:48.884918 kernel: e820: reserve RAM buffer [mem 0x7dc01018-0x7fffffff] Mar 3 13:34:48.884924 kernel: e820: reserve RAM buffer [mem 0x7df6f018-0x7fffffff] Mar 3 13:34:48.884930 kernel: e820: reserve RAM buffer [mem 0x7dfab018-0x7fffffff] Mar 3 13:34:48.884936 kernel: e820: reserve RAM buffer [mem 0x7ed3f000-0x7fffffff] Mar 3 13:34:48.884941 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Mar 3 13:34:48.884948 kernel: e820: reserve RAM buffer [mem 0x7ff7c000-0x7fffffff] Mar 3 13:34:48.884955 kernel: e820: reserve RAM buffer [mem 0x17a000000-0x17bffffff] Mar 3 13:34:48.885081 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 3 13:34:48.885183 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 3 13:34:48.885281 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 3 13:34:48.885288 kernel: vgaarb: loaded Mar 3 13:34:48.885294 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Mar 3 13:34:48.885300 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Mar 3 13:34:48.885306 kernel: clocksource: Switched to clocksource kvm-clock Mar 3 13:34:48.885312 kernel: VFS: Disk quotas dquot_6.6.0 Mar 3 13:34:48.885321 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 3 13:34:48.885327 kernel: pnp: PnP ACPI init Mar 3 13:34:48.885435 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Mar 3 13:34:48.885444 kernel: pnp: PnP ACPI: found 5 devices Mar 3 13:34:48.885450 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 3 13:34:48.885456 kernel: NET: Registered PF_INET protocol family Mar 3 13:34:48.885462 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 3 13:34:48.885468 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 3 13:34:48.885476 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 3 13:34:48.885482 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 3 13:34:48.885488 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 3 13:34:48.885494 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 3 13:34:48.885500 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 3 13:34:48.885506 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 3 13:34:48.885511 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 3 13:34:48.885517 kernel: NET: Registered PF_XDP protocol family Mar 3 13:34:48.886888 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Mar 3 13:34:48.887019 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Mar 3 13:34:48.887122 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 3 13:34:48.887223 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 3 13:34:48.887323 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 3 13:34:48.887421 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Mar 3 13:34:48.887519 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Mar 3 13:34:48.887651 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Mar 3 13:34:48.887779 kernel: pci 0000:01:00.0: ROM [mem 0x81280000-0x812fffff pref]: assigned Mar 3 13:34:48.892895 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 3 13:34:48.893011 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Mar 3 13:34:48.893110 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 3 13:34:48.893218 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 3 13:34:48.893336 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Mar 3 13:34:48.893437 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 3 13:34:48.893534 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Mar 3 13:34:48.893630 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 3 13:34:48.893729 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 3 13:34:48.893832 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 3 13:34:48.893955 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 3 13:34:48.894052 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Mar 3 13:34:48.894148 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 3 13:34:48.894247 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 3 13:34:48.894362 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Mar 3 13:34:48.894472 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 3 13:34:48.894577 kernel: pci 0000:07:00.0: ROM [mem 0x80c80000-0x80cfffff pref]: assigned Mar 3 13:34:48.894675 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 3 13:34:48.894776 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Mar 3 13:34:48.894896 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Mar 3 13:34:48.894994 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 3 13:34:48.895092 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 3 13:34:48.895190 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Mar 3 13:34:48.895292 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Mar 3 13:34:48.895389 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 3 13:34:48.895489 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 3 13:34:48.895586 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Mar 3 13:34:48.895682 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Mar 3 13:34:48.895788 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 3 13:34:48.895917 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 3 13:34:48.896009 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 3 13:34:48.896098 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 3 13:34:48.896192 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Mar 3 13:34:48.896282 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Mar 3 13:34:48.896371 kernel: pci_bus 0000:00: resource 9 [mem 0xc000000000-0xc7ffffffff window] Mar 3 13:34:48.896475 kernel: pci_bus 0000:01: resource 1 [mem 0x81200000-0x812fffff] Mar 3 13:34:48.896571 kernel: pci_bus 0000:01: resource 2 [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 3 13:34:48.896677 kernel: pci_bus 0000:02: resource 1 [mem 0x81100000-0x811fffff] Mar 3 13:34:48.896784 kernel: pci_bus 0000:03: resource 1 [mem 0x81000000-0x810fffff] Mar 3 13:34:48.896951 kernel: pci_bus 0000:03: resource 2 [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 3 13:34:48.897098 kernel: pci_bus 0000:04: resource 2 [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 3 13:34:48.897202 kernel: pci_bus 0000:05: resource 1 [mem 0x80f00000-0x80ffffff] Mar 3 13:34:48.897299 kernel: pci_bus 0000:05: resource 2 [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 3 13:34:48.897401 kernel: pci_bus 0000:06: resource 1 [mem 0x80e00000-0x80efffff] Mar 3 13:34:48.897499 kernel: pci_bus 0000:06: resource 2 [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 3 13:34:48.897599 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Mar 3 13:34:48.897693 kernel: pci_bus 0000:07: resource 1 [mem 0x80c00000-0x80dfffff] Mar 3 13:34:48.897787 kernel: pci_bus 0000:07: resource 2 [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 3 13:34:48.897908 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Mar 3 13:34:48.898004 kernel: pci_bus 0000:08: resource 1 [mem 0x80a00000-0x80bfffff] Mar 3 13:34:48.898102 kernel: pci_bus 0000:08: resource 2 [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 3 13:34:48.898209 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Mar 3 13:34:48.898303 kernel: pci_bus 0000:09: resource 1 [mem 0x80800000-0x809fffff] Mar 3 13:34:48.898397 kernel: pci_bus 0000:09: resource 2 [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 3 13:34:48.898405 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 3 13:34:48.898411 kernel: PCI: CLS 0 bytes, default 64 Mar 3 13:34:48.898418 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 3 13:34:48.898424 kernel: software IO TLB: mapped [mem 0x0000000077ffd000-0x000000007bffd000] (64MB) Mar 3 13:34:48.898430 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x229833f6470, max_idle_ns: 440795327230 ns Mar 3 13:34:48.898438 kernel: Initialise system trusted keyrings Mar 3 13:34:48.898445 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 3 13:34:48.898451 kernel: Key type asymmetric registered Mar 3 13:34:48.898456 kernel: Asymmetric key parser 'x509' registered Mar 3 13:34:48.898462 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 3 13:34:48.898468 kernel: io scheduler mq-deadline registered Mar 3 13:34:48.898474 kernel: io scheduler kyber registered Mar 3 13:34:48.898480 kernel: io scheduler bfq registered Mar 3 13:34:48.898582 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 3 13:34:48.898683 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 3 13:34:48.898782 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 3 13:34:48.898921 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 3 13:34:48.899023 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 3 13:34:48.899123 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 3 13:34:48.899222 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 3 13:34:48.899320 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 3 13:34:48.899418 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 3 13:34:48.899519 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 3 13:34:48.899617 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 3 13:34:48.899718 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 3 13:34:48.899816 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 3 13:34:48.903022 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 3 13:34:48.903137 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 3 13:34:48.903236 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 3 13:34:48.903249 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 3 13:34:48.903349 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Mar 3 13:34:48.903447 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Mar 3 13:34:48.903454 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 3 13:34:48.903460 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Mar 3 13:34:48.903466 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 3 13:34:48.903473 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 3 13:34:48.903481 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 3 13:34:48.903487 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 3 13:34:48.903493 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 3 13:34:48.903499 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 3 13:34:48.903606 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 3 13:34:48.903700 kernel: rtc_cmos 00:03: registered as rtc0 Mar 3 13:34:48.903793 kernel: rtc_cmos 00:03: setting system clock to 2026-03-03T13:34:48 UTC (1772544888) Mar 3 13:34:48.903918 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Mar 3 13:34:48.903930 kernel: amd_pstate: The CPPC feature is supported but currently disabled by the BIOS. Please enable it if your BIOS has the CPPC option. Mar 3 13:34:48.903937 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 3 13:34:48.903943 kernel: efifb: probing for efifb Mar 3 13:34:48.903949 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Mar 3 13:34:48.903955 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Mar 3 13:34:48.903961 kernel: efifb: scrolling: redraw Mar 3 13:34:48.903967 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 3 13:34:48.903973 kernel: Console: switching to colour frame buffer device 160x50 Mar 3 13:34:48.903979 kernel: fb0: EFI VGA frame buffer device Mar 3 13:34:48.903987 kernel: pstore: Using crash dump compression: deflate Mar 3 13:34:48.903993 kernel: pstore: Registered efi_pstore as persistent store backend Mar 3 13:34:48.903999 kernel: NET: Registered PF_INET6 protocol family Mar 3 13:34:48.904005 kernel: Segment Routing with IPv6 Mar 3 13:34:48.904011 kernel: In-situ OAM (IOAM) with IPv6 Mar 3 13:34:48.904017 kernel: NET: Registered PF_PACKET protocol family Mar 3 13:34:48.904023 kernel: Key type dns_resolver registered Mar 3 13:34:48.904029 kernel: IPI shorthand broadcast: enabled Mar 3 13:34:48.904034 kernel: sched_clock: Marking stable (2879006395, 278615544)->(3197815347, -40193408) Mar 3 13:34:48.904043 kernel: registered taskstats version 1 Mar 3 13:34:48.904048 kernel: Loading compiled-in X.509 certificates Mar 3 13:34:48.904054 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: bf135b2a3d3664cc6742f4e1848867384c1e52f1' Mar 3 13:34:48.904060 kernel: Demotion targets for Node 0: null Mar 3 13:34:48.904066 kernel: Key type .fscrypt registered Mar 3 13:34:48.904072 kernel: Key type fscrypt-provisioning registered Mar 3 13:34:48.904078 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 3 13:34:48.904084 kernel: ima: Allocated hash algorithm: sha1 Mar 3 13:34:48.904092 kernel: ima: No architecture policies found Mar 3 13:34:48.904098 kernel: clk: Disabling unused clocks Mar 3 13:34:48.904104 kernel: Warning: unable to open an initial console. Mar 3 13:34:48.904111 kernel: Freeing unused kernel image (initmem) memory: 46200K Mar 3 13:34:48.904117 kernel: Write protecting the kernel read-only data: 40960k Mar 3 13:34:48.904123 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Mar 3 13:34:48.904129 kernel: Run /init as init process Mar 3 13:34:48.904135 kernel: with arguments: Mar 3 13:34:48.904141 kernel: /init Mar 3 13:34:48.904147 kernel: with environment: Mar 3 13:34:48.904155 kernel: HOME=/ Mar 3 13:34:48.904161 kernel: TERM=linux Mar 3 13:34:48.904168 systemd[1]: Successfully made /usr/ read-only. Mar 3 13:34:48.904177 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 3 13:34:48.904184 systemd[1]: Detected virtualization kvm. Mar 3 13:34:48.904190 systemd[1]: Detected architecture x86-64. Mar 3 13:34:48.904196 systemd[1]: Running in initrd. Mar 3 13:34:48.904204 systemd[1]: No hostname configured, using default hostname. Mar 3 13:34:48.904211 systemd[1]: Hostname set to . Mar 3 13:34:48.904217 systemd[1]: Initializing machine ID from VM UUID. Mar 3 13:34:48.904223 systemd[1]: Queued start job for default target initrd.target. Mar 3 13:34:48.904229 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 3 13:34:48.904236 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 3 13:34:48.904243 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 3 13:34:48.904249 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 3 13:34:48.904257 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 3 13:34:48.904264 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 3 13:34:48.904271 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 3 13:34:48.904278 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 3 13:34:48.904284 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 3 13:34:48.904290 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 3 13:34:48.904297 systemd[1]: Reached target paths.target - Path Units. Mar 3 13:34:48.904305 systemd[1]: Reached target slices.target - Slice Units. Mar 3 13:34:48.904311 systemd[1]: Reached target swap.target - Swaps. Mar 3 13:34:48.904317 systemd[1]: Reached target timers.target - Timer Units. Mar 3 13:34:48.904324 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 3 13:34:48.904330 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 3 13:34:48.904336 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 3 13:34:48.904342 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 3 13:34:48.904349 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 3 13:34:48.904357 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 3 13:34:48.904363 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 3 13:34:48.904369 systemd[1]: Reached target sockets.target - Socket Units. Mar 3 13:34:48.904376 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 3 13:34:48.904382 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 3 13:34:48.904388 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 3 13:34:48.904395 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 3 13:34:48.904401 systemd[1]: Starting systemd-fsck-usr.service... Mar 3 13:34:48.904407 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 3 13:34:48.904416 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 3 13:34:48.904422 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 13:34:48.904428 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 3 13:34:48.904435 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 3 13:34:48.904444 systemd[1]: Finished systemd-fsck-usr.service. Mar 3 13:34:48.904450 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 3 13:34:48.904476 systemd-journald[198]: Collecting audit messages is disabled. Mar 3 13:34:48.904492 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 13:34:48.904501 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 3 13:34:48.904507 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 3 13:34:48.904514 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 3 13:34:48.904520 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 3 13:34:48.904526 kernel: Bridge firewalling registered Mar 3 13:34:48.904532 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 3 13:34:48.904539 systemd-journald[198]: Journal started Mar 3 13:34:48.904559 systemd-journald[198]: Runtime Journal (/run/log/journal/54244fe9e698483f881a5272dd7c4c6d) is 8M, max 76.1M, 68.1M free. Mar 3 13:34:48.851191 systemd-modules-load[200]: Inserted module 'overlay' Mar 3 13:34:48.895804 systemd-modules-load[200]: Inserted module 'br_netfilter' Mar 3 13:34:48.914355 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 3 13:34:48.917883 systemd[1]: Started systemd-journald.service - Journal Service. Mar 3 13:34:48.925957 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 3 13:34:48.927414 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 3 13:34:48.931960 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 3 13:34:48.933995 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 3 13:34:48.934981 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 3 13:34:48.944653 systemd-tmpfiles[234]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 3 13:34:48.950146 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 3 13:34:48.951006 dracut-cmdline[233]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=51ade538e3d3c371f07ae1ec6fa9803fff0566ec060cf4b56dc685fc36d0e01c Mar 3 13:34:48.953984 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 3 13:34:48.992597 systemd-resolved[252]: Positive Trust Anchors: Mar 3 13:34:48.993201 systemd-resolved[252]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 3 13:34:48.993224 systemd-resolved[252]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 3 13:34:48.996864 systemd-resolved[252]: Defaulting to hostname 'linux'. Mar 3 13:34:48.997821 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 3 13:34:48.999718 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 3 13:34:49.036879 kernel: SCSI subsystem initialized Mar 3 13:34:49.045880 kernel: Loading iSCSI transport class v2.0-870. Mar 3 13:34:49.054895 kernel: iscsi: registered transport (tcp) Mar 3 13:34:49.072964 kernel: iscsi: registered transport (qla4xxx) Mar 3 13:34:49.073043 kernel: QLogic iSCSI HBA Driver Mar 3 13:34:49.095798 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 3 13:34:49.110138 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 3 13:34:49.113566 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 3 13:34:49.155039 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 3 13:34:49.157236 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 3 13:34:49.201886 kernel: raid6: avx512x4 gen() 40234 MB/s Mar 3 13:34:49.220901 kernel: raid6: avx512x2 gen() 42949 MB/s Mar 3 13:34:49.238898 kernel: raid6: avx512x1 gen() 36385 MB/s Mar 3 13:34:49.256893 kernel: raid6: avx2x4 gen() 42117 MB/s Mar 3 13:34:49.274918 kernel: raid6: avx2x2 gen() 45602 MB/s Mar 3 13:34:49.294024 kernel: raid6: avx2x1 gen() 36008 MB/s Mar 3 13:34:49.294092 kernel: raid6: using algorithm avx2x2 gen() 45602 MB/s Mar 3 13:34:49.314127 kernel: raid6: .... xor() 35768 MB/s, rmw enabled Mar 3 13:34:49.314210 kernel: raid6: using avx512x2 recovery algorithm Mar 3 13:34:49.330879 kernel: xor: automatically using best checksumming function avx Mar 3 13:34:49.445872 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 3 13:34:49.452758 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 3 13:34:49.454642 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 3 13:34:49.480717 systemd-udevd[448]: Using default interface naming scheme 'v255'. Mar 3 13:34:49.486230 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 3 13:34:49.488791 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 3 13:34:49.507464 dracut-pre-trigger[456]: rd.md=0: removing MD RAID activation Mar 3 13:34:49.531045 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 3 13:34:49.532989 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 3 13:34:49.610389 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 3 13:34:49.612357 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 3 13:34:49.671865 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Mar 3 13:34:49.695064 kernel: ACPI: bus type USB registered Mar 3 13:34:49.695118 kernel: usbcore: registered new interface driver usbfs Mar 3 13:34:49.696827 kernel: usbcore: registered new interface driver hub Mar 3 13:34:49.698856 kernel: usbcore: registered new device driver usb Mar 3 13:34:49.711883 kernel: cryptd: max_cpu_qlen set to 1000 Mar 3 13:34:49.732613 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 3 13:34:49.732909 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 13:34:49.734256 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 13:34:49.738049 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 13:34:49.751298 kernel: scsi host0: Virtio SCSI HBA Mar 3 13:34:49.751730 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Mar 3 13:34:49.751754 kernel: libata version 3.00 loaded. Mar 3 13:34:49.762192 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 3 13:34:49.762346 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 13:34:49.767101 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Mar 3 13:34:49.767239 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 13:34:49.787324 kernel: AES CTR mode by8 optimization enabled Mar 3 13:34:49.787373 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 3 13:34:49.796461 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Mar 3 13:34:49.800565 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 3 13:34:49.802324 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 13:34:49.806986 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 3 13:34:49.811400 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Mar 3 13:34:49.811585 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Mar 3 13:34:49.811864 kernel: sd 0:0:0:0: Power-on or device reset occurred Mar 3 13:34:49.818915 kernel: sd 0:0:0:0: [sda] 160006144 512-byte logical blocks: (81.9 GB/76.3 GiB) Mar 3 13:34:49.819113 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 3 13:34:49.820914 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Mar 3 13:34:49.824873 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Mar 3 13:34:49.825038 kernel: hub 1-0:1.0: USB hub found Mar 3 13:34:49.825178 kernel: ahci 0000:00:1f.2: version 3.0 Mar 3 13:34:49.826862 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 3 13:34:49.830900 kernel: hub 1-0:1.0: 4 ports detected Mar 3 13:34:49.838240 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 3 13:34:49.838315 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 3 13:34:49.840098 kernel: hub 2-0:1.0: USB hub found Mar 3 13:34:49.840336 kernel: GPT:17805311 != 160006143 Mar 3 13:34:49.841913 kernel: hub 2-0:1.0: 4 ports detected Mar 3 13:34:49.842113 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 3 13:34:49.846915 kernel: GPT:17805311 != 160006143 Mar 3 13:34:49.853148 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 3 13:34:49.853185 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Mar 3 13:34:49.853415 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 3 13:34:49.853429 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Mar 3 13:34:49.861636 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 3 13:34:49.861877 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 3 13:34:49.871875 kernel: scsi host1: ahci Mar 3 13:34:49.874532 kernel: scsi host2: ahci Mar 3 13:34:49.874697 kernel: scsi host3: ahci Mar 3 13:34:49.877324 kernel: scsi host4: ahci Mar 3 13:34:49.877491 kernel: scsi host5: ahci Mar 3 13:34:49.877615 kernel: scsi host6: ahci Mar 3 13:34:49.879021 kernel: ata1: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380100 irq 51 lpm-pol 1 Mar 3 13:34:49.885168 kernel: ata2: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380180 irq 51 lpm-pol 1 Mar 3 13:34:49.885195 kernel: ata3: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380200 irq 51 lpm-pol 1 Mar 3 13:34:49.888546 kernel: ata4: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380280 irq 51 lpm-pol 1 Mar 3 13:34:49.893670 kernel: ata5: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380300 irq 51 lpm-pol 1 Mar 3 13:34:49.893697 kernel: ata6: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380380 irq 51 lpm-pol 1 Mar 3 13:34:49.915205 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Mar 3 13:34:49.924525 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Mar 3 13:34:49.940188 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 3 13:34:49.946049 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Mar 3 13:34:49.946410 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Mar 3 13:34:49.948659 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 3 13:34:49.966436 disk-uuid[646]: Primary Header is updated. Mar 3 13:34:49.966436 disk-uuid[646]: Secondary Entries is updated. Mar 3 13:34:49.966436 disk-uuid[646]: Secondary Header is updated. Mar 3 13:34:49.977874 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 3 13:34:49.992862 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 3 13:34:50.080916 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 3 13:34:50.203875 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 3 13:34:50.210862 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Mar 3 13:34:50.210919 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 3 13:34:50.219424 kernel: ata1.00: LPM support broken, forcing max_power Mar 3 13:34:50.219448 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Mar 3 13:34:50.219458 kernel: ata1.00: applying bridge limits Mar 3 13:34:50.224894 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 3 13:34:50.229112 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 3 13:34:50.229175 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 3 13:34:50.238278 kernel: ata1.00: LPM support broken, forcing max_power Mar 3 13:34:50.238350 kernel: ata1.00: configured for UDMA/100 Mar 3 13:34:50.240866 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 3 13:34:50.240889 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 3 13:34:50.271471 kernel: usbcore: registered new interface driver usbhid Mar 3 13:34:50.271521 kernel: usbhid: USB HID core driver Mar 3 13:34:50.277894 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Mar 3 13:34:50.281881 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Mar 3 13:34:50.297270 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Mar 3 13:34:50.297579 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 3 13:34:50.324888 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Mar 3 13:34:50.683464 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 3 13:34:50.685192 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 3 13:34:50.686159 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 3 13:34:50.686618 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 3 13:34:50.689005 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 3 13:34:50.723446 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 3 13:34:50.997036 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 3 13:34:50.999287 disk-uuid[647]: The operation has completed successfully. Mar 3 13:34:51.060162 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 3 13:34:51.060258 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 3 13:34:51.088878 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 3 13:34:51.110759 sh[679]: Success Mar 3 13:34:51.134107 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 3 13:34:51.134187 kernel: device-mapper: uevent: version 1.0.3 Mar 3 13:34:51.135929 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 3 13:34:51.147891 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Mar 3 13:34:51.189901 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 3 13:34:51.192029 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 3 13:34:51.200303 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 3 13:34:51.212887 kernel: BTRFS: device fsid f550cb98-648e-4600-9237-4b15eb09827b devid 1 transid 41 /dev/mapper/usr (254:0) scanned by mount (691) Mar 3 13:34:51.212927 kernel: BTRFS info (device dm-0): first mount of filesystem f550cb98-648e-4600-9237-4b15eb09827b Mar 3 13:34:51.217555 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 3 13:34:51.229904 kernel: BTRFS info (device dm-0 state E): enabling ssd optimizations Mar 3 13:34:51.229935 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 3 13:34:51.229948 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 3 13:34:51.234514 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 3 13:34:51.235625 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 3 13:34:51.236262 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 3 13:34:51.237068 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 3 13:34:51.241949 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 3 13:34:51.268895 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (724) Mar 3 13:34:51.274386 kernel: BTRFS info (device sda6): first mount of filesystem af9be1e8-b0f0-42a3-a696-521642a3b9f8 Mar 3 13:34:51.274418 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 3 13:34:51.280864 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 3 13:34:51.280890 kernel: BTRFS info (device sda6): turning on async discard Mar 3 13:34:51.284192 kernel: BTRFS info (device sda6): enabling free space tree Mar 3 13:34:51.290902 kernel: BTRFS info (device sda6): last unmount of filesystem af9be1e8-b0f0-42a3-a696-521642a3b9f8 Mar 3 13:34:51.291473 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 3 13:34:51.293965 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 3 13:34:51.349115 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 3 13:34:51.351955 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 3 13:34:51.385312 ignition[795]: Ignition 2.22.0 Mar 3 13:34:51.385957 ignition[795]: Stage: fetch-offline Mar 3 13:34:51.386369 ignition[795]: no configs at "/usr/lib/ignition/base.d" Mar 3 13:34:51.386723 ignition[795]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 13:34:51.387150 ignition[795]: parsed url from cmdline: "" Mar 3 13:34:51.387154 ignition[795]: no config URL provided Mar 3 13:34:51.387159 ignition[795]: reading system config file "/usr/lib/ignition/user.ign" Mar 3 13:34:51.387168 ignition[795]: no config at "/usr/lib/ignition/user.ign" Mar 3 13:34:51.387173 ignition[795]: failed to fetch config: resource requires networking Mar 3 13:34:51.387280 ignition[795]: Ignition finished successfully Mar 3 13:34:51.390906 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 3 13:34:51.397258 systemd-networkd[862]: lo: Link UP Mar 3 13:34:51.397799 systemd-networkd[862]: lo: Gained carrier Mar 3 13:34:51.400186 systemd-networkd[862]: Enumeration completed Mar 3 13:34:51.400249 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 3 13:34:51.400732 systemd[1]: Reached target network.target - Network. Mar 3 13:34:51.401101 systemd-networkd[862]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 13:34:51.401106 systemd-networkd[862]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 3 13:34:51.402855 systemd-networkd[862]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 13:34:51.402859 systemd-networkd[862]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 3 13:34:51.403580 systemd-networkd[862]: eth0: Link UP Mar 3 13:34:51.403724 systemd-networkd[862]: eth1: Link UP Mar 3 13:34:51.403898 systemd-networkd[862]: eth0: Gained carrier Mar 3 13:34:51.403906 systemd-networkd[862]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 13:34:51.405951 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 3 13:34:51.407979 systemd-networkd[862]: eth1: Gained carrier Mar 3 13:34:51.407987 systemd-networkd[862]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 13:34:51.428792 ignition[870]: Ignition 2.22.0 Mar 3 13:34:51.428866 ignition[870]: Stage: fetch Mar 3 13:34:51.428974 ignition[870]: no configs at "/usr/lib/ignition/base.d" Mar 3 13:34:51.428982 ignition[870]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 13:34:51.429039 ignition[870]: parsed url from cmdline: "" Mar 3 13:34:51.429042 ignition[870]: no config URL provided Mar 3 13:34:51.429046 ignition[870]: reading system config file "/usr/lib/ignition/user.ign" Mar 3 13:34:51.429053 ignition[870]: no config at "/usr/lib/ignition/user.ign" Mar 3 13:34:51.429074 ignition[870]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Mar 3 13:34:51.429210 ignition[870]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Mar 3 13:34:51.452921 systemd-networkd[862]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 3 13:34:51.470888 systemd-networkd[862]: eth0: DHCPv4 address 95.217.157.231/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 3 13:34:51.629475 ignition[870]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Mar 3 13:34:51.634135 ignition[870]: GET result: OK Mar 3 13:34:51.634641 ignition[870]: parsing config with SHA512: 55e5a9512ef4f45fca285f5edb4ca62d9b7d963a8204791d318db815f9662a45e1f176419171681b92f0453d78769a87da16abbbd678ef17ad24fa5e3c14e8f1 Mar 3 13:34:51.639140 unknown[870]: fetched base config from "system" Mar 3 13:34:51.639164 unknown[870]: fetched base config from "system" Mar 3 13:34:51.639541 ignition[870]: fetch: fetch complete Mar 3 13:34:51.639173 unknown[870]: fetched user config from "hetzner" Mar 3 13:34:51.639550 ignition[870]: fetch: fetch passed Mar 3 13:34:51.639617 ignition[870]: Ignition finished successfully Mar 3 13:34:51.643783 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 3 13:34:51.646102 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 3 13:34:51.694561 ignition[877]: Ignition 2.22.0 Mar 3 13:34:51.694574 ignition[877]: Stage: kargs Mar 3 13:34:51.694696 ignition[877]: no configs at "/usr/lib/ignition/base.d" Mar 3 13:34:51.694704 ignition[877]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 13:34:51.695297 ignition[877]: kargs: kargs passed Mar 3 13:34:51.695336 ignition[877]: Ignition finished successfully Mar 3 13:34:51.698495 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 3 13:34:51.700037 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 3 13:34:51.728192 ignition[884]: Ignition 2.22.0 Mar 3 13:34:51.728209 ignition[884]: Stage: disks Mar 3 13:34:51.728318 ignition[884]: no configs at "/usr/lib/ignition/base.d" Mar 3 13:34:51.728330 ignition[884]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 13:34:51.728888 ignition[884]: disks: disks passed Mar 3 13:34:51.730519 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 3 13:34:51.728928 ignition[884]: Ignition finished successfully Mar 3 13:34:51.731340 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 3 13:34:51.731894 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 3 13:34:51.732503 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 3 13:34:51.733117 systemd[1]: Reached target sysinit.target - System Initialization. Mar 3 13:34:51.733721 systemd[1]: Reached target basic.target - Basic System. Mar 3 13:34:51.735263 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 3 13:34:51.761222 systemd-fsck[893]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Mar 3 13:34:51.764388 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 3 13:34:51.766154 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 3 13:34:51.872895 kernel: EXT4-fs (sda9): mounted filesystem f0c751de-febc-4e57-b330-c926d38ed5ec r/w with ordered data mode. Quota mode: none. Mar 3 13:34:51.873406 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 3 13:34:51.874333 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 3 13:34:51.876323 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 3 13:34:51.878913 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 3 13:34:51.880111 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 3 13:34:51.880525 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 3 13:34:51.880552 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 3 13:34:51.891654 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 3 13:34:51.894149 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 3 13:34:51.907655 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (901) Mar 3 13:34:51.907684 kernel: BTRFS info (device sda6): first mount of filesystem af9be1e8-b0f0-42a3-a696-521642a3b9f8 Mar 3 13:34:51.911858 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 3 13:34:51.922180 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 3 13:34:51.922207 kernel: BTRFS info (device sda6): turning on async discard Mar 3 13:34:51.922217 kernel: BTRFS info (device sda6): enabling free space tree Mar 3 13:34:51.926215 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 3 13:34:51.946667 coreos-metadata[903]: Mar 03 13:34:51.946 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Mar 3 13:34:51.948162 coreos-metadata[903]: Mar 03 13:34:51.948 INFO Fetch successful Mar 3 13:34:51.949373 coreos-metadata[903]: Mar 03 13:34:51.949 INFO wrote hostname ci-4459-2-4-7-599052a073 to /sysroot/etc/hostname Mar 3 13:34:51.951205 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 3 13:34:51.954377 initrd-setup-root[929]: cut: /sysroot/etc/passwd: No such file or directory Mar 3 13:34:51.959826 initrd-setup-root[936]: cut: /sysroot/etc/group: No such file or directory Mar 3 13:34:51.964697 initrd-setup-root[943]: cut: /sysroot/etc/shadow: No such file or directory Mar 3 13:34:51.969755 initrd-setup-root[950]: cut: /sysroot/etc/gshadow: No such file or directory Mar 3 13:34:52.062880 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 3 13:34:52.064576 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 3 13:34:52.066110 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 3 13:34:52.081886 kernel: BTRFS info (device sda6): last unmount of filesystem af9be1e8-b0f0-42a3-a696-521642a3b9f8 Mar 3 13:34:52.096493 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 3 13:34:52.106610 ignition[1019]: INFO : Ignition 2.22.0 Mar 3 13:34:52.106610 ignition[1019]: INFO : Stage: mount Mar 3 13:34:52.107585 ignition[1019]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 3 13:34:52.107585 ignition[1019]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 13:34:52.107585 ignition[1019]: INFO : mount: mount passed Mar 3 13:34:52.107585 ignition[1019]: INFO : Ignition finished successfully Mar 3 13:34:52.108741 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 3 13:34:52.110924 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 3 13:34:52.210662 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 3 13:34:52.212411 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 3 13:34:52.232875 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1029) Mar 3 13:34:52.238998 kernel: BTRFS info (device sda6): first mount of filesystem af9be1e8-b0f0-42a3-a696-521642a3b9f8 Mar 3 13:34:52.239036 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 3 13:34:52.245298 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 3 13:34:52.245333 kernel: BTRFS info (device sda6): turning on async discard Mar 3 13:34:52.245343 kernel: BTRFS info (device sda6): enabling free space tree Mar 3 13:34:52.248909 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 3 13:34:52.277427 ignition[1045]: INFO : Ignition 2.22.0 Mar 3 13:34:52.277427 ignition[1045]: INFO : Stage: files Mar 3 13:34:52.278323 ignition[1045]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 3 13:34:52.278323 ignition[1045]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 13:34:52.278323 ignition[1045]: DEBUG : files: compiled without relabeling support, skipping Mar 3 13:34:52.279280 ignition[1045]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 3 13:34:52.279280 ignition[1045]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 3 13:34:52.281963 ignition[1045]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 3 13:34:52.282462 ignition[1045]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 3 13:34:52.283284 unknown[1045]: wrote ssh authorized keys file for user: core Mar 3 13:34:52.283808 ignition[1045]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 3 13:34:52.285056 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 3 13:34:52.285647 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Mar 3 13:34:52.504034 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 3 13:34:52.700182 systemd-networkd[862]: eth1: Gained IPv6LL Mar 3 13:34:52.816606 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 3 13:34:52.816606 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 3 13:34:52.816606 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 3 13:34:52.818739 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 3 13:34:52.818739 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 3 13:34:52.818739 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 3 13:34:52.818739 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 3 13:34:52.818739 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 3 13:34:52.818739 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 3 13:34:52.820995 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 3 13:34:52.820995 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 3 13:34:52.820995 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 3 13:34:52.822506 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 3 13:34:52.822506 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 3 13:34:52.822506 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.8-x86-64.raw: attempt #1 Mar 3 13:34:53.084049 systemd-networkd[862]: eth0: Gained IPv6LL Mar 3 13:34:53.486157 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 3 13:34:56.680148 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 3 13:34:56.680148 ignition[1045]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 3 13:34:56.683703 ignition[1045]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 3 13:34:56.688296 ignition[1045]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 3 13:34:56.688296 ignition[1045]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 3 13:34:56.688296 ignition[1045]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 3 13:34:56.690808 ignition[1045]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 3 13:34:56.690808 ignition[1045]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 3 13:34:56.690808 ignition[1045]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 3 13:34:56.690808 ignition[1045]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Mar 3 13:34:56.690808 ignition[1045]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Mar 3 13:34:56.690808 ignition[1045]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 3 13:34:56.690808 ignition[1045]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 3 13:34:56.690808 ignition[1045]: INFO : files: files passed Mar 3 13:34:56.690808 ignition[1045]: INFO : Ignition finished successfully Mar 3 13:34:56.692479 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 3 13:34:56.696100 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 3 13:34:56.700526 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 3 13:34:56.715534 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 3 13:34:56.715722 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 3 13:34:56.721820 initrd-setup-root-after-ignition[1075]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 3 13:34:56.721820 initrd-setup-root-after-ignition[1075]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 3 13:34:56.725365 initrd-setup-root-after-ignition[1079]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 3 13:34:56.727785 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 3 13:34:56.729799 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 3 13:34:56.732750 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 3 13:34:56.784891 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 3 13:34:56.784998 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 3 13:34:56.786635 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 3 13:34:56.787099 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 3 13:34:56.787939 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 3 13:34:56.788948 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 3 13:34:56.810288 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 3 13:34:56.812979 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 3 13:34:56.829952 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 3 13:34:56.830893 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 3 13:34:56.831328 systemd[1]: Stopped target timers.target - Timer Units. Mar 3 13:34:56.831754 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 3 13:34:56.831862 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 3 13:34:56.832782 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 3 13:34:56.833555 systemd[1]: Stopped target basic.target - Basic System. Mar 3 13:34:56.834303 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 3 13:34:56.835050 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 3 13:34:56.835748 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 3 13:34:56.836472 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 3 13:34:56.837213 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 3 13:34:56.837954 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 3 13:34:56.838668 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 3 13:34:56.839483 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 3 13:34:56.840192 systemd[1]: Stopped target swap.target - Swaps. Mar 3 13:34:56.840905 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 3 13:34:56.841015 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 3 13:34:56.842040 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 3 13:34:56.842820 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 3 13:34:56.843464 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 3 13:34:56.843550 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 3 13:34:56.844216 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 3 13:34:56.844322 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 3 13:34:56.845299 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 3 13:34:56.845405 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 3 13:34:56.846083 systemd[1]: ignition-files.service: Deactivated successfully. Mar 3 13:34:56.846181 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 3 13:34:56.846860 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 3 13:34:56.846981 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 3 13:34:56.849973 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 3 13:34:56.852987 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 3 13:34:56.853725 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 3 13:34:56.853823 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 3 13:34:56.854982 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 3 13:34:56.855052 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 3 13:34:56.859515 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 3 13:34:56.859606 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 3 13:34:56.877050 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 3 13:34:56.878767 ignition[1099]: INFO : Ignition 2.22.0 Mar 3 13:34:56.878767 ignition[1099]: INFO : Stage: umount Mar 3 13:34:56.878767 ignition[1099]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 3 13:34:56.878767 ignition[1099]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 13:34:56.878767 ignition[1099]: INFO : umount: umount passed Mar 3 13:34:56.878767 ignition[1099]: INFO : Ignition finished successfully Mar 3 13:34:56.880081 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 3 13:34:56.880192 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 3 13:34:56.883007 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 3 13:34:56.883092 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 3 13:34:56.884062 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 3 13:34:56.884145 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 3 13:34:56.885039 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 3 13:34:56.885079 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 3 13:34:56.885666 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 3 13:34:56.885705 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 3 13:34:56.886340 systemd[1]: Stopped target network.target - Network. Mar 3 13:34:56.886979 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 3 13:34:56.887019 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 3 13:34:56.887735 systemd[1]: Stopped target paths.target - Path Units. Mar 3 13:34:56.888496 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 3 13:34:56.890900 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 3 13:34:56.891633 systemd[1]: Stopped target slices.target - Slice Units. Mar 3 13:34:56.892310 systemd[1]: Stopped target sockets.target - Socket Units. Mar 3 13:34:56.892983 systemd[1]: iscsid.socket: Deactivated successfully. Mar 3 13:34:56.893018 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 3 13:34:56.893683 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 3 13:34:56.893722 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 3 13:34:56.894303 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 3 13:34:56.894349 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 3 13:34:56.894969 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 3 13:34:56.895007 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 3 13:34:56.895598 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 3 13:34:56.895636 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 3 13:34:56.896338 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 3 13:34:56.896928 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 3 13:34:56.903629 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 3 13:34:56.903744 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 3 13:34:56.906628 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 3 13:34:56.906946 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 3 13:34:56.906989 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 3 13:34:56.908230 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 3 13:34:56.910806 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 3 13:34:56.910950 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 3 13:34:56.912642 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 3 13:34:56.912810 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 3 13:34:56.913421 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 3 13:34:56.913451 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 3 13:34:56.914963 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 3 13:34:56.915314 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 3 13:34:56.915354 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 3 13:34:56.916947 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 3 13:34:56.916989 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 3 13:34:56.919008 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 3 13:34:56.919072 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 3 13:34:56.919497 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 3 13:34:56.922031 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 3 13:34:56.928158 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 3 13:34:56.928306 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 3 13:34:56.929055 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 3 13:34:56.929118 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 3 13:34:56.929737 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 3 13:34:56.929768 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 3 13:34:56.930505 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 3 13:34:56.930549 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 3 13:34:56.934499 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 3 13:34:56.934544 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 3 13:34:56.936689 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 3 13:34:56.936743 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 3 13:34:56.941109 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 3 13:34:56.941507 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 3 13:34:56.941559 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 3 13:34:56.942078 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 3 13:34:56.942118 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 3 13:34:56.942555 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 3 13:34:56.942590 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 3 13:34:56.943091 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 3 13:34:56.943126 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 3 13:34:56.943925 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 3 13:34:56.943962 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 13:34:56.945406 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 3 13:34:56.945495 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 3 13:34:56.961649 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 3 13:34:56.962166 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 3 13:34:56.962944 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 3 13:34:56.964752 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 3 13:34:56.980122 systemd[1]: Switching root. Mar 3 13:34:57.015778 systemd-journald[198]: Journal stopped Mar 3 13:34:58.244301 systemd-journald[198]: Received SIGTERM from PID 1 (systemd). Mar 3 13:34:58.244373 kernel: SELinux: policy capability network_peer_controls=1 Mar 3 13:34:58.244385 kernel: SELinux: policy capability open_perms=1 Mar 3 13:34:58.244394 kernel: SELinux: policy capability extended_socket_class=1 Mar 3 13:34:58.244405 kernel: SELinux: policy capability always_check_network=0 Mar 3 13:34:58.244414 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 3 13:34:58.244423 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 3 13:34:58.244432 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 3 13:34:58.244440 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 3 13:34:58.244451 kernel: SELinux: policy capability userspace_initial_context=0 Mar 3 13:34:58.244460 kernel: audit: type=1403 audit(1772544897.225:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 3 13:34:58.244470 systemd[1]: Successfully loaded SELinux policy in 81.181ms. Mar 3 13:34:58.244497 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 9.308ms. Mar 3 13:34:58.244509 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 3 13:34:58.244519 systemd[1]: Detected virtualization kvm. Mar 3 13:34:58.244527 systemd[1]: Detected architecture x86-64. Mar 3 13:34:58.244536 systemd[1]: Detected first boot. Mar 3 13:34:58.244545 systemd[1]: Hostname set to . Mar 3 13:34:58.244554 systemd[1]: Initializing machine ID from VM UUID. Mar 3 13:34:58.244568 zram_generator::config[1143]: No configuration found. Mar 3 13:34:58.244580 kernel: Guest personality initialized and is inactive Mar 3 13:34:58.244589 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Mar 3 13:34:58.244598 kernel: Initialized host personality Mar 3 13:34:58.244606 kernel: NET: Registered PF_VSOCK protocol family Mar 3 13:34:58.244614 systemd[1]: Populated /etc with preset unit settings. Mar 3 13:34:58.244624 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 3 13:34:58.244633 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 3 13:34:58.244642 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 3 13:34:58.244652 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 3 13:34:58.244663 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 3 13:34:58.244674 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 3 13:34:58.244687 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 3 13:34:58.244696 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 3 13:34:58.244706 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 3 13:34:58.244715 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 3 13:34:58.244724 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 3 13:34:58.244732 systemd[1]: Created slice user.slice - User and Session Slice. Mar 3 13:34:58.244743 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 3 13:34:58.244752 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 3 13:34:58.244761 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 3 13:34:58.244771 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 3 13:34:58.244780 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 3 13:34:58.244789 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 3 13:34:58.244801 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 3 13:34:58.244809 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 3 13:34:58.244818 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 3 13:34:58.244827 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 3 13:34:58.247400 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 3 13:34:58.247415 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 3 13:34:58.247425 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 3 13:34:58.247434 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 3 13:34:58.247443 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 3 13:34:58.247457 systemd[1]: Reached target slices.target - Slice Units. Mar 3 13:34:58.247466 systemd[1]: Reached target swap.target - Swaps. Mar 3 13:34:58.247477 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 3 13:34:58.247496 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 3 13:34:58.247512 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 3 13:34:58.247524 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 3 13:34:58.247538 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 3 13:34:58.247547 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 3 13:34:58.247556 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 3 13:34:58.247569 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 3 13:34:58.247579 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 3 13:34:58.247588 systemd[1]: Mounting media.mount - External Media Directory... Mar 3 13:34:58.247597 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 3 13:34:58.247606 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 3 13:34:58.247615 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 3 13:34:58.247624 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 3 13:34:58.247634 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 3 13:34:58.247649 systemd[1]: Reached target machines.target - Containers. Mar 3 13:34:58.247660 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 3 13:34:58.247670 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 3 13:34:58.247679 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 3 13:34:58.247688 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 3 13:34:58.247697 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 3 13:34:58.247705 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 3 13:34:58.247714 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 3 13:34:58.247723 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 3 13:34:58.247734 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 3 13:34:58.247743 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 3 13:34:58.247752 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 3 13:34:58.247761 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 3 13:34:58.247771 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 3 13:34:58.247781 systemd[1]: Stopped systemd-fsck-usr.service. Mar 3 13:34:58.247790 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 3 13:34:58.247801 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 3 13:34:58.247810 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 3 13:34:58.247819 kernel: loop: module loaded Mar 3 13:34:58.247828 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 3 13:34:58.248282 kernel: fuse: init (API version 7.41) Mar 3 13:34:58.248293 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 3 13:34:58.248303 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 3 13:34:58.248312 kernel: ACPI: bus type drm_connector registered Mar 3 13:34:58.248320 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 3 13:34:58.248329 systemd[1]: verity-setup.service: Deactivated successfully. Mar 3 13:34:58.248338 systemd[1]: Stopped verity-setup.service. Mar 3 13:34:58.248348 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 3 13:34:58.248358 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 3 13:34:58.248367 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 3 13:34:58.248376 systemd[1]: Mounted media.mount - External Media Directory. Mar 3 13:34:58.248386 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 3 13:34:58.248418 systemd-journald[1217]: Collecting audit messages is disabled. Mar 3 13:34:58.248442 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 3 13:34:58.248452 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 3 13:34:58.248463 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 3 13:34:58.248472 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 3 13:34:58.248481 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 3 13:34:58.248491 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 3 13:34:58.248500 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 3 13:34:58.248508 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 3 13:34:58.248518 systemd-journald[1217]: Journal started Mar 3 13:34:58.248535 systemd-journald[1217]: Runtime Journal (/run/log/journal/54244fe9e698483f881a5272dd7c4c6d) is 8M, max 76.1M, 68.1M free. Mar 3 13:34:57.865315 systemd[1]: Queued start job for default target multi-user.target. Mar 3 13:34:57.885211 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 3 13:34:58.253656 systemd[1]: Started systemd-journald.service - Journal Service. Mar 3 13:34:57.886189 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 3 13:34:58.252602 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 3 13:34:58.252805 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 3 13:34:58.253475 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 3 13:34:58.253911 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 3 13:34:58.255347 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 3 13:34:58.255536 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 3 13:34:58.257323 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 3 13:34:58.257498 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 3 13:34:58.258882 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 3 13:34:58.259564 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 3 13:34:58.261261 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 3 13:34:58.267763 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 3 13:34:58.274514 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 3 13:34:58.277914 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 3 13:34:58.280950 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 3 13:34:58.281865 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 3 13:34:58.281897 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 3 13:34:58.283382 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 3 13:34:58.290129 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 3 13:34:58.290584 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 3 13:34:58.292993 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 3 13:34:58.294944 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 3 13:34:58.295318 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 3 13:34:58.296361 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 3 13:34:58.296948 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 3 13:34:58.299971 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 3 13:34:58.301990 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 3 13:34:58.307962 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 3 13:34:58.309796 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 3 13:34:58.311224 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 3 13:34:58.333142 systemd-journald[1217]: Time spent on flushing to /var/log/journal/54244fe9e698483f881a5272dd7c4c6d is 28.580ms for 1243 entries. Mar 3 13:34:58.333142 systemd-journald[1217]: System Journal (/var/log/journal/54244fe9e698483f881a5272dd7c4c6d) is 8M, max 584.8M, 576.8M free. Mar 3 13:34:58.403256 systemd-journald[1217]: Received client request to flush runtime journal. Mar 3 13:34:58.403297 kernel: loop0: detected capacity change from 0 to 110984 Mar 3 13:34:58.403319 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 3 13:34:58.341540 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 3 13:34:58.342108 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 3 13:34:58.344395 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 3 13:34:58.375126 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 3 13:34:58.385244 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 3 13:34:58.399985 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Mar 3 13:34:58.399996 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Mar 3 13:34:58.405558 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 3 13:34:58.406297 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 3 13:34:58.408854 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 3 13:34:58.418464 kernel: loop1: detected capacity change from 0 to 128560 Mar 3 13:34:58.417809 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 3 13:34:58.451167 kernel: loop2: detected capacity change from 0 to 8 Mar 3 13:34:58.461750 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 3 13:34:58.466074 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 3 13:34:58.468894 kernel: loop3: detected capacity change from 0 to 228704 Mar 3 13:34:58.492216 systemd-tmpfiles[1293]: ACLs are not supported, ignoring. Mar 3 13:34:58.492235 systemd-tmpfiles[1293]: ACLs are not supported, ignoring. Mar 3 13:34:58.498707 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 3 13:34:58.516880 kernel: loop4: detected capacity change from 0 to 110984 Mar 3 13:34:58.530868 kernel: loop5: detected capacity change from 0 to 128560 Mar 3 13:34:58.549867 kernel: loop6: detected capacity change from 0 to 8 Mar 3 13:34:58.554216 kernel: loop7: detected capacity change from 0 to 228704 Mar 3 13:34:58.571416 (sd-merge)[1298]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Mar 3 13:34:58.572985 (sd-merge)[1298]: Merged extensions into '/usr'. Mar 3 13:34:58.578563 systemd[1]: Reload requested from client PID 1268 ('systemd-sysext') (unit systemd-sysext.service)... Mar 3 13:34:58.578681 systemd[1]: Reloading... Mar 3 13:34:58.677919 zram_generator::config[1324]: No configuration found. Mar 3 13:34:58.817398 ldconfig[1263]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 3 13:34:58.895578 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 3 13:34:58.895689 systemd[1]: Reloading finished in 315 ms. Mar 3 13:34:58.928079 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 3 13:34:58.928894 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 3 13:34:58.929679 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 3 13:34:58.941122 systemd[1]: Starting ensure-sysext.service... Mar 3 13:34:58.942440 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 3 13:34:58.945952 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 3 13:34:58.972459 systemd[1]: Reload requested from client PID 1368 ('systemctl') (unit ensure-sysext.service)... Mar 3 13:34:58.972567 systemd[1]: Reloading... Mar 3 13:34:58.991937 systemd-tmpfiles[1369]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 3 13:34:58.993320 systemd-tmpfiles[1369]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 3 13:34:58.995102 systemd-tmpfiles[1369]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 3 13:34:58.995389 systemd-tmpfiles[1369]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 3 13:34:58.996224 systemd-tmpfiles[1369]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 3 13:34:58.996475 systemd-tmpfiles[1369]: ACLs are not supported, ignoring. Mar 3 13:34:58.996569 systemd-tmpfiles[1369]: ACLs are not supported, ignoring. Mar 3 13:34:59.003016 systemd-tmpfiles[1369]: Detected autofs mount point /boot during canonicalization of boot. Mar 3 13:34:59.003101 systemd-tmpfiles[1369]: Skipping /boot Mar 3 13:34:59.014624 systemd-tmpfiles[1369]: Detected autofs mount point /boot during canonicalization of boot. Mar 3 13:34:59.014988 systemd-tmpfiles[1369]: Skipping /boot Mar 3 13:34:59.016120 systemd-udevd[1370]: Using default interface naming scheme 'v255'. Mar 3 13:34:59.049913 zram_generator::config[1397]: No configuration found. Mar 3 13:34:59.244399 systemd[1]: Reloading finished in 271 ms. Mar 3 13:34:59.253963 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 3 13:34:59.255452 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 3 13:34:59.280859 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Mar 3 13:34:59.285852 kernel: mousedev: PS/2 mouse device common for all mice Mar 3 13:34:59.296465 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 3 13:34:59.297593 systemd[1]: Finished ensure-sysext.service. Mar 3 13:34:59.299599 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 3 13:34:59.301385 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 3 13:34:59.306278 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 3 13:34:59.306730 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 3 13:34:59.308108 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 3 13:34:59.313406 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 3 13:34:59.316379 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 3 13:34:59.317912 kernel: ACPI: button: Power Button [PWRF] Mar 3 13:34:59.318664 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 3 13:34:59.320039 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 3 13:34:59.320088 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 3 13:34:59.321587 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 3 13:34:59.326062 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 3 13:34:59.334076 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 3 13:34:59.345765 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 3 13:34:59.348033 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 3 13:34:59.348918 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 3 13:34:59.357499 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 3 13:34:59.358919 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 3 13:34:59.365421 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 3 13:34:59.365616 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 3 13:34:59.366308 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 3 13:34:59.367069 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 3 13:34:59.370439 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 3 13:34:59.370504 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 3 13:34:59.380064 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 3 13:34:59.417033 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 3 13:34:59.417735 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 3 13:34:59.429898 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 3 13:34:59.449885 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 3 13:34:59.453189 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 3 13:34:59.474587 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 3 13:34:59.484170 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Mar 3 13:34:59.484414 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 3 13:34:59.486605 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 3 13:34:59.502057 augenrules[1539]: No rules Mar 3 13:34:59.488502 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 3 13:34:59.490314 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 3 13:34:59.491727 systemd[1]: audit-rules.service: Deactivated successfully. Mar 3 13:34:59.492936 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 3 13:34:59.494862 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 3 13:34:59.497666 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Mar 3 13:34:59.497702 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 3 13:34:59.497791 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 3 13:34:59.499956 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 3 13:34:59.502586 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 3 13:34:59.503617 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 3 13:34:59.505043 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 3 13:34:59.505075 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 3 13:34:59.505095 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 3 13:34:59.505105 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 3 13:34:59.523859 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Mar 3 13:34:59.530072 kernel: Console: switching to colour dummy device 80x25 Mar 3 13:34:59.531888 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Mar 3 13:34:59.536314 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 3 13:34:59.536522 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 3 13:34:59.536922 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 3 13:34:59.537267 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 3 13:34:59.538569 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 3 13:34:59.540807 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 3 13:34:59.541036 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 3 13:34:59.551558 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 3 13:34:59.551629 kernel: [drm] features: -context_init Mar 3 13:34:59.553953 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 3 13:34:59.555914 kernel: EDAC MC: Ver: 3.0.0 Mar 3 13:34:59.562773 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 3 13:34:59.565448 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 3 13:34:59.572900 kernel: [drm] number of scanouts: 1 Mar 3 13:34:59.574859 kernel: [drm] number of cap sets: 0 Mar 3 13:34:59.576853 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Mar 3 13:34:59.584153 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Mar 3 13:34:59.584210 kernel: Console: switching to colour frame buffer device 160x50 Mar 3 13:34:59.589861 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 3 13:34:59.598172 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 13:34:59.613364 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 3 13:34:59.613583 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 13:34:59.618201 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 13:34:59.621713 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 3 13:34:59.642068 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 3 13:34:59.642995 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 13:34:59.645624 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 13:34:59.720504 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 13:34:59.758141 systemd-networkd[1492]: lo: Link UP Mar 3 13:34:59.758156 systemd-networkd[1492]: lo: Gained carrier Mar 3 13:34:59.765560 systemd-networkd[1492]: Enumeration completed Mar 3 13:34:59.765959 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 3 13:34:59.770006 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 3 13:34:59.771609 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 3 13:34:59.775211 systemd-networkd[1492]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 13:34:59.775218 systemd-networkd[1492]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 3 13:34:59.775788 systemd-networkd[1492]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 13:34:59.775792 systemd-networkd[1492]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 3 13:34:59.776203 systemd-networkd[1492]: eth0: Link UP Mar 3 13:34:59.776341 systemd-networkd[1492]: eth0: Gained carrier Mar 3 13:34:59.776352 systemd-networkd[1492]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 13:34:59.782033 systemd-networkd[1492]: eth1: Link UP Mar 3 13:34:59.782955 systemd-networkd[1492]: eth1: Gained carrier Mar 3 13:34:59.783016 systemd-networkd[1492]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 13:34:59.800720 systemd-resolved[1494]: Positive Trust Anchors: Mar 3 13:34:59.800760 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 3 13:34:59.801461 systemd-resolved[1494]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 3 13:34:59.801515 systemd-resolved[1494]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 3 13:34:59.807256 systemd-resolved[1494]: Using system hostname 'ci-4459-2-4-7-599052a073'. Mar 3 13:34:59.809558 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 3 13:34:59.809712 systemd[1]: Reached target network.target - Network. Mar 3 13:34:59.809768 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 3 13:34:59.811928 systemd-networkd[1492]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 3 13:34:59.812797 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 3 13:34:59.814490 systemd[1]: Reached target sysinit.target - System Initialization. Mar 3 13:34:59.814641 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 3 13:34:59.814724 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 3 13:34:59.814783 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Mar 3 13:34:59.814854 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 3 13:34:59.814920 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 3 13:34:59.814940 systemd[1]: Reached target paths.target - Path Units. Mar 3 13:34:59.814983 systemd[1]: Reached target time-set.target - System Time Set. Mar 3 13:34:59.815171 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 3 13:34:59.815612 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 3 13:34:59.816900 systemd[1]: Reached target timers.target - Timer Units. Mar 3 13:34:59.818462 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 3 13:34:59.822170 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 3 13:34:59.826819 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 3 13:34:59.828535 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 3 13:34:59.828896 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 3 13:34:59.834897 systemd-networkd[1492]: eth0: DHCPv4 address 95.217.157.231/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 3 13:34:59.835761 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 3 13:34:59.836517 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 3 13:34:59.837798 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 3 13:34:59.837937 systemd-timesyncd[1495]: Network configuration changed, trying to establish connection. Mar 3 13:34:59.840370 systemd[1]: Reached target sockets.target - Socket Units. Mar 3 13:34:59.842162 systemd[1]: Reached target basic.target - Basic System. Mar 3 13:34:59.843904 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 3 13:34:59.843932 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 3 13:34:59.844946 systemd[1]: Starting containerd.service - containerd container runtime... Mar 3 13:34:59.848278 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 3 13:34:59.851942 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 3 13:34:59.856051 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 3 13:34:59.859920 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 3 13:34:59.863026 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 3 13:34:59.864704 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 3 13:34:59.867472 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Mar 3 13:34:59.878509 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 3 13:34:59.880550 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 3 13:34:59.883402 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Mar 3 13:34:59.887022 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 3 13:34:59.889970 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 3 13:34:59.899604 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Refreshing passwd entry cache Mar 3 13:34:59.899608 oslogin_cache_refresh[1593]: Refreshing passwd entry cache Mar 3 13:34:59.901409 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 3 13:34:59.903749 jq[1591]: false Mar 3 13:34:59.903357 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 3 13:34:59.904093 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Failure getting users, quitting Mar 3 13:34:59.904093 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 3 13:34:59.904093 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Refreshing group entry cache Mar 3 13:34:59.902580 oslogin_cache_refresh[1593]: Failure getting users, quitting Mar 3 13:34:59.902595 oslogin_cache_refresh[1593]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 3 13:34:59.902632 oslogin_cache_refresh[1593]: Refreshing group entry cache Mar 3 13:34:59.906199 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 3 13:34:59.909736 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Failure getting groups, quitting Mar 3 13:34:59.909736 google_oslogin_nss_cache[1593]: oslogin_cache_refresh[1593]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 3 13:34:59.904996 oslogin_cache_refresh[1593]: Failure getting groups, quitting Mar 3 13:34:59.909091 systemd[1]: Starting update-engine.service - Update Engine... Mar 3 13:34:59.905006 oslogin_cache_refresh[1593]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 3 13:34:59.915011 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 3 13:34:59.924573 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 3 13:34:59.928528 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 3 13:34:59.929865 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 3 13:34:59.930283 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Mar 3 13:34:59.931009 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Mar 3 13:34:59.940846 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 3 13:34:59.941403 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 3 13:34:59.945927 coreos-metadata[1588]: Mar 03 13:34:59.945 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Mar 3 13:34:59.947994 coreos-metadata[1588]: Mar 03 13:34:59.947 INFO Fetch successful Mar 3 13:34:59.947994 coreos-metadata[1588]: Mar 03 13:34:59.947 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Mar 3 13:34:59.949768 coreos-metadata[1588]: Mar 03 13:34:59.948 INFO Fetch successful Mar 3 13:34:59.950691 extend-filesystems[1592]: Found /dev/sda6 Mar 3 13:34:59.960599 extend-filesystems[1592]: Found /dev/sda9 Mar 3 13:34:59.969245 extend-filesystems[1592]: Checking size of /dev/sda9 Mar 3 13:34:59.974654 jq[1606]: true Mar 3 13:34:59.987884 update_engine[1605]: I20260303 13:34:59.984225 1605 main.cc:92] Flatcar Update Engine starting Mar 3 13:34:59.993192 dbus-daemon[1589]: [system] SELinux support is enabled Mar 3 13:34:59.993367 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 3 13:34:59.997412 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 3 13:34:59.997435 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 3 13:35:00.000061 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 3 13:35:00.000077 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 3 13:35:00.004865 jq[1627]: true Mar 3 13:35:00.013757 (ntainerd)[1629]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 3 13:35:00.014275 systemd[1]: motdgen.service: Deactivated successfully. Mar 3 13:35:00.014495 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 3 13:35:00.020391 systemd[1]: Started update-engine.service - Update Engine. Mar 3 13:35:00.031529 tar[1609]: linux-amd64/LICENSE Mar 3 13:35:00.031529 tar[1609]: linux-amd64/helm Mar 3 13:35:00.031798 update_engine[1605]: I20260303 13:35:00.026577 1605 update_check_scheduler.cc:74] Next update check in 6m33s Mar 3 13:35:00.035882 extend-filesystems[1592]: Resized partition /dev/sda9 Mar 3 13:35:00.046410 extend-filesystems[1644]: resize2fs 1.47.3 (8-Jul-2025) Mar 3 13:35:00.062345 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 19393531 blocks Mar 3 13:35:00.059093 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 3 13:35:00.094622 systemd-logind[1601]: New seat seat0. Mar 3 13:35:00.101456 systemd-logind[1601]: Watching system buttons on /dev/input/event3 (Power Button) Mar 3 13:35:00.101485 systemd-logind[1601]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 3 13:35:00.101715 systemd[1]: Started systemd-logind.service - User Login Management. Mar 3 13:35:00.162123 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 3 13:35:00.168699 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 3 13:35:00.226358 bash[1663]: Updated "/home/core/.ssh/authorized_keys" Mar 3 13:35:00.231587 containerd[1629]: time="2026-03-03T13:35:00Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 3 13:35:00.231587 containerd[1629]: time="2026-03-03T13:35:00.230946923Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 3 13:35:00.231711 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 3 13:35:00.236602 containerd[1629]: time="2026-03-03T13:35:00.236571820Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.66µs" Mar 3 13:35:00.236602 containerd[1629]: time="2026-03-03T13:35:00.236599110Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 3 13:35:00.236660 containerd[1629]: time="2026-03-03T13:35:00.236612880Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 3 13:35:00.237016 containerd[1629]: time="2026-03-03T13:35:00.236729960Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 3 13:35:00.237016 containerd[1629]: time="2026-03-03T13:35:00.236740800Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 3 13:35:00.237016 containerd[1629]: time="2026-03-03T13:35:00.236756830Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 3 13:35:00.237016 containerd[1629]: time="2026-03-03T13:35:00.236807500Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 3 13:35:00.237016 containerd[1629]: time="2026-03-03T13:35:00.236815130Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 3 13:35:00.237739 containerd[1629]: time="2026-03-03T13:35:00.237254571Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 3 13:35:00.237739 containerd[1629]: time="2026-03-03T13:35:00.237268961Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 3 13:35:00.237739 containerd[1629]: time="2026-03-03T13:35:00.237281601Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 3 13:35:00.237739 containerd[1629]: time="2026-03-03T13:35:00.237287431Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 3 13:35:00.237739 containerd[1629]: time="2026-03-03T13:35:00.237365161Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 3 13:35:00.237739 containerd[1629]: time="2026-03-03T13:35:00.237545491Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 3 13:35:00.237739 containerd[1629]: time="2026-03-03T13:35:00.237568791Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 3 13:35:00.237739 containerd[1629]: time="2026-03-03T13:35:00.237575191Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 3 13:35:00.237739 containerd[1629]: time="2026-03-03T13:35:00.237608441Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 3 13:35:00.252036 containerd[1629]: time="2026-03-03T13:35:00.237801022Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 3 13:35:00.252036 containerd[1629]: time="2026-03-03T13:35:00.237886702Z" level=info msg="metadata content store policy set" policy=shared Mar 3 13:35:00.248227 systemd[1]: Starting sshkeys.service... Mar 3 13:35:00.266911 containerd[1629]: time="2026-03-03T13:35:00.266359187Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 3 13:35:00.266911 containerd[1629]: time="2026-03-03T13:35:00.266410027Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 3 13:35:00.266911 containerd[1629]: time="2026-03-03T13:35:00.266420697Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 3 13:35:00.266911 containerd[1629]: time="2026-03-03T13:35:00.266431207Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 3 13:35:00.266911 containerd[1629]: time="2026-03-03T13:35:00.266440677Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 3 13:35:00.266911 containerd[1629]: time="2026-03-03T13:35:00.266448127Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 3 13:35:00.266911 containerd[1629]: time="2026-03-03T13:35:00.266458677Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 3 13:35:00.266911 containerd[1629]: time="2026-03-03T13:35:00.266468107Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 3 13:35:00.266911 containerd[1629]: time="2026-03-03T13:35:00.266486717Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 3 13:35:00.266911 containerd[1629]: time="2026-03-03T13:35:00.266496817Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 3 13:35:00.266911 containerd[1629]: time="2026-03-03T13:35:00.266503467Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 3 13:35:00.266911 containerd[1629]: time="2026-03-03T13:35:00.266513808Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 3 13:35:00.266911 containerd[1629]: time="2026-03-03T13:35:00.266628488Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 3 13:35:00.266911 containerd[1629]: time="2026-03-03T13:35:00.266641768Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 3 13:35:00.267143 containerd[1629]: time="2026-03-03T13:35:00.266652698Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 3 13:35:00.267143 containerd[1629]: time="2026-03-03T13:35:00.266660978Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 3 13:35:00.267143 containerd[1629]: time="2026-03-03T13:35:00.266668248Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 3 13:35:00.267143 containerd[1629]: time="2026-03-03T13:35:00.266675618Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 3 13:35:00.267143 containerd[1629]: time="2026-03-03T13:35:00.266683658Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 3 13:35:00.267143 containerd[1629]: time="2026-03-03T13:35:00.266691858Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 3 13:35:00.267143 containerd[1629]: time="2026-03-03T13:35:00.266699718Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 3 13:35:00.267143 containerd[1629]: time="2026-03-03T13:35:00.266707208Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 3 13:35:00.267143 containerd[1629]: time="2026-03-03T13:35:00.266714028Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 3 13:35:00.267143 containerd[1629]: time="2026-03-03T13:35:00.266752978Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 3 13:35:00.267143 containerd[1629]: time="2026-03-03T13:35:00.266762578Z" level=info msg="Start snapshots syncer" Mar 3 13:35:00.267143 containerd[1629]: time="2026-03-03T13:35:00.266786228Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 3 13:35:00.269299 containerd[1629]: time="2026-03-03T13:35:00.268432870Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 3 13:35:00.269299 containerd[1629]: time="2026-03-03T13:35:00.268480920Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 3 13:35:00.269478 containerd[1629]: time="2026-03-03T13:35:00.269464591Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 3 13:35:00.272030 containerd[1629]: time="2026-03-03T13:35:00.270543803Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 3 13:35:00.272030 containerd[1629]: time="2026-03-03T13:35:00.270581793Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 3 13:35:00.272030 containerd[1629]: time="2026-03-03T13:35:00.270590993Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 3 13:35:00.272030 containerd[1629]: time="2026-03-03T13:35:00.270598313Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 3 13:35:00.272030 containerd[1629]: time="2026-03-03T13:35:00.270606543Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 3 13:35:00.272030 containerd[1629]: time="2026-03-03T13:35:00.270615453Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 3 13:35:00.272030 containerd[1629]: time="2026-03-03T13:35:00.270623283Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 3 13:35:00.272030 containerd[1629]: time="2026-03-03T13:35:00.271064683Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 3 13:35:00.272030 containerd[1629]: time="2026-03-03T13:35:00.271080313Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 3 13:35:00.272030 containerd[1629]: time="2026-03-03T13:35:00.271089763Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 3 13:35:00.272030 containerd[1629]: time="2026-03-03T13:35:00.271144033Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 3 13:35:00.272030 containerd[1629]: time="2026-03-03T13:35:00.271154883Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 3 13:35:00.272030 containerd[1629]: time="2026-03-03T13:35:00.271161453Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 3 13:35:00.272218 containerd[1629]: time="2026-03-03T13:35:00.271170143Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 3 13:35:00.272218 containerd[1629]: time="2026-03-03T13:35:00.271175893Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 3 13:35:00.272218 containerd[1629]: time="2026-03-03T13:35:00.271183573Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 3 13:35:00.272218 containerd[1629]: time="2026-03-03T13:35:00.271241773Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 3 13:35:00.272218 containerd[1629]: time="2026-03-03T13:35:00.271256153Z" level=info msg="runtime interface created" Mar 3 13:35:00.272218 containerd[1629]: time="2026-03-03T13:35:00.271260703Z" level=info msg="created NRI interface" Mar 3 13:35:00.272218 containerd[1629]: time="2026-03-03T13:35:00.271267113Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 3 13:35:00.272218 containerd[1629]: time="2026-03-03T13:35:00.271289333Z" level=info msg="Connect containerd service" Mar 3 13:35:00.272218 containerd[1629]: time="2026-03-03T13:35:00.271302963Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 3 13:35:00.272768 containerd[1629]: time="2026-03-03T13:35:00.272486125Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 3 13:35:00.298383 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 3 13:35:00.302052 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 3 13:35:00.336864 locksmithd[1643]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 3 13:35:00.350982 coreos-metadata[1687]: Mar 03 13:35:00.350 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Mar 3 13:35:00.355280 coreos-metadata[1687]: Mar 03 13:35:00.355 INFO Fetch successful Mar 3 13:35:00.366091 unknown[1687]: wrote ssh authorized keys file for user: core Mar 3 13:35:00.370862 containerd[1629]: time="2026-03-03T13:35:00.370184017Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 3 13:35:00.370862 containerd[1629]: time="2026-03-03T13:35:00.370498697Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 3 13:35:00.370862 containerd[1629]: time="2026-03-03T13:35:00.370519708Z" level=info msg="Start subscribing containerd event" Mar 3 13:35:00.370938 containerd[1629]: time="2026-03-03T13:35:00.370538398Z" level=info msg="Start recovering state" Mar 3 13:35:00.371671 containerd[1629]: time="2026-03-03T13:35:00.371093358Z" level=info msg="Start event monitor" Mar 3 13:35:00.371671 containerd[1629]: time="2026-03-03T13:35:00.371106638Z" level=info msg="Start cni network conf syncer for default" Mar 3 13:35:00.371671 containerd[1629]: time="2026-03-03T13:35:00.371113118Z" level=info msg="Start streaming server" Mar 3 13:35:00.371671 containerd[1629]: time="2026-03-03T13:35:00.371121978Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 3 13:35:00.371671 containerd[1629]: time="2026-03-03T13:35:00.371130418Z" level=info msg="runtime interface starting up..." Mar 3 13:35:00.372275 containerd[1629]: time="2026-03-03T13:35:00.371136838Z" level=info msg="starting plugins..." Mar 3 13:35:00.372426 containerd[1629]: time="2026-03-03T13:35:00.372409390Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 3 13:35:00.372641 systemd[1]: Started containerd.service - containerd container runtime. Mar 3 13:35:00.375472 containerd[1629]: time="2026-03-03T13:35:00.375448544Z" level=info msg="containerd successfully booted in 0.146047s" Mar 3 13:35:00.383977 sshd_keygen[1635]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 3 13:35:00.409397 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 3 13:35:00.409852 kernel: EXT4-fs (sda9): resized filesystem to 19393531 Mar 3 13:35:00.413060 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 3 13:35:00.427815 systemd[1]: issuegen.service: Deactivated successfully. Mar 3 13:35:00.428038 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 3 13:35:00.430864 update-ssh-keys[1696]: Updated "/home/core/.ssh/authorized_keys" Mar 3 13:35:00.434061 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 3 13:35:00.434771 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 3 13:35:00.436762 extend-filesystems[1644]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Mar 3 13:35:00.436762 extend-filesystems[1644]: old_desc_blocks = 1, new_desc_blocks = 10 Mar 3 13:35:00.436762 extend-filesystems[1644]: The filesystem on /dev/sda9 is now 19393531 (4k) blocks long. Mar 3 13:35:00.438295 extend-filesystems[1592]: Resized filesystem in /dev/sda9 Mar 3 13:35:00.437544 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 3 13:35:00.438545 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 3 13:35:00.441680 systemd[1]: Finished sshkeys.service. Mar 3 13:35:00.455929 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 3 13:35:00.462178 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 3 13:35:00.465057 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 3 13:35:00.467032 systemd[1]: Reached target getty.target - Login Prompts. Mar 3 13:35:00.555766 tar[1609]: linux-amd64/README.md Mar 3 13:35:00.571737 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 3 13:35:00.892191 systemd-networkd[1492]: eth0: Gained IPv6LL Mar 3 13:35:00.893110 systemd-timesyncd[1495]: Network configuration changed, trying to establish connection. Mar 3 13:35:00.897648 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 3 13:35:00.900965 systemd[1]: Reached target network-online.target - Network is Online. Mar 3 13:35:00.906230 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 13:35:00.913175 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 3 13:35:00.960165 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 3 13:35:01.340050 systemd-networkd[1492]: eth1: Gained IPv6LL Mar 3 13:35:01.341110 systemd-timesyncd[1495]: Network configuration changed, trying to establish connection. Mar 3 13:35:01.793732 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 13:35:01.796104 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 3 13:35:01.797927 systemd[1]: Startup finished in 2.932s (kernel) + 8.559s (initrd) + 4.651s (userspace) = 16.144s. Mar 3 13:35:01.804318 (kubelet)[1741]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 3 13:35:02.346797 kubelet[1741]: E0303 13:35:02.346737 1741 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 3 13:35:02.351585 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 3 13:35:02.351748 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 3 13:35:02.352363 systemd[1]: kubelet.service: Consumed 811ms CPU time, 266.5M memory peak. Mar 3 13:35:02.755715 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 3 13:35:02.756936 systemd[1]: Started sshd@0-95.217.157.231:22-20.161.92.111:33600.service - OpenSSH per-connection server daemon (20.161.92.111:33600). Mar 3 13:35:03.421511 sshd[1754]: Accepted publickey for core from 20.161.92.111 port 33600 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:35:03.424736 sshd-session[1754]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:35:03.436272 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 3 13:35:03.439185 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 3 13:35:03.451798 systemd-logind[1601]: New session 1 of user core. Mar 3 13:35:03.460907 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 3 13:35:03.463645 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 3 13:35:03.488773 (systemd)[1759]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 3 13:35:03.492191 systemd-logind[1601]: New session c1 of user core. Mar 3 13:35:03.635778 systemd[1759]: Queued start job for default target default.target. Mar 3 13:35:03.645817 systemd[1759]: Created slice app.slice - User Application Slice. Mar 3 13:35:03.645861 systemd[1759]: Reached target paths.target - Paths. Mar 3 13:35:03.645908 systemd[1759]: Reached target timers.target - Timers. Mar 3 13:35:03.647223 systemd[1759]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 3 13:35:03.657612 systemd[1759]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 3 13:35:03.657666 systemd[1759]: Reached target sockets.target - Sockets. Mar 3 13:35:03.657783 systemd[1759]: Reached target basic.target - Basic System. Mar 3 13:35:03.657875 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 3 13:35:03.658994 systemd[1759]: Reached target default.target - Main User Target. Mar 3 13:35:03.659092 systemd[1759]: Startup finished in 158ms. Mar 3 13:35:03.664975 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 3 13:35:04.045227 systemd[1]: Started sshd@1-95.217.157.231:22-20.161.92.111:33616.service - OpenSSH per-connection server daemon (20.161.92.111:33616). Mar 3 13:35:04.714820 sshd[1770]: Accepted publickey for core from 20.161.92.111 port 33616 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:35:04.717064 sshd-session[1770]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:35:04.724939 systemd-logind[1601]: New session 2 of user core. Mar 3 13:35:04.732095 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 3 13:35:05.085086 sshd[1773]: Connection closed by 20.161.92.111 port 33616 Mar 3 13:35:05.085653 sshd-session[1770]: pam_unix(sshd:session): session closed for user core Mar 3 13:35:05.090451 systemd-logind[1601]: Session 2 logged out. Waiting for processes to exit. Mar 3 13:35:05.091339 systemd[1]: sshd@1-95.217.157.231:22-20.161.92.111:33616.service: Deactivated successfully. Mar 3 13:35:05.093336 systemd[1]: session-2.scope: Deactivated successfully. Mar 3 13:35:05.094802 systemd-logind[1601]: Removed session 2. Mar 3 13:35:05.222674 systemd[1]: Started sshd@2-95.217.157.231:22-20.161.92.111:33632.service - OpenSSH per-connection server daemon (20.161.92.111:33632). Mar 3 13:35:05.897389 sshd[1779]: Accepted publickey for core from 20.161.92.111 port 33632 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:35:05.899667 sshd-session[1779]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:35:05.906288 systemd-logind[1601]: New session 3 of user core. Mar 3 13:35:05.913248 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 3 13:35:06.259327 sshd[1782]: Connection closed by 20.161.92.111 port 33632 Mar 3 13:35:06.261072 sshd-session[1779]: pam_unix(sshd:session): session closed for user core Mar 3 13:35:06.264599 systemd[1]: sshd@2-95.217.157.231:22-20.161.92.111:33632.service: Deactivated successfully. Mar 3 13:35:06.266609 systemd[1]: session-3.scope: Deactivated successfully. Mar 3 13:35:06.268651 systemd-logind[1601]: Session 3 logged out. Waiting for processes to exit. Mar 3 13:35:06.269626 systemd-logind[1601]: Removed session 3. Mar 3 13:35:06.393889 systemd[1]: Started sshd@3-95.217.157.231:22-20.161.92.111:33636.service - OpenSSH per-connection server daemon (20.161.92.111:33636). Mar 3 13:35:07.056661 sshd[1788]: Accepted publickey for core from 20.161.92.111 port 33636 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:35:07.058606 sshd-session[1788]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:35:07.064712 systemd-logind[1601]: New session 4 of user core. Mar 3 13:35:07.070045 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 3 13:35:07.419960 sshd[1791]: Connection closed by 20.161.92.111 port 33636 Mar 3 13:35:07.422049 sshd-session[1788]: pam_unix(sshd:session): session closed for user core Mar 3 13:35:07.425752 systemd-logind[1601]: Session 4 logged out. Waiting for processes to exit. Mar 3 13:35:07.426474 systemd[1]: sshd@3-95.217.157.231:22-20.161.92.111:33636.service: Deactivated successfully. Mar 3 13:35:07.428335 systemd[1]: session-4.scope: Deactivated successfully. Mar 3 13:35:07.429487 systemd-logind[1601]: Removed session 4. Mar 3 13:35:07.550581 systemd[1]: Started sshd@4-95.217.157.231:22-20.161.92.111:33644.service - OpenSSH per-connection server daemon (20.161.92.111:33644). Mar 3 13:35:08.202232 sshd[1797]: Accepted publickey for core from 20.161.92.111 port 33644 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:35:08.205209 sshd-session[1797]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:35:08.212402 systemd-logind[1601]: New session 5 of user core. Mar 3 13:35:08.227074 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 3 13:35:08.465724 sudo[1801]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 3 13:35:08.466379 sudo[1801]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 3 13:35:08.484961 sudo[1801]: pam_unix(sudo:session): session closed for user root Mar 3 13:35:08.605648 sshd[1800]: Connection closed by 20.161.92.111 port 33644 Mar 3 13:35:08.608226 sshd-session[1797]: pam_unix(sshd:session): session closed for user core Mar 3 13:35:08.614932 systemd-logind[1601]: Session 5 logged out. Waiting for processes to exit. Mar 3 13:35:08.616223 systemd[1]: sshd@4-95.217.157.231:22-20.161.92.111:33644.service: Deactivated successfully. Mar 3 13:35:08.619474 systemd[1]: session-5.scope: Deactivated successfully. Mar 3 13:35:08.622509 systemd-logind[1601]: Removed session 5. Mar 3 13:35:08.740676 systemd[1]: Started sshd@5-95.217.157.231:22-20.161.92.111:35402.service - OpenSSH per-connection server daemon (20.161.92.111:35402). Mar 3 13:35:09.415676 sshd[1807]: Accepted publickey for core from 20.161.92.111 port 35402 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:35:09.418641 sshd-session[1807]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:35:09.428930 systemd-logind[1601]: New session 6 of user core. Mar 3 13:35:09.436097 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 3 13:35:09.667682 sudo[1812]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 3 13:35:09.668324 sudo[1812]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 3 13:35:09.676748 sudo[1812]: pam_unix(sudo:session): session closed for user root Mar 3 13:35:09.688026 sudo[1811]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 3 13:35:09.688626 sudo[1811]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 3 13:35:09.707269 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 3 13:35:09.777710 augenrules[1834]: No rules Mar 3 13:35:09.780297 systemd[1]: audit-rules.service: Deactivated successfully. Mar 3 13:35:09.780805 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 3 13:35:09.782825 sudo[1811]: pam_unix(sudo:session): session closed for user root Mar 3 13:35:09.904000 sshd[1810]: Connection closed by 20.161.92.111 port 35402 Mar 3 13:35:09.905750 sshd-session[1807]: pam_unix(sshd:session): session closed for user core Mar 3 13:35:09.911256 systemd-logind[1601]: Session 6 logged out. Waiting for processes to exit. Mar 3 13:35:09.912600 systemd[1]: sshd@5-95.217.157.231:22-20.161.92.111:35402.service: Deactivated successfully. Mar 3 13:35:09.915531 systemd[1]: session-6.scope: Deactivated successfully. Mar 3 13:35:09.918069 systemd-logind[1601]: Removed session 6. Mar 3 13:35:10.038059 systemd[1]: Started sshd@6-95.217.157.231:22-20.161.92.111:35416.service - OpenSSH per-connection server daemon (20.161.92.111:35416). Mar 3 13:35:10.710609 sshd[1843]: Accepted publickey for core from 20.161.92.111 port 35416 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:35:10.711957 sshd-session[1843]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:35:10.716228 systemd-logind[1601]: New session 7 of user core. Mar 3 13:35:10.721951 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 3 13:35:10.964638 sudo[1847]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 3 13:35:10.965566 sudo[1847]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 3 13:35:11.327281 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 3 13:35:11.358641 (dockerd)[1865]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 3 13:35:11.605977 dockerd[1865]: time="2026-03-03T13:35:11.605545000Z" level=info msg="Starting up" Mar 3 13:35:11.606462 dockerd[1865]: time="2026-03-03T13:35:11.606425852Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 3 13:35:11.621387 dockerd[1865]: time="2026-03-03T13:35:11.621271150Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 3 13:35:11.669193 dockerd[1865]: time="2026-03-03T13:35:11.669137060Z" level=info msg="Loading containers: start." Mar 3 13:35:11.679947 kernel: Initializing XFRM netlink socket Mar 3 13:35:11.876475 systemd-timesyncd[1495]: Network configuration changed, trying to establish connection. Mar 3 13:35:13.506883 systemd-resolved[1494]: Clock change detected. Flushing caches. Mar 3 13:35:13.507102 systemd-timesyncd[1495]: Contacted time server 116.203.244.102:123 (2.flatcar.pool.ntp.org). Mar 3 13:35:13.507144 systemd-timesyncd[1495]: Initial clock synchronization to Tue 2026-03-03 13:35:13.506835 UTC. Mar 3 13:35:13.527920 systemd-networkd[1492]: docker0: Link UP Mar 3 13:35:13.533214 dockerd[1865]: time="2026-03-03T13:35:13.533169907Z" level=info msg="Loading containers: done." Mar 3 13:35:13.546229 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck4100657232-merged.mount: Deactivated successfully. Mar 3 13:35:13.550384 dockerd[1865]: time="2026-03-03T13:35:13.550342389Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 3 13:35:13.550530 dockerd[1865]: time="2026-03-03T13:35:13.550408649Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 3 13:35:13.550530 dockerd[1865]: time="2026-03-03T13:35:13.550474749Z" level=info msg="Initializing buildkit" Mar 3 13:35:13.573708 dockerd[1865]: time="2026-03-03T13:35:13.573664258Z" level=info msg="Completed buildkit initialization" Mar 3 13:35:13.580927 dockerd[1865]: time="2026-03-03T13:35:13.580880417Z" level=info msg="Daemon has completed initialization" Mar 3 13:35:13.581005 dockerd[1865]: time="2026-03-03T13:35:13.580955147Z" level=info msg="API listen on /run/docker.sock" Mar 3 13:35:13.581169 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 3 13:35:14.004870 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 3 13:35:14.008138 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 13:35:14.130480 containerd[1629]: time="2026-03-03T13:35:14.130430094Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\"" Mar 3 13:35:14.197721 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 13:35:14.210379 (kubelet)[2087]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 3 13:35:14.253550 kubelet[2087]: E0303 13:35:14.253468 2087 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 3 13:35:14.258536 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 3 13:35:14.258752 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 3 13:35:14.259459 systemd[1]: kubelet.service: Consumed 198ms CPU time, 110.9M memory peak. Mar 3 13:35:14.758016 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3355548720.mount: Deactivated successfully. Mar 3 13:35:15.766651 containerd[1629]: time="2026-03-03T13:35:15.766593939Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:15.767682 containerd[1629]: time="2026-03-03T13:35:15.767476050Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.9: active requests=0, bytes read=30116286" Mar 3 13:35:15.768241 containerd[1629]: time="2026-03-03T13:35:15.768215001Z" level=info msg="ImageCreate event name:\"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:15.770613 containerd[1629]: time="2026-03-03T13:35:15.770582624Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:15.771362 containerd[1629]: time="2026-03-03T13:35:15.771335725Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.9\" with image id \"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\", size \"30112785\" in 1.640857341s" Mar 3 13:35:15.771399 containerd[1629]: time="2026-03-03T13:35:15.771367695Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\" returns image reference \"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\"" Mar 3 13:35:15.772216 containerd[1629]: time="2026-03-03T13:35:15.772169296Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\"" Mar 3 13:35:17.030866 containerd[1629]: time="2026-03-03T13:35:17.030787929Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:17.032209 containerd[1629]: time="2026-03-03T13:35:17.032012510Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.9: active requests=0, bytes read=26021832" Mar 3 13:35:17.033057 containerd[1629]: time="2026-03-03T13:35:17.033034852Z" level=info msg="ImageCreate event name:\"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:17.036231 containerd[1629]: time="2026-03-03T13:35:17.036204966Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:17.036772 containerd[1629]: time="2026-03-03T13:35:17.036745956Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.9\" with image id \"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\", size \"27678758\" in 1.26454371s" Mar 3 13:35:17.036806 containerd[1629]: time="2026-03-03T13:35:17.036775316Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\" returns image reference \"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\"" Mar 3 13:35:17.037550 containerd[1629]: time="2026-03-03T13:35:17.037496997Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\"" Mar 3 13:35:18.144073 containerd[1629]: time="2026-03-03T13:35:18.143993160Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:18.145457 containerd[1629]: time="2026-03-03T13:35:18.145313992Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.9: active requests=0, bytes read=20162768" Mar 3 13:35:18.146619 containerd[1629]: time="2026-03-03T13:35:18.146571404Z" level=info msg="ImageCreate event name:\"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:18.149809 containerd[1629]: time="2026-03-03T13:35:18.149072877Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:18.149809 containerd[1629]: time="2026-03-03T13:35:18.149698757Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.9\" with image id \"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\", size \"21819712\" in 1.11217812s" Mar 3 13:35:18.149809 containerd[1629]: time="2026-03-03T13:35:18.149732287Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\" returns image reference \"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\"" Mar 3 13:35:18.150270 containerd[1629]: time="2026-03-03T13:35:18.150242268Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\"" Mar 3 13:35:19.118849 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount291312678.mount: Deactivated successfully. Mar 3 13:35:19.470455 containerd[1629]: time="2026-03-03T13:35:19.470320918Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:19.471496 containerd[1629]: time="2026-03-03T13:35:19.471389389Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.9: active requests=0, bytes read=31828675" Mar 3 13:35:19.472150 containerd[1629]: time="2026-03-03T13:35:19.472128830Z" level=info msg="ImageCreate event name:\"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:19.473685 containerd[1629]: time="2026-03-03T13:35:19.473657452Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:19.474241 containerd[1629]: time="2026-03-03T13:35:19.474220603Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.9\" with image id \"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\", repo tag \"registry.k8s.io/kube-proxy:v1.33.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\", size \"31827666\" in 1.323948065s" Mar 3 13:35:19.474310 containerd[1629]: time="2026-03-03T13:35:19.474299423Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\" returns image reference \"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\"" Mar 3 13:35:19.474987 containerd[1629]: time="2026-03-03T13:35:19.474963714Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Mar 3 13:35:20.003556 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3540573464.mount: Deactivated successfully. Mar 3 13:35:20.766552 containerd[1629]: time="2026-03-03T13:35:20.766495368Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:20.767562 containerd[1629]: time="2026-03-03T13:35:20.767536340Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942332" Mar 3 13:35:20.768568 containerd[1629]: time="2026-03-03T13:35:20.768538271Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:20.770598 containerd[1629]: time="2026-03-03T13:35:20.770559943Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:20.771255 containerd[1629]: time="2026-03-03T13:35:20.771165664Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.29617653s" Mar 3 13:35:20.771255 containerd[1629]: time="2026-03-03T13:35:20.771188234Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Mar 3 13:35:20.771682 containerd[1629]: time="2026-03-03T13:35:20.771657655Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 3 13:35:21.256889 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1879036401.mount: Deactivated successfully. Mar 3 13:35:21.267897 containerd[1629]: time="2026-03-03T13:35:21.267821775Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 3 13:35:21.269437 containerd[1629]: time="2026-03-03T13:35:21.269065026Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Mar 3 13:35:21.272066 containerd[1629]: time="2026-03-03T13:35:21.272034110Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 3 13:35:21.275999 containerd[1629]: time="2026-03-03T13:35:21.275947595Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 3 13:35:21.276273 containerd[1629]: time="2026-03-03T13:35:21.276237285Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 504.54845ms" Mar 3 13:35:21.276273 containerd[1629]: time="2026-03-03T13:35:21.276268495Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Mar 3 13:35:21.276708 containerd[1629]: time="2026-03-03T13:35:21.276662436Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\"" Mar 3 13:35:21.846768 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3327854039.mount: Deactivated successfully. Mar 3 13:35:22.621148 containerd[1629]: time="2026-03-03T13:35:22.621090436Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.24-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:22.622153 containerd[1629]: time="2026-03-03T13:35:22.621990797Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.24-0: active requests=0, bytes read=23718940" Mar 3 13:35:22.622827 containerd[1629]: time="2026-03-03T13:35:22.622805949Z" level=info msg="ImageCreate event name:\"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:22.625449 containerd[1629]: time="2026-03-03T13:35:22.625426122Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:22.626277 containerd[1629]: time="2026-03-03T13:35:22.626249063Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.24-0\" with image id \"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\", repo tag \"registry.k8s.io/etcd:3.5.24-0\", repo digest \"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\", size \"23716032\" in 1.349564727s" Mar 3 13:35:22.626334 containerd[1629]: time="2026-03-03T13:35:22.626282093Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\" returns image reference \"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\"" Mar 3 13:35:24.505376 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 3 13:35:24.508316 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 13:35:24.674639 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 3 13:35:24.674740 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 3 13:35:24.675343 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 13:35:24.675700 systemd[1]: kubelet.service: Consumed 129ms CPU time, 98.3M memory peak. Mar 3 13:35:24.681333 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 13:35:24.701422 systemd[1]: Reload requested from client PID 2322 ('systemctl') (unit session-7.scope)... Mar 3 13:35:24.701437 systemd[1]: Reloading... Mar 3 13:35:24.828106 zram_generator::config[2364]: No configuration found. Mar 3 13:35:25.021428 systemd[1]: Reloading finished in 319 ms. Mar 3 13:35:25.085626 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 3 13:35:25.085732 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 3 13:35:25.086000 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 13:35:25.086077 systemd[1]: kubelet.service: Consumed 111ms CPU time, 98.3M memory peak. Mar 3 13:35:25.087506 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 13:35:25.239283 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 13:35:25.247502 (kubelet)[2419]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 3 13:35:25.274083 kubelet[2419]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 3 13:35:25.274083 kubelet[2419]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 3 13:35:25.274083 kubelet[2419]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 3 13:35:25.274083 kubelet[2419]: I0303 13:35:25.274003 2419 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 3 13:35:25.692448 kubelet[2419]: I0303 13:35:25.692315 2419 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 3 13:35:25.692448 kubelet[2419]: I0303 13:35:25.692337 2419 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 3 13:35:25.692642 kubelet[2419]: I0303 13:35:25.692501 2419 server.go:956] "Client rotation is on, will bootstrap in background" Mar 3 13:35:25.730227 kubelet[2419]: E0303 13:35:25.729634 2419 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://95.217.157.231:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 95.217.157.231:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 3 13:35:25.730398 kubelet[2419]: I0303 13:35:25.730354 2419 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 3 13:35:25.739211 kubelet[2419]: I0303 13:35:25.739180 2419 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 3 13:35:25.746543 kubelet[2419]: I0303 13:35:25.746511 2419 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 3 13:35:25.747835 kubelet[2419]: I0303 13:35:25.747790 2419 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 3 13:35:25.748068 kubelet[2419]: I0303 13:35:25.747838 2419 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-7-599052a073","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 3 13:35:25.748157 kubelet[2419]: I0303 13:35:25.748073 2419 topology_manager.go:138] "Creating topology manager with none policy" Mar 3 13:35:25.748157 kubelet[2419]: I0303 13:35:25.748085 2419 container_manager_linux.go:303] "Creating device plugin manager" Mar 3 13:35:25.748325 kubelet[2419]: I0303 13:35:25.748297 2419 state_mem.go:36] "Initialized new in-memory state store" Mar 3 13:35:25.755572 kubelet[2419]: I0303 13:35:25.755545 2419 kubelet.go:480] "Attempting to sync node with API server" Mar 3 13:35:25.755633 kubelet[2419]: I0303 13:35:25.755579 2419 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 3 13:35:25.755633 kubelet[2419]: I0303 13:35:25.755628 2419 kubelet.go:386] "Adding apiserver pod source" Mar 3 13:35:25.755768 kubelet[2419]: I0303 13:35:25.755664 2419 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 3 13:35:25.757138 kubelet[2419]: E0303 13:35:25.757118 2419 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://95.217.157.231:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-4-7-599052a073&limit=500&resourceVersion=0\": dial tcp 95.217.157.231:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 3 13:35:25.760198 kubelet[2419]: E0303 13:35:25.759986 2419 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://95.217.157.231:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 95.217.157.231:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 3 13:35:25.760715 kubelet[2419]: I0303 13:35:25.760311 2419 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 3 13:35:25.761432 kubelet[2419]: I0303 13:35:25.761289 2419 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 3 13:35:25.763535 kubelet[2419]: W0303 13:35:25.763502 2419 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 3 13:35:25.770248 kubelet[2419]: I0303 13:35:25.770224 2419 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 3 13:35:25.770309 kubelet[2419]: I0303 13:35:25.770267 2419 server.go:1289] "Started kubelet" Mar 3 13:35:25.770392 kubelet[2419]: I0303 13:35:25.770346 2419 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 3 13:35:25.771025 kubelet[2419]: I0303 13:35:25.771014 2419 server.go:317] "Adding debug handlers to kubelet server" Mar 3 13:35:25.772484 kubelet[2419]: I0303 13:35:25.772440 2419 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 3 13:35:25.772808 kubelet[2419]: I0303 13:35:25.772787 2419 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 3 13:35:25.775488 kubelet[2419]: E0303 13:35:25.772899 2419 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://95.217.157.231:6443/api/v1/namespaces/default/events\": dial tcp 95.217.157.231:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-4-7-599052a073.18995837f8f4d413 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-4-7-599052a073,UID:ci-4459-2-4-7-599052a073,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-7-599052a073,},FirstTimestamp:2026-03-03 13:35:25.770241043 +0000 UTC m=+0.519401741,LastTimestamp:2026-03-03 13:35:25.770241043 +0000 UTC m=+0.519401741,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-7-599052a073,}" Mar 3 13:35:25.775658 kubelet[2419]: I0303 13:35:25.775637 2419 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 3 13:35:25.778081 kubelet[2419]: I0303 13:35:25.776463 2419 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 3 13:35:25.779423 kubelet[2419]: E0303 13:35:25.779412 2419 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 3 13:35:25.779753 kubelet[2419]: E0303 13:35:25.779733 2419 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-2-4-7-599052a073\" not found" Mar 3 13:35:25.779841 kubelet[2419]: I0303 13:35:25.779834 2419 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 3 13:35:25.780022 kubelet[2419]: I0303 13:35:25.780013 2419 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 3 13:35:25.780112 kubelet[2419]: I0303 13:35:25.780105 2419 reconciler.go:26] "Reconciler: start to sync state" Mar 3 13:35:25.780929 kubelet[2419]: E0303 13:35:25.780894 2419 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://95.217.157.231:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 95.217.157.231:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 3 13:35:25.781633 kubelet[2419]: I0303 13:35:25.781622 2419 factory.go:223] Registration of the containerd container factory successfully Mar 3 13:35:25.781689 kubelet[2419]: I0303 13:35:25.781683 2419 factory.go:223] Registration of the systemd container factory successfully Mar 3 13:35:25.781793 kubelet[2419]: I0303 13:35:25.781777 2419 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 3 13:35:25.796337 kubelet[2419]: I0303 13:35:25.796291 2419 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 3 13:35:25.797496 kubelet[2419]: I0303 13:35:25.797469 2419 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 3 13:35:25.797496 kubelet[2419]: I0303 13:35:25.797486 2419 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 3 13:35:25.797559 kubelet[2419]: I0303 13:35:25.797504 2419 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 3 13:35:25.797559 kubelet[2419]: I0303 13:35:25.797510 2419 kubelet.go:2436] "Starting kubelet main sync loop" Mar 3 13:35:25.797559 kubelet[2419]: E0303 13:35:25.797544 2419 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 3 13:35:25.804801 kubelet[2419]: E0303 13:35:25.804765 2419 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://95.217.157.231:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-7-599052a073?timeout=10s\": dial tcp 95.217.157.231:6443: connect: connection refused" interval="200ms" Mar 3 13:35:25.805323 kubelet[2419]: E0303 13:35:25.805302 2419 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://95.217.157.231:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 95.217.157.231:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 3 13:35:25.811254 kubelet[2419]: I0303 13:35:25.811224 2419 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 3 13:35:25.811343 kubelet[2419]: I0303 13:35:25.811335 2419 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 3 13:35:25.811395 kubelet[2419]: I0303 13:35:25.811390 2419 state_mem.go:36] "Initialized new in-memory state store" Mar 3 13:35:25.813778 kubelet[2419]: I0303 13:35:25.813766 2419 policy_none.go:49] "None policy: Start" Mar 3 13:35:25.813827 kubelet[2419]: I0303 13:35:25.813821 2419 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 3 13:35:25.813867 kubelet[2419]: I0303 13:35:25.813861 2419 state_mem.go:35] "Initializing new in-memory state store" Mar 3 13:35:25.818805 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 3 13:35:25.827339 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 3 13:35:25.830182 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 3 13:35:25.842268 kubelet[2419]: E0303 13:35:25.842241 2419 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 3 13:35:25.843516 kubelet[2419]: I0303 13:35:25.843506 2419 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 3 13:35:25.843644 kubelet[2419]: I0303 13:35:25.843621 2419 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 3 13:35:25.844388 kubelet[2419]: I0303 13:35:25.844063 2419 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 3 13:35:25.844984 kubelet[2419]: E0303 13:35:25.844972 2419 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 3 13:35:25.845147 kubelet[2419]: E0303 13:35:25.845139 2419 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-4-7-599052a073\" not found" Mar 3 13:35:25.913048 systemd[1]: Created slice kubepods-burstable-pod5fb806e12fed151e952edd9e40abf303.slice - libcontainer container kubepods-burstable-pod5fb806e12fed151e952edd9e40abf303.slice. Mar 3 13:35:25.920941 kubelet[2419]: E0303 13:35:25.920867 2419 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-7-599052a073\" not found" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:25.924706 systemd[1]: Created slice kubepods-burstable-pod0580c5d8bfde85509b022740c9e9a5b4.slice - libcontainer container kubepods-burstable-pod0580c5d8bfde85509b022740c9e9a5b4.slice. Mar 3 13:35:25.927139 kubelet[2419]: E0303 13:35:25.927079 2419 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-7-599052a073\" not found" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:25.943121 systemd[1]: Created slice kubepods-burstable-pod0325d08d48090116b08f5fe4838d84e9.slice - libcontainer container kubepods-burstable-pod0325d08d48090116b08f5fe4838d84e9.slice. Mar 3 13:35:25.946636 kubelet[2419]: I0303 13:35:25.946348 2419 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:25.946636 kubelet[2419]: E0303 13:35:25.946614 2419 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://95.217.157.231:6443/api/v1/nodes\": dial tcp 95.217.157.231:6443: connect: connection refused" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:25.946731 kubelet[2419]: E0303 13:35:25.946623 2419 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-7-599052a073\" not found" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:26.005900 kubelet[2419]: E0303 13:35:26.005831 2419 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://95.217.157.231:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-7-599052a073?timeout=10s\": dial tcp 95.217.157.231:6443: connect: connection refused" interval="400ms" Mar 3 13:35:26.081301 kubelet[2419]: I0303 13:35:26.081239 2419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5fb806e12fed151e952edd9e40abf303-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-7-599052a073\" (UID: \"5fb806e12fed151e952edd9e40abf303\") " pod="kube-system/kube-apiserver-ci-4459-2-4-7-599052a073" Mar 3 13:35:26.081301 kubelet[2419]: I0303 13:35:26.081279 2419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5fb806e12fed151e952edd9e40abf303-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-7-599052a073\" (UID: \"5fb806e12fed151e952edd9e40abf303\") " pod="kube-system/kube-apiserver-ci-4459-2-4-7-599052a073" Mar 3 13:35:26.081301 kubelet[2419]: I0303 13:35:26.081294 2419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5fb806e12fed151e952edd9e40abf303-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-7-599052a073\" (UID: \"5fb806e12fed151e952edd9e40abf303\") " pod="kube-system/kube-apiserver-ci-4459-2-4-7-599052a073" Mar 3 13:35:26.081301 kubelet[2419]: I0303 13:35:26.081308 2419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0580c5d8bfde85509b022740c9e9a5b4-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-7-599052a073\" (UID: \"0580c5d8bfde85509b022740c9e9a5b4\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" Mar 3 13:35:26.081301 kubelet[2419]: I0303 13:35:26.081320 2419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0325d08d48090116b08f5fe4838d84e9-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-7-599052a073\" (UID: \"0325d08d48090116b08f5fe4838d84e9\") " pod="kube-system/kube-scheduler-ci-4459-2-4-7-599052a073" Mar 3 13:35:26.081528 kubelet[2419]: I0303 13:35:26.081331 2419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0580c5d8bfde85509b022740c9e9a5b4-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-7-599052a073\" (UID: \"0580c5d8bfde85509b022740c9e9a5b4\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" Mar 3 13:35:26.081528 kubelet[2419]: I0303 13:35:26.081344 2419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/0580c5d8bfde85509b022740c9e9a5b4-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-7-599052a073\" (UID: \"0580c5d8bfde85509b022740c9e9a5b4\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" Mar 3 13:35:26.081528 kubelet[2419]: I0303 13:35:26.081357 2419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0580c5d8bfde85509b022740c9e9a5b4-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-7-599052a073\" (UID: \"0580c5d8bfde85509b022740c9e9a5b4\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" Mar 3 13:35:26.081528 kubelet[2419]: I0303 13:35:26.081368 2419 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0580c5d8bfde85509b022740c9e9a5b4-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-7-599052a073\" (UID: \"0580c5d8bfde85509b022740c9e9a5b4\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" Mar 3 13:35:26.149475 kubelet[2419]: I0303 13:35:26.149420 2419 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:26.149935 kubelet[2419]: E0303 13:35:26.149863 2419 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://95.217.157.231:6443/api/v1/nodes\": dial tcp 95.217.157.231:6443: connect: connection refused" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:26.222951 containerd[1629]: time="2026-03-03T13:35:26.222692228Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-7-599052a073,Uid:5fb806e12fed151e952edd9e40abf303,Namespace:kube-system,Attempt:0,}" Mar 3 13:35:26.228581 containerd[1629]: time="2026-03-03T13:35:26.228107675Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-7-599052a073,Uid:0580c5d8bfde85509b022740c9e9a5b4,Namespace:kube-system,Attempt:0,}" Mar 3 13:35:26.259863 containerd[1629]: time="2026-03-03T13:35:26.259803104Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-7-599052a073,Uid:0325d08d48090116b08f5fe4838d84e9,Namespace:kube-system,Attempt:0,}" Mar 3 13:35:26.262811 containerd[1629]: time="2026-03-03T13:35:26.262782228Z" level=info msg="connecting to shim 1d6ff1fa6ab1e3af68201cf10a962ee1ea9efa67c838a5351389c24ccad3ce5c" address="unix:///run/containerd/s/09e3f86a83356e1ccb8aae326ebe1d64cb019b7f12dc81afa5cf5206812d5664" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:35:26.263220 containerd[1629]: time="2026-03-03T13:35:26.263182199Z" level=info msg="connecting to shim be840f150a21ea77628b92e0279f29c482b42fad7dcfd2043c22b97aca0ea243" address="unix:///run/containerd/s/dd938fddc3d70fb126e1c39df700f62eb4490e05822b84d5290c03c066adc88c" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:35:26.296801 containerd[1629]: time="2026-03-03T13:35:26.296738691Z" level=info msg="connecting to shim 738790978ae1921e080eb5c0d33effbd6cb435b3c9cf654db91b36bdc90da453" address="unix:///run/containerd/s/1a467ddfdd99ae933ead0d2a405f197da539f00084e93458368dddb3af09fad4" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:35:26.300093 systemd[1]: Started cri-containerd-1d6ff1fa6ab1e3af68201cf10a962ee1ea9efa67c838a5351389c24ccad3ce5c.scope - libcontainer container 1d6ff1fa6ab1e3af68201cf10a962ee1ea9efa67c838a5351389c24ccad3ce5c. Mar 3 13:35:26.304056 systemd[1]: Started cri-containerd-be840f150a21ea77628b92e0279f29c482b42fad7dcfd2043c22b97aca0ea243.scope - libcontainer container be840f150a21ea77628b92e0279f29c482b42fad7dcfd2043c22b97aca0ea243. Mar 3 13:35:26.323116 systemd[1]: Started cri-containerd-738790978ae1921e080eb5c0d33effbd6cb435b3c9cf654db91b36bdc90da453.scope - libcontainer container 738790978ae1921e080eb5c0d33effbd6cb435b3c9cf654db91b36bdc90da453. Mar 3 13:35:26.371644 containerd[1629]: time="2026-03-03T13:35:26.371608914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-7-599052a073,Uid:0325d08d48090116b08f5fe4838d84e9,Namespace:kube-system,Attempt:0,} returns sandbox id \"738790978ae1921e080eb5c0d33effbd6cb435b3c9cf654db91b36bdc90da453\"" Mar 3 13:35:26.376938 containerd[1629]: time="2026-03-03T13:35:26.376766621Z" level=info msg="CreateContainer within sandbox \"738790978ae1921e080eb5c0d33effbd6cb435b3c9cf654db91b36bdc90da453\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 3 13:35:26.380345 containerd[1629]: time="2026-03-03T13:35:26.380292475Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-7-599052a073,Uid:5fb806e12fed151e952edd9e40abf303,Namespace:kube-system,Attempt:0,} returns sandbox id \"1d6ff1fa6ab1e3af68201cf10a962ee1ea9efa67c838a5351389c24ccad3ce5c\"" Mar 3 13:35:26.384413 containerd[1629]: time="2026-03-03T13:35:26.384373350Z" level=info msg="CreateContainer within sandbox \"1d6ff1fa6ab1e3af68201cf10a962ee1ea9efa67c838a5351389c24ccad3ce5c\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 3 13:35:26.388938 containerd[1629]: time="2026-03-03T13:35:26.388367135Z" level=info msg="Container 3827b2d26b9e0685086a76f7c52c617a015be952bdb07c65ea6913172d57817d: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:35:26.391762 containerd[1629]: time="2026-03-03T13:35:26.391730189Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-7-599052a073,Uid:0580c5d8bfde85509b022740c9e9a5b4,Namespace:kube-system,Attempt:0,} returns sandbox id \"be840f150a21ea77628b92e0279f29c482b42fad7dcfd2043c22b97aca0ea243\"" Mar 3 13:35:26.395510 containerd[1629]: time="2026-03-03T13:35:26.395494244Z" level=info msg="CreateContainer within sandbox \"be840f150a21ea77628b92e0279f29c482b42fad7dcfd2043c22b97aca0ea243\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 3 13:35:26.396149 containerd[1629]: time="2026-03-03T13:35:26.396133465Z" level=info msg="CreateContainer within sandbox \"738790978ae1921e080eb5c0d33effbd6cb435b3c9cf654db91b36bdc90da453\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"3827b2d26b9e0685086a76f7c52c617a015be952bdb07c65ea6913172d57817d\"" Mar 3 13:35:26.396431 containerd[1629]: time="2026-03-03T13:35:26.396399785Z" level=info msg="Container eb975c2b907da10d771b7d8de0b89fc34e99a0690e400ce1ac08a12f2706cc7a: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:35:26.397107 containerd[1629]: time="2026-03-03T13:35:26.397007746Z" level=info msg="StartContainer for \"3827b2d26b9e0685086a76f7c52c617a015be952bdb07c65ea6913172d57817d\"" Mar 3 13:35:26.397966 containerd[1629]: time="2026-03-03T13:35:26.397950767Z" level=info msg="connecting to shim 3827b2d26b9e0685086a76f7c52c617a015be952bdb07c65ea6913172d57817d" address="unix:///run/containerd/s/1a467ddfdd99ae933ead0d2a405f197da539f00084e93458368dddb3af09fad4" protocol=ttrpc version=3 Mar 3 13:35:26.403724 containerd[1629]: time="2026-03-03T13:35:26.403698344Z" level=info msg="CreateContainer within sandbox \"1d6ff1fa6ab1e3af68201cf10a962ee1ea9efa67c838a5351389c24ccad3ce5c\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"eb975c2b907da10d771b7d8de0b89fc34e99a0690e400ce1ac08a12f2706cc7a\"" Mar 3 13:35:26.404117 containerd[1629]: time="2026-03-03T13:35:26.404096285Z" level=info msg="StartContainer for \"eb975c2b907da10d771b7d8de0b89fc34e99a0690e400ce1ac08a12f2706cc7a\"" Mar 3 13:35:26.404263 containerd[1629]: time="2026-03-03T13:35:26.404247395Z" level=info msg="Container 64e1e855e086e4e1da11c3a82238ec1c412598b8c5a8af91f669b5f89a18f386: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:35:26.405248 containerd[1629]: time="2026-03-03T13:35:26.405224926Z" level=info msg="connecting to shim eb975c2b907da10d771b7d8de0b89fc34e99a0690e400ce1ac08a12f2706cc7a" address="unix:///run/containerd/s/09e3f86a83356e1ccb8aae326ebe1d64cb019b7f12dc81afa5cf5206812d5664" protocol=ttrpc version=3 Mar 3 13:35:26.407686 kubelet[2419]: E0303 13:35:26.407638 2419 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://95.217.157.231:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-7-599052a073?timeout=10s\": dial tcp 95.217.157.231:6443: connect: connection refused" interval="800ms" Mar 3 13:35:26.416574 containerd[1629]: time="2026-03-03T13:35:26.416497610Z" level=info msg="CreateContainer within sandbox \"be840f150a21ea77628b92e0279f29c482b42fad7dcfd2043c22b97aca0ea243\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"64e1e855e086e4e1da11c3a82238ec1c412598b8c5a8af91f669b5f89a18f386\"" Mar 3 13:35:26.417619 containerd[1629]: time="2026-03-03T13:35:26.417107051Z" level=info msg="StartContainer for \"64e1e855e086e4e1da11c3a82238ec1c412598b8c5a8af91f669b5f89a18f386\"" Mar 3 13:35:26.418104 systemd[1]: Started cri-containerd-3827b2d26b9e0685086a76f7c52c617a015be952bdb07c65ea6913172d57817d.scope - libcontainer container 3827b2d26b9e0685086a76f7c52c617a015be952bdb07c65ea6913172d57817d. Mar 3 13:35:26.418994 containerd[1629]: time="2026-03-03T13:35:26.418936353Z" level=info msg="connecting to shim 64e1e855e086e4e1da11c3a82238ec1c412598b8c5a8af91f669b5f89a18f386" address="unix:///run/containerd/s/dd938fddc3d70fb126e1c39df700f62eb4490e05822b84d5290c03c066adc88c" protocol=ttrpc version=3 Mar 3 13:35:26.430074 systemd[1]: Started cri-containerd-eb975c2b907da10d771b7d8de0b89fc34e99a0690e400ce1ac08a12f2706cc7a.scope - libcontainer container eb975c2b907da10d771b7d8de0b89fc34e99a0690e400ce1ac08a12f2706cc7a. Mar 3 13:35:26.444014 systemd[1]: Started cri-containerd-64e1e855e086e4e1da11c3a82238ec1c412598b8c5a8af91f669b5f89a18f386.scope - libcontainer container 64e1e855e086e4e1da11c3a82238ec1c412598b8c5a8af91f669b5f89a18f386. Mar 3 13:35:26.504762 containerd[1629]: time="2026-03-03T13:35:26.504724071Z" level=info msg="StartContainer for \"3827b2d26b9e0685086a76f7c52c617a015be952bdb07c65ea6913172d57817d\" returns successfully" Mar 3 13:35:26.514235 containerd[1629]: time="2026-03-03T13:35:26.514151282Z" level=info msg="StartContainer for \"eb975c2b907da10d771b7d8de0b89fc34e99a0690e400ce1ac08a12f2706cc7a\" returns successfully" Mar 3 13:35:26.530334 containerd[1629]: time="2026-03-03T13:35:26.530150912Z" level=info msg="StartContainer for \"64e1e855e086e4e1da11c3a82238ec1c412598b8c5a8af91f669b5f89a18f386\" returns successfully" Mar 3 13:35:26.552245 kubelet[2419]: I0303 13:35:26.552214 2419 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:26.552514 kubelet[2419]: E0303 13:35:26.552493 2419 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://95.217.157.231:6443/api/v1/nodes\": dial tcp 95.217.157.231:6443: connect: connection refused" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:26.814609 kubelet[2419]: E0303 13:35:26.814518 2419 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-7-599052a073\" not found" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:26.817302 kubelet[2419]: E0303 13:35:26.817281 2419 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-7-599052a073\" not found" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:26.818661 kubelet[2419]: E0303 13:35:26.818639 2419 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-7-599052a073\" not found" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:27.355400 kubelet[2419]: I0303 13:35:27.355367 2419 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:27.737491 kubelet[2419]: E0303 13:35:27.737374 2419 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-4-7-599052a073\" not found" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:27.758996 kubelet[2419]: I0303 13:35:27.758962 2419 apiserver.go:52] "Watching apiserver" Mar 3 13:35:27.780158 kubelet[2419]: I0303 13:35:27.780124 2419 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 3 13:35:27.820856 kubelet[2419]: E0303 13:35:27.820739 2419 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-7-599052a073\" not found" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:27.820856 kubelet[2419]: E0303 13:35:27.820845 2419 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-7-599052a073\" not found" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:27.842120 kubelet[2419]: I0303 13:35:27.842084 2419 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:27.885346 kubelet[2419]: I0303 13:35:27.885311 2419 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-7-599052a073" Mar 3 13:35:27.894937 kubelet[2419]: E0303 13:35:27.892816 2419 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-7-599052a073\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-2-4-7-599052a073" Mar 3 13:35:27.895248 kubelet[2419]: I0303 13:35:27.895077 2419 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" Mar 3 13:35:27.898155 kubelet[2419]: E0303 13:35:27.898121 2419 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-7-599052a073\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" Mar 3 13:35:27.898352 kubelet[2419]: I0303 13:35:27.898232 2419 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-7-599052a073" Mar 3 13:35:27.903238 kubelet[2419]: E0303 13:35:27.903219 2419 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-7-599052a073\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-2-4-7-599052a073" Mar 3 13:35:27.973796 kubelet[2419]: I0303 13:35:27.973768 2419 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" Mar 3 13:35:27.975430 kubelet[2419]: E0303 13:35:27.975387 2419 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-7-599052a073\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" Mar 3 13:35:29.709930 systemd[1]: Reload requested from client PID 2699 ('systemctl') (unit session-7.scope)... Mar 3 13:35:29.709947 systemd[1]: Reloading... Mar 3 13:35:29.792931 zram_generator::config[2740]: No configuration found. Mar 3 13:35:29.984626 systemd[1]: Reloading finished in 274 ms. Mar 3 13:35:30.018209 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 13:35:30.042101 systemd[1]: kubelet.service: Deactivated successfully. Mar 3 13:35:30.042364 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 13:35:30.042423 systemd[1]: kubelet.service: Consumed 858ms CPU time, 131.2M memory peak. Mar 3 13:35:30.045133 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 13:35:30.222114 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 13:35:30.231322 (kubelet)[2794]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 3 13:35:30.269760 kubelet[2794]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 3 13:35:30.269760 kubelet[2794]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 3 13:35:30.269760 kubelet[2794]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 3 13:35:30.270371 kubelet[2794]: I0303 13:35:30.269811 2794 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 3 13:35:30.277930 kubelet[2794]: I0303 13:35:30.277868 2794 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 3 13:35:30.277930 kubelet[2794]: I0303 13:35:30.277886 2794 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 3 13:35:30.278406 kubelet[2794]: I0303 13:35:30.278394 2794 server.go:956] "Client rotation is on, will bootstrap in background" Mar 3 13:35:30.280646 kubelet[2794]: I0303 13:35:30.280605 2794 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 3 13:35:30.283060 kubelet[2794]: I0303 13:35:30.283024 2794 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 3 13:35:30.287501 kubelet[2794]: I0303 13:35:30.287458 2794 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 3 13:35:30.291794 kubelet[2794]: I0303 13:35:30.291731 2794 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 3 13:35:30.292059 kubelet[2794]: I0303 13:35:30.292029 2794 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 3 13:35:30.292235 kubelet[2794]: I0303 13:35:30.292111 2794 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-7-599052a073","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 3 13:35:30.292336 kubelet[2794]: I0303 13:35:30.292328 2794 topology_manager.go:138] "Creating topology manager with none policy" Mar 3 13:35:30.292370 kubelet[2794]: I0303 13:35:30.292365 2794 container_manager_linux.go:303] "Creating device plugin manager" Mar 3 13:35:30.292440 kubelet[2794]: I0303 13:35:30.292434 2794 state_mem.go:36] "Initialized new in-memory state store" Mar 3 13:35:30.292626 kubelet[2794]: I0303 13:35:30.292618 2794 kubelet.go:480] "Attempting to sync node with API server" Mar 3 13:35:30.292685 kubelet[2794]: I0303 13:35:30.292679 2794 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 3 13:35:30.292731 kubelet[2794]: I0303 13:35:30.292726 2794 kubelet.go:386] "Adding apiserver pod source" Mar 3 13:35:30.292773 kubelet[2794]: I0303 13:35:30.292767 2794 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 3 13:35:30.296013 kubelet[2794]: I0303 13:35:30.295927 2794 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 3 13:35:30.296433 kubelet[2794]: I0303 13:35:30.296419 2794 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 3 13:35:30.301299 kubelet[2794]: I0303 13:35:30.301279 2794 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 3 13:35:30.302496 kubelet[2794]: I0303 13:35:30.302170 2794 server.go:1289] "Started kubelet" Mar 3 13:35:30.304788 kubelet[2794]: I0303 13:35:30.304567 2794 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 3 13:35:30.310233 kubelet[2794]: I0303 13:35:30.310192 2794 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 3 13:35:30.310484 kubelet[2794]: I0303 13:35:30.310466 2794 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 3 13:35:30.313726 kubelet[2794]: I0303 13:35:30.312297 2794 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 3 13:35:30.315444 kubelet[2794]: I0303 13:35:30.315405 2794 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 3 13:35:30.317234 kubelet[2794]: I0303 13:35:30.317216 2794 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 3 13:35:30.317780 kubelet[2794]: I0303 13:35:30.317767 2794 server.go:317] "Adding debug handlers to kubelet server" Mar 3 13:35:30.319312 kubelet[2794]: I0303 13:35:30.319300 2794 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 3 13:35:30.319584 kubelet[2794]: I0303 13:35:30.319572 2794 reconciler.go:26] "Reconciler: start to sync state" Mar 3 13:35:30.322011 kubelet[2794]: I0303 13:35:30.321802 2794 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 3 13:35:30.325341 kubelet[2794]: I0303 13:35:30.325279 2794 factory.go:223] Registration of the containerd container factory successfully Mar 3 13:35:30.325521 kubelet[2794]: I0303 13:35:30.325509 2794 factory.go:223] Registration of the systemd container factory successfully Mar 3 13:35:30.332432 kubelet[2794]: I0303 13:35:30.332105 2794 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 3 13:35:30.333204 kubelet[2794]: I0303 13:35:30.333184 2794 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 3 13:35:30.333204 kubelet[2794]: I0303 13:35:30.333202 2794 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 3 13:35:30.333280 kubelet[2794]: I0303 13:35:30.333218 2794 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 3 13:35:30.333280 kubelet[2794]: I0303 13:35:30.333224 2794 kubelet.go:2436] "Starting kubelet main sync loop" Mar 3 13:35:30.333280 kubelet[2794]: E0303 13:35:30.333273 2794 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 3 13:35:30.387804 kubelet[2794]: I0303 13:35:30.387774 2794 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 3 13:35:30.388103 kubelet[2794]: I0303 13:35:30.387975 2794 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 3 13:35:30.388103 kubelet[2794]: I0303 13:35:30.387994 2794 state_mem.go:36] "Initialized new in-memory state store" Mar 3 13:35:30.388221 kubelet[2794]: I0303 13:35:30.388211 2794 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 3 13:35:30.388273 kubelet[2794]: I0303 13:35:30.388258 2794 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 3 13:35:30.388304 kubelet[2794]: I0303 13:35:30.388299 2794 policy_none.go:49] "None policy: Start" Mar 3 13:35:30.388335 kubelet[2794]: I0303 13:35:30.388330 2794 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 3 13:35:30.388367 kubelet[2794]: I0303 13:35:30.388362 2794 state_mem.go:35] "Initializing new in-memory state store" Mar 3 13:35:30.388934 kubelet[2794]: I0303 13:35:30.388485 2794 state_mem.go:75] "Updated machine memory state" Mar 3 13:35:30.392315 kubelet[2794]: E0303 13:35:30.392245 2794 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 3 13:35:30.392451 kubelet[2794]: I0303 13:35:30.392430 2794 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 3 13:35:30.392483 kubelet[2794]: I0303 13:35:30.392452 2794 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 3 13:35:30.393006 kubelet[2794]: I0303 13:35:30.392935 2794 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 3 13:35:30.394984 kubelet[2794]: E0303 13:35:30.394964 2794 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 3 13:35:30.435624 kubelet[2794]: I0303 13:35:30.434937 2794 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-7-599052a073" Mar 3 13:35:30.435931 kubelet[2794]: I0303 13:35:30.435919 2794 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-7-599052a073" Mar 3 13:35:30.436545 kubelet[2794]: I0303 13:35:30.436279 2794 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" Mar 3 13:35:30.502432 kubelet[2794]: I0303 13:35:30.502385 2794 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:30.509592 kubelet[2794]: I0303 13:35:30.509514 2794 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:30.509592 kubelet[2794]: I0303 13:35:30.509582 2794 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-7-599052a073" Mar 3 13:35:30.521854 kubelet[2794]: I0303 13:35:30.520847 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0580c5d8bfde85509b022740c9e9a5b4-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-7-599052a073\" (UID: \"0580c5d8bfde85509b022740c9e9a5b4\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" Mar 3 13:35:30.521854 kubelet[2794]: I0303 13:35:30.520874 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/0580c5d8bfde85509b022740c9e9a5b4-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-7-599052a073\" (UID: \"0580c5d8bfde85509b022740c9e9a5b4\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" Mar 3 13:35:30.521854 kubelet[2794]: I0303 13:35:30.520895 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0580c5d8bfde85509b022740c9e9a5b4-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-7-599052a073\" (UID: \"0580c5d8bfde85509b022740c9e9a5b4\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" Mar 3 13:35:30.521854 kubelet[2794]: I0303 13:35:30.520933 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0580c5d8bfde85509b022740c9e9a5b4-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-7-599052a073\" (UID: \"0580c5d8bfde85509b022740c9e9a5b4\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" Mar 3 13:35:30.521854 kubelet[2794]: I0303 13:35:30.520986 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0580c5d8bfde85509b022740c9e9a5b4-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-7-599052a073\" (UID: \"0580c5d8bfde85509b022740c9e9a5b4\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" Mar 3 13:35:30.522084 kubelet[2794]: I0303 13:35:30.521713 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0325d08d48090116b08f5fe4838d84e9-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-7-599052a073\" (UID: \"0325d08d48090116b08f5fe4838d84e9\") " pod="kube-system/kube-scheduler-ci-4459-2-4-7-599052a073" Mar 3 13:35:30.522084 kubelet[2794]: I0303 13:35:30.521738 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5fb806e12fed151e952edd9e40abf303-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-7-599052a073\" (UID: \"5fb806e12fed151e952edd9e40abf303\") " pod="kube-system/kube-apiserver-ci-4459-2-4-7-599052a073" Mar 3 13:35:30.522084 kubelet[2794]: I0303 13:35:30.521749 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5fb806e12fed151e952edd9e40abf303-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-7-599052a073\" (UID: \"5fb806e12fed151e952edd9e40abf303\") " pod="kube-system/kube-apiserver-ci-4459-2-4-7-599052a073" Mar 3 13:35:30.522084 kubelet[2794]: I0303 13:35:30.521760 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5fb806e12fed151e952edd9e40abf303-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-7-599052a073\" (UID: \"5fb806e12fed151e952edd9e40abf303\") " pod="kube-system/kube-apiserver-ci-4459-2-4-7-599052a073" Mar 3 13:35:31.294193 kubelet[2794]: I0303 13:35:31.294154 2794 apiserver.go:52] "Watching apiserver" Mar 3 13:35:31.317850 kubelet[2794]: I0303 13:35:31.317811 2794 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 3 13:35:31.366524 kubelet[2794]: I0303 13:35:31.366485 2794 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-7-599052a073" Mar 3 13:35:31.367208 kubelet[2794]: I0303 13:35:31.367101 2794 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-7-599052a073" Mar 3 13:35:31.374473 kubelet[2794]: E0303 13:35:31.374434 2794 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-7-599052a073\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-4-7-599052a073" Mar 3 13:35:31.375081 kubelet[2794]: E0303 13:35:31.375034 2794 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-7-599052a073\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-4-7-599052a073" Mar 3 13:35:31.389520 kubelet[2794]: I0303 13:35:31.389023 2794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-4-7-599052a073" podStartSLOduration=1.388993366 podStartE2EDuration="1.388993366s" podCreationTimestamp="2026-03-03 13:35:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 13:35:31.387800434 +0000 UTC m=+1.150918689" watchObservedRunningTime="2026-03-03 13:35:31.388993366 +0000 UTC m=+1.152111631" Mar 3 13:35:31.405996 kubelet[2794]: I0303 13:35:31.405736 2794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-4-7-599052a073" podStartSLOduration=1.4057189860000001 podStartE2EDuration="1.405718986s" podCreationTimestamp="2026-03-03 13:35:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 13:35:31.396282065 +0000 UTC m=+1.159400330" watchObservedRunningTime="2026-03-03 13:35:31.405718986 +0000 UTC m=+1.168837241" Mar 3 13:35:36.008450 kubelet[2794]: I0303 13:35:36.008414 2794 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 3 13:35:36.009035 containerd[1629]: time="2026-03-03T13:35:36.008668020Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 3 13:35:36.009613 kubelet[2794]: I0303 13:35:36.009405 2794 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 3 13:35:36.570804 kubelet[2794]: I0303 13:35:36.569319 2794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" podStartSLOduration=6.569290861 podStartE2EDuration="6.569290861s" podCreationTimestamp="2026-03-03 13:35:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 13:35:31.406278287 +0000 UTC m=+1.169396542" watchObservedRunningTime="2026-03-03 13:35:36.569290861 +0000 UTC m=+6.332409156" Mar 3 13:35:36.593810 systemd[1]: Created slice kubepods-besteffort-podd72bd89b_13a3_4355_a166_71e5f129d471.slice - libcontainer container kubepods-besteffort-podd72bd89b_13a3_4355_a166_71e5f129d471.slice. Mar 3 13:35:36.664564 kubelet[2794]: I0303 13:35:36.664510 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/d72bd89b-13a3-4355-a166-71e5f129d471-kube-proxy\") pod \"kube-proxy-l7brq\" (UID: \"d72bd89b-13a3-4355-a166-71e5f129d471\") " pod="kube-system/kube-proxy-l7brq" Mar 3 13:35:36.664564 kubelet[2794]: I0303 13:35:36.664554 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d72bd89b-13a3-4355-a166-71e5f129d471-xtables-lock\") pod \"kube-proxy-l7brq\" (UID: \"d72bd89b-13a3-4355-a166-71e5f129d471\") " pod="kube-system/kube-proxy-l7brq" Mar 3 13:35:36.664564 kubelet[2794]: I0303 13:35:36.664569 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d72bd89b-13a3-4355-a166-71e5f129d471-lib-modules\") pod \"kube-proxy-l7brq\" (UID: \"d72bd89b-13a3-4355-a166-71e5f129d471\") " pod="kube-system/kube-proxy-l7brq" Mar 3 13:35:36.664763 kubelet[2794]: I0303 13:35:36.664587 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wnpz\" (UniqueName: \"kubernetes.io/projected/d72bd89b-13a3-4355-a166-71e5f129d471-kube-api-access-4wnpz\") pod \"kube-proxy-l7brq\" (UID: \"d72bd89b-13a3-4355-a166-71e5f129d471\") " pod="kube-system/kube-proxy-l7brq" Mar 3 13:35:36.771083 kubelet[2794]: E0303 13:35:36.771024 2794 projected.go:289] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Mar 3 13:35:36.771083 kubelet[2794]: E0303 13:35:36.771057 2794 projected.go:194] Error preparing data for projected volume kube-api-access-4wnpz for pod kube-system/kube-proxy-l7brq: configmap "kube-root-ca.crt" not found Mar 3 13:35:36.771253 kubelet[2794]: E0303 13:35:36.771130 2794 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d72bd89b-13a3-4355-a166-71e5f129d471-kube-api-access-4wnpz podName:d72bd89b-13a3-4355-a166-71e5f129d471 nodeName:}" failed. No retries permitted until 2026-03-03 13:35:37.271110013 +0000 UTC m=+7.034228278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4wnpz" (UniqueName: "kubernetes.io/projected/d72bd89b-13a3-4355-a166-71e5f129d471-kube-api-access-4wnpz") pod "kube-proxy-l7brq" (UID: "d72bd89b-13a3-4355-a166-71e5f129d471") : configmap "kube-root-ca.crt" not found Mar 3 13:35:37.196496 systemd[1]: Created slice kubepods-besteffort-podd3509e65_77d3_4f0e_9d9a_8b47fdce9585.slice - libcontainer container kubepods-besteffort-podd3509e65_77d3_4f0e_9d9a_8b47fdce9585.slice. Mar 3 13:35:37.269940 kubelet[2794]: I0303 13:35:37.269834 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d66hw\" (UniqueName: \"kubernetes.io/projected/d3509e65-77d3-4f0e-9d9a-8b47fdce9585-kube-api-access-d66hw\") pod \"tigera-operator-6bf85f8dd-vgkz5\" (UID: \"d3509e65-77d3-4f0e-9d9a-8b47fdce9585\") " pod="tigera-operator/tigera-operator-6bf85f8dd-vgkz5" Mar 3 13:35:37.269940 kubelet[2794]: I0303 13:35:37.269867 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d3509e65-77d3-4f0e-9d9a-8b47fdce9585-var-lib-calico\") pod \"tigera-operator-6bf85f8dd-vgkz5\" (UID: \"d3509e65-77d3-4f0e-9d9a-8b47fdce9585\") " pod="tigera-operator/tigera-operator-6bf85f8dd-vgkz5" Mar 3 13:35:37.503020 containerd[1629]: time="2026-03-03T13:35:37.502804847Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-l7brq,Uid:d72bd89b-13a3-4355-a166-71e5f129d471,Namespace:kube-system,Attempt:0,}" Mar 3 13:35:37.503020 containerd[1629]: time="2026-03-03T13:35:37.502831717Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-vgkz5,Uid:d3509e65-77d3-4f0e-9d9a-8b47fdce9585,Namespace:tigera-operator,Attempt:0,}" Mar 3 13:35:37.534358 containerd[1629]: time="2026-03-03T13:35:37.534234697Z" level=info msg="connecting to shim 5d6f16039b779fa10b1625bc096c565d1366850895ae2f068763f07136017fcd" address="unix:///run/containerd/s/c744968a2f65a2792a44ab2c19df3c4fae08abe3be9bdd8771dbbc6c7f856804" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:35:37.536634 containerd[1629]: time="2026-03-03T13:35:37.536606980Z" level=info msg="connecting to shim bbdf6954b98a2719e6bc610b7681a1ea78d90a97b8c06c6a4aae3d3aa6f0f686" address="unix:///run/containerd/s/621d83b8f69a5b0bdc3c06d9def17c77797ffdb5b9804d0aae16f089ad672871" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:35:37.562505 systemd[1]: Started cri-containerd-5d6f16039b779fa10b1625bc096c565d1366850895ae2f068763f07136017fcd.scope - libcontainer container 5d6f16039b779fa10b1625bc096c565d1366850895ae2f068763f07136017fcd. Mar 3 13:35:37.567971 systemd[1]: Started cri-containerd-bbdf6954b98a2719e6bc610b7681a1ea78d90a97b8c06c6a4aae3d3aa6f0f686.scope - libcontainer container bbdf6954b98a2719e6bc610b7681a1ea78d90a97b8c06c6a4aae3d3aa6f0f686. Mar 3 13:35:37.592659 containerd[1629]: time="2026-03-03T13:35:37.592594350Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-l7brq,Uid:d72bd89b-13a3-4355-a166-71e5f129d471,Namespace:kube-system,Attempt:0,} returns sandbox id \"5d6f16039b779fa10b1625bc096c565d1366850895ae2f068763f07136017fcd\"" Mar 3 13:35:37.600900 containerd[1629]: time="2026-03-03T13:35:37.600767370Z" level=info msg="CreateContainer within sandbox \"5d6f16039b779fa10b1625bc096c565d1366850895ae2f068763f07136017fcd\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 3 13:35:37.617316 containerd[1629]: time="2026-03-03T13:35:37.616980440Z" level=info msg="Container 87b55214c25fa6ebfaa82307addf03b9107be4a090488ea076a667b853d01d2e: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:35:37.623622 containerd[1629]: time="2026-03-03T13:35:37.623585268Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-vgkz5,Uid:d3509e65-77d3-4f0e-9d9a-8b47fdce9585,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"bbdf6954b98a2719e6bc610b7681a1ea78d90a97b8c06c6a4aae3d3aa6f0f686\"" Mar 3 13:35:37.624372 containerd[1629]: time="2026-03-03T13:35:37.624350109Z" level=info msg="CreateContainer within sandbox \"5d6f16039b779fa10b1625bc096c565d1366850895ae2f068763f07136017fcd\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"87b55214c25fa6ebfaa82307addf03b9107be4a090488ea076a667b853d01d2e\"" Mar 3 13:35:37.625195 containerd[1629]: time="2026-03-03T13:35:37.625151000Z" level=info msg="StartContainer for \"87b55214c25fa6ebfaa82307addf03b9107be4a090488ea076a667b853d01d2e\"" Mar 3 13:35:37.626263 containerd[1629]: time="2026-03-03T13:35:37.626130132Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 3 13:35:37.628119 containerd[1629]: time="2026-03-03T13:35:37.628095164Z" level=info msg="connecting to shim 87b55214c25fa6ebfaa82307addf03b9107be4a090488ea076a667b853d01d2e" address="unix:///run/containerd/s/c744968a2f65a2792a44ab2c19df3c4fae08abe3be9bdd8771dbbc6c7f856804" protocol=ttrpc version=3 Mar 3 13:35:37.652110 systemd[1]: Started cri-containerd-87b55214c25fa6ebfaa82307addf03b9107be4a090488ea076a667b853d01d2e.scope - libcontainer container 87b55214c25fa6ebfaa82307addf03b9107be4a090488ea076a667b853d01d2e. Mar 3 13:35:37.735462 containerd[1629]: time="2026-03-03T13:35:37.735410998Z" level=info msg="StartContainer for \"87b55214c25fa6ebfaa82307addf03b9107be4a090488ea076a667b853d01d2e\" returns successfully" Mar 3 13:35:38.732093 kubelet[2794]: I0303 13:35:38.732017 2794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-l7brq" podStartSLOduration=2.732000894 podStartE2EDuration="2.732000894s" podCreationTimestamp="2026-03-03 13:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 13:35:38.407303018 +0000 UTC m=+8.170421313" watchObservedRunningTime="2026-03-03 13:35:38.732000894 +0000 UTC m=+8.495119149" Mar 3 13:35:39.368706 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1282853674.mount: Deactivated successfully. Mar 3 13:35:39.966435 containerd[1629]: time="2026-03-03T13:35:39.966392007Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:39.967479 containerd[1629]: time="2026-03-03T13:35:39.967452938Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Mar 3 13:35:39.968532 containerd[1629]: time="2026-03-03T13:35:39.968498279Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:39.970614 containerd[1629]: time="2026-03-03T13:35:39.970534062Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:39.971306 containerd[1629]: time="2026-03-03T13:35:39.970798182Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 2.34464874s" Mar 3 13:35:39.971306 containerd[1629]: time="2026-03-03T13:35:39.970819842Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Mar 3 13:35:39.975221 containerd[1629]: time="2026-03-03T13:35:39.975195158Z" level=info msg="CreateContainer within sandbox \"bbdf6954b98a2719e6bc610b7681a1ea78d90a97b8c06c6a4aae3d3aa6f0f686\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 3 13:35:39.985744 containerd[1629]: time="2026-03-03T13:35:39.985315530Z" level=info msg="Container 0b1e4b35a8df3a681fca107deb7daa6a4a23ff0003598dd95357954b80fb9484: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:35:39.997618 containerd[1629]: time="2026-03-03T13:35:39.997571016Z" level=info msg="CreateContainer within sandbox \"bbdf6954b98a2719e6bc610b7681a1ea78d90a97b8c06c6a4aae3d3aa6f0f686\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"0b1e4b35a8df3a681fca107deb7daa6a4a23ff0003598dd95357954b80fb9484\"" Mar 3 13:35:39.998193 containerd[1629]: time="2026-03-03T13:35:39.998152576Z" level=info msg="StartContainer for \"0b1e4b35a8df3a681fca107deb7daa6a4a23ff0003598dd95357954b80fb9484\"" Mar 3 13:35:39.999172 containerd[1629]: time="2026-03-03T13:35:39.999121428Z" level=info msg="connecting to shim 0b1e4b35a8df3a681fca107deb7daa6a4a23ff0003598dd95357954b80fb9484" address="unix:///run/containerd/s/621d83b8f69a5b0bdc3c06d9def17c77797ffdb5b9804d0aae16f089ad672871" protocol=ttrpc version=3 Mar 3 13:35:40.018035 systemd[1]: Started cri-containerd-0b1e4b35a8df3a681fca107deb7daa6a4a23ff0003598dd95357954b80fb9484.scope - libcontainer container 0b1e4b35a8df3a681fca107deb7daa6a4a23ff0003598dd95357954b80fb9484. Mar 3 13:35:40.042101 containerd[1629]: time="2026-03-03T13:35:40.041976831Z" level=info msg="StartContainer for \"0b1e4b35a8df3a681fca107deb7daa6a4a23ff0003598dd95357954b80fb9484\" returns successfully" Mar 3 13:35:41.204314 kubelet[2794]: I0303 13:35:41.204233 2794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6bf85f8dd-vgkz5" podStartSLOduration=1.858166662 podStartE2EDuration="4.204074894s" podCreationTimestamp="2026-03-03 13:35:37 +0000 UTC" firstStartedPulling="2026-03-03 13:35:37.625570731 +0000 UTC m=+7.388688996" lastFinishedPulling="2026-03-03 13:35:39.971478973 +0000 UTC m=+9.734597228" observedRunningTime="2026-03-03 13:35:40.425974431 +0000 UTC m=+10.189092726" watchObservedRunningTime="2026-03-03 13:35:41.204074894 +0000 UTC m=+10.967193179" Mar 3 13:35:45.106400 sudo[1847]: pam_unix(sudo:session): session closed for user root Mar 3 13:35:45.226285 sshd[1846]: Connection closed by 20.161.92.111 port 35416 Mar 3 13:35:45.226812 sshd-session[1843]: pam_unix(sshd:session): session closed for user core Mar 3 13:35:45.231653 systemd[1]: sshd@6-95.217.157.231:22-20.161.92.111:35416.service: Deactivated successfully. Mar 3 13:35:45.235210 systemd[1]: session-7.scope: Deactivated successfully. Mar 3 13:35:45.235600 systemd[1]: session-7.scope: Consumed 3.915s CPU time, 230.5M memory peak. Mar 3 13:35:45.238111 systemd-logind[1601]: Session 7 logged out. Waiting for processes to exit. Mar 3 13:35:45.241179 systemd-logind[1601]: Removed session 7. Mar 3 13:35:46.943686 systemd[1]: Created slice kubepods-besteffort-pode9b5a8fa_ea6e_499d_a811_5f78d88e426b.slice - libcontainer container kubepods-besteffort-pode9b5a8fa_ea6e_499d_a811_5f78d88e426b.slice. Mar 3 13:35:47.012787 systemd[1]: Created slice kubepods-besteffort-pod58cf811b_3c9a_4e4f_a809_bfef7ba669a9.slice - libcontainer container kubepods-besteffort-pod58cf811b_3c9a_4e4f_a809_bfef7ba669a9.slice. Mar 3 13:35:47.037057 kubelet[2794]: I0303 13:35:47.037022 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/58cf811b-3c9a-4e4f-a809-bfef7ba669a9-var-run-calico\") pod \"calico-node-r47qv\" (UID: \"58cf811b-3c9a-4e4f-a809-bfef7ba669a9\") " pod="calico-system/calico-node-r47qv" Mar 3 13:35:47.037622 kubelet[2794]: I0303 13:35:47.037488 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/58cf811b-3c9a-4e4f-a809-bfef7ba669a9-flexvol-driver-host\") pod \"calico-node-r47qv\" (UID: \"58cf811b-3c9a-4e4f-a809-bfef7ba669a9\") " pod="calico-system/calico-node-r47qv" Mar 3 13:35:47.037622 kubelet[2794]: I0303 13:35:47.037509 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/58cf811b-3c9a-4e4f-a809-bfef7ba669a9-sys-fs\") pod \"calico-node-r47qv\" (UID: \"58cf811b-3c9a-4e4f-a809-bfef7ba669a9\") " pod="calico-system/calico-node-r47qv" Mar 3 13:35:47.037622 kubelet[2794]: I0303 13:35:47.037526 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjj49\" (UniqueName: \"kubernetes.io/projected/58cf811b-3c9a-4e4f-a809-bfef7ba669a9-kube-api-access-fjj49\") pod \"calico-node-r47qv\" (UID: \"58cf811b-3c9a-4e4f-a809-bfef7ba669a9\") " pod="calico-system/calico-node-r47qv" Mar 3 13:35:47.037622 kubelet[2794]: I0303 13:35:47.037573 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/58cf811b-3c9a-4e4f-a809-bfef7ba669a9-xtables-lock\") pod \"calico-node-r47qv\" (UID: \"58cf811b-3c9a-4e4f-a809-bfef7ba669a9\") " pod="calico-system/calico-node-r47qv" Mar 3 13:35:47.037622 kubelet[2794]: I0303 13:35:47.037584 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/58cf811b-3c9a-4e4f-a809-bfef7ba669a9-cni-net-dir\") pod \"calico-node-r47qv\" (UID: \"58cf811b-3c9a-4e4f-a809-bfef7ba669a9\") " pod="calico-system/calico-node-r47qv" Mar 3 13:35:47.037861 kubelet[2794]: I0303 13:35:47.037595 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/58cf811b-3c9a-4e4f-a809-bfef7ba669a9-node-certs\") pod \"calico-node-r47qv\" (UID: \"58cf811b-3c9a-4e4f-a809-bfef7ba669a9\") " pod="calico-system/calico-node-r47qv" Mar 3 13:35:47.037861 kubelet[2794]: I0303 13:35:47.037607 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/58cf811b-3c9a-4e4f-a809-bfef7ba669a9-cni-log-dir\") pod \"calico-node-r47qv\" (UID: \"58cf811b-3c9a-4e4f-a809-bfef7ba669a9\") " pod="calico-system/calico-node-r47qv" Mar 3 13:35:47.037861 kubelet[2794]: I0303 13:35:47.037793 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46c97\" (UniqueName: \"kubernetes.io/projected/e9b5a8fa-ea6e-499d-a811-5f78d88e426b-kube-api-access-46c97\") pod \"calico-typha-569f9cb8c7-4sbwp\" (UID: \"e9b5a8fa-ea6e-499d-a811-5f78d88e426b\") " pod="calico-system/calico-typha-569f9cb8c7-4sbwp" Mar 3 13:35:47.037861 kubelet[2794]: I0303 13:35:47.037806 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/58cf811b-3c9a-4e4f-a809-bfef7ba669a9-cni-bin-dir\") pod \"calico-node-r47qv\" (UID: \"58cf811b-3c9a-4e4f-a809-bfef7ba669a9\") " pod="calico-system/calico-node-r47qv" Mar 3 13:35:47.037861 kubelet[2794]: I0303 13:35:47.037818 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58cf811b-3c9a-4e4f-a809-bfef7ba669a9-tigera-ca-bundle\") pod \"calico-node-r47qv\" (UID: \"58cf811b-3c9a-4e4f-a809-bfef7ba669a9\") " pod="calico-system/calico-node-r47qv" Mar 3 13:35:47.038055 kubelet[2794]: I0303 13:35:47.038003 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/58cf811b-3c9a-4e4f-a809-bfef7ba669a9-lib-modules\") pod \"calico-node-r47qv\" (UID: \"58cf811b-3c9a-4e4f-a809-bfef7ba669a9\") " pod="calico-system/calico-node-r47qv" Mar 3 13:35:47.038055 kubelet[2794]: I0303 13:35:47.038020 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9b5a8fa-ea6e-499d-a811-5f78d88e426b-tigera-ca-bundle\") pod \"calico-typha-569f9cb8c7-4sbwp\" (UID: \"e9b5a8fa-ea6e-499d-a811-5f78d88e426b\") " pod="calico-system/calico-typha-569f9cb8c7-4sbwp" Mar 3 13:35:47.038055 kubelet[2794]: I0303 13:35:47.038034 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/e9b5a8fa-ea6e-499d-a811-5f78d88e426b-typha-certs\") pod \"calico-typha-569f9cb8c7-4sbwp\" (UID: \"e9b5a8fa-ea6e-499d-a811-5f78d88e426b\") " pod="calico-system/calico-typha-569f9cb8c7-4sbwp" Mar 3 13:35:47.038173 kubelet[2794]: I0303 13:35:47.038164 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/58cf811b-3c9a-4e4f-a809-bfef7ba669a9-bpffs\") pod \"calico-node-r47qv\" (UID: \"58cf811b-3c9a-4e4f-a809-bfef7ba669a9\") " pod="calico-system/calico-node-r47qv" Mar 3 13:35:47.038294 kubelet[2794]: I0303 13:35:47.038238 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/58cf811b-3c9a-4e4f-a809-bfef7ba669a9-policysync\") pod \"calico-node-r47qv\" (UID: \"58cf811b-3c9a-4e4f-a809-bfef7ba669a9\") " pod="calico-system/calico-node-r47qv" Mar 3 13:35:47.038294 kubelet[2794]: I0303 13:35:47.038252 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/58cf811b-3c9a-4e4f-a809-bfef7ba669a9-nodeproc\") pod \"calico-node-r47qv\" (UID: \"58cf811b-3c9a-4e4f-a809-bfef7ba669a9\") " pod="calico-system/calico-node-r47qv" Mar 3 13:35:47.038294 kubelet[2794]: I0303 13:35:47.038262 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/58cf811b-3c9a-4e4f-a809-bfef7ba669a9-var-lib-calico\") pod \"calico-node-r47qv\" (UID: \"58cf811b-3c9a-4e4f-a809-bfef7ba669a9\") " pod="calico-system/calico-node-r47qv" Mar 3 13:35:47.115526 kubelet[2794]: E0303 13:35:47.115481 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jzlgk" podUID="7910b7ac-0de5-4bb6-95b6-143b053aaa8f" Mar 3 13:35:47.144315 kubelet[2794]: E0303 13:35:47.143947 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.144315 kubelet[2794]: W0303 13:35:47.143969 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.144315 kubelet[2794]: E0303 13:35:47.143993 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.144512 kubelet[2794]: E0303 13:35:47.144330 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.144512 kubelet[2794]: W0303 13:35:47.144340 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.144512 kubelet[2794]: E0303 13:35:47.144351 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.144736 kubelet[2794]: E0303 13:35:47.144691 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.144736 kubelet[2794]: W0303 13:35:47.144704 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.144736 kubelet[2794]: E0303 13:35:47.144712 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.147024 kubelet[2794]: E0303 13:35:47.147005 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.147024 kubelet[2794]: W0303 13:35:47.147018 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.147303 kubelet[2794]: E0303 13:35:47.147028 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.152002 kubelet[2794]: E0303 13:35:47.151966 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.152002 kubelet[2794]: W0303 13:35:47.151982 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.152002 kubelet[2794]: E0303 13:35:47.151996 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.158923 kubelet[2794]: E0303 13:35:47.155977 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.158923 kubelet[2794]: W0303 13:35:47.155989 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.158923 kubelet[2794]: E0303 13:35:47.156002 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.165012 kubelet[2794]: E0303 13:35:47.164984 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.165012 kubelet[2794]: W0303 13:35:47.165006 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.165117 kubelet[2794]: E0303 13:35:47.165021 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.181130 kubelet[2794]: E0303 13:35:47.180991 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.181130 kubelet[2794]: W0303 13:35:47.181009 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.181130 kubelet[2794]: E0303 13:35:47.181025 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.212984 kubelet[2794]: E0303 13:35:47.212876 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.212984 kubelet[2794]: W0303 13:35:47.212899 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.212984 kubelet[2794]: E0303 13:35:47.212938 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.213968 kubelet[2794]: E0303 13:35:47.213950 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.213968 kubelet[2794]: W0303 13:35:47.213964 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.214037 kubelet[2794]: E0303 13:35:47.213975 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.214226 kubelet[2794]: E0303 13:35:47.214207 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.214226 kubelet[2794]: W0303 13:35:47.214220 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.214226 kubelet[2794]: E0303 13:35:47.214229 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.214459 kubelet[2794]: E0303 13:35:47.214443 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.214459 kubelet[2794]: W0303 13:35:47.214453 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.214524 kubelet[2794]: E0303 13:35:47.214462 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.214643 kubelet[2794]: E0303 13:35:47.214616 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.214643 kubelet[2794]: W0303 13:35:47.214624 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.214643 kubelet[2794]: E0303 13:35:47.214631 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.214799 kubelet[2794]: E0303 13:35:47.214789 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.214799 kubelet[2794]: W0303 13:35:47.214797 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.214881 kubelet[2794]: E0303 13:35:47.214813 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.215022 kubelet[2794]: E0303 13:35:47.215009 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.215022 kubelet[2794]: W0303 13:35:47.215019 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.215066 kubelet[2794]: E0303 13:35:47.215035 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.215219 kubelet[2794]: E0303 13:35:47.215204 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.215219 kubelet[2794]: W0303 13:35:47.215213 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.215267 kubelet[2794]: E0303 13:35:47.215219 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.215384 kubelet[2794]: E0303 13:35:47.215370 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.215384 kubelet[2794]: W0303 13:35:47.215379 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.215418 kubelet[2794]: E0303 13:35:47.215386 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.215538 kubelet[2794]: E0303 13:35:47.215525 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.215538 kubelet[2794]: W0303 13:35:47.215534 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.215579 kubelet[2794]: E0303 13:35:47.215539 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.215696 kubelet[2794]: E0303 13:35:47.215683 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.215696 kubelet[2794]: W0303 13:35:47.215692 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.215733 kubelet[2794]: E0303 13:35:47.215698 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.215855 kubelet[2794]: E0303 13:35:47.215842 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.215855 kubelet[2794]: W0303 13:35:47.215850 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.215894 kubelet[2794]: E0303 13:35:47.215855 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.216034 kubelet[2794]: E0303 13:35:47.216023 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.216034 kubelet[2794]: W0303 13:35:47.216032 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.216076 kubelet[2794]: E0303 13:35:47.216038 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.216207 kubelet[2794]: E0303 13:35:47.216197 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.216207 kubelet[2794]: W0303 13:35:47.216205 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.216252 kubelet[2794]: E0303 13:35:47.216211 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.216367 kubelet[2794]: E0303 13:35:47.216354 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.216367 kubelet[2794]: W0303 13:35:47.216362 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.216404 kubelet[2794]: E0303 13:35:47.216368 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.216519 kubelet[2794]: E0303 13:35:47.216507 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.216519 kubelet[2794]: W0303 13:35:47.216515 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.216563 kubelet[2794]: E0303 13:35:47.216521 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.216692 kubelet[2794]: E0303 13:35:47.216679 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.216692 kubelet[2794]: W0303 13:35:47.216687 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.216732 kubelet[2794]: E0303 13:35:47.216693 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.216847 kubelet[2794]: E0303 13:35:47.216833 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.216847 kubelet[2794]: W0303 13:35:47.216843 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.216884 kubelet[2794]: E0303 13:35:47.216848 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.217283 kubelet[2794]: E0303 13:35:47.217121 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.217283 kubelet[2794]: W0303 13:35:47.217128 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.217283 kubelet[2794]: E0303 13:35:47.217135 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.217364 kubelet[2794]: E0303 13:35:47.217349 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.217364 kubelet[2794]: W0303 13:35:47.217358 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.217364 kubelet[2794]: E0303 13:35:47.217364 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.240248 kubelet[2794]: E0303 13:35:47.240211 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.240248 kubelet[2794]: W0303 13:35:47.240236 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.240248 kubelet[2794]: E0303 13:35:47.240255 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.240410 kubelet[2794]: I0303 13:35:47.240283 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7910b7ac-0de5-4bb6-95b6-143b053aaa8f-kubelet-dir\") pod \"csi-node-driver-jzlgk\" (UID: \"7910b7ac-0de5-4bb6-95b6-143b053aaa8f\") " pod="calico-system/csi-node-driver-jzlgk" Mar 3 13:35:47.240483 kubelet[2794]: E0303 13:35:47.240465 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.240483 kubelet[2794]: W0303 13:35:47.240476 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.240568 kubelet[2794]: E0303 13:35:47.240483 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.240568 kubelet[2794]: I0303 13:35:47.240527 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7910b7ac-0de5-4bb6-95b6-143b053aaa8f-socket-dir\") pod \"csi-node-driver-jzlgk\" (UID: \"7910b7ac-0de5-4bb6-95b6-143b053aaa8f\") " pod="calico-system/csi-node-driver-jzlgk" Mar 3 13:35:47.240734 kubelet[2794]: E0303 13:35:47.240714 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.240734 kubelet[2794]: W0303 13:35:47.240726 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.240734 kubelet[2794]: E0303 13:35:47.240734 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.240844 kubelet[2794]: I0303 13:35:47.240750 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7910b7ac-0de5-4bb6-95b6-143b053aaa8f-registration-dir\") pod \"csi-node-driver-jzlgk\" (UID: \"7910b7ac-0de5-4bb6-95b6-143b053aaa8f\") " pod="calico-system/csi-node-driver-jzlgk" Mar 3 13:35:47.240977 kubelet[2794]: E0303 13:35:47.240961 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.240977 kubelet[2794]: W0303 13:35:47.240972 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.241049 kubelet[2794]: E0303 13:35:47.240979 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.241049 kubelet[2794]: I0303 13:35:47.241001 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/7910b7ac-0de5-4bb6-95b6-143b053aaa8f-varrun\") pod \"csi-node-driver-jzlgk\" (UID: \"7910b7ac-0de5-4bb6-95b6-143b053aaa8f\") " pod="calico-system/csi-node-driver-jzlgk" Mar 3 13:35:47.241229 kubelet[2794]: E0303 13:35:47.241215 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.241229 kubelet[2794]: W0303 13:35:47.241224 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.241229 kubelet[2794]: E0303 13:35:47.241230 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.241292 kubelet[2794]: I0303 13:35:47.241250 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbj9t\" (UniqueName: \"kubernetes.io/projected/7910b7ac-0de5-4bb6-95b6-143b053aaa8f-kube-api-access-rbj9t\") pod \"csi-node-driver-jzlgk\" (UID: \"7910b7ac-0de5-4bb6-95b6-143b053aaa8f\") " pod="calico-system/csi-node-driver-jzlgk" Mar 3 13:35:47.241469 kubelet[2794]: E0303 13:35:47.241455 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.241469 kubelet[2794]: W0303 13:35:47.241464 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.241511 kubelet[2794]: E0303 13:35:47.241470 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.241693 kubelet[2794]: E0303 13:35:47.241680 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.241693 kubelet[2794]: W0303 13:35:47.241688 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.241730 kubelet[2794]: E0303 13:35:47.241695 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.241920 kubelet[2794]: E0303 13:35:47.241888 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.241948 kubelet[2794]: W0303 13:35:47.241920 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.241948 kubelet[2794]: E0303 13:35:47.241926 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.242119 kubelet[2794]: E0303 13:35:47.242096 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.242141 kubelet[2794]: W0303 13:35:47.242123 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.242141 kubelet[2794]: E0303 13:35:47.242132 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.242357 kubelet[2794]: E0303 13:35:47.242343 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.242357 kubelet[2794]: W0303 13:35:47.242352 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.242401 kubelet[2794]: E0303 13:35:47.242359 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.242560 kubelet[2794]: E0303 13:35:47.242544 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.242560 kubelet[2794]: W0303 13:35:47.242554 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.242725 kubelet[2794]: E0303 13:35:47.242562 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.242746 kubelet[2794]: E0303 13:35:47.242726 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.242746 kubelet[2794]: W0303 13:35:47.242733 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.242746 kubelet[2794]: E0303 13:35:47.242740 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.242962 kubelet[2794]: E0303 13:35:47.242944 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.242962 kubelet[2794]: W0303 13:35:47.242954 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.242962 kubelet[2794]: E0303 13:35:47.242961 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.243181 kubelet[2794]: E0303 13:35:47.243165 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.243216 kubelet[2794]: W0303 13:35:47.243195 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.243216 kubelet[2794]: E0303 13:35:47.243203 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.243665 kubelet[2794]: E0303 13:35:47.243652 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.243698 kubelet[2794]: W0303 13:35:47.243663 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.243698 kubelet[2794]: E0303 13:35:47.243688 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.250052 containerd[1629]: time="2026-03-03T13:35:47.250012042Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-569f9cb8c7-4sbwp,Uid:e9b5a8fa-ea6e-499d-a811-5f78d88e426b,Namespace:calico-system,Attempt:0,}" Mar 3 13:35:47.258571 update_engine[1605]: I20260303 13:35:47.258526 1605 update_attempter.cc:509] Updating boot flags... Mar 3 13:35:47.267922 containerd[1629]: time="2026-03-03T13:35:47.267770254Z" level=info msg="connecting to shim f7158f7bc3f7969a0fac1cd3433b081a4e4de2cd5c20afdea8e0e5150c652cf3" address="unix:///run/containerd/s/c559cf78244981f9ee29e377d5bc4045dc8c6a68586b8649943ff67befe5c2f5" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:35:47.298840 systemd[1]: Started cri-containerd-f7158f7bc3f7969a0fac1cd3433b081a4e4de2cd5c20afdea8e0e5150c652cf3.scope - libcontainer container f7158f7bc3f7969a0fac1cd3433b081a4e4de2cd5c20afdea8e0e5150c652cf3. Mar 3 13:35:47.317565 containerd[1629]: time="2026-03-03T13:35:47.317536587Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-r47qv,Uid:58cf811b-3c9a-4e4f-a809-bfef7ba669a9,Namespace:calico-system,Attempt:0,}" Mar 3 13:35:47.344420 kubelet[2794]: E0303 13:35:47.343373 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.344420 kubelet[2794]: W0303 13:35:47.343510 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.344420 kubelet[2794]: E0303 13:35:47.343535 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.346406 kubelet[2794]: E0303 13:35:47.346344 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.346406 kubelet[2794]: W0303 13:35:47.346361 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.346406 kubelet[2794]: E0303 13:35:47.346393 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.347496 kubelet[2794]: E0303 13:35:47.347199 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.347496 kubelet[2794]: W0303 13:35:47.347213 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.347496 kubelet[2794]: E0303 13:35:47.347223 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.348935 kubelet[2794]: E0303 13:35:47.348372 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.348935 kubelet[2794]: W0303 13:35:47.348387 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.348935 kubelet[2794]: E0303 13:35:47.348398 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.349816 kubelet[2794]: E0303 13:35:47.349595 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.349816 kubelet[2794]: W0303 13:35:47.349609 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.349816 kubelet[2794]: E0303 13:35:47.349624 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.350081 kubelet[2794]: E0303 13:35:47.349980 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.350081 kubelet[2794]: W0303 13:35:47.349991 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.350081 kubelet[2794]: E0303 13:35:47.350025 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.350260 kubelet[2794]: E0303 13:35:47.350241 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.350282 kubelet[2794]: W0303 13:35:47.350275 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.350308 kubelet[2794]: E0303 13:35:47.350282 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.350631 kubelet[2794]: E0303 13:35:47.350610 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.350631 kubelet[2794]: W0303 13:35:47.350623 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.350631 kubelet[2794]: E0303 13:35:47.350629 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.351061 kubelet[2794]: E0303 13:35:47.351031 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.351061 kubelet[2794]: W0303 13:35:47.351058 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.351121 kubelet[2794]: E0303 13:35:47.351066 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.352941 kubelet[2794]: E0303 13:35:47.351374 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.352941 kubelet[2794]: W0303 13:35:47.351384 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.352941 kubelet[2794]: E0303 13:35:47.351391 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.353013 kubelet[2794]: E0303 13:35:47.352998 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.353013 kubelet[2794]: W0303 13:35:47.353005 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.353013 kubelet[2794]: E0303 13:35:47.353012 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.353228 kubelet[2794]: E0303 13:35:47.353211 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.353228 kubelet[2794]: W0303 13:35:47.353221 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.353228 kubelet[2794]: E0303 13:35:47.353228 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.353423 kubelet[2794]: E0303 13:35:47.353406 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.353423 kubelet[2794]: W0303 13:35:47.353419 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.353460 kubelet[2794]: E0303 13:35:47.353425 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.353612 kubelet[2794]: E0303 13:35:47.353594 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.353612 kubelet[2794]: W0303 13:35:47.353606 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.353612 kubelet[2794]: E0303 13:35:47.353611 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.353798 kubelet[2794]: E0303 13:35:47.353780 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.353798 kubelet[2794]: W0303 13:35:47.353791 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.353798 kubelet[2794]: E0303 13:35:47.353797 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.354007 kubelet[2794]: E0303 13:35:47.353983 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.354007 kubelet[2794]: W0303 13:35:47.353995 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.354007 kubelet[2794]: E0303 13:35:47.354003 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.354233 kubelet[2794]: E0303 13:35:47.354214 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.354233 kubelet[2794]: W0303 13:35:47.354229 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.354272 kubelet[2794]: E0303 13:35:47.354236 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.354409 kubelet[2794]: E0303 13:35:47.354392 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.354409 kubelet[2794]: W0303 13:35:47.354403 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.354409 kubelet[2794]: E0303 13:35:47.354409 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.354687 kubelet[2794]: E0303 13:35:47.354672 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.354687 kubelet[2794]: W0303 13:35:47.354681 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.354687 kubelet[2794]: E0303 13:35:47.354688 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.354880 kubelet[2794]: E0303 13:35:47.354863 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.354880 kubelet[2794]: W0303 13:35:47.354874 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.354933 kubelet[2794]: E0303 13:35:47.354900 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.355130 kubelet[2794]: E0303 13:35:47.355093 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.355130 kubelet[2794]: W0303 13:35:47.355116 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.355130 kubelet[2794]: E0303 13:35:47.355122 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.355314 kubelet[2794]: E0303 13:35:47.355297 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.355314 kubelet[2794]: W0303 13:35:47.355310 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.355314 kubelet[2794]: E0303 13:35:47.355315 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.355515 kubelet[2794]: E0303 13:35:47.355489 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.355515 kubelet[2794]: W0303 13:35:47.355498 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.355515 kubelet[2794]: E0303 13:35:47.355504 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.355683 kubelet[2794]: E0303 13:35:47.355667 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.355683 kubelet[2794]: W0303 13:35:47.355680 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.355725 kubelet[2794]: E0303 13:35:47.355686 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.355854 kubelet[2794]: E0303 13:35:47.355837 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.355854 kubelet[2794]: W0303 13:35:47.355848 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.355854 kubelet[2794]: E0303 13:35:47.355853 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.374719 kubelet[2794]: E0303 13:35:47.373854 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:47.374719 kubelet[2794]: W0303 13:35:47.373873 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:47.375616 kubelet[2794]: E0303 13:35:47.375588 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:47.379116 containerd[1629]: time="2026-03-03T13:35:47.378156725Z" level=info msg="connecting to shim 418cd470c860554882d27e560f2ec0b484ca00bac82f6caba3c5ec4fe796f8f1" address="unix:///run/containerd/s/70bc889b601e6dbea33f7f649c3f4aa9dfd95e4be7f7e4fb323eee11aa334d9b" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:35:47.438442 systemd[1]: Started cri-containerd-418cd470c860554882d27e560f2ec0b484ca00bac82f6caba3c5ec4fe796f8f1.scope - libcontainer container 418cd470c860554882d27e560f2ec0b484ca00bac82f6caba3c5ec4fe796f8f1. Mar 3 13:35:47.476194 containerd[1629]: time="2026-03-03T13:35:47.475081911Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-569f9cb8c7-4sbwp,Uid:e9b5a8fa-ea6e-499d-a811-5f78d88e426b,Namespace:calico-system,Attempt:0,} returns sandbox id \"f7158f7bc3f7969a0fac1cd3433b081a4e4de2cd5c20afdea8e0e5150c652cf3\"" Mar 3 13:35:47.483215 containerd[1629]: time="2026-03-03T13:35:47.482474361Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 3 13:35:47.524014 containerd[1629]: time="2026-03-03T13:35:47.523958211Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-r47qv,Uid:58cf811b-3c9a-4e4f-a809-bfef7ba669a9,Namespace:calico-system,Attempt:0,} returns sandbox id \"418cd470c860554882d27e560f2ec0b484ca00bac82f6caba3c5ec4fe796f8f1\"" Mar 3 13:35:48.334845 kubelet[2794]: E0303 13:35:48.334025 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jzlgk" podUID="7910b7ac-0de5-4bb6-95b6-143b053aaa8f" Mar 3 13:35:49.576370 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount960406564.mount: Deactivated successfully. Mar 3 13:35:50.334271 kubelet[2794]: E0303 13:35:50.334226 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jzlgk" podUID="7910b7ac-0de5-4bb6-95b6-143b053aaa8f" Mar 3 13:35:50.353717 containerd[1629]: time="2026-03-03T13:35:50.353670385Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:50.354645 containerd[1629]: time="2026-03-03T13:35:50.354617269Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Mar 3 13:35:50.355935 containerd[1629]: time="2026-03-03T13:35:50.355740753Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:50.357690 containerd[1629]: time="2026-03-03T13:35:50.357660930Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:50.358169 containerd[1629]: time="2026-03-03T13:35:50.358100332Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 2.875603961s" Mar 3 13:35:50.358169 containerd[1629]: time="2026-03-03T13:35:50.358136852Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Mar 3 13:35:50.360199 containerd[1629]: time="2026-03-03T13:35:50.360184809Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 3 13:35:50.370989 containerd[1629]: time="2026-03-03T13:35:50.370945438Z" level=info msg="CreateContainer within sandbox \"f7158f7bc3f7969a0fac1cd3433b081a4e4de2cd5c20afdea8e0e5150c652cf3\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 3 13:35:50.377593 containerd[1629]: time="2026-03-03T13:35:50.377022459Z" level=info msg="Container ee85356d6c181275d73e24fb1306723a923d8edb652006da5daac477b08ea038: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:35:50.385252 containerd[1629]: time="2026-03-03T13:35:50.385212778Z" level=info msg="CreateContainer within sandbox \"f7158f7bc3f7969a0fac1cd3433b081a4e4de2cd5c20afdea8e0e5150c652cf3\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"ee85356d6c181275d73e24fb1306723a923d8edb652006da5daac477b08ea038\"" Mar 3 13:35:50.388047 containerd[1629]: time="2026-03-03T13:35:50.388023829Z" level=info msg="StartContainer for \"ee85356d6c181275d73e24fb1306723a923d8edb652006da5daac477b08ea038\"" Mar 3 13:35:50.390030 containerd[1629]: time="2026-03-03T13:35:50.389661605Z" level=info msg="connecting to shim ee85356d6c181275d73e24fb1306723a923d8edb652006da5daac477b08ea038" address="unix:///run/containerd/s/c559cf78244981f9ee29e377d5bc4045dc8c6a68586b8649943ff67befe5c2f5" protocol=ttrpc version=3 Mar 3 13:35:50.408031 systemd[1]: Started cri-containerd-ee85356d6c181275d73e24fb1306723a923d8edb652006da5daac477b08ea038.scope - libcontainer container ee85356d6c181275d73e24fb1306723a923d8edb652006da5daac477b08ea038. Mar 3 13:35:50.458861 containerd[1629]: time="2026-03-03T13:35:50.458826572Z" level=info msg="StartContainer for \"ee85356d6c181275d73e24fb1306723a923d8edb652006da5daac477b08ea038\" returns successfully" Mar 3 13:35:51.445947 kubelet[2794]: E0303 13:35:51.445666 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.445947 kubelet[2794]: W0303 13:35:51.445698 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.445947 kubelet[2794]: E0303 13:35:51.445767 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.446703 kubelet[2794]: E0303 13:35:51.446556 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.446703 kubelet[2794]: W0303 13:35:51.446632 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.446703 kubelet[2794]: E0303 13:35:51.446647 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.447979 kubelet[2794]: E0303 13:35:51.447478 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.447979 kubelet[2794]: W0303 13:35:51.447550 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.447979 kubelet[2794]: E0303 13:35:51.447565 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.448220 kubelet[2794]: E0303 13:35:51.448203 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.448269 kubelet[2794]: W0303 13:35:51.448251 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.448326 kubelet[2794]: E0303 13:35:51.448266 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.449661 kubelet[2794]: E0303 13:35:51.449073 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.449661 kubelet[2794]: W0303 13:35:51.449178 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.449661 kubelet[2794]: E0303 13:35:51.449243 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.449888 kubelet[2794]: E0303 13:35:51.449850 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.449888 kubelet[2794]: W0303 13:35:51.449868 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.450037 kubelet[2794]: E0303 13:35:51.449890 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.450547 kubelet[2794]: E0303 13:35:51.450484 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.450547 kubelet[2794]: W0303 13:35:51.450515 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.450547 kubelet[2794]: E0303 13:35:51.450535 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.451612 kubelet[2794]: E0303 13:35:51.451346 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.451612 kubelet[2794]: W0303 13:35:51.451572 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.451612 kubelet[2794]: E0303 13:35:51.451595 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.452519 kubelet[2794]: E0303 13:35:51.452463 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.452519 kubelet[2794]: W0303 13:35:51.452483 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.452519 kubelet[2794]: E0303 13:35:51.452504 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.453065 kubelet[2794]: E0303 13:35:51.453033 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.453065 kubelet[2794]: W0303 13:35:51.453052 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.453224 kubelet[2794]: E0303 13:35:51.453071 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.453795 kubelet[2794]: E0303 13:35:51.453562 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.453795 kubelet[2794]: W0303 13:35:51.453586 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.453795 kubelet[2794]: E0303 13:35:51.453607 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.454152 kubelet[2794]: E0303 13:35:51.454111 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.454285 kubelet[2794]: W0303 13:35:51.454236 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.454285 kubelet[2794]: E0303 13:35:51.454264 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.455822 kubelet[2794]: E0303 13:35:51.455457 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.455822 kubelet[2794]: W0303 13:35:51.455485 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.455822 kubelet[2794]: E0303 13:35:51.455505 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.457149 kubelet[2794]: E0303 13:35:51.455868 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.457149 kubelet[2794]: W0303 13:35:51.455879 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.457149 kubelet[2794]: E0303 13:35:51.455893 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.457149 kubelet[2794]: E0303 13:35:51.456500 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.457149 kubelet[2794]: W0303 13:35:51.456634 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.457149 kubelet[2794]: E0303 13:35:51.456651 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.460239 kubelet[2794]: I0303 13:35:51.459571 2794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-569f9cb8c7-4sbwp" podStartSLOduration=2.578676304 podStartE2EDuration="5.459547995s" podCreationTimestamp="2026-03-03 13:35:46 +0000 UTC" firstStartedPulling="2026-03-03 13:35:47.477863512 +0000 UTC m=+17.240981777" lastFinishedPulling="2026-03-03 13:35:50.358735213 +0000 UTC m=+20.121853468" observedRunningTime="2026-03-03 13:35:51.458271111 +0000 UTC m=+21.221389406" watchObservedRunningTime="2026-03-03 13:35:51.459547995 +0000 UTC m=+21.222666300" Mar 3 13:35:51.481091 kubelet[2794]: E0303 13:35:51.480671 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.481091 kubelet[2794]: W0303 13:35:51.480702 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.481091 kubelet[2794]: E0303 13:35:51.480727 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.481429 kubelet[2794]: E0303 13:35:51.481204 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.481429 kubelet[2794]: W0303 13:35:51.481219 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.481429 kubelet[2794]: E0303 13:35:51.481235 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.481622 kubelet[2794]: E0303 13:35:51.481558 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.481622 kubelet[2794]: W0303 13:35:51.481571 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.481622 kubelet[2794]: E0303 13:35:51.481584 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.482578 kubelet[2794]: E0303 13:35:51.481984 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.482578 kubelet[2794]: W0303 13:35:51.482000 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.482578 kubelet[2794]: E0303 13:35:51.482015 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.482578 kubelet[2794]: E0303 13:35:51.482337 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.482578 kubelet[2794]: W0303 13:35:51.482347 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.482578 kubelet[2794]: E0303 13:35:51.482360 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.484146 kubelet[2794]: E0303 13:35:51.482695 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.484146 kubelet[2794]: W0303 13:35:51.482706 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.484146 kubelet[2794]: E0303 13:35:51.482719 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.484146 kubelet[2794]: E0303 13:35:51.483101 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.484146 kubelet[2794]: W0303 13:35:51.483114 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.484146 kubelet[2794]: E0303 13:35:51.483126 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.484146 kubelet[2794]: E0303 13:35:51.484024 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.484146 kubelet[2794]: W0303 13:35:51.484040 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.484146 kubelet[2794]: E0303 13:35:51.484055 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.484749 kubelet[2794]: E0303 13:35:51.484407 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.484749 kubelet[2794]: W0303 13:35:51.484419 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.484749 kubelet[2794]: E0303 13:35:51.484433 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.484749 kubelet[2794]: E0303 13:35:51.484713 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.484749 kubelet[2794]: W0303 13:35:51.484724 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.484749 kubelet[2794]: E0303 13:35:51.484734 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.485737 kubelet[2794]: E0303 13:35:51.485077 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.485737 kubelet[2794]: W0303 13:35:51.485088 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.485737 kubelet[2794]: E0303 13:35:51.485324 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.485737 kubelet[2794]: E0303 13:35:51.485736 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.486318 kubelet[2794]: W0303 13:35:51.485751 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.486318 kubelet[2794]: E0303 13:35:51.485767 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.486732 kubelet[2794]: E0303 13:35:51.486688 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.486732 kubelet[2794]: W0303 13:35:51.486716 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.486868 kubelet[2794]: E0303 13:35:51.486739 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.487322 kubelet[2794]: E0303 13:35:51.487290 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.487322 kubelet[2794]: W0303 13:35:51.487318 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.487418 kubelet[2794]: E0303 13:35:51.487336 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.487799 kubelet[2794]: E0303 13:35:51.487768 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.487799 kubelet[2794]: W0303 13:35:51.487790 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.487996 kubelet[2794]: E0303 13:35:51.487807 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.488356 kubelet[2794]: E0303 13:35:51.488306 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.488356 kubelet[2794]: W0303 13:35:51.488333 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.488356 kubelet[2794]: E0303 13:35:51.488350 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.488777 kubelet[2794]: E0303 13:35:51.488716 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.488777 kubelet[2794]: W0303 13:35:51.488730 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.488777 kubelet[2794]: E0303 13:35:51.488743 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:51.489791 kubelet[2794]: E0303 13:35:51.489754 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:51.489791 kubelet[2794]: W0303 13:35:51.489777 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:51.489976 kubelet[2794]: E0303 13:35:51.489794 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.334634 kubelet[2794]: E0303 13:35:52.333801 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jzlgk" podUID="7910b7ac-0de5-4bb6-95b6-143b053aaa8f" Mar 3 13:35:52.418867 containerd[1629]: time="2026-03-03T13:35:52.418812678Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:52.419975 containerd[1629]: time="2026-03-03T13:35:52.419942232Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Mar 3 13:35:52.420744 containerd[1629]: time="2026-03-03T13:35:52.420625894Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:52.422347 containerd[1629]: time="2026-03-03T13:35:52.422300070Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:52.422806 containerd[1629]: time="2026-03-03T13:35:52.422662261Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 2.062425022s" Mar 3 13:35:52.422806 containerd[1629]: time="2026-03-03T13:35:52.422694271Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Mar 3 13:35:52.426743 containerd[1629]: time="2026-03-03T13:35:52.426687254Z" level=info msg="CreateContainer within sandbox \"418cd470c860554882d27e560f2ec0b484ca00bac82f6caba3c5ec4fe796f8f1\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 3 13:35:52.437480 containerd[1629]: time="2026-03-03T13:35:52.437250969Z" level=info msg="Container 4e2291b3c8a5803582aa9f45bd00691f444b89cb2491e9aafbe3b188e64d3217: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:35:52.449712 kubelet[2794]: I0303 13:35:52.449672 2794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 3 13:35:52.453489 containerd[1629]: time="2026-03-03T13:35:52.453430023Z" level=info msg="CreateContainer within sandbox \"418cd470c860554882d27e560f2ec0b484ca00bac82f6caba3c5ec4fe796f8f1\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"4e2291b3c8a5803582aa9f45bd00691f444b89cb2491e9aafbe3b188e64d3217\"" Mar 3 13:35:52.454208 containerd[1629]: time="2026-03-03T13:35:52.454180255Z" level=info msg="StartContainer for \"4e2291b3c8a5803582aa9f45bd00691f444b89cb2491e9aafbe3b188e64d3217\"" Mar 3 13:35:52.455656 containerd[1629]: time="2026-03-03T13:35:52.455626519Z" level=info msg="connecting to shim 4e2291b3c8a5803582aa9f45bd00691f444b89cb2491e9aafbe3b188e64d3217" address="unix:///run/containerd/s/70bc889b601e6dbea33f7f649c3f4aa9dfd95e4be7f7e4fb323eee11aa334d9b" protocol=ttrpc version=3 Mar 3 13:35:52.463425 kubelet[2794]: E0303 13:35:52.463390 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.463425 kubelet[2794]: W0303 13:35:52.463410 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.463425 kubelet[2794]: E0303 13:35:52.463426 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.463596 kubelet[2794]: E0303 13:35:52.463578 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.463596 kubelet[2794]: W0303 13:35:52.463592 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.463640 kubelet[2794]: E0303 13:35:52.463598 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.463790 kubelet[2794]: E0303 13:35:52.463775 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.463790 kubelet[2794]: W0303 13:35:52.463786 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.463851 kubelet[2794]: E0303 13:35:52.463809 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.464018 kubelet[2794]: E0303 13:35:52.464003 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.464018 kubelet[2794]: W0303 13:35:52.464014 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.464064 kubelet[2794]: E0303 13:35:52.464020 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.464270 kubelet[2794]: E0303 13:35:52.464215 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.464270 kubelet[2794]: W0303 13:35:52.464251 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.464270 kubelet[2794]: E0303 13:35:52.464257 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.464451 kubelet[2794]: E0303 13:35:52.464437 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.464451 kubelet[2794]: W0303 13:35:52.464443 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.464451 kubelet[2794]: E0303 13:35:52.464449 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.464688 kubelet[2794]: E0303 13:35:52.464660 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.464688 kubelet[2794]: W0303 13:35:52.464673 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.464688 kubelet[2794]: E0303 13:35:52.464680 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.464973 kubelet[2794]: E0303 13:35:52.464937 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.464973 kubelet[2794]: W0303 13:35:52.464949 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.464973 kubelet[2794]: E0303 13:35:52.464958 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.465226 kubelet[2794]: E0303 13:35:52.465211 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.465226 kubelet[2794]: W0303 13:35:52.465224 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.465270 kubelet[2794]: E0303 13:35:52.465234 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.465490 kubelet[2794]: E0303 13:35:52.465471 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.465490 kubelet[2794]: W0303 13:35:52.465483 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.465490 kubelet[2794]: E0303 13:35:52.465489 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.465761 kubelet[2794]: E0303 13:35:52.465741 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.465761 kubelet[2794]: W0303 13:35:52.465754 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.465761 kubelet[2794]: E0303 13:35:52.465761 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.466005 kubelet[2794]: E0303 13:35:52.465976 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.466005 kubelet[2794]: W0303 13:35:52.465988 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.466056 kubelet[2794]: E0303 13:35:52.466010 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.466260 kubelet[2794]: E0303 13:35:52.466223 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.466298 kubelet[2794]: W0303 13:35:52.466263 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.466298 kubelet[2794]: E0303 13:35:52.466271 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.466478 kubelet[2794]: E0303 13:35:52.466462 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.466478 kubelet[2794]: W0303 13:35:52.466474 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.466522 kubelet[2794]: E0303 13:35:52.466482 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.467883 kubelet[2794]: E0303 13:35:52.467520 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.467883 kubelet[2794]: W0303 13:35:52.467532 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.467883 kubelet[2794]: E0303 13:35:52.467539 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.479133 systemd[1]: Started cri-containerd-4e2291b3c8a5803582aa9f45bd00691f444b89cb2491e9aafbe3b188e64d3217.scope - libcontainer container 4e2291b3c8a5803582aa9f45bd00691f444b89cb2491e9aafbe3b188e64d3217. Mar 3 13:35:52.488019 kubelet[2794]: E0303 13:35:52.487969 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.488019 kubelet[2794]: W0303 13:35:52.487993 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.488237 kubelet[2794]: E0303 13:35:52.488153 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.488508 kubelet[2794]: E0303 13:35:52.488499 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.488634 kubelet[2794]: W0303 13:35:52.488542 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.488634 kubelet[2794]: E0303 13:35:52.488550 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.488868 kubelet[2794]: E0303 13:35:52.488860 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.488928 kubelet[2794]: W0303 13:35:52.488920 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.488999 kubelet[2794]: E0303 13:35:52.488961 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.490032 kubelet[2794]: E0303 13:35:52.490006 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.490032 kubelet[2794]: W0303 13:35:52.490027 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.490118 kubelet[2794]: E0303 13:35:52.490046 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.490351 kubelet[2794]: E0303 13:35:52.490331 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.490351 kubelet[2794]: W0303 13:35:52.490345 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.490351 kubelet[2794]: E0303 13:35:52.490355 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.491162 kubelet[2794]: E0303 13:35:52.491127 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.491162 kubelet[2794]: W0303 13:35:52.491154 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.491212 kubelet[2794]: E0303 13:35:52.491165 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.491441 kubelet[2794]: E0303 13:35:52.491414 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.491441 kubelet[2794]: W0303 13:35:52.491427 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.491441 kubelet[2794]: E0303 13:35:52.491437 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.492161 kubelet[2794]: E0303 13:35:52.492131 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.492161 kubelet[2794]: W0303 13:35:52.492156 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.492211 kubelet[2794]: E0303 13:35:52.492168 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.494199 kubelet[2794]: E0303 13:35:52.494167 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.494199 kubelet[2794]: W0303 13:35:52.494178 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.494199 kubelet[2794]: E0303 13:35:52.494187 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.494550 kubelet[2794]: E0303 13:35:52.494494 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.494550 kubelet[2794]: W0303 13:35:52.494502 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.494550 kubelet[2794]: E0303 13:35:52.494509 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.494874 kubelet[2794]: E0303 13:35:52.494798 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.494874 kubelet[2794]: W0303 13:35:52.494806 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.494874 kubelet[2794]: E0303 13:35:52.494812 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.495153 kubelet[2794]: E0303 13:35:52.495133 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.495244 kubelet[2794]: W0303 13:35:52.495190 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.495244 kubelet[2794]: E0303 13:35:52.495199 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.495822 kubelet[2794]: E0303 13:35:52.495793 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.495822 kubelet[2794]: W0303 13:35:52.495818 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.495874 kubelet[2794]: E0303 13:35:52.495831 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.496771 kubelet[2794]: E0303 13:35:52.496421 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.496771 kubelet[2794]: W0303 13:35:52.496435 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.496771 kubelet[2794]: E0303 13:35:52.496451 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.498633 kubelet[2794]: E0303 13:35:52.498611 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.498667 kubelet[2794]: W0303 13:35:52.498632 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.498667 kubelet[2794]: E0303 13:35:52.498651 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.498861 kubelet[2794]: E0303 13:35:52.498841 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.498879 kubelet[2794]: W0303 13:35:52.498858 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.498879 kubelet[2794]: E0303 13:35:52.498873 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.500243 kubelet[2794]: E0303 13:35:52.500214 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.500243 kubelet[2794]: W0303 13:35:52.500239 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.500298 kubelet[2794]: E0303 13:35:52.500251 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.502191 kubelet[2794]: E0303 13:35:52.502169 2794 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:52.502225 kubelet[2794]: W0303 13:35:52.502188 2794 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:52.502225 kubelet[2794]: E0303 13:35:52.502209 2794 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:52.545643 containerd[1629]: time="2026-03-03T13:35:52.545615257Z" level=info msg="StartContainer for \"4e2291b3c8a5803582aa9f45bd00691f444b89cb2491e9aafbe3b188e64d3217\" returns successfully" Mar 3 13:35:52.559683 systemd[1]: cri-containerd-4e2291b3c8a5803582aa9f45bd00691f444b89cb2491e9aafbe3b188e64d3217.scope: Deactivated successfully. Mar 3 13:35:52.566172 containerd[1629]: time="2026-03-03T13:35:52.566133354Z" level=info msg="received container exit event container_id:\"4e2291b3c8a5803582aa9f45bd00691f444b89cb2491e9aafbe3b188e64d3217\" id:\"4e2291b3c8a5803582aa9f45bd00691f444b89cb2491e9aafbe3b188e64d3217\" pid:3503 exited_at:{seconds:1772544952 nanos:565695503}" Mar 3 13:35:52.589677 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4e2291b3c8a5803582aa9f45bd00691f444b89cb2491e9aafbe3b188e64d3217-rootfs.mount: Deactivated successfully. Mar 3 13:35:53.455557 containerd[1629]: time="2026-03-03T13:35:53.455490874Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 3 13:35:54.334440 kubelet[2794]: E0303 13:35:54.334335 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jzlgk" podUID="7910b7ac-0de5-4bb6-95b6-143b053aaa8f" Mar 3 13:35:56.335950 kubelet[2794]: E0303 13:35:56.334273 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jzlgk" podUID="7910b7ac-0de5-4bb6-95b6-143b053aaa8f" Mar 3 13:35:58.335399 kubelet[2794]: E0303 13:35:58.335340 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jzlgk" podUID="7910b7ac-0de5-4bb6-95b6-143b053aaa8f" Mar 3 13:35:59.855992 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2018422710.mount: Deactivated successfully. Mar 3 13:35:59.880382 containerd[1629]: time="2026-03-03T13:35:59.880341399Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:59.882110 containerd[1629]: time="2026-03-03T13:35:59.882030213Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:59.882110 containerd[1629]: time="2026-03-03T13:35:59.882066953Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Mar 3 13:35:59.886773 containerd[1629]: time="2026-03-03T13:35:59.886589854Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:59.886951 containerd[1629]: time="2026-03-03T13:35:59.886933465Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 6.431405181s" Mar 3 13:35:59.886983 containerd[1629]: time="2026-03-03T13:35:59.886954285Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Mar 3 13:35:59.890578 containerd[1629]: time="2026-03-03T13:35:59.890552714Z" level=info msg="CreateContainer within sandbox \"418cd470c860554882d27e560f2ec0b484ca00bac82f6caba3c5ec4fe796f8f1\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 3 13:35:59.898925 containerd[1629]: time="2026-03-03T13:35:59.898507504Z" level=info msg="Container 226f0ee314a40fe9e3415907b480ffb5a11f80bc924efbf4003faaec4f54e461: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:35:59.906314 containerd[1629]: time="2026-03-03T13:35:59.906253885Z" level=info msg="CreateContainer within sandbox \"418cd470c860554882d27e560f2ec0b484ca00bac82f6caba3c5ec4fe796f8f1\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"226f0ee314a40fe9e3415907b480ffb5a11f80bc924efbf4003faaec4f54e461\"" Mar 3 13:35:59.906745 containerd[1629]: time="2026-03-03T13:35:59.906721816Z" level=info msg="StartContainer for \"226f0ee314a40fe9e3415907b480ffb5a11f80bc924efbf4003faaec4f54e461\"" Mar 3 13:35:59.907745 containerd[1629]: time="2026-03-03T13:35:59.907721678Z" level=info msg="connecting to shim 226f0ee314a40fe9e3415907b480ffb5a11f80bc924efbf4003faaec4f54e461" address="unix:///run/containerd/s/70bc889b601e6dbea33f7f649c3f4aa9dfd95e4be7f7e4fb323eee11aa334d9b" protocol=ttrpc version=3 Mar 3 13:35:59.930006 systemd[1]: Started cri-containerd-226f0ee314a40fe9e3415907b480ffb5a11f80bc924efbf4003faaec4f54e461.scope - libcontainer container 226f0ee314a40fe9e3415907b480ffb5a11f80bc924efbf4003faaec4f54e461. Mar 3 13:36:00.009938 containerd[1629]: time="2026-03-03T13:36:00.009818268Z" level=info msg="StartContainer for \"226f0ee314a40fe9e3415907b480ffb5a11f80bc924efbf4003faaec4f54e461\" returns successfully" Mar 3 13:36:00.047452 systemd[1]: cri-containerd-226f0ee314a40fe9e3415907b480ffb5a11f80bc924efbf4003faaec4f54e461.scope: Deactivated successfully. Mar 3 13:36:00.050693 containerd[1629]: time="2026-03-03T13:36:00.050671100Z" level=info msg="received container exit event container_id:\"226f0ee314a40fe9e3415907b480ffb5a11f80bc924efbf4003faaec4f54e461\" id:\"226f0ee314a40fe9e3415907b480ffb5a11f80bc924efbf4003faaec4f54e461\" pid:3574 exited_at:{seconds:1772544960 nanos:50478069}" Mar 3 13:36:00.335030 kubelet[2794]: E0303 13:36:00.334687 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jzlgk" podUID="7910b7ac-0de5-4bb6-95b6-143b053aaa8f" Mar 3 13:36:00.474073 containerd[1629]: time="2026-03-03T13:36:00.474016628Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 3 13:36:00.856425 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-226f0ee314a40fe9e3415907b480ffb5a11f80bc924efbf4003faaec4f54e461-rootfs.mount: Deactivated successfully. Mar 3 13:36:02.335485 kubelet[2794]: E0303 13:36:02.334537 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jzlgk" podUID="7910b7ac-0de5-4bb6-95b6-143b053aaa8f" Mar 3 13:36:04.335328 kubelet[2794]: E0303 13:36:04.335173 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jzlgk" podUID="7910b7ac-0de5-4bb6-95b6-143b053aaa8f" Mar 3 13:36:04.503782 containerd[1629]: time="2026-03-03T13:36:04.503731112Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:04.504931 containerd[1629]: time="2026-03-03T13:36:04.504831594Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Mar 3 13:36:04.505660 containerd[1629]: time="2026-03-03T13:36:04.505630616Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:04.507050 containerd[1629]: time="2026-03-03T13:36:04.507035950Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:04.507690 containerd[1629]: time="2026-03-03T13:36:04.507412290Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 4.03244278s" Mar 3 13:36:04.507690 containerd[1629]: time="2026-03-03T13:36:04.507435520Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Mar 3 13:36:04.510686 containerd[1629]: time="2026-03-03T13:36:04.510662137Z" level=info msg="CreateContainer within sandbox \"418cd470c860554882d27e560f2ec0b484ca00bac82f6caba3c5ec4fe796f8f1\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 3 13:36:04.518803 containerd[1629]: time="2026-03-03T13:36:04.518100123Z" level=info msg="Container a47a5890be1f83e1ef406599ba59b3dbf1b7f77178110095e5cea6b36234da81: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:04.539411 containerd[1629]: time="2026-03-03T13:36:04.539358760Z" level=info msg="CreateContainer within sandbox \"418cd470c860554882d27e560f2ec0b484ca00bac82f6caba3c5ec4fe796f8f1\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a47a5890be1f83e1ef406599ba59b3dbf1b7f77178110095e5cea6b36234da81\"" Mar 3 13:36:04.540169 containerd[1629]: time="2026-03-03T13:36:04.539991552Z" level=info msg="StartContainer for \"a47a5890be1f83e1ef406599ba59b3dbf1b7f77178110095e5cea6b36234da81\"" Mar 3 13:36:04.541434 containerd[1629]: time="2026-03-03T13:36:04.541405955Z" level=info msg="connecting to shim a47a5890be1f83e1ef406599ba59b3dbf1b7f77178110095e5cea6b36234da81" address="unix:///run/containerd/s/70bc889b601e6dbea33f7f649c3f4aa9dfd95e4be7f7e4fb323eee11aa334d9b" protocol=ttrpc version=3 Mar 3 13:36:04.586021 systemd[1]: Started cri-containerd-a47a5890be1f83e1ef406599ba59b3dbf1b7f77178110095e5cea6b36234da81.scope - libcontainer container a47a5890be1f83e1ef406599ba59b3dbf1b7f77178110095e5cea6b36234da81. Mar 3 13:36:04.677138 containerd[1629]: time="2026-03-03T13:36:04.677104813Z" level=info msg="StartContainer for \"a47a5890be1f83e1ef406599ba59b3dbf1b7f77178110095e5cea6b36234da81\" returns successfully" Mar 3 13:36:05.105494 containerd[1629]: time="2026-03-03T13:36:05.105367838Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 3 13:36:05.107529 systemd[1]: cri-containerd-a47a5890be1f83e1ef406599ba59b3dbf1b7f77178110095e5cea6b36234da81.scope: Deactivated successfully. Mar 3 13:36:05.108011 systemd[1]: cri-containerd-a47a5890be1f83e1ef406599ba59b3dbf1b7f77178110095e5cea6b36234da81.scope: Consumed 393ms CPU time, 188M memory peak, 2.4M read from disk, 177M written to disk. Mar 3 13:36:05.109071 containerd[1629]: time="2026-03-03T13:36:05.109047035Z" level=info msg="received container exit event container_id:\"a47a5890be1f83e1ef406599ba59b3dbf1b7f77178110095e5cea6b36234da81\" id:\"a47a5890be1f83e1ef406599ba59b3dbf1b7f77178110095e5cea6b36234da81\" pid:3629 exited_at:{seconds:1772544965 nanos:108869785}" Mar 3 13:36:05.131405 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a47a5890be1f83e1ef406599ba59b3dbf1b7f77178110095e5cea6b36234da81-rootfs.mount: Deactivated successfully. Mar 3 13:36:05.199923 kubelet[2794]: I0303 13:36:05.199875 2794 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Mar 3 13:36:05.236420 systemd[1]: Created slice kubepods-besteffort-pod14132d38_7264_4753_918c_53ff70118c6f.slice - libcontainer container kubepods-besteffort-pod14132d38_7264_4753_918c_53ff70118c6f.slice. Mar 3 13:36:05.249823 systemd[1]: Created slice kubepods-burstable-pod547c2410_b867_4848_8712_44c1c77daad1.slice - libcontainer container kubepods-burstable-pod547c2410_b867_4848_8712_44c1c77daad1.slice. Mar 3 13:36:05.261174 systemd[1]: Created slice kubepods-besteffort-pod2e0d1620_86b4_4e75_81d4_5493cf82e93e.slice - libcontainer container kubepods-besteffort-pod2e0d1620_86b4_4e75_81d4_5493cf82e93e.slice. Mar 3 13:36:05.266807 systemd[1]: Created slice kubepods-besteffort-pod15882429_b1cf_4254_b3ef_f6e453c09e66.slice - libcontainer container kubepods-besteffort-pod15882429_b1cf_4254_b3ef_f6e453c09e66.slice. Mar 3 13:36:05.273700 systemd[1]: Created slice kubepods-besteffort-pod15401374_cd08_4d87_aaa5_00861a15882b.slice - libcontainer container kubepods-besteffort-pod15401374_cd08_4d87_aaa5_00861a15882b.slice. Mar 3 13:36:05.278764 systemd[1]: Created slice kubepods-burstable-pod44c56449_ed93_445e_90b3_c1b25198a9bc.slice - libcontainer container kubepods-burstable-pod44c56449_ed93_445e_90b3_c1b25198a9bc.slice. Mar 3 13:36:05.280069 kubelet[2794]: I0303 13:36:05.280046 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/14132d38-7264-4753-918c-53ff70118c6f-calico-apiserver-certs\") pod \"calico-apiserver-c5859fd4b-4t5vt\" (UID: \"14132d38-7264-4753-918c-53ff70118c6f\") " pod="calico-system/calico-apiserver-c5859fd4b-4t5vt" Mar 3 13:36:05.280069 kubelet[2794]: I0303 13:36:05.280068 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb625\" (UniqueName: \"kubernetes.io/projected/14132d38-7264-4753-918c-53ff70118c6f-kube-api-access-bb625\") pod \"calico-apiserver-c5859fd4b-4t5vt\" (UID: \"14132d38-7264-4753-918c-53ff70118c6f\") " pod="calico-system/calico-apiserver-c5859fd4b-4t5vt" Mar 3 13:36:05.280182 kubelet[2794]: I0303 13:36:05.280083 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qjrr\" (UniqueName: \"kubernetes.io/projected/15401374-cd08-4d87-aaa5-00861a15882b-kube-api-access-7qjrr\") pod \"calico-apiserver-c5859fd4b-l4vrj\" (UID: \"15401374-cd08-4d87-aaa5-00861a15882b\") " pod="calico-system/calico-apiserver-c5859fd4b-l4vrj" Mar 3 13:36:05.280182 kubelet[2794]: I0303 13:36:05.280094 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91-nginx-config\") pod \"whisker-6b9ddbc87d-dn6cr\" (UID: \"fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91\") " pod="calico-system/whisker-6b9ddbc87d-dn6cr" Mar 3 13:36:05.280182 kubelet[2794]: I0303 13:36:05.280106 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/547c2410-b867-4848-8712-44c1c77daad1-config-volume\") pod \"coredns-674b8bbfcf-hjkbg\" (UID: \"547c2410-b867-4848-8712-44c1c77daad1\") " pod="kube-system/coredns-674b8bbfcf-hjkbg" Mar 3 13:36:05.280182 kubelet[2794]: I0303 13:36:05.280116 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91-whisker-ca-bundle\") pod \"whisker-6b9ddbc87d-dn6cr\" (UID: \"fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91\") " pod="calico-system/whisker-6b9ddbc87d-dn6cr" Mar 3 13:36:05.280182 kubelet[2794]: I0303 13:36:05.280126 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv8j4\" (UniqueName: \"kubernetes.io/projected/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91-kube-api-access-zv8j4\") pod \"whisker-6b9ddbc87d-dn6cr\" (UID: \"fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91\") " pod="calico-system/whisker-6b9ddbc87d-dn6cr" Mar 3 13:36:05.280279 kubelet[2794]: I0303 13:36:05.280139 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsbtb\" (UniqueName: \"kubernetes.io/projected/547c2410-b867-4848-8712-44c1c77daad1-kube-api-access-wsbtb\") pod \"coredns-674b8bbfcf-hjkbg\" (UID: \"547c2410-b867-4848-8712-44c1c77daad1\") " pod="kube-system/coredns-674b8bbfcf-hjkbg" Mar 3 13:36:05.280279 kubelet[2794]: I0303 13:36:05.280151 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/15401374-cd08-4d87-aaa5-00861a15882b-calico-apiserver-certs\") pod \"calico-apiserver-c5859fd4b-l4vrj\" (UID: \"15401374-cd08-4d87-aaa5-00861a15882b\") " pod="calico-system/calico-apiserver-c5859fd4b-l4vrj" Mar 3 13:36:05.280465 kubelet[2794]: I0303 13:36:05.280162 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44c56449-ed93-445e-90b3-c1b25198a9bc-config-volume\") pod \"coredns-674b8bbfcf-hk74p\" (UID: \"44c56449-ed93-445e-90b3-c1b25198a9bc\") " pod="kube-system/coredns-674b8bbfcf-hk74p" Mar 3 13:36:05.280487 kubelet[2794]: I0303 13:36:05.280471 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e0d1620-86b4-4e75-81d4-5493cf82e93e-goldmane-ca-bundle\") pod \"goldmane-5b85766d88-mpt25\" (UID: \"2e0d1620-86b4-4e75-81d4-5493cf82e93e\") " pod="calico-system/goldmane-5b85766d88-mpt25" Mar 3 13:36:05.280487 kubelet[2794]: I0303 13:36:05.280483 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/2e0d1620-86b4-4e75-81d4-5493cf82e93e-goldmane-key-pair\") pod \"goldmane-5b85766d88-mpt25\" (UID: \"2e0d1620-86b4-4e75-81d4-5493cf82e93e\") " pod="calico-system/goldmane-5b85766d88-mpt25" Mar 3 13:36:05.280522 kubelet[2794]: I0303 13:36:05.280494 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91-whisker-backend-key-pair\") pod \"whisker-6b9ddbc87d-dn6cr\" (UID: \"fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91\") " pod="calico-system/whisker-6b9ddbc87d-dn6cr" Mar 3 13:36:05.280522 kubelet[2794]: I0303 13:36:05.280505 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15882429-b1cf-4254-b3ef-f6e453c09e66-tigera-ca-bundle\") pod \"calico-kube-controllers-8ccd5dff9-mk9lp\" (UID: \"15882429-b1cf-4254-b3ef-f6e453c09e66\") " pod="calico-system/calico-kube-controllers-8ccd5dff9-mk9lp" Mar 3 13:36:05.280560 kubelet[2794]: I0303 13:36:05.280531 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6695\" (UniqueName: \"kubernetes.io/projected/15882429-b1cf-4254-b3ef-f6e453c09e66-kube-api-access-l6695\") pod \"calico-kube-controllers-8ccd5dff9-mk9lp\" (UID: \"15882429-b1cf-4254-b3ef-f6e453c09e66\") " pod="calico-system/calico-kube-controllers-8ccd5dff9-mk9lp" Mar 3 13:36:05.280560 kubelet[2794]: I0303 13:36:05.280552 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srpqc\" (UniqueName: \"kubernetes.io/projected/44c56449-ed93-445e-90b3-c1b25198a9bc-kube-api-access-srpqc\") pod \"coredns-674b8bbfcf-hk74p\" (UID: \"44c56449-ed93-445e-90b3-c1b25198a9bc\") " pod="kube-system/coredns-674b8bbfcf-hk74p" Mar 3 13:36:05.280596 kubelet[2794]: I0303 13:36:05.280564 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e0d1620-86b4-4e75-81d4-5493cf82e93e-config\") pod \"goldmane-5b85766d88-mpt25\" (UID: \"2e0d1620-86b4-4e75-81d4-5493cf82e93e\") " pod="calico-system/goldmane-5b85766d88-mpt25" Mar 3 13:36:05.280596 kubelet[2794]: I0303 13:36:05.280574 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fjv9\" (UniqueName: \"kubernetes.io/projected/2e0d1620-86b4-4e75-81d4-5493cf82e93e-kube-api-access-5fjv9\") pod \"goldmane-5b85766d88-mpt25\" (UID: \"2e0d1620-86b4-4e75-81d4-5493cf82e93e\") " pod="calico-system/goldmane-5b85766d88-mpt25" Mar 3 13:36:05.285303 systemd[1]: Created slice kubepods-besteffort-podfdd9d6fd_311f_4e0b_9c4d_ead1ab269e91.slice - libcontainer container kubepods-besteffort-podfdd9d6fd_311f_4e0b_9c4d_ead1ab269e91.slice. Mar 3 13:36:05.507668 containerd[1629]: time="2026-03-03T13:36:05.507565356Z" level=info msg="CreateContainer within sandbox \"418cd470c860554882d27e560f2ec0b484ca00bac82f6caba3c5ec4fe796f8f1\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 3 13:36:05.528777 containerd[1629]: time="2026-03-03T13:36:05.528735622Z" level=info msg="Container 19740e236de24a819ad39612867c92495ba82e6b8c6b83648fc623b39e071b66: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:05.541868 containerd[1629]: time="2026-03-03T13:36:05.541842380Z" level=info msg="CreateContainer within sandbox \"418cd470c860554882d27e560f2ec0b484ca00bac82f6caba3c5ec4fe796f8f1\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"19740e236de24a819ad39612867c92495ba82e6b8c6b83648fc623b39e071b66\"" Mar 3 13:36:05.542730 containerd[1629]: time="2026-03-03T13:36:05.542699712Z" level=info msg="StartContainer for \"19740e236de24a819ad39612867c92495ba82e6b8c6b83648fc623b39e071b66\"" Mar 3 13:36:05.544919 containerd[1629]: time="2026-03-03T13:36:05.544885966Z" level=info msg="connecting to shim 19740e236de24a819ad39612867c92495ba82e6b8c6b83648fc623b39e071b66" address="unix:///run/containerd/s/70bc889b601e6dbea33f7f649c3f4aa9dfd95e4be7f7e4fb323eee11aa334d9b" protocol=ttrpc version=3 Mar 3 13:36:05.545666 containerd[1629]: time="2026-03-03T13:36:05.545002417Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5859fd4b-4t5vt,Uid:14132d38-7264-4753-918c-53ff70118c6f,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:05.563955 containerd[1629]: time="2026-03-03T13:36:05.563927687Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hjkbg,Uid:547c2410-b867-4848-8712-44c1c77daad1,Namespace:kube-system,Attempt:0,}" Mar 3 13:36:05.566716 containerd[1629]: time="2026-03-03T13:36:05.566302763Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-mpt25,Uid:2e0d1620-86b4-4e75-81d4-5493cf82e93e,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:05.569162 systemd[1]: Started cri-containerd-19740e236de24a819ad39612867c92495ba82e6b8c6b83648fc623b39e071b66.scope - libcontainer container 19740e236de24a819ad39612867c92495ba82e6b8c6b83648fc623b39e071b66. Mar 3 13:36:05.571467 containerd[1629]: time="2026-03-03T13:36:05.571449133Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8ccd5dff9-mk9lp,Uid:15882429-b1cf-4254-b3ef-f6e453c09e66,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:05.579507 containerd[1629]: time="2026-03-03T13:36:05.579487430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5859fd4b-l4vrj,Uid:15401374-cd08-4d87-aaa5-00861a15882b,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:05.585796 containerd[1629]: time="2026-03-03T13:36:05.585313572Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hk74p,Uid:44c56449-ed93-445e-90b3-c1b25198a9bc,Namespace:kube-system,Attempt:0,}" Mar 3 13:36:05.591557 containerd[1629]: time="2026-03-03T13:36:05.591364846Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6b9ddbc87d-dn6cr,Uid:fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:05.687473 containerd[1629]: time="2026-03-03T13:36:05.687431651Z" level=info msg="StartContainer for \"19740e236de24a819ad39612867c92495ba82e6b8c6b83648fc623b39e071b66\" returns successfully" Mar 3 13:36:05.766176 containerd[1629]: time="2026-03-03T13:36:05.765933619Z" level=error msg="Failed to destroy network for sandbox \"f150b433cc0eafd499ba88f1f714760e04cee422d6fc3787a0e31eee2902b6b5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.771246 containerd[1629]: time="2026-03-03T13:36:05.771210860Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8ccd5dff9-mk9lp,Uid:15882429-b1cf-4254-b3ef-f6e453c09e66,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f150b433cc0eafd499ba88f1f714760e04cee422d6fc3787a0e31eee2902b6b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.772004 kubelet[2794]: E0303 13:36:05.771418 2794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f150b433cc0eafd499ba88f1f714760e04cee422d6fc3787a0e31eee2902b6b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.772004 kubelet[2794]: E0303 13:36:05.771475 2794 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f150b433cc0eafd499ba88f1f714760e04cee422d6fc3787a0e31eee2902b6b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8ccd5dff9-mk9lp" Mar 3 13:36:05.772004 kubelet[2794]: E0303 13:36:05.771493 2794 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f150b433cc0eafd499ba88f1f714760e04cee422d6fc3787a0e31eee2902b6b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8ccd5dff9-mk9lp" Mar 3 13:36:05.772943 kubelet[2794]: E0303 13:36:05.771544 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8ccd5dff9-mk9lp_calico-system(15882429-b1cf-4254-b3ef-f6e453c09e66)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8ccd5dff9-mk9lp_calico-system(15882429-b1cf-4254-b3ef-f6e453c09e66)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f150b433cc0eafd499ba88f1f714760e04cee422d6fc3787a0e31eee2902b6b5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8ccd5dff9-mk9lp" podUID="15882429-b1cf-4254-b3ef-f6e453c09e66" Mar 3 13:36:05.798388 containerd[1629]: time="2026-03-03T13:36:05.798269638Z" level=error msg="Failed to destroy network for sandbox \"f8fe56c50cb747fa1585f2408815b76c0dd0ecffef7b4d1846118b90515450c0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.800532 containerd[1629]: time="2026-03-03T13:36:05.800338842Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hk74p,Uid:44c56449-ed93-445e-90b3-c1b25198a9bc,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8fe56c50cb747fa1585f2408815b76c0dd0ecffef7b4d1846118b90515450c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.802918 kubelet[2794]: E0303 13:36:05.800982 2794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8fe56c50cb747fa1585f2408815b76c0dd0ecffef7b4d1846118b90515450c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.802918 kubelet[2794]: E0303 13:36:05.802580 2794 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8fe56c50cb747fa1585f2408815b76c0dd0ecffef7b4d1846118b90515450c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-hk74p" Mar 3 13:36:05.802918 kubelet[2794]: E0303 13:36:05.802609 2794 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8fe56c50cb747fa1585f2408815b76c0dd0ecffef7b4d1846118b90515450c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-hk74p" Mar 3 13:36:05.803042 kubelet[2794]: E0303 13:36:05.802669 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-hk74p_kube-system(44c56449-ed93-445e-90b3-c1b25198a9bc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-hk74p_kube-system(44c56449-ed93-445e-90b3-c1b25198a9bc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f8fe56c50cb747fa1585f2408815b76c0dd0ecffef7b4d1846118b90515450c0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-hk74p" podUID="44c56449-ed93-445e-90b3-c1b25198a9bc" Mar 3 13:36:05.815630 containerd[1629]: time="2026-03-03T13:36:05.815600075Z" level=error msg="Failed to destroy network for sandbox \"ae52ace5a4afd4e9149c6d1e463f55560a01df5189655f70d50ca335c2fff949\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.817999 containerd[1629]: time="2026-03-03T13:36:05.817975970Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hjkbg,Uid:547c2410-b867-4848-8712-44c1c77daad1,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae52ace5a4afd4e9149c6d1e463f55560a01df5189655f70d50ca335c2fff949\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.818457 kubelet[2794]: E0303 13:36:05.818434 2794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae52ace5a4afd4e9149c6d1e463f55560a01df5189655f70d50ca335c2fff949\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.819015 kubelet[2794]: E0303 13:36:05.818883 2794 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae52ace5a4afd4e9149c6d1e463f55560a01df5189655f70d50ca335c2fff949\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-hjkbg" Mar 3 13:36:05.819015 kubelet[2794]: E0303 13:36:05.818941 2794 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae52ace5a4afd4e9149c6d1e463f55560a01df5189655f70d50ca335c2fff949\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-hjkbg" Mar 3 13:36:05.819015 kubelet[2794]: E0303 13:36:05.818986 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-hjkbg_kube-system(547c2410-b867-4848-8712-44c1c77daad1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-hjkbg_kube-system(547c2410-b867-4848-8712-44c1c77daad1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ae52ace5a4afd4e9149c6d1e463f55560a01df5189655f70d50ca335c2fff949\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-hjkbg" podUID="547c2410-b867-4848-8712-44c1c77daad1" Mar 3 13:36:05.825338 containerd[1629]: time="2026-03-03T13:36:05.825317055Z" level=error msg="Failed to destroy network for sandbox \"253af431c393ab5d859fc8cbb713152b7e3884822e499e4803ad63f744eb343f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.826824 containerd[1629]: time="2026-03-03T13:36:05.826796619Z" level=error msg="Failed to destroy network for sandbox \"97b254f8ae7824c1f8571fe13f5627283882cf47d50512bd8c749d9f4eca9f7c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.827021 containerd[1629]: time="2026-03-03T13:36:05.826998130Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5859fd4b-l4vrj,Uid:15401374-cd08-4d87-aaa5-00861a15882b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"253af431c393ab5d859fc8cbb713152b7e3884822e499e4803ad63f744eb343f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.827303 kubelet[2794]: E0303 13:36:05.827281 2794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"253af431c393ab5d859fc8cbb713152b7e3884822e499e4803ad63f744eb343f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.827303 kubelet[2794]: E0303 13:36:05.827318 2794 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"253af431c393ab5d859fc8cbb713152b7e3884822e499e4803ad63f744eb343f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-c5859fd4b-l4vrj" Mar 3 13:36:05.827400 kubelet[2794]: E0303 13:36:05.827353 2794 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"253af431c393ab5d859fc8cbb713152b7e3884822e499e4803ad63f744eb343f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-c5859fd4b-l4vrj" Mar 3 13:36:05.827400 kubelet[2794]: E0303 13:36:05.827386 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-c5859fd4b-l4vrj_calico-system(15401374-cd08-4d87-aaa5-00861a15882b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-c5859fd4b-l4vrj_calico-system(15401374-cd08-4d87-aaa5-00861a15882b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"253af431c393ab5d859fc8cbb713152b7e3884822e499e4803ad63f744eb343f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-c5859fd4b-l4vrj" podUID="15401374-cd08-4d87-aaa5-00861a15882b" Mar 3 13:36:05.828687 containerd[1629]: time="2026-03-03T13:36:05.828668433Z" level=error msg="Failed to destroy network for sandbox \"bc0900046b9594e9ff1925b9ebb19cee5c78de9b3df5ac333d3e2cd9c8688709\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.829321 containerd[1629]: time="2026-03-03T13:36:05.828939484Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5859fd4b-4t5vt,Uid:14132d38-7264-4753-918c-53ff70118c6f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"97b254f8ae7824c1f8571fe13f5627283882cf47d50512bd8c749d9f4eca9f7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.829384 kubelet[2794]: E0303 13:36:05.829171 2794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"97b254f8ae7824c1f8571fe13f5627283882cf47d50512bd8c749d9f4eca9f7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.829384 kubelet[2794]: E0303 13:36:05.829191 2794 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"97b254f8ae7824c1f8571fe13f5627283882cf47d50512bd8c749d9f4eca9f7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-c5859fd4b-4t5vt" Mar 3 13:36:05.829384 kubelet[2794]: E0303 13:36:05.829252 2794 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"97b254f8ae7824c1f8571fe13f5627283882cf47d50512bd8c749d9f4eca9f7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-c5859fd4b-4t5vt" Mar 3 13:36:05.829462 kubelet[2794]: E0303 13:36:05.829283 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-c5859fd4b-4t5vt_calico-system(14132d38-7264-4753-918c-53ff70118c6f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-c5859fd4b-4t5vt_calico-system(14132d38-7264-4753-918c-53ff70118c6f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"97b254f8ae7824c1f8571fe13f5627283882cf47d50512bd8c749d9f4eca9f7c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-c5859fd4b-4t5vt" podUID="14132d38-7264-4753-918c-53ff70118c6f" Mar 3 13:36:05.830882 containerd[1629]: time="2026-03-03T13:36:05.830706247Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6b9ddbc87d-dn6cr,Uid:fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc0900046b9594e9ff1925b9ebb19cee5c78de9b3df5ac333d3e2cd9c8688709\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.830974 kubelet[2794]: E0303 13:36:05.830800 2794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc0900046b9594e9ff1925b9ebb19cee5c78de9b3df5ac333d3e2cd9c8688709\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.830974 kubelet[2794]: E0303 13:36:05.830819 2794 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc0900046b9594e9ff1925b9ebb19cee5c78de9b3df5ac333d3e2cd9c8688709\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6b9ddbc87d-dn6cr" Mar 3 13:36:05.830974 kubelet[2794]: E0303 13:36:05.830831 2794 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc0900046b9594e9ff1925b9ebb19cee5c78de9b3df5ac333d3e2cd9c8688709\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6b9ddbc87d-dn6cr" Mar 3 13:36:05.831350 kubelet[2794]: E0303 13:36:05.830857 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6b9ddbc87d-dn6cr_calico-system(fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6b9ddbc87d-dn6cr_calico-system(fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bc0900046b9594e9ff1925b9ebb19cee5c78de9b3df5ac333d3e2cd9c8688709\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6b9ddbc87d-dn6cr" podUID="fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91" Mar 3 13:36:05.900675 containerd[1629]: 2026-03-03 13:36:05.865 [INFO][3842] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027" Mar 3 13:36:05.900675 containerd[1629]: 2026-03-03 13:36:05.866 [INFO][3842] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027" iface="eth0" netns="/var/run/netns/cni-3e8827a6-9fe8-34b7-d39e-ff25538067ca" Mar 3 13:36:05.900675 containerd[1629]: 2026-03-03 13:36:05.866 [INFO][3842] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027" iface="eth0" netns="/var/run/netns/cni-3e8827a6-9fe8-34b7-d39e-ff25538067ca" Mar 3 13:36:05.900675 containerd[1629]: 2026-03-03 13:36:05.867 [INFO][3842] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027" iface="eth0" netns="/var/run/netns/cni-3e8827a6-9fe8-34b7-d39e-ff25538067ca" Mar 3 13:36:05.900675 containerd[1629]: 2026-03-03 13:36:05.867 [INFO][3842] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027" Mar 3 13:36:05.900675 containerd[1629]: 2026-03-03 13:36:05.867 [INFO][3842] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027" Mar 3 13:36:05.900675 containerd[1629]: 2026-03-03 13:36:05.886 [INFO][3892] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027" HandleID="k8s-pod-network.2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027" Workload="ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-eth0" Mar 3 13:36:05.900675 containerd[1629]: 2026-03-03 13:36:05.886 [INFO][3892] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:05.900675 containerd[1629]: 2026-03-03 13:36:05.886 [INFO][3892] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:05.901751 containerd[1629]: 2026-03-03 13:36:05.894 [WARNING][3892] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027" HandleID="k8s-pod-network.2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027" Workload="ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-eth0" Mar 3 13:36:05.901751 containerd[1629]: 2026-03-03 13:36:05.894 [INFO][3892] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027" HandleID="k8s-pod-network.2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027" Workload="ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-eth0" Mar 3 13:36:05.901751 containerd[1629]: 2026-03-03 13:36:05.895 [INFO][3892] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:05.901751 containerd[1629]: 2026-03-03 13:36:05.898 [INFO][3842] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027" Mar 3 13:36:05.902845 containerd[1629]: time="2026-03-03T13:36:05.902814521Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-mpt25,Uid:2e0d1620-86b4-4e75-81d4-5493cf82e93e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.903682 kubelet[2794]: E0303 13:36:05.903082 2794 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:05.903682 kubelet[2794]: E0303 13:36:05.903153 2794 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-mpt25" Mar 3 13:36:05.903682 kubelet[2794]: E0303 13:36:05.903182 2794 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-mpt25" Mar 3 13:36:05.904208 kubelet[2794]: E0303 13:36:05.903255 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-5b85766d88-mpt25_calico-system(2e0d1620-86b4-4e75-81d4-5493cf82e93e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-5b85766d88-mpt25_calico-system(2e0d1620-86b4-4e75-81d4-5493cf82e93e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2cc60ccffcf5635b03cee9c7e422e10533eed91def09332500a9e59523e7d027\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-5b85766d88-mpt25" podUID="2e0d1620-86b4-4e75-81d4-5493cf82e93e" Mar 3 13:36:06.341383 systemd[1]: Created slice kubepods-besteffort-pod7910b7ac_0de5_4bb6_95b6_143b053aaa8f.slice - libcontainer container kubepods-besteffort-pod7910b7ac_0de5_4bb6_95b6_143b053aaa8f.slice. Mar 3 13:36:06.343587 containerd[1629]: time="2026-03-03T13:36:06.343314764Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jzlgk,Uid:7910b7ac-0de5-4bb6-95b6-143b053aaa8f,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:06.428727 systemd-networkd[1492]: cali33512d546f3: Link UP Mar 3 13:36:06.429870 systemd-networkd[1492]: cali33512d546f3: Gained carrier Mar 3 13:36:06.444235 containerd[1629]: 2026-03-03 13:36:06.361 [ERROR][3913] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 3 13:36:06.444235 containerd[1629]: 2026-03-03 13:36:06.371 [INFO][3913] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--7--599052a073-k8s-csi--node--driver--jzlgk-eth0 csi-node-driver- calico-system 7910b7ac-0de5-4bb6-95b6-143b053aaa8f 684 0 2026-03-03 13:35:47 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6d9d697c7c k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-4-7-599052a073 csi-node-driver-jzlgk eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali33512d546f3 [] [] }} ContainerID="1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" Namespace="calico-system" Pod="csi-node-driver-jzlgk" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-csi--node--driver--jzlgk-" Mar 3 13:36:06.444235 containerd[1629]: 2026-03-03 13:36:06.371 [INFO][3913] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" Namespace="calico-system" Pod="csi-node-driver-jzlgk" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-csi--node--driver--jzlgk-eth0" Mar 3 13:36:06.444235 containerd[1629]: 2026-03-03 13:36:06.391 [INFO][3925] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" HandleID="k8s-pod-network.1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" Workload="ci--4459--2--4--7--599052a073-k8s-csi--node--driver--jzlgk-eth0" Mar 3 13:36:06.444524 containerd[1629]: 2026-03-03 13:36:06.396 [INFO][3925] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" HandleID="k8s-pod-network.1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" Workload="ci--4459--2--4--7--599052a073-k8s-csi--node--driver--jzlgk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ee0b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-7-599052a073", "pod":"csi-node-driver-jzlgk", "timestamp":"2026-03-03 13:36:06.391323144 +0000 UTC"}, Hostname:"ci-4459-2-4-7-599052a073", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001142c0)} Mar 3 13:36:06.444524 containerd[1629]: 2026-03-03 13:36:06.396 [INFO][3925] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:06.444524 containerd[1629]: 2026-03-03 13:36:06.396 [INFO][3925] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:06.444524 containerd[1629]: 2026-03-03 13:36:06.396 [INFO][3925] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-7-599052a073' Mar 3 13:36:06.444524 containerd[1629]: 2026-03-03 13:36:06.398 [INFO][3925] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.444524 containerd[1629]: 2026-03-03 13:36:06.402 [INFO][3925] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.444524 containerd[1629]: 2026-03-03 13:36:06.406 [INFO][3925] ipam/ipam.go 526: Trying affinity for 192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.444524 containerd[1629]: 2026-03-03 13:36:06.408 [INFO][3925] ipam/ipam.go 160: Attempting to load block cidr=192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.444524 containerd[1629]: 2026-03-03 13:36:06.409 [INFO][3925] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.445025 containerd[1629]: 2026-03-03 13:36:06.409 [INFO][3925] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.97.192/26 handle="k8s-pod-network.1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.445025 containerd[1629]: 2026-03-03 13:36:06.411 [INFO][3925] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985 Mar 3 13:36:06.445025 containerd[1629]: 2026-03-03 13:36:06.414 [INFO][3925] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.97.192/26 handle="k8s-pod-network.1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.445025 containerd[1629]: 2026-03-03 13:36:06.418 [INFO][3925] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.97.193/26] block=192.168.97.192/26 handle="k8s-pod-network.1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.445025 containerd[1629]: 2026-03-03 13:36:06.418 [INFO][3925] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.97.193/26] handle="k8s-pod-network.1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.445025 containerd[1629]: 2026-03-03 13:36:06.418 [INFO][3925] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:06.445025 containerd[1629]: 2026-03-03 13:36:06.418 [INFO][3925] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.97.193/26] IPv6=[] ContainerID="1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" HandleID="k8s-pod-network.1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" Workload="ci--4459--2--4--7--599052a073-k8s-csi--node--driver--jzlgk-eth0" Mar 3 13:36:06.445152 containerd[1629]: 2026-03-03 13:36:06.420 [INFO][3913] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" Namespace="calico-system" Pod="csi-node-driver-jzlgk" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-csi--node--driver--jzlgk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--7--599052a073-k8s-csi--node--driver--jzlgk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7910b7ac-0de5-4bb6-95b6-143b053aaa8f", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-7-599052a073", ContainerID:"", Pod:"csi-node-driver-jzlgk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.97.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali33512d546f3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:06.445199 containerd[1629]: 2026-03-03 13:36:06.420 [INFO][3913] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.193/32] ContainerID="1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" Namespace="calico-system" Pod="csi-node-driver-jzlgk" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-csi--node--driver--jzlgk-eth0" Mar 3 13:36:06.445199 containerd[1629]: 2026-03-03 13:36:06.420 [INFO][3913] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali33512d546f3 ContainerID="1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" Namespace="calico-system" Pod="csi-node-driver-jzlgk" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-csi--node--driver--jzlgk-eth0" Mar 3 13:36:06.445199 containerd[1629]: 2026-03-03 13:36:06.429 [INFO][3913] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" Namespace="calico-system" Pod="csi-node-driver-jzlgk" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-csi--node--driver--jzlgk-eth0" Mar 3 13:36:06.445256 containerd[1629]: 2026-03-03 13:36:06.429 [INFO][3913] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" Namespace="calico-system" Pod="csi-node-driver-jzlgk" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-csi--node--driver--jzlgk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--7--599052a073-k8s-csi--node--driver--jzlgk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7910b7ac-0de5-4bb6-95b6-143b053aaa8f", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-7-599052a073", ContainerID:"1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985", Pod:"csi-node-driver-jzlgk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.97.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali33512d546f3", MAC:"02:06:89:d1:96:5b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:06.445295 containerd[1629]: 2026-03-03 13:36:06.439 [INFO][3913] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" Namespace="calico-system" Pod="csi-node-driver-jzlgk" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-csi--node--driver--jzlgk-eth0" Mar 3 13:36:06.480544 containerd[1629]: time="2026-03-03T13:36:06.480460689Z" level=info msg="connecting to shim 1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985" address="unix:///run/containerd/s/ccd2de3bd356fe3b129f7757ec800aba22e3cc1c94626b5ad1a94e520885717c" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:36:06.499540 containerd[1629]: time="2026-03-03T13:36:06.499509789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-mpt25,Uid:2e0d1620-86b4-4e75-81d4-5493cf82e93e,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:06.516321 systemd[1]: Started cri-containerd-1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985.scope - libcontainer container 1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985. Mar 3 13:36:06.526821 systemd[1]: run-netns-cni\x2d65cded64\x2d4d5d\x2d4b02\x2d6420\x2df4618b48e25e.mount: Deactivated successfully. Mar 3 13:36:06.528582 systemd[1]: run-netns-cni\x2db0b8505b\x2d12a6\x2df352\x2d3588\x2de30c7710d99c.mount: Deactivated successfully. Mar 3 13:36:06.567801 containerd[1629]: time="2026-03-03T13:36:06.567734121Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jzlgk,Uid:7910b7ac-0de5-4bb6-95b6-143b053aaa8f,Namespace:calico-system,Attempt:0,} returns sandbox id \"1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985\"" Mar 3 13:36:06.571197 containerd[1629]: time="2026-03-03T13:36:06.570372967Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 3 13:36:06.590170 kubelet[2794]: I0303 13:36:06.590116 2794 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91-nginx-config\") pod \"fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91\" (UID: \"fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91\") " Mar 3 13:36:06.590328 kubelet[2794]: I0303 13:36:06.590286 2794 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv8j4\" (UniqueName: \"kubernetes.io/projected/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91-kube-api-access-zv8j4\") pod \"fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91\" (UID: \"fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91\") " Mar 3 13:36:06.590843 kubelet[2794]: I0303 13:36:06.590426 2794 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91-whisker-backend-key-pair\") pod \"fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91\" (UID: \"fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91\") " Mar 3 13:36:06.590843 kubelet[2794]: I0303 13:36:06.590467 2794 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91-whisker-ca-bundle\") pod \"fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91\" (UID: \"fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91\") " Mar 3 13:36:06.593786 kubelet[2794]: I0303 13:36:06.593709 2794 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91" (UID: "fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 3 13:36:06.595270 kubelet[2794]: I0303 13:36:06.595242 2794 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91" (UID: "fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 3 13:36:06.601050 systemd[1]: var-lib-kubelet-pods-fdd9d6fd\x2d311f\x2d4e0b\x2d9c4d\x2dead1ab269e91-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dzv8j4.mount: Deactivated successfully. Mar 3 13:36:06.604603 kubelet[2794]: I0303 13:36:06.603880 2794 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91-kube-api-access-zv8j4" (OuterVolumeSpecName: "kube-api-access-zv8j4") pod "fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91" (UID: "fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91"). InnerVolumeSpecName "kube-api-access-zv8j4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 3 13:36:06.605299 systemd[1]: var-lib-kubelet-pods-fdd9d6fd\x2d311f\x2d4e0b\x2d9c4d\x2dead1ab269e91-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 3 13:36:06.606197 kubelet[2794]: I0303 13:36:06.606044 2794 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91" (UID: "fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 3 13:36:06.659257 systemd-networkd[1492]: cali1ba1b41eb56: Link UP Mar 3 13:36:06.660333 systemd-networkd[1492]: cali1ba1b41eb56: Gained carrier Mar 3 13:36:06.673034 kubelet[2794]: I0303 13:36:06.672965 2794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-r47qv" podStartSLOduration=3.690573086 podStartE2EDuration="20.67290019s" podCreationTimestamp="2026-03-03 13:35:46 +0000 UTC" firstStartedPulling="2026-03-03 13:35:47.525671337 +0000 UTC m=+17.288789592" lastFinishedPulling="2026-03-03 13:36:04.507998441 +0000 UTC m=+34.271116696" observedRunningTime="2026-03-03 13:36:06.536399836 +0000 UTC m=+36.299518131" watchObservedRunningTime="2026-03-03 13:36:06.67290019 +0000 UTC m=+36.436018455" Mar 3 13:36:06.674068 containerd[1629]: 2026-03-03 13:36:06.566 [ERROR][3972] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 3 13:36:06.674068 containerd[1629]: 2026-03-03 13:36:06.581 [INFO][3972] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-eth0 goldmane-5b85766d88- calico-system 2e0d1620-86b4-4e75-81d4-5493cf82e93e 850 0 2026-03-03 13:35:46 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:5b85766d88 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-4-7-599052a073 goldmane-5b85766d88-mpt25 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali1ba1b41eb56 [] [] }} ContainerID="8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" Namespace="calico-system" Pod="goldmane-5b85766d88-mpt25" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-" Mar 3 13:36:06.674068 containerd[1629]: 2026-03-03 13:36:06.581 [INFO][3972] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" Namespace="calico-system" Pod="goldmane-5b85766d88-mpt25" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-eth0" Mar 3 13:36:06.674068 containerd[1629]: 2026-03-03 13:36:06.619 [INFO][3998] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" HandleID="k8s-pod-network.8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" Workload="ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-eth0" Mar 3 13:36:06.674345 containerd[1629]: 2026-03-03 13:36:06.627 [INFO][3998] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" HandleID="k8s-pod-network.8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" Workload="ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e7e80), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-7-599052a073", "pod":"goldmane-5b85766d88-mpt25", "timestamp":"2026-03-03 13:36:06.61998749 +0000 UTC"}, Hostname:"ci-4459-2-4-7-599052a073", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00016a6e0)} Mar 3 13:36:06.674345 containerd[1629]: 2026-03-03 13:36:06.627 [INFO][3998] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:06.674345 containerd[1629]: 2026-03-03 13:36:06.627 [INFO][3998] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:06.674345 containerd[1629]: 2026-03-03 13:36:06.628 [INFO][3998] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-7-599052a073' Mar 3 13:36:06.674345 containerd[1629]: 2026-03-03 13:36:06.630 [INFO][3998] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.674345 containerd[1629]: 2026-03-03 13:36:06.635 [INFO][3998] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.674345 containerd[1629]: 2026-03-03 13:36:06.639 [INFO][3998] ipam/ipam.go 526: Trying affinity for 192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.674345 containerd[1629]: 2026-03-03 13:36:06.641 [INFO][3998] ipam/ipam.go 160: Attempting to load block cidr=192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.674345 containerd[1629]: 2026-03-03 13:36:06.642 [INFO][3998] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.674753 containerd[1629]: 2026-03-03 13:36:06.642 [INFO][3998] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.97.192/26 handle="k8s-pod-network.8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.674753 containerd[1629]: 2026-03-03 13:36:06.643 [INFO][3998] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd Mar 3 13:36:06.674753 containerd[1629]: 2026-03-03 13:36:06.647 [INFO][3998] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.97.192/26 handle="k8s-pod-network.8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.674753 containerd[1629]: 2026-03-03 13:36:06.652 [INFO][3998] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.97.194/26] block=192.168.97.192/26 handle="k8s-pod-network.8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.674753 containerd[1629]: 2026-03-03 13:36:06.652 [INFO][3998] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.97.194/26] handle="k8s-pod-network.8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:06.674753 containerd[1629]: 2026-03-03 13:36:06.652 [INFO][3998] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:06.674753 containerd[1629]: 2026-03-03 13:36:06.652 [INFO][3998] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.97.194/26] IPv6=[] ContainerID="8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" HandleID="k8s-pod-network.8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" Workload="ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-eth0" Mar 3 13:36:06.674951 containerd[1629]: 2026-03-03 13:36:06.655 [INFO][3972] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" Namespace="calico-system" Pod="goldmane-5b85766d88-mpt25" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"2e0d1620-86b4-4e75-81d4-5493cf82e93e", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-7-599052a073", ContainerID:"", Pod:"goldmane-5b85766d88-mpt25", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.97.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali1ba1b41eb56", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:06.675053 containerd[1629]: 2026-03-03 13:36:06.655 [INFO][3972] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.194/32] ContainerID="8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" Namespace="calico-system" Pod="goldmane-5b85766d88-mpt25" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-eth0" Mar 3 13:36:06.675053 containerd[1629]: 2026-03-03 13:36:06.655 [INFO][3972] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1ba1b41eb56 ContainerID="8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" Namespace="calico-system" Pod="goldmane-5b85766d88-mpt25" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-eth0" Mar 3 13:36:06.675053 containerd[1629]: 2026-03-03 13:36:06.659 [INFO][3972] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" Namespace="calico-system" Pod="goldmane-5b85766d88-mpt25" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-eth0" Mar 3 13:36:06.675129 containerd[1629]: 2026-03-03 13:36:06.659 [INFO][3972] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" Namespace="calico-system" Pod="goldmane-5b85766d88-mpt25" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"2e0d1620-86b4-4e75-81d4-5493cf82e93e", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-7-599052a073", ContainerID:"8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd", Pod:"goldmane-5b85766d88-mpt25", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.97.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali1ba1b41eb56", MAC:"06:98:5a:8e:62:b7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:06.675195 containerd[1629]: 2026-03-03 13:36:06.669 [INFO][3972] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" Namespace="calico-system" Pod="goldmane-5b85766d88-mpt25" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-goldmane--5b85766d88--mpt25-eth0" Mar 3 13:36:06.692197 kubelet[2794]: I0303 13:36:06.691991 2794 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91-whisker-ca-bundle\") on node \"ci-4459-2-4-7-599052a073\" DevicePath \"\"" Mar 3 13:36:06.692197 kubelet[2794]: I0303 13:36:06.692043 2794 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91-nginx-config\") on node \"ci-4459-2-4-7-599052a073\" DevicePath \"\"" Mar 3 13:36:06.692197 kubelet[2794]: I0303 13:36:06.692060 2794 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zv8j4\" (UniqueName: \"kubernetes.io/projected/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91-kube-api-access-zv8j4\") on node \"ci-4459-2-4-7-599052a073\" DevicePath \"\"" Mar 3 13:36:06.692197 kubelet[2794]: I0303 13:36:06.692073 2794 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91-whisker-backend-key-pair\") on node \"ci-4459-2-4-7-599052a073\" DevicePath \"\"" Mar 3 13:36:06.696647 containerd[1629]: time="2026-03-03T13:36:06.696572749Z" level=info msg="connecting to shim 8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd" address="unix:///run/containerd/s/f53e8f19ab7e545cb93e91324339add6298e9a26ef0379502290c0e505c09f88" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:36:06.730193 systemd[1]: Started cri-containerd-8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd.scope - libcontainer container 8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd. Mar 3 13:36:06.795234 containerd[1629]: time="2026-03-03T13:36:06.795165035Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-mpt25,Uid:2e0d1620-86b4-4e75-81d4-5493cf82e93e,Namespace:calico-system,Attempt:0,} returns sandbox id \"8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd\"" Mar 3 13:36:07.502044 kubelet[2794]: I0303 13:36:07.501770 2794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 3 13:36:07.507164 systemd[1]: Removed slice kubepods-besteffort-podfdd9d6fd_311f_4e0b_9c4d_ead1ab269e91.slice - libcontainer container kubepods-besteffort-podfdd9d6fd_311f_4e0b_9c4d_ead1ab269e91.slice. Mar 3 13:36:07.578581 systemd[1]: Created slice kubepods-besteffort-pod947b0e80_edd7_413e_9fd6_aebf6178044c.slice - libcontainer container kubepods-besteffort-pod947b0e80_edd7_413e_9fd6_aebf6178044c.slice. Mar 3 13:36:07.599016 kubelet[2794]: I0303 13:36:07.598969 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/947b0e80-edd7-413e-9fd6-aebf6178044c-nginx-config\") pod \"whisker-5f4897f75-tx9xp\" (UID: \"947b0e80-edd7-413e-9fd6-aebf6178044c\") " pod="calico-system/whisker-5f4897f75-tx9xp" Mar 3 13:36:07.599016 kubelet[2794]: I0303 13:36:07.599007 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/947b0e80-edd7-413e-9fd6-aebf6178044c-whisker-backend-key-pair\") pod \"whisker-5f4897f75-tx9xp\" (UID: \"947b0e80-edd7-413e-9fd6-aebf6178044c\") " pod="calico-system/whisker-5f4897f75-tx9xp" Mar 3 13:36:07.599174 kubelet[2794]: I0303 13:36:07.599031 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/947b0e80-edd7-413e-9fd6-aebf6178044c-whisker-ca-bundle\") pod \"whisker-5f4897f75-tx9xp\" (UID: \"947b0e80-edd7-413e-9fd6-aebf6178044c\") " pod="calico-system/whisker-5f4897f75-tx9xp" Mar 3 13:36:07.599174 kubelet[2794]: I0303 13:36:07.599055 2794 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p4h5\" (UniqueName: \"kubernetes.io/projected/947b0e80-edd7-413e-9fd6-aebf6178044c-kube-api-access-9p4h5\") pod \"whisker-5f4897f75-tx9xp\" (UID: \"947b0e80-edd7-413e-9fd6-aebf6178044c\") " pod="calico-system/whisker-5f4897f75-tx9xp" Mar 3 13:36:07.839287 systemd-networkd[1492]: cali33512d546f3: Gained IPv6LL Mar 3 13:36:07.885638 containerd[1629]: time="2026-03-03T13:36:07.885173508Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5f4897f75-tx9xp,Uid:947b0e80-edd7-413e-9fd6-aebf6178044c,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:08.017199 systemd-networkd[1492]: cali94f19c7e093: Link UP Mar 3 13:36:08.018157 systemd-networkd[1492]: cali94f19c7e093: Gained carrier Mar 3 13:36:08.031039 systemd-networkd[1492]: cali1ba1b41eb56: Gained IPv6LL Mar 3 13:36:08.040152 containerd[1629]: 2026-03-03 13:36:07.917 [ERROR][4161] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 3 13:36:08.040152 containerd[1629]: 2026-03-03 13:36:07.931 [INFO][4161] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--7--599052a073-k8s-whisker--5f4897f75--tx9xp-eth0 whisker-5f4897f75- calico-system 947b0e80-edd7-413e-9fd6-aebf6178044c 885 0 2026-03-03 13:36:07 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5f4897f75 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-4-7-599052a073 whisker-5f4897f75-tx9xp eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali94f19c7e093 [] [] }} ContainerID="0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" Namespace="calico-system" Pod="whisker-5f4897f75-tx9xp" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-whisker--5f4897f75--tx9xp-" Mar 3 13:36:08.040152 containerd[1629]: 2026-03-03 13:36:07.931 [INFO][4161] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" Namespace="calico-system" Pod="whisker-5f4897f75-tx9xp" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-whisker--5f4897f75--tx9xp-eth0" Mar 3 13:36:08.040152 containerd[1629]: 2026-03-03 13:36:07.972 [INFO][4173] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" HandleID="k8s-pod-network.0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" Workload="ci--4459--2--4--7--599052a073-k8s-whisker--5f4897f75--tx9xp-eth0" Mar 3 13:36:08.041245 containerd[1629]: 2026-03-03 13:36:07.980 [INFO][4173] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" HandleID="k8s-pod-network.0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" Workload="ci--4459--2--4--7--599052a073-k8s-whisker--5f4897f75--tx9xp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fc110), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-7-599052a073", "pod":"whisker-5f4897f75-tx9xp", "timestamp":"2026-03-03 13:36:07.972977447 +0000 UTC"}, Hostname:"ci-4459-2-4-7-599052a073", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00054c000)} Mar 3 13:36:08.041245 containerd[1629]: 2026-03-03 13:36:07.980 [INFO][4173] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:08.041245 containerd[1629]: 2026-03-03 13:36:07.980 [INFO][4173] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:08.041245 containerd[1629]: 2026-03-03 13:36:07.980 [INFO][4173] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-7-599052a073' Mar 3 13:36:08.041245 containerd[1629]: 2026-03-03 13:36:07.983 [INFO][4173] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:08.041245 containerd[1629]: 2026-03-03 13:36:07.987 [INFO][4173] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-7-599052a073" Mar 3 13:36:08.041245 containerd[1629]: 2026-03-03 13:36:07.992 [INFO][4173] ipam/ipam.go 526: Trying affinity for 192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:08.041245 containerd[1629]: 2026-03-03 13:36:07.994 [INFO][4173] ipam/ipam.go 160: Attempting to load block cidr=192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:08.041245 containerd[1629]: 2026-03-03 13:36:07.996 [INFO][4173] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:08.042152 containerd[1629]: 2026-03-03 13:36:07.996 [INFO][4173] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.97.192/26 handle="k8s-pod-network.0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:08.042152 containerd[1629]: 2026-03-03 13:36:07.998 [INFO][4173] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571 Mar 3 13:36:08.042152 containerd[1629]: 2026-03-03 13:36:08.003 [INFO][4173] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.97.192/26 handle="k8s-pod-network.0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:08.042152 containerd[1629]: 2026-03-03 13:36:08.007 [INFO][4173] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.97.195/26] block=192.168.97.192/26 handle="k8s-pod-network.0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:08.042152 containerd[1629]: 2026-03-03 13:36:08.008 [INFO][4173] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.97.195/26] handle="k8s-pod-network.0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:08.042152 containerd[1629]: 2026-03-03 13:36:08.008 [INFO][4173] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:08.042152 containerd[1629]: 2026-03-03 13:36:08.008 [INFO][4173] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.97.195/26] IPv6=[] ContainerID="0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" HandleID="k8s-pod-network.0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" Workload="ci--4459--2--4--7--599052a073-k8s-whisker--5f4897f75--tx9xp-eth0" Mar 3 13:36:08.043992 containerd[1629]: 2026-03-03 13:36:08.013 [INFO][4161] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" Namespace="calico-system" Pod="whisker-5f4897f75-tx9xp" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-whisker--5f4897f75--tx9xp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--7--599052a073-k8s-whisker--5f4897f75--tx9xp-eth0", GenerateName:"whisker-5f4897f75-", Namespace:"calico-system", SelfLink:"", UID:"947b0e80-edd7-413e-9fd6-aebf6178044c", ResourceVersion:"885", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 36, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5f4897f75", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-7-599052a073", ContainerID:"", Pod:"whisker-5f4897f75-tx9xp", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.97.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali94f19c7e093", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:08.043992 containerd[1629]: 2026-03-03 13:36:08.013 [INFO][4161] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.195/32] ContainerID="0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" Namespace="calico-system" Pod="whisker-5f4897f75-tx9xp" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-whisker--5f4897f75--tx9xp-eth0" Mar 3 13:36:08.044109 containerd[1629]: 2026-03-03 13:36:08.013 [INFO][4161] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali94f19c7e093 ContainerID="0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" Namespace="calico-system" Pod="whisker-5f4897f75-tx9xp" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-whisker--5f4897f75--tx9xp-eth0" Mar 3 13:36:08.044109 containerd[1629]: 2026-03-03 13:36:08.019 [INFO][4161] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" Namespace="calico-system" Pod="whisker-5f4897f75-tx9xp" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-whisker--5f4897f75--tx9xp-eth0" Mar 3 13:36:08.044176 containerd[1629]: 2026-03-03 13:36:08.019 [INFO][4161] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" Namespace="calico-system" Pod="whisker-5f4897f75-tx9xp" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-whisker--5f4897f75--tx9xp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--7--599052a073-k8s-whisker--5f4897f75--tx9xp-eth0", GenerateName:"whisker-5f4897f75-", Namespace:"calico-system", SelfLink:"", UID:"947b0e80-edd7-413e-9fd6-aebf6178044c", ResourceVersion:"885", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 36, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5f4897f75", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-7-599052a073", ContainerID:"0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571", Pod:"whisker-5f4897f75-tx9xp", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.97.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali94f19c7e093", MAC:"32:ea:67:08:47:27", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:08.044460 containerd[1629]: 2026-03-03 13:36:08.035 [INFO][4161] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" Namespace="calico-system" Pod="whisker-5f4897f75-tx9xp" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-whisker--5f4897f75--tx9xp-eth0" Mar 3 13:36:08.067581 containerd[1629]: time="2026-03-03T13:36:08.067535395Z" level=info msg="connecting to shim 0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571" address="unix:///run/containerd/s/a0f6076acfdbb03457724678adb7f9fadaea36d9456d0bfa35139f71f1db118b" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:36:08.099053 systemd[1]: Started cri-containerd-0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571.scope - libcontainer container 0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571. Mar 3 13:36:08.141739 containerd[1629]: time="2026-03-03T13:36:08.141693732Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5f4897f75-tx9xp,Uid:947b0e80-edd7-413e-9fd6-aebf6178044c,Namespace:calico-system,Attempt:0,} returns sandbox id \"0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571\"" Mar 3 13:36:08.343369 kubelet[2794]: I0303 13:36:08.343305 2794 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91" path="/var/lib/kubelet/pods/fdd9d6fd-311f-4e0b-9c4d-ead1ab269e91/volumes" Mar 3 13:36:09.247091 systemd-networkd[1492]: cali94f19c7e093: Gained IPv6LL Mar 3 13:36:09.412534 containerd[1629]: time="2026-03-03T13:36:09.412490480Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:09.413851 containerd[1629]: time="2026-03-03T13:36:09.413821523Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Mar 3 13:36:09.415926 containerd[1629]: time="2026-03-03T13:36:09.414538854Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:09.418007 containerd[1629]: time="2026-03-03T13:36:09.417974941Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:09.418861 containerd[1629]: time="2026-03-03T13:36:09.418838902Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 2.848408775s" Mar 3 13:36:09.418900 containerd[1629]: time="2026-03-03T13:36:09.418862802Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Mar 3 13:36:09.419944 containerd[1629]: time="2026-03-03T13:36:09.419873834Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 3 13:36:09.422838 containerd[1629]: time="2026-03-03T13:36:09.422812050Z" level=info msg="CreateContainer within sandbox \"1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 3 13:36:09.434041 containerd[1629]: time="2026-03-03T13:36:09.433996922Z" level=info msg="Container 963dfde22822d7c0ad8019381dca3e54092ec15ed1faaae47aeac68a9578c328: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:09.446144 containerd[1629]: time="2026-03-03T13:36:09.446104795Z" level=info msg="CreateContainer within sandbox \"1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"963dfde22822d7c0ad8019381dca3e54092ec15ed1faaae47aeac68a9578c328\"" Mar 3 13:36:09.446864 containerd[1629]: time="2026-03-03T13:36:09.446795137Z" level=info msg="StartContainer for \"963dfde22822d7c0ad8019381dca3e54092ec15ed1faaae47aeac68a9578c328\"" Mar 3 13:36:09.448143 containerd[1629]: time="2026-03-03T13:36:09.448114889Z" level=info msg="connecting to shim 963dfde22822d7c0ad8019381dca3e54092ec15ed1faaae47aeac68a9578c328" address="unix:///run/containerd/s/ccd2de3bd356fe3b129f7757ec800aba22e3cc1c94626b5ad1a94e520885717c" protocol=ttrpc version=3 Mar 3 13:36:09.470038 systemd[1]: Started cri-containerd-963dfde22822d7c0ad8019381dca3e54092ec15ed1faaae47aeac68a9578c328.scope - libcontainer container 963dfde22822d7c0ad8019381dca3e54092ec15ed1faaae47aeac68a9578c328. Mar 3 13:36:09.528662 containerd[1629]: time="2026-03-03T13:36:09.528559485Z" level=info msg="StartContainer for \"963dfde22822d7c0ad8019381dca3e54092ec15ed1faaae47aeac68a9578c328\" returns successfully" Mar 3 13:36:11.293118 kubelet[2794]: I0303 13:36:11.292656 2794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 3 13:36:12.081104 systemd-networkd[1492]: vxlan.calico: Link UP Mar 3 13:36:12.081115 systemd-networkd[1492]: vxlan.calico: Gained carrier Mar 3 13:36:12.576283 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3094302971.mount: Deactivated successfully. Mar 3 13:36:12.883341 containerd[1629]: time="2026-03-03T13:36:12.883207206Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:12.884434 containerd[1629]: time="2026-03-03T13:36:12.884398879Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Mar 3 13:36:12.885648 containerd[1629]: time="2026-03-03T13:36:12.885602380Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:12.887374 containerd[1629]: time="2026-03-03T13:36:12.887340403Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:12.887964 containerd[1629]: time="2026-03-03T13:36:12.887928595Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 3.468011571s" Mar 3 13:36:12.887964 containerd[1629]: time="2026-03-03T13:36:12.887964385Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Mar 3 13:36:12.890739 containerd[1629]: time="2026-03-03T13:36:12.890554429Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 3 13:36:12.896173 containerd[1629]: time="2026-03-03T13:36:12.896138589Z" level=info msg="CreateContainer within sandbox \"8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 3 13:36:12.906128 containerd[1629]: time="2026-03-03T13:36:12.906094948Z" level=info msg="Container 2068f3d0304f6c36a4880b21a6057e64d2585f1c35f4bba25250873bec77efc5: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:12.916374 containerd[1629]: time="2026-03-03T13:36:12.916310867Z" level=info msg="CreateContainer within sandbox \"8123f87a4f906fa2a4a67bef5de24c78b7157b61b27e541693cde0a6e3d2e9bd\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"2068f3d0304f6c36a4880b21a6057e64d2585f1c35f4bba25250873bec77efc5\"" Mar 3 13:36:12.917275 containerd[1629]: time="2026-03-03T13:36:12.917213288Z" level=info msg="StartContainer for \"2068f3d0304f6c36a4880b21a6057e64d2585f1c35f4bba25250873bec77efc5\"" Mar 3 13:36:12.918320 containerd[1629]: time="2026-03-03T13:36:12.918280820Z" level=info msg="connecting to shim 2068f3d0304f6c36a4880b21a6057e64d2585f1c35f4bba25250873bec77efc5" address="unix:///run/containerd/s/f53e8f19ab7e545cb93e91324339add6298e9a26ef0379502290c0e505c09f88" protocol=ttrpc version=3 Mar 3 13:36:12.943115 systemd[1]: Started cri-containerd-2068f3d0304f6c36a4880b21a6057e64d2585f1c35f4bba25250873bec77efc5.scope - libcontainer container 2068f3d0304f6c36a4880b21a6057e64d2585f1c35f4bba25250873bec77efc5. Mar 3 13:36:12.995097 containerd[1629]: time="2026-03-03T13:36:12.995038019Z" level=info msg="StartContainer for \"2068f3d0304f6c36a4880b21a6057e64d2585f1c35f4bba25250873bec77efc5\" returns successfully" Mar 3 13:36:13.215251 systemd-networkd[1492]: vxlan.calico: Gained IPv6LL Mar 3 13:36:13.556578 kubelet[2794]: I0303 13:36:13.554016 2794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-5b85766d88-mpt25" podStartSLOduration=21.460531823 podStartE2EDuration="27.553783384s" podCreationTimestamp="2026-03-03 13:35:46 +0000 UTC" firstStartedPulling="2026-03-03 13:36:06.796520207 +0000 UTC m=+36.559638472" lastFinishedPulling="2026-03-03 13:36:12.889771778 +0000 UTC m=+42.652890033" observedRunningTime="2026-03-03 13:36:13.553194534 +0000 UTC m=+43.316312889" watchObservedRunningTime="2026-03-03 13:36:13.553783384 +0000 UTC m=+43.316901689" Mar 3 13:36:14.625318 kubelet[2794]: I0303 13:36:14.625280 2794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 3 13:36:14.718276 containerd[1629]: time="2026-03-03T13:36:14.718217683Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:14.721071 containerd[1629]: time="2026-03-03T13:36:14.721023348Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Mar 3 13:36:14.722544 containerd[1629]: time="2026-03-03T13:36:14.722523381Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:14.725319 containerd[1629]: time="2026-03-03T13:36:14.725296596Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:14.726854 containerd[1629]: time="2026-03-03T13:36:14.726714859Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 1.836131509s" Mar 3 13:36:14.727165 containerd[1629]: time="2026-03-03T13:36:14.727002409Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Mar 3 13:36:14.728720 containerd[1629]: time="2026-03-03T13:36:14.728500422Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 3 13:36:14.732182 containerd[1629]: time="2026-03-03T13:36:14.732145217Z" level=info msg="CreateContainer within sandbox \"0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 3 13:36:14.741562 containerd[1629]: time="2026-03-03T13:36:14.741042313Z" level=info msg="Container f23d8af7cf1063260a5c9072f324da9bf1c8e8cab93d772afb1d0a9234fa8268: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:14.749749 containerd[1629]: time="2026-03-03T13:36:14.749717758Z" level=info msg="CreateContainer within sandbox \"0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"f23d8af7cf1063260a5c9072f324da9bf1c8e8cab93d772afb1d0a9234fa8268\"" Mar 3 13:36:14.750291 containerd[1629]: time="2026-03-03T13:36:14.750212359Z" level=info msg="StartContainer for \"f23d8af7cf1063260a5c9072f324da9bf1c8e8cab93d772afb1d0a9234fa8268\"" Mar 3 13:36:14.751369 containerd[1629]: time="2026-03-03T13:36:14.751352001Z" level=info msg="connecting to shim f23d8af7cf1063260a5c9072f324da9bf1c8e8cab93d772afb1d0a9234fa8268" address="unix:///run/containerd/s/a0f6076acfdbb03457724678adb7f9fadaea36d9456d0bfa35139f71f1db118b" protocol=ttrpc version=3 Mar 3 13:36:14.780030 systemd[1]: Started cri-containerd-f23d8af7cf1063260a5c9072f324da9bf1c8e8cab93d772afb1d0a9234fa8268.scope - libcontainer container f23d8af7cf1063260a5c9072f324da9bf1c8e8cab93d772afb1d0a9234fa8268. Mar 3 13:36:14.823283 containerd[1629]: time="2026-03-03T13:36:14.823256437Z" level=info msg="StartContainer for \"f23d8af7cf1063260a5c9072f324da9bf1c8e8cab93d772afb1d0a9234fa8268\" returns successfully" Mar 3 13:36:16.338081 containerd[1629]: time="2026-03-03T13:36:16.337993171Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hk74p,Uid:44c56449-ed93-445e-90b3-c1b25198a9bc,Namespace:kube-system,Attempt:0,}" Mar 3 13:36:16.474853 systemd-networkd[1492]: cali00e8b369213: Link UP Mar 3 13:36:16.475933 systemd-networkd[1492]: cali00e8b369213: Gained carrier Mar 3 13:36:16.497600 containerd[1629]: 2026-03-03 13:36:16.407 [INFO][4651] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hk74p-eth0 coredns-674b8bbfcf- kube-system 44c56449-ed93-445e-90b3-c1b25198a9bc 825 0 2026-03-03 13:35:37 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-7-599052a073 coredns-674b8bbfcf-hk74p eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali00e8b369213 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" Namespace="kube-system" Pod="coredns-674b8bbfcf-hk74p" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hk74p-" Mar 3 13:36:16.497600 containerd[1629]: 2026-03-03 13:36:16.407 [INFO][4651] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" Namespace="kube-system" Pod="coredns-674b8bbfcf-hk74p" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hk74p-eth0" Mar 3 13:36:16.497600 containerd[1629]: 2026-03-03 13:36:16.431 [INFO][4662] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" HandleID="k8s-pod-network.b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" Workload="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hk74p-eth0" Mar 3 13:36:16.497809 containerd[1629]: 2026-03-03 13:36:16.438 [INFO][4662] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" HandleID="k8s-pod-network.b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" Workload="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hk74p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277e80), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-7-599052a073", "pod":"coredns-674b8bbfcf-hk74p", "timestamp":"2026-03-03 13:36:16.4317483 +0000 UTC"}, Hostname:"ci-4459-2-4-7-599052a073", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003291e0)} Mar 3 13:36:16.497809 containerd[1629]: 2026-03-03 13:36:16.438 [INFO][4662] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:16.497809 containerd[1629]: 2026-03-03 13:36:16.438 [INFO][4662] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:16.497809 containerd[1629]: 2026-03-03 13:36:16.438 [INFO][4662] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-7-599052a073' Mar 3 13:36:16.497809 containerd[1629]: 2026-03-03 13:36:16.441 [INFO][4662] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:16.497809 containerd[1629]: 2026-03-03 13:36:16.445 [INFO][4662] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-7-599052a073" Mar 3 13:36:16.497809 containerd[1629]: 2026-03-03 13:36:16.450 [INFO][4662] ipam/ipam.go 526: Trying affinity for 192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:16.497809 containerd[1629]: 2026-03-03 13:36:16.452 [INFO][4662] ipam/ipam.go 160: Attempting to load block cidr=192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:16.497809 containerd[1629]: 2026-03-03 13:36:16.454 [INFO][4662] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:16.498253 containerd[1629]: 2026-03-03 13:36:16.454 [INFO][4662] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.97.192/26 handle="k8s-pod-network.b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:16.498253 containerd[1629]: 2026-03-03 13:36:16.455 [INFO][4662] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340 Mar 3 13:36:16.498253 containerd[1629]: 2026-03-03 13:36:16.461 [INFO][4662] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.97.192/26 handle="k8s-pod-network.b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:16.498253 containerd[1629]: 2026-03-03 13:36:16.467 [INFO][4662] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.97.196/26] block=192.168.97.192/26 handle="k8s-pod-network.b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:16.498253 containerd[1629]: 2026-03-03 13:36:16.467 [INFO][4662] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.97.196/26] handle="k8s-pod-network.b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:16.498253 containerd[1629]: 2026-03-03 13:36:16.467 [INFO][4662] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:16.498253 containerd[1629]: 2026-03-03 13:36:16.467 [INFO][4662] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.97.196/26] IPv6=[] ContainerID="b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" HandleID="k8s-pod-network.b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" Workload="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hk74p-eth0" Mar 3 13:36:16.498421 containerd[1629]: 2026-03-03 13:36:16.471 [INFO][4651] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" Namespace="kube-system" Pod="coredns-674b8bbfcf-hk74p" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hk74p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hk74p-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"44c56449-ed93-445e-90b3-c1b25198a9bc", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-7-599052a073", ContainerID:"", Pod:"coredns-674b8bbfcf-hk74p", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.97.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali00e8b369213", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:16.498421 containerd[1629]: 2026-03-03 13:36:16.471 [INFO][4651] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.196/32] ContainerID="b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" Namespace="kube-system" Pod="coredns-674b8bbfcf-hk74p" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hk74p-eth0" Mar 3 13:36:16.498421 containerd[1629]: 2026-03-03 13:36:16.471 [INFO][4651] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali00e8b369213 ContainerID="b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" Namespace="kube-system" Pod="coredns-674b8bbfcf-hk74p" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hk74p-eth0" Mar 3 13:36:16.498421 containerd[1629]: 2026-03-03 13:36:16.476 [INFO][4651] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" Namespace="kube-system" Pod="coredns-674b8bbfcf-hk74p" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hk74p-eth0" Mar 3 13:36:16.498421 containerd[1629]: 2026-03-03 13:36:16.477 [INFO][4651] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" Namespace="kube-system" Pod="coredns-674b8bbfcf-hk74p" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hk74p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hk74p-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"44c56449-ed93-445e-90b3-c1b25198a9bc", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-7-599052a073", ContainerID:"b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340", Pod:"coredns-674b8bbfcf-hk74p", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.97.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali00e8b369213", MAC:"d2:29:d0:c4:00:1b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:16.498421 containerd[1629]: 2026-03-03 13:36:16.494 [INFO][4651] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" Namespace="kube-system" Pod="coredns-674b8bbfcf-hk74p" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hk74p-eth0" Mar 3 13:36:16.526966 containerd[1629]: time="2026-03-03T13:36:16.526582629Z" level=info msg="connecting to shim b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340" address="unix:///run/containerd/s/08c67e4df9c91cf12ff372f6727384a7992d4b9c496841265a84f3da96b72bce" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:36:16.564937 systemd[1]: Started cri-containerd-b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340.scope - libcontainer container b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340. Mar 3 13:36:16.618592 containerd[1629]: time="2026-03-03T13:36:16.618442254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hk74p,Uid:44c56449-ed93-445e-90b3-c1b25198a9bc,Namespace:kube-system,Attempt:0,} returns sandbox id \"b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340\"" Mar 3 13:36:16.624109 containerd[1629]: time="2026-03-03T13:36:16.624089144Z" level=info msg="CreateContainer within sandbox \"b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 3 13:36:16.637110 containerd[1629]: time="2026-03-03T13:36:16.637046815Z" level=info msg="Container fd14508109ba73e7f014dda09e582c6e743b58c3862ce8f4605edeaee995d90e: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:16.640984 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3820103824.mount: Deactivated successfully. Mar 3 13:36:16.647618 containerd[1629]: time="2026-03-03T13:36:16.647596273Z" level=info msg="CreateContainer within sandbox \"b184daf3d3502e9f8da30d0abc8246bc315735d2eded852cba8783ddcbb45340\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"fd14508109ba73e7f014dda09e582c6e743b58c3862ce8f4605edeaee995d90e\"" Mar 3 13:36:16.649755 containerd[1629]: time="2026-03-03T13:36:16.649718697Z" level=info msg="StartContainer for \"fd14508109ba73e7f014dda09e582c6e743b58c3862ce8f4605edeaee995d90e\"" Mar 3 13:36:16.650758 containerd[1629]: time="2026-03-03T13:36:16.650724088Z" level=info msg="connecting to shim fd14508109ba73e7f014dda09e582c6e743b58c3862ce8f4605edeaee995d90e" address="unix:///run/containerd/s/08c67e4df9c91cf12ff372f6727384a7992d4b9c496841265a84f3da96b72bce" protocol=ttrpc version=3 Mar 3 13:36:16.669033 systemd[1]: Started cri-containerd-fd14508109ba73e7f014dda09e582c6e743b58c3862ce8f4605edeaee995d90e.scope - libcontainer container fd14508109ba73e7f014dda09e582c6e743b58c3862ce8f4605edeaee995d90e. Mar 3 13:36:16.701711 containerd[1629]: time="2026-03-03T13:36:16.701648884Z" level=info msg="StartContainer for \"fd14508109ba73e7f014dda09e582c6e743b58c3862ce8f4605edeaee995d90e\" returns successfully" Mar 3 13:36:17.335143 containerd[1629]: time="2026-03-03T13:36:17.334671653Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8ccd5dff9-mk9lp,Uid:15882429-b1cf-4254-b3ef-f6e453c09e66,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:17.335143 containerd[1629]: time="2026-03-03T13:36:17.334888664Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5859fd4b-l4vrj,Uid:15401374-cd08-4d87-aaa5-00861a15882b,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:17.492188 systemd-networkd[1492]: cali9e5a0aa92e7: Link UP Mar 3 13:36:17.494240 systemd-networkd[1492]: cali9e5a0aa92e7: Gained carrier Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.397 [INFO][4779] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--7--599052a073-k8s-calico--kube--controllers--8ccd5dff9--mk9lp-eth0 calico-kube-controllers-8ccd5dff9- calico-system 15882429-b1cf-4254-b3ef-f6e453c09e66 824 0 2026-03-03 13:35:47 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:8ccd5dff9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-4-7-599052a073 calico-kube-controllers-8ccd5dff9-mk9lp eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali9e5a0aa92e7 [] [] }} ContainerID="8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" Namespace="calico-system" Pod="calico-kube-controllers-8ccd5dff9-mk9lp" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--kube--controllers--8ccd5dff9--mk9lp-" Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.398 [INFO][4779] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" Namespace="calico-system" Pod="calico-kube-controllers-8ccd5dff9-mk9lp" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--kube--controllers--8ccd5dff9--mk9lp-eth0" Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.450 [INFO][4801] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" HandleID="k8s-pod-network.8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" Workload="ci--4459--2--4--7--599052a073-k8s-calico--kube--controllers--8ccd5dff9--mk9lp-eth0" Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.457 [INFO][4801] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" HandleID="k8s-pod-network.8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" Workload="ci--4459--2--4--7--599052a073-k8s-calico--kube--controllers--8ccd5dff9--mk9lp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277f00), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-7-599052a073", "pod":"calico-kube-controllers-8ccd5dff9-mk9lp", "timestamp":"2026-03-03 13:36:17.450070684 +0000 UTC"}, Hostname:"ci-4459-2-4-7-599052a073", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000112f20)} Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.457 [INFO][4801] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.458 [INFO][4801] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.458 [INFO][4801] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-7-599052a073' Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.460 [INFO][4801] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.464 [INFO][4801] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.469 [INFO][4801] ipam/ipam.go 526: Trying affinity for 192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.470 [INFO][4801] ipam/ipam.go 160: Attempting to load block cidr=192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.472 [INFO][4801] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.472 [INFO][4801] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.97.192/26 handle="k8s-pod-network.8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.474 [INFO][4801] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278 Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.479 [INFO][4801] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.97.192/26 handle="k8s-pod-network.8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.483 [INFO][4801] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.97.197/26] block=192.168.97.192/26 handle="k8s-pod-network.8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.483 [INFO][4801] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.97.197/26] handle="k8s-pod-network.8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.483 [INFO][4801] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:17.516108 containerd[1629]: 2026-03-03 13:36:17.483 [INFO][4801] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.97.197/26] IPv6=[] ContainerID="8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" HandleID="k8s-pod-network.8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" Workload="ci--4459--2--4--7--599052a073-k8s-calico--kube--controllers--8ccd5dff9--mk9lp-eth0" Mar 3 13:36:17.516788 containerd[1629]: 2026-03-03 13:36:17.487 [INFO][4779] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" Namespace="calico-system" Pod="calico-kube-controllers-8ccd5dff9-mk9lp" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--kube--controllers--8ccd5dff9--mk9lp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--7--599052a073-k8s-calico--kube--controllers--8ccd5dff9--mk9lp-eth0", GenerateName:"calico-kube-controllers-8ccd5dff9-", Namespace:"calico-system", SelfLink:"", UID:"15882429-b1cf-4254-b3ef-f6e453c09e66", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8ccd5dff9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-7-599052a073", ContainerID:"", Pod:"calico-kube-controllers-8ccd5dff9-mk9lp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.97.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9e5a0aa92e7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:17.516788 containerd[1629]: 2026-03-03 13:36:17.487 [INFO][4779] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.197/32] ContainerID="8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" Namespace="calico-system" Pod="calico-kube-controllers-8ccd5dff9-mk9lp" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--kube--controllers--8ccd5dff9--mk9lp-eth0" Mar 3 13:36:17.516788 containerd[1629]: 2026-03-03 13:36:17.487 [INFO][4779] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9e5a0aa92e7 ContainerID="8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" Namespace="calico-system" Pod="calico-kube-controllers-8ccd5dff9-mk9lp" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--kube--controllers--8ccd5dff9--mk9lp-eth0" Mar 3 13:36:17.516788 containerd[1629]: 2026-03-03 13:36:17.495 [INFO][4779] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" Namespace="calico-system" Pod="calico-kube-controllers-8ccd5dff9-mk9lp" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--kube--controllers--8ccd5dff9--mk9lp-eth0" Mar 3 13:36:17.516788 containerd[1629]: 2026-03-03 13:36:17.496 [INFO][4779] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" Namespace="calico-system" Pod="calico-kube-controllers-8ccd5dff9-mk9lp" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--kube--controllers--8ccd5dff9--mk9lp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--7--599052a073-k8s-calico--kube--controllers--8ccd5dff9--mk9lp-eth0", GenerateName:"calico-kube-controllers-8ccd5dff9-", Namespace:"calico-system", SelfLink:"", UID:"15882429-b1cf-4254-b3ef-f6e453c09e66", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8ccd5dff9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-7-599052a073", ContainerID:"8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278", Pod:"calico-kube-controllers-8ccd5dff9-mk9lp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.97.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9e5a0aa92e7", MAC:"02:f5:2f:e1:c4:1a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:17.516788 containerd[1629]: 2026-03-03 13:36:17.513 [INFO][4779] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" Namespace="calico-system" Pod="calico-kube-controllers-8ccd5dff9-mk9lp" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--kube--controllers--8ccd5dff9--mk9lp-eth0" Mar 3 13:36:17.551360 containerd[1629]: time="2026-03-03T13:36:17.549541850Z" level=info msg="connecting to shim 8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278" address="unix:///run/containerd/s/a99d924b260c1a19c144bf575cb18f8dac15e289917a2de1e57ef30bad1c05c0" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:36:17.564876 kubelet[2794]: I0303 13:36:17.564031 2794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-hk74p" podStartSLOduration=40.564017674 podStartE2EDuration="40.564017674s" podCreationTimestamp="2026-03-03 13:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 13:36:17.563651973 +0000 UTC m=+47.326770228" watchObservedRunningTime="2026-03-03 13:36:17.564017674 +0000 UTC m=+47.327135929" Mar 3 13:36:17.587160 systemd[1]: Started cri-containerd-8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278.scope - libcontainer container 8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278. Mar 3 13:36:17.642979 systemd-networkd[1492]: cali55c404b9027: Link UP Mar 3 13:36:17.644657 systemd-networkd[1492]: cali55c404b9027: Gained carrier Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.412 [INFO][4780] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--l4vrj-eth0 calico-apiserver-c5859fd4b- calico-system 15401374-cd08-4d87-aaa5-00861a15882b 823 0 2026-03-03 13:35:46 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:c5859fd4b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-7-599052a073 calico-apiserver-c5859fd4b-l4vrj eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali55c404b9027 [] [] }} ContainerID="13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" Namespace="calico-system" Pod="calico-apiserver-c5859fd4b-l4vrj" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--l4vrj-" Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.412 [INFO][4780] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" Namespace="calico-system" Pod="calico-apiserver-c5859fd4b-l4vrj" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--l4vrj-eth0" Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.451 [INFO][4806] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" HandleID="k8s-pod-network.13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" Workload="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--l4vrj-eth0" Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.459 [INFO][4806] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" HandleID="k8s-pod-network.13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" Workload="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--l4vrj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277dc0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-7-599052a073", "pod":"calico-apiserver-c5859fd4b-l4vrj", "timestamp":"2026-03-03 13:36:17.451745448 +0000 UTC"}, Hostname:"ci-4459-2-4-7-599052a073", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001fe9a0)} Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.459 [INFO][4806] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.483 [INFO][4806] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.483 [INFO][4806] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-7-599052a073' Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.567 [INFO][4806] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.585 [INFO][4806] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.599 [INFO][4806] ipam/ipam.go 526: Trying affinity for 192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.603 [INFO][4806] ipam/ipam.go 160: Attempting to load block cidr=192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.606 [INFO][4806] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.606 [INFO][4806] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.97.192/26 handle="k8s-pod-network.13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.609 [INFO][4806] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.614 [INFO][4806] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.97.192/26 handle="k8s-pod-network.13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.623 [INFO][4806] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.97.198/26] block=192.168.97.192/26 handle="k8s-pod-network.13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.623 [INFO][4806] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.97.198/26] handle="k8s-pod-network.13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.624 [INFO][4806] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:17.665428 containerd[1629]: 2026-03-03 13:36:17.624 [INFO][4806] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.97.198/26] IPv6=[] ContainerID="13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" HandleID="k8s-pod-network.13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" Workload="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--l4vrj-eth0" Mar 3 13:36:17.665846 containerd[1629]: 2026-03-03 13:36:17.636 [INFO][4780] cni-plugin/k8s.go 418: Populated endpoint ContainerID="13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" Namespace="calico-system" Pod="calico-apiserver-c5859fd4b-l4vrj" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--l4vrj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--l4vrj-eth0", GenerateName:"calico-apiserver-c5859fd4b-", Namespace:"calico-system", SelfLink:"", UID:"15401374-cd08-4d87-aaa5-00861a15882b", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c5859fd4b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-7-599052a073", ContainerID:"", Pod:"calico-apiserver-c5859fd4b-l4vrj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.97.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali55c404b9027", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:17.665846 containerd[1629]: 2026-03-03 13:36:17.637 [INFO][4780] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.198/32] ContainerID="13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" Namespace="calico-system" Pod="calico-apiserver-c5859fd4b-l4vrj" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--l4vrj-eth0" Mar 3 13:36:17.665846 containerd[1629]: 2026-03-03 13:36:17.637 [INFO][4780] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali55c404b9027 ContainerID="13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" Namespace="calico-system" Pod="calico-apiserver-c5859fd4b-l4vrj" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--l4vrj-eth0" Mar 3 13:36:17.665846 containerd[1629]: 2026-03-03 13:36:17.646 [INFO][4780] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" Namespace="calico-system" Pod="calico-apiserver-c5859fd4b-l4vrj" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--l4vrj-eth0" Mar 3 13:36:17.665846 containerd[1629]: 2026-03-03 13:36:17.647 [INFO][4780] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" Namespace="calico-system" Pod="calico-apiserver-c5859fd4b-l4vrj" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--l4vrj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--l4vrj-eth0", GenerateName:"calico-apiserver-c5859fd4b-", Namespace:"calico-system", SelfLink:"", UID:"15401374-cd08-4d87-aaa5-00861a15882b", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c5859fd4b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-7-599052a073", ContainerID:"13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb", Pod:"calico-apiserver-c5859fd4b-l4vrj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.97.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali55c404b9027", MAC:"42:fc:c1:fd:52:48", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:17.665846 containerd[1629]: 2026-03-03 13:36:17.659 [INFO][4780] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" Namespace="calico-system" Pod="calico-apiserver-c5859fd4b-l4vrj" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--l4vrj-eth0" Mar 3 13:36:17.706157 containerd[1629]: time="2026-03-03T13:36:17.706116880Z" level=info msg="connecting to shim 13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb" address="unix:///run/containerd/s/a6dcf9d13db60adc3815b1a2b4ca832f4f3c68358de6f8fc99cc041fe40eff22" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:36:17.737422 systemd[1]: Started cri-containerd-13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb.scope - libcontainer container 13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb. Mar 3 13:36:17.761001 containerd[1629]: time="2026-03-03T13:36:17.760942530Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8ccd5dff9-mk9lp,Uid:15882429-b1cf-4254-b3ef-f6e453c09e66,Namespace:calico-system,Attempt:0,} returns sandbox id \"8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278\"" Mar 3 13:36:17.834384 containerd[1629]: time="2026-03-03T13:36:17.834329632Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5859fd4b-l4vrj,Uid:15401374-cd08-4d87-aaa5-00861a15882b,Namespace:calico-system,Attempt:0,} returns sandbox id \"13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb\"" Mar 3 13:36:18.528117 systemd-networkd[1492]: cali00e8b369213: Gained IPv6LL Mar 3 13:36:18.678219 containerd[1629]: time="2026-03-03T13:36:18.678164375Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:18.679425 containerd[1629]: time="2026-03-03T13:36:18.679333677Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Mar 3 13:36:18.680620 containerd[1629]: time="2026-03-03T13:36:18.680590969Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:18.683378 containerd[1629]: time="2026-03-03T13:36:18.683329443Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:18.684360 containerd[1629]: time="2026-03-03T13:36:18.683922794Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 3.955380542s" Mar 3 13:36:18.684360 containerd[1629]: time="2026-03-03T13:36:18.683954384Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Mar 3 13:36:18.685713 containerd[1629]: time="2026-03-03T13:36:18.685649407Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 3 13:36:18.689583 containerd[1629]: time="2026-03-03T13:36:18.689563524Z" level=info msg="CreateContainer within sandbox \"1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 3 13:36:18.701924 containerd[1629]: time="2026-03-03T13:36:18.698969559Z" level=info msg="Container 78b020e71a2408dc5ec4e2a99a8b9bcd01c97e52893f8985bb91a239067cbb8c: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:18.713053 containerd[1629]: time="2026-03-03T13:36:18.712995612Z" level=info msg="CreateContainer within sandbox \"1559eae128532919252a24f42229d5b34109a8a27d566dd57a947a7fcf84b985\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"78b020e71a2408dc5ec4e2a99a8b9bcd01c97e52893f8985bb91a239067cbb8c\"" Mar 3 13:36:18.715944 containerd[1629]: time="2026-03-03T13:36:18.715703836Z" level=info msg="StartContainer for \"78b020e71a2408dc5ec4e2a99a8b9bcd01c97e52893f8985bb91a239067cbb8c\"" Mar 3 13:36:18.717747 containerd[1629]: time="2026-03-03T13:36:18.716877328Z" level=info msg="connecting to shim 78b020e71a2408dc5ec4e2a99a8b9bcd01c97e52893f8985bb91a239067cbb8c" address="unix:///run/containerd/s/ccd2de3bd356fe3b129f7757ec800aba22e3cc1c94626b5ad1a94e520885717c" protocol=ttrpc version=3 Mar 3 13:36:18.747057 systemd[1]: Started cri-containerd-78b020e71a2408dc5ec4e2a99a8b9bcd01c97e52893f8985bb91a239067cbb8c.scope - libcontainer container 78b020e71a2408dc5ec4e2a99a8b9bcd01c97e52893f8985bb91a239067cbb8c. Mar 3 13:36:18.903893 containerd[1629]: time="2026-03-03T13:36:18.903781344Z" level=info msg="StartContainer for \"78b020e71a2408dc5ec4e2a99a8b9bcd01c97e52893f8985bb91a239067cbb8c\" returns successfully" Mar 3 13:36:19.335016 containerd[1629]: time="2026-03-03T13:36:19.334679379Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hjkbg,Uid:547c2410-b867-4848-8712-44c1c77daad1,Namespace:kube-system,Attempt:0,}" Mar 3 13:36:19.359852 systemd-networkd[1492]: cali9e5a0aa92e7: Gained IPv6LL Mar 3 13:36:19.440929 kubelet[2794]: I0303 13:36:19.440022 2794 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 3 13:36:19.442920 kubelet[2794]: I0303 13:36:19.442388 2794 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 3 13:36:19.504438 systemd-networkd[1492]: cali59ecfdaed67: Link UP Mar 3 13:36:19.507954 systemd-networkd[1492]: cali59ecfdaed67: Gained carrier Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.416 [INFO][4997] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hjkbg-eth0 coredns-674b8bbfcf- kube-system 547c2410-b867-4848-8712-44c1c77daad1 820 0 2026-03-03 13:35:37 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-7-599052a073 coredns-674b8bbfcf-hjkbg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali59ecfdaed67 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" Namespace="kube-system" Pod="coredns-674b8bbfcf-hjkbg" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hjkbg-" Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.416 [INFO][4997] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" Namespace="kube-system" Pod="coredns-674b8bbfcf-hjkbg" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hjkbg-eth0" Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.445 [INFO][5012] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" HandleID="k8s-pod-network.304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" Workload="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hjkbg-eth0" Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.452 [INFO][5012] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" HandleID="k8s-pod-network.304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" Workload="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hjkbg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fd2d0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-7-599052a073", "pod":"coredns-674b8bbfcf-hjkbg", "timestamp":"2026-03-03 13:36:19.445595998 +0000 UTC"}, Hostname:"ci-4459-2-4-7-599052a073", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003771e0)} Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.452 [INFO][5012] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.452 [INFO][5012] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.452 [INFO][5012] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-7-599052a073' Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.460 [INFO][5012] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.469 [INFO][5012] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-7-599052a073" Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.478 [INFO][5012] ipam/ipam.go 526: Trying affinity for 192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.481 [INFO][5012] ipam/ipam.go 160: Attempting to load block cidr=192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.484 [INFO][5012] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.484 [INFO][5012] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.97.192/26 handle="k8s-pod-network.304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.485 [INFO][5012] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620 Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.490 [INFO][5012] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.97.192/26 handle="k8s-pod-network.304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.495 [INFO][5012] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.97.199/26] block=192.168.97.192/26 handle="k8s-pod-network.304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.495 [INFO][5012] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.97.199/26] handle="k8s-pod-network.304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.496 [INFO][5012] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:19.527238 containerd[1629]: 2026-03-03 13:36:19.496 [INFO][5012] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.97.199/26] IPv6=[] ContainerID="304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" HandleID="k8s-pod-network.304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" Workload="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hjkbg-eth0" Mar 3 13:36:19.527783 containerd[1629]: 2026-03-03 13:36:19.499 [INFO][4997] cni-plugin/k8s.go 418: Populated endpoint ContainerID="304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" Namespace="kube-system" Pod="coredns-674b8bbfcf-hjkbg" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hjkbg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hjkbg-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"547c2410-b867-4848-8712-44c1c77daad1", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-7-599052a073", ContainerID:"", Pod:"coredns-674b8bbfcf-hjkbg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.97.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali59ecfdaed67", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:19.527783 containerd[1629]: 2026-03-03 13:36:19.499 [INFO][4997] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.199/32] ContainerID="304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" Namespace="kube-system" Pod="coredns-674b8bbfcf-hjkbg" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hjkbg-eth0" Mar 3 13:36:19.527783 containerd[1629]: 2026-03-03 13:36:19.499 [INFO][4997] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali59ecfdaed67 ContainerID="304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" Namespace="kube-system" Pod="coredns-674b8bbfcf-hjkbg" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hjkbg-eth0" Mar 3 13:36:19.527783 containerd[1629]: 2026-03-03 13:36:19.502 [INFO][4997] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" Namespace="kube-system" Pod="coredns-674b8bbfcf-hjkbg" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hjkbg-eth0" Mar 3 13:36:19.527783 containerd[1629]: 2026-03-03 13:36:19.502 [INFO][4997] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" Namespace="kube-system" Pod="coredns-674b8bbfcf-hjkbg" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hjkbg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hjkbg-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"547c2410-b867-4848-8712-44c1c77daad1", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-7-599052a073", ContainerID:"304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620", Pod:"coredns-674b8bbfcf-hjkbg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.97.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali59ecfdaed67", MAC:"e6:ce:c9:d0:c9:60", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:19.527783 containerd[1629]: 2026-03-03 13:36:19.518 [INFO][4997] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" Namespace="kube-system" Pod="coredns-674b8bbfcf-hjkbg" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-coredns--674b8bbfcf--hjkbg-eth0" Mar 3 13:36:19.556841 containerd[1629]: time="2026-03-03T13:36:19.556788657Z" level=info msg="connecting to shim 304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620" address="unix:///run/containerd/s/076314d0bc1d7b0ad4d5857041780dc59840592aab5213b1c242fe2ee34f91ef" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:36:19.581452 kubelet[2794]: I0303 13:36:19.581398 2794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-jzlgk" podStartSLOduration=20.466126425 podStartE2EDuration="32.581382526s" podCreationTimestamp="2026-03-03 13:35:47 +0000 UTC" firstStartedPulling="2026-03-03 13:36:06.569605645 +0000 UTC m=+36.332723910" lastFinishedPulling="2026-03-03 13:36:18.684861756 +0000 UTC m=+48.447980011" observedRunningTime="2026-03-03 13:36:19.576850469 +0000 UTC m=+49.339968734" watchObservedRunningTime="2026-03-03 13:36:19.581382526 +0000 UTC m=+49.344500781" Mar 3 13:36:19.617306 systemd[1]: Started cri-containerd-304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620.scope - libcontainer container 304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620. Mar 3 13:36:19.674844 containerd[1629]: time="2026-03-03T13:36:19.674802866Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hjkbg,Uid:547c2410-b867-4848-8712-44c1c77daad1,Namespace:kube-system,Attempt:0,} returns sandbox id \"304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620\"" Mar 3 13:36:19.679067 systemd-networkd[1492]: cali55c404b9027: Gained IPv6LL Mar 3 13:36:19.680925 containerd[1629]: time="2026-03-03T13:36:19.680209305Z" level=info msg="CreateContainer within sandbox \"304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 3 13:36:19.688729 containerd[1629]: time="2026-03-03T13:36:19.688334838Z" level=info msg="Container f2b5c4a38381da100e1b770925f763b2d3be88119e3b97e325d2573186429fb3: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:19.696115 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount166153296.mount: Deactivated successfully. Mar 3 13:36:19.701575 containerd[1629]: time="2026-03-03T13:36:19.701550110Z" level=info msg="CreateContainer within sandbox \"304961df3a0ed5d156eeaae5480b5a8f46c037212b01865fee3cb336a0225620\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f2b5c4a38381da100e1b770925f763b2d3be88119e3b97e325d2573186429fb3\"" Mar 3 13:36:19.702400 containerd[1629]: time="2026-03-03T13:36:19.702386931Z" level=info msg="StartContainer for \"f2b5c4a38381da100e1b770925f763b2d3be88119e3b97e325d2573186429fb3\"" Mar 3 13:36:19.703296 containerd[1629]: time="2026-03-03T13:36:19.703280823Z" level=info msg="connecting to shim f2b5c4a38381da100e1b770925f763b2d3be88119e3b97e325d2573186429fb3" address="unix:///run/containerd/s/076314d0bc1d7b0ad4d5857041780dc59840592aab5213b1c242fe2ee34f91ef" protocol=ttrpc version=3 Mar 3 13:36:19.738293 systemd[1]: Started cri-containerd-f2b5c4a38381da100e1b770925f763b2d3be88119e3b97e325d2573186429fb3.scope - libcontainer container f2b5c4a38381da100e1b770925f763b2d3be88119e3b97e325d2573186429fb3. Mar 3 13:36:19.771406 containerd[1629]: time="2026-03-03T13:36:19.771353182Z" level=info msg="StartContainer for \"f2b5c4a38381da100e1b770925f763b2d3be88119e3b97e325d2573186429fb3\" returns successfully" Mar 3 13:36:20.337799 containerd[1629]: time="2026-03-03T13:36:20.337697436Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5859fd4b-4t5vt,Uid:14132d38-7264-4753-918c-53ff70118c6f,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:20.461139 systemd-networkd[1492]: cali4a1fb60f99b: Link UP Mar 3 13:36:20.461310 systemd-networkd[1492]: cali4a1fb60f99b: Gained carrier Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.382 [INFO][5118] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--4t5vt-eth0 calico-apiserver-c5859fd4b- calico-system 14132d38-7264-4753-918c-53ff70118c6f 815 0 2026-03-03 13:35:46 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:c5859fd4b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-7-599052a073 calico-apiserver-c5859fd4b-4t5vt eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali4a1fb60f99b [] [] }} ContainerID="11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" Namespace="calico-system" Pod="calico-apiserver-c5859fd4b-4t5vt" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--4t5vt-" Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.382 [INFO][5118] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" Namespace="calico-system" Pod="calico-apiserver-c5859fd4b-4t5vt" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--4t5vt-eth0" Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.419 [INFO][5130] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" HandleID="k8s-pod-network.11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" Workload="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--4t5vt-eth0" Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.425 [INFO][5130] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" HandleID="k8s-pod-network.11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" Workload="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--4t5vt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fb480), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-7-599052a073", "pod":"calico-apiserver-c5859fd4b-4t5vt", "timestamp":"2026-03-03 13:36:20.419795995 +0000 UTC"}, Hostname:"ci-4459-2-4-7-599052a073", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003bcdc0)} Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.425 [INFO][5130] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.425 [INFO][5130] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.425 [INFO][5130] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-7-599052a073' Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.428 [INFO][5130] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.433 [INFO][5130] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-7-599052a073" Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.438 [INFO][5130] ipam/ipam.go 526: Trying affinity for 192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.440 [INFO][5130] ipam/ipam.go 160: Attempting to load block cidr=192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.442 [INFO][5130] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.97.192/26 host="ci-4459-2-4-7-599052a073" Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.442 [INFO][5130] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.97.192/26 handle="k8s-pod-network.11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.444 [INFO][5130] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372 Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.448 [INFO][5130] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.97.192/26 handle="k8s-pod-network.11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.453 [INFO][5130] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.97.200/26] block=192.168.97.192/26 handle="k8s-pod-network.11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.454 [INFO][5130] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.97.200/26] handle="k8s-pod-network.11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" host="ci-4459-2-4-7-599052a073" Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.454 [INFO][5130] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:20.479506 containerd[1629]: 2026-03-03 13:36:20.454 [INFO][5130] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.97.200/26] IPv6=[] ContainerID="11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" HandleID="k8s-pod-network.11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" Workload="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--4t5vt-eth0" Mar 3 13:36:20.481129 containerd[1629]: 2026-03-03 13:36:20.457 [INFO][5118] cni-plugin/k8s.go 418: Populated endpoint ContainerID="11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" Namespace="calico-system" Pod="calico-apiserver-c5859fd4b-4t5vt" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--4t5vt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--4t5vt-eth0", GenerateName:"calico-apiserver-c5859fd4b-", Namespace:"calico-system", SelfLink:"", UID:"14132d38-7264-4753-918c-53ff70118c6f", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c5859fd4b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-7-599052a073", ContainerID:"", Pod:"calico-apiserver-c5859fd4b-4t5vt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.97.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali4a1fb60f99b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:20.481129 containerd[1629]: 2026-03-03 13:36:20.457 [INFO][5118] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.97.200/32] ContainerID="11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" Namespace="calico-system" Pod="calico-apiserver-c5859fd4b-4t5vt" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--4t5vt-eth0" Mar 3 13:36:20.481129 containerd[1629]: 2026-03-03 13:36:20.457 [INFO][5118] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4a1fb60f99b ContainerID="11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" Namespace="calico-system" Pod="calico-apiserver-c5859fd4b-4t5vt" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--4t5vt-eth0" Mar 3 13:36:20.481129 containerd[1629]: 2026-03-03 13:36:20.459 [INFO][5118] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" Namespace="calico-system" Pod="calico-apiserver-c5859fd4b-4t5vt" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--4t5vt-eth0" Mar 3 13:36:20.481129 containerd[1629]: 2026-03-03 13:36:20.462 [INFO][5118] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" Namespace="calico-system" Pod="calico-apiserver-c5859fd4b-4t5vt" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--4t5vt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--4t5vt-eth0", GenerateName:"calico-apiserver-c5859fd4b-", Namespace:"calico-system", SelfLink:"", UID:"14132d38-7264-4753-918c-53ff70118c6f", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c5859fd4b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-7-599052a073", ContainerID:"11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372", Pod:"calico-apiserver-c5859fd4b-4t5vt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.97.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali4a1fb60f99b", MAC:"b6:c1:68:a1:47:d7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:20.481129 containerd[1629]: 2026-03-03 13:36:20.475 [INFO][5118] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" Namespace="calico-system" Pod="calico-apiserver-c5859fd4b-4t5vt" WorkloadEndpoint="ci--4459--2--4--7--599052a073-k8s-calico--apiserver--c5859fd4b--4t5vt-eth0" Mar 3 13:36:20.505558 containerd[1629]: time="2026-03-03T13:36:20.505470482Z" level=info msg="connecting to shim 11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372" address="unix:///run/containerd/s/14de369ef96fdd8e9d4962b5ee7339e84e81bbe6623bcf3d1245b9af514d4e9a" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:36:20.532075 systemd[1]: Started cri-containerd-11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372.scope - libcontainer container 11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372. Mar 3 13:36:20.576080 systemd-networkd[1492]: cali59ecfdaed67: Gained IPv6LL Mar 3 13:36:20.589467 kubelet[2794]: I0303 13:36:20.589334 2794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-hjkbg" podStartSLOduration=43.589318655 podStartE2EDuration="43.589318655s" podCreationTimestamp="2026-03-03 13:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 13:36:20.589060175 +0000 UTC m=+50.352178440" watchObservedRunningTime="2026-03-03 13:36:20.589318655 +0000 UTC m=+50.352436910" Mar 3 13:36:20.664297 containerd[1629]: time="2026-03-03T13:36:20.664136244Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c5859fd4b-4t5vt,Uid:14132d38-7264-4753-918c-53ff70118c6f,Namespace:calico-system,Attempt:0,} returns sandbox id \"11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372\"" Mar 3 13:36:21.150363 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount901554587.mount: Deactivated successfully. Mar 3 13:36:21.163620 containerd[1629]: time="2026-03-03T13:36:21.163564373Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:21.164560 containerd[1629]: time="2026-03-03T13:36:21.164413614Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Mar 3 13:36:21.165714 containerd[1629]: time="2026-03-03T13:36:21.165696426Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:21.167209 containerd[1629]: time="2026-03-03T13:36:21.167186149Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:21.167727 containerd[1629]: time="2026-03-03T13:36:21.167659809Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 2.481981532s" Mar 3 13:36:21.167827 containerd[1629]: time="2026-03-03T13:36:21.167790039Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Mar 3 13:36:21.169332 containerd[1629]: time="2026-03-03T13:36:21.169306651Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 3 13:36:21.171922 containerd[1629]: time="2026-03-03T13:36:21.171108415Z" level=info msg="CreateContainer within sandbox \"0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 3 13:36:21.176942 containerd[1629]: time="2026-03-03T13:36:21.175526221Z" level=info msg="Container a146e6644f9310d1d7bb075e08ed34d3fa134b12c55b00b6ff2cfb26a298590e: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:21.194663 containerd[1629]: time="2026-03-03T13:36:21.194619121Z" level=info msg="CreateContainer within sandbox \"0fbd0b2173b6d7bdedcba47212e3908bb078ce0d249318e8937d4af6fb7fa571\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"a146e6644f9310d1d7bb075e08ed34d3fa134b12c55b00b6ff2cfb26a298590e\"" Mar 3 13:36:21.195197 containerd[1629]: time="2026-03-03T13:36:21.195180663Z" level=info msg="StartContainer for \"a146e6644f9310d1d7bb075e08ed34d3fa134b12c55b00b6ff2cfb26a298590e\"" Mar 3 13:36:21.196337 containerd[1629]: time="2026-03-03T13:36:21.195936644Z" level=info msg="connecting to shim a146e6644f9310d1d7bb075e08ed34d3fa134b12c55b00b6ff2cfb26a298590e" address="unix:///run/containerd/s/a0f6076acfdbb03457724678adb7f9fadaea36d9456d0bfa35139f71f1db118b" protocol=ttrpc version=3 Mar 3 13:36:21.215011 systemd[1]: Started cri-containerd-a146e6644f9310d1d7bb075e08ed34d3fa134b12c55b00b6ff2cfb26a298590e.scope - libcontainer container a146e6644f9310d1d7bb075e08ed34d3fa134b12c55b00b6ff2cfb26a298590e. Mar 3 13:36:21.256163 containerd[1629]: time="2026-03-03T13:36:21.256129628Z" level=info msg="StartContainer for \"a146e6644f9310d1d7bb075e08ed34d3fa134b12c55b00b6ff2cfb26a298590e\" returns successfully" Mar 3 13:36:21.791256 systemd-networkd[1492]: cali4a1fb60f99b: Gained IPv6LL Mar 3 13:36:25.192095 containerd[1629]: time="2026-03-03T13:36:25.192044834Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:25.193091 containerd[1629]: time="2026-03-03T13:36:25.192935526Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Mar 3 13:36:25.193707 containerd[1629]: time="2026-03-03T13:36:25.193686876Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:25.195198 containerd[1629]: time="2026-03-03T13:36:25.195175618Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:25.195625 containerd[1629]: time="2026-03-03T13:36:25.195607250Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 4.026276749s" Mar 3 13:36:25.195688 containerd[1629]: time="2026-03-03T13:36:25.195677680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Mar 3 13:36:25.196960 containerd[1629]: time="2026-03-03T13:36:25.196942812Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 3 13:36:25.204441 containerd[1629]: time="2026-03-03T13:36:25.203554431Z" level=info msg="CreateContainer within sandbox \"8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 3 13:36:25.211063 containerd[1629]: time="2026-03-03T13:36:25.211043132Z" level=info msg="Container 1a97205b144df52182ab4a4bb9195d558e6733c4c8e3be31e10543c647b4ed4f: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:25.226160 containerd[1629]: time="2026-03-03T13:36:25.226116595Z" level=info msg="CreateContainer within sandbox \"8c273c9bafed25f0ec4069ad22d5fe5f529092b5bd8f5b78a25ba937c9d2a278\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"1a97205b144df52182ab4a4bb9195d558e6733c4c8e3be31e10543c647b4ed4f\"" Mar 3 13:36:25.226692 containerd[1629]: time="2026-03-03T13:36:25.226616016Z" level=info msg="StartContainer for \"1a97205b144df52182ab4a4bb9195d558e6733c4c8e3be31e10543c647b4ed4f\"" Mar 3 13:36:25.227832 containerd[1629]: time="2026-03-03T13:36:25.227760067Z" level=info msg="connecting to shim 1a97205b144df52182ab4a4bb9195d558e6733c4c8e3be31e10543c647b4ed4f" address="unix:///run/containerd/s/a99d924b260c1a19c144bf575cb18f8dac15e289917a2de1e57ef30bad1c05c0" protocol=ttrpc version=3 Mar 3 13:36:25.247053 systemd[1]: Started cri-containerd-1a97205b144df52182ab4a4bb9195d558e6733c4c8e3be31e10543c647b4ed4f.scope - libcontainer container 1a97205b144df52182ab4a4bb9195d558e6733c4c8e3be31e10543c647b4ed4f. Mar 3 13:36:25.302166 containerd[1629]: time="2026-03-03T13:36:25.302106649Z" level=info msg="StartContainer for \"1a97205b144df52182ab4a4bb9195d558e6733c4c8e3be31e10543c647b4ed4f\" returns successfully" Mar 3 13:36:25.612298 kubelet[2794]: I0303 13:36:25.612242 2794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5f4897f75-tx9xp" podStartSLOduration=5.586647886 podStartE2EDuration="18.612227462s" podCreationTimestamp="2026-03-03 13:36:07 +0000 UTC" firstStartedPulling="2026-03-03 13:36:08.142862774 +0000 UTC m=+37.905981029" lastFinishedPulling="2026-03-03 13:36:21.16844235 +0000 UTC m=+50.931560605" observedRunningTime="2026-03-03 13:36:21.606187805 +0000 UTC m=+51.369306070" watchObservedRunningTime="2026-03-03 13:36:25.612227462 +0000 UTC m=+55.375345717" Mar 3 13:36:25.660502 kubelet[2794]: I0303 13:36:25.660437 2794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-8ccd5dff9-mk9lp" podStartSLOduration=31.227086337 podStartE2EDuration="38.660322614s" podCreationTimestamp="2026-03-03 13:35:47 +0000 UTC" firstStartedPulling="2026-03-03 13:36:17.762997864 +0000 UTC m=+47.526116129" lastFinishedPulling="2026-03-03 13:36:25.196234141 +0000 UTC m=+54.959352406" observedRunningTime="2026-03-03 13:36:25.613220254 +0000 UTC m=+55.376338519" watchObservedRunningTime="2026-03-03 13:36:25.660322614 +0000 UTC m=+55.423440879" Mar 3 13:36:28.593483 containerd[1629]: time="2026-03-03T13:36:28.593440106Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:28.594563 containerd[1629]: time="2026-03-03T13:36:28.594537947Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Mar 3 13:36:28.595615 containerd[1629]: time="2026-03-03T13:36:28.595569889Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:28.597706 containerd[1629]: time="2026-03-03T13:36:28.597670792Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:28.598313 containerd[1629]: time="2026-03-03T13:36:28.598285853Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 3.401321821s" Mar 3 13:36:28.598376 containerd[1629]: time="2026-03-03T13:36:28.598317153Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 3 13:36:28.600308 containerd[1629]: time="2026-03-03T13:36:28.600275705Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 3 13:36:28.602855 containerd[1629]: time="2026-03-03T13:36:28.602834330Z" level=info msg="CreateContainer within sandbox \"13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 3 13:36:28.611804 containerd[1629]: time="2026-03-03T13:36:28.611349462Z" level=info msg="Container 93742775fc1cceac0f04034c29fad4b22885b6a3900b2085187128ab2414fe7e: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:28.615088 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1864851997.mount: Deactivated successfully. Mar 3 13:36:28.619893 containerd[1629]: time="2026-03-03T13:36:28.619864224Z" level=info msg="CreateContainer within sandbox \"13aa733ab3ba34032b3066bd6e99394419033262e49d1df119ccd9578e229afb\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"93742775fc1cceac0f04034c29fad4b22885b6a3900b2085187128ab2414fe7e\"" Mar 3 13:36:28.620492 containerd[1629]: time="2026-03-03T13:36:28.620468245Z" level=info msg="StartContainer for \"93742775fc1cceac0f04034c29fad4b22885b6a3900b2085187128ab2414fe7e\"" Mar 3 13:36:28.621614 containerd[1629]: time="2026-03-03T13:36:28.621571937Z" level=info msg="connecting to shim 93742775fc1cceac0f04034c29fad4b22885b6a3900b2085187128ab2414fe7e" address="unix:///run/containerd/s/a6dcf9d13db60adc3815b1a2b4ca832f4f3c68358de6f8fc99cc041fe40eff22" protocol=ttrpc version=3 Mar 3 13:36:28.650250 systemd[1]: Started cri-containerd-93742775fc1cceac0f04034c29fad4b22885b6a3900b2085187128ab2414fe7e.scope - libcontainer container 93742775fc1cceac0f04034c29fad4b22885b6a3900b2085187128ab2414fe7e. Mar 3 13:36:28.703809 containerd[1629]: time="2026-03-03T13:36:28.703779996Z" level=info msg="StartContainer for \"93742775fc1cceac0f04034c29fad4b22885b6a3900b2085187128ab2414fe7e\" returns successfully" Mar 3 13:36:29.134867 containerd[1629]: time="2026-03-03T13:36:29.133930108Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:29.137779 containerd[1629]: time="2026-03-03T13:36:29.137739204Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 3 13:36:29.144933 containerd[1629]: time="2026-03-03T13:36:29.143868013Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 543.059547ms" Mar 3 13:36:29.145124 containerd[1629]: time="2026-03-03T13:36:29.145099404Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 3 13:36:29.149434 containerd[1629]: time="2026-03-03T13:36:29.149406401Z" level=info msg="CreateContainer within sandbox \"11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 3 13:36:29.162507 containerd[1629]: time="2026-03-03T13:36:29.162475079Z" level=info msg="Container 6116a24ae48a8de8866561a32a73d605d2a2c6e1f1105897052d3958a5e2b7f9: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:29.173196 containerd[1629]: time="2026-03-03T13:36:29.173009294Z" level=info msg="CreateContainer within sandbox \"11833cc6b303fe7a85592dbdc78f7d2d7f02c026acdcc65f95264333b324a372\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"6116a24ae48a8de8866561a32a73d605d2a2c6e1f1105897052d3958a5e2b7f9\"" Mar 3 13:36:29.173608 containerd[1629]: time="2026-03-03T13:36:29.173591766Z" level=info msg="StartContainer for \"6116a24ae48a8de8866561a32a73d605d2a2c6e1f1105897052d3958a5e2b7f9\"" Mar 3 13:36:29.175328 containerd[1629]: time="2026-03-03T13:36:29.175307498Z" level=info msg="connecting to shim 6116a24ae48a8de8866561a32a73d605d2a2c6e1f1105897052d3958a5e2b7f9" address="unix:///run/containerd/s/14de369ef96fdd8e9d4962b5ee7339e84e81bbe6623bcf3d1245b9af514d4e9a" protocol=ttrpc version=3 Mar 3 13:36:29.197016 systemd[1]: Started cri-containerd-6116a24ae48a8de8866561a32a73d605d2a2c6e1f1105897052d3958a5e2b7f9.scope - libcontainer container 6116a24ae48a8de8866561a32a73d605d2a2c6e1f1105897052d3958a5e2b7f9. Mar 3 13:36:29.240526 containerd[1629]: time="2026-03-03T13:36:29.240496332Z" level=info msg="StartContainer for \"6116a24ae48a8de8866561a32a73d605d2a2c6e1f1105897052d3958a5e2b7f9\" returns successfully" Mar 3 13:36:29.624040 kubelet[2794]: I0303 13:36:29.623891 2794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-c5859fd4b-4t5vt" podStartSLOduration=35.143251103 podStartE2EDuration="43.623877093s" podCreationTimestamp="2026-03-03 13:35:46 +0000 UTC" firstStartedPulling="2026-03-03 13:36:20.666008766 +0000 UTC m=+50.429127021" lastFinishedPulling="2026-03-03 13:36:29.146634746 +0000 UTC m=+58.909753011" observedRunningTime="2026-03-03 13:36:29.623110482 +0000 UTC m=+59.386228737" watchObservedRunningTime="2026-03-03 13:36:29.623877093 +0000 UTC m=+59.386995358" Mar 3 13:36:29.639206 kubelet[2794]: I0303 13:36:29.639162 2794 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-c5859fd4b-l4vrj" podStartSLOduration=32.875426455 podStartE2EDuration="43.639148705s" podCreationTimestamp="2026-03-03 13:35:46 +0000 UTC" firstStartedPulling="2026-03-03 13:36:17.835899325 +0000 UTC m=+47.599017580" lastFinishedPulling="2026-03-03 13:36:28.599621565 +0000 UTC m=+58.362739830" observedRunningTime="2026-03-03 13:36:29.638729334 +0000 UTC m=+59.401847589" watchObservedRunningTime="2026-03-03 13:36:29.639148705 +0000 UTC m=+59.402266960" Mar 3 13:36:30.621895 kubelet[2794]: I0303 13:36:30.621792 2794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 3 13:36:51.461749 kubelet[2794]: I0303 13:36:51.461117 2794 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 3 13:37:17.069624 systemd[1]: Started sshd@7-95.217.157.231:22-20.161.92.111:43212.service - OpenSSH per-connection server daemon (20.161.92.111:43212). Mar 3 13:37:17.717940 sshd[5633]: Accepted publickey for core from 20.161.92.111 port 43212 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:17.719353 sshd-session[5633]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:17.724492 systemd-logind[1601]: New session 8 of user core. Mar 3 13:37:17.728037 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 3 13:37:18.172043 sshd[5636]: Connection closed by 20.161.92.111 port 43212 Mar 3 13:37:18.173159 sshd-session[5633]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:18.181691 systemd-logind[1601]: Session 8 logged out. Waiting for processes to exit. Mar 3 13:37:18.183128 systemd[1]: sshd@7-95.217.157.231:22-20.161.92.111:43212.service: Deactivated successfully. Mar 3 13:37:18.191675 systemd[1]: session-8.scope: Deactivated successfully. Mar 3 13:37:18.197709 systemd-logind[1601]: Removed session 8. Mar 3 13:37:23.311182 systemd[1]: Started sshd@8-95.217.157.231:22-20.161.92.111:52222.service - OpenSSH per-connection server daemon (20.161.92.111:52222). Mar 3 13:37:23.961483 sshd[5649]: Accepted publickey for core from 20.161.92.111 port 52222 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:23.962777 sshd-session[5649]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:23.967398 systemd-logind[1601]: New session 9 of user core. Mar 3 13:37:23.972020 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 3 13:37:24.413704 sshd[5652]: Connection closed by 20.161.92.111 port 52222 Mar 3 13:37:24.415005 sshd-session[5649]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:24.424062 systemd[1]: sshd@8-95.217.157.231:22-20.161.92.111:52222.service: Deactivated successfully. Mar 3 13:37:24.429487 systemd[1]: session-9.scope: Deactivated successfully. Mar 3 13:37:24.432753 systemd-logind[1601]: Session 9 logged out. Waiting for processes to exit. Mar 3 13:37:24.436212 systemd-logind[1601]: Removed session 9. Mar 3 13:37:29.549966 systemd[1]: Started sshd@9-95.217.157.231:22-20.161.92.111:52236.service - OpenSSH per-connection server daemon (20.161.92.111:52236). Mar 3 13:37:30.215232 sshd[5707]: Accepted publickey for core from 20.161.92.111 port 52236 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:30.217574 sshd-session[5707]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:30.225058 systemd-logind[1601]: New session 10 of user core. Mar 3 13:37:30.228066 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 3 13:37:30.677876 sshd[5710]: Connection closed by 20.161.92.111 port 52236 Mar 3 13:37:30.680559 sshd-session[5707]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:30.683686 systemd[1]: sshd@9-95.217.157.231:22-20.161.92.111:52236.service: Deactivated successfully. Mar 3 13:37:30.687655 systemd[1]: session-10.scope: Deactivated successfully. Mar 3 13:37:30.689921 systemd-logind[1601]: Session 10 logged out. Waiting for processes to exit. Mar 3 13:37:30.691066 systemd-logind[1601]: Removed session 10. Mar 3 13:37:30.808631 systemd[1]: Started sshd@10-95.217.157.231:22-20.161.92.111:44600.service - OpenSSH per-connection server daemon (20.161.92.111:44600). Mar 3 13:37:31.454825 sshd[5741]: Accepted publickey for core from 20.161.92.111 port 44600 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:31.458688 sshd-session[5741]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:31.467258 systemd-logind[1601]: New session 11 of user core. Mar 3 13:37:31.483227 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 3 13:37:31.939303 sshd[5744]: Connection closed by 20.161.92.111 port 44600 Mar 3 13:37:31.941032 sshd-session[5741]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:31.945393 systemd[1]: sshd@10-95.217.157.231:22-20.161.92.111:44600.service: Deactivated successfully. Mar 3 13:37:31.948725 systemd[1]: session-11.scope: Deactivated successfully. Mar 3 13:37:31.950891 systemd-logind[1601]: Session 11 logged out. Waiting for processes to exit. Mar 3 13:37:31.953403 systemd-logind[1601]: Removed session 11. Mar 3 13:37:32.074291 systemd[1]: Started sshd@11-95.217.157.231:22-20.161.92.111:44610.service - OpenSSH per-connection server daemon (20.161.92.111:44610). Mar 3 13:37:32.736843 sshd[5753]: Accepted publickey for core from 20.161.92.111 port 44610 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:32.740389 sshd-session[5753]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:32.749646 systemd-logind[1601]: New session 12 of user core. Mar 3 13:37:32.761026 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 3 13:37:33.207104 sshd[5766]: Connection closed by 20.161.92.111 port 44610 Mar 3 13:37:33.208628 sshd-session[5753]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:33.216773 systemd[1]: sshd@11-95.217.157.231:22-20.161.92.111:44610.service: Deactivated successfully. Mar 3 13:37:33.217130 systemd-logind[1601]: Session 12 logged out. Waiting for processes to exit. Mar 3 13:37:33.221031 systemd[1]: session-12.scope: Deactivated successfully. Mar 3 13:37:33.224607 systemd-logind[1601]: Removed session 12. Mar 3 13:37:38.345263 systemd[1]: Started sshd@12-95.217.157.231:22-20.161.92.111:44622.service - OpenSSH per-connection server daemon (20.161.92.111:44622). Mar 3 13:37:39.014133 sshd[5782]: Accepted publickey for core from 20.161.92.111 port 44622 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:39.016024 sshd-session[5782]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:39.022832 systemd-logind[1601]: New session 13 of user core. Mar 3 13:37:39.028219 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 3 13:37:39.483742 sshd[5785]: Connection closed by 20.161.92.111 port 44622 Mar 3 13:37:39.485258 sshd-session[5782]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:39.489246 systemd-logind[1601]: Session 13 logged out. Waiting for processes to exit. Mar 3 13:37:39.490283 systemd[1]: sshd@12-95.217.157.231:22-20.161.92.111:44622.service: Deactivated successfully. Mar 3 13:37:39.492834 systemd[1]: session-13.scope: Deactivated successfully. Mar 3 13:37:39.495132 systemd-logind[1601]: Removed session 13. Mar 3 13:37:39.618579 systemd[1]: Started sshd@13-95.217.157.231:22-20.161.92.111:44638.service - OpenSSH per-connection server daemon (20.161.92.111:44638). Mar 3 13:37:40.261828 sshd[5797]: Accepted publickey for core from 20.161.92.111 port 44638 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:40.263868 sshd-session[5797]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:40.270525 systemd-logind[1601]: New session 14 of user core. Mar 3 13:37:40.282227 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 3 13:37:40.928080 sshd[5800]: Connection closed by 20.161.92.111 port 44638 Mar 3 13:37:40.929957 sshd-session[5797]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:40.934980 systemd-logind[1601]: Session 14 logged out. Waiting for processes to exit. Mar 3 13:37:40.936154 systemd[1]: sshd@13-95.217.157.231:22-20.161.92.111:44638.service: Deactivated successfully. Mar 3 13:37:40.938747 systemd[1]: session-14.scope: Deactivated successfully. Mar 3 13:37:40.940862 systemd-logind[1601]: Removed session 14. Mar 3 13:37:41.070347 systemd[1]: Started sshd@14-95.217.157.231:22-20.161.92.111:57534.service - OpenSSH per-connection server daemon (20.161.92.111:57534). Mar 3 13:37:41.730480 sshd[5816]: Accepted publickey for core from 20.161.92.111 port 57534 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:41.733170 sshd-session[5816]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:41.743009 systemd-logind[1601]: New session 15 of user core. Mar 3 13:37:41.749248 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 3 13:37:42.792204 sshd[5819]: Connection closed by 20.161.92.111 port 57534 Mar 3 13:37:42.794207 sshd-session[5816]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:42.808545 systemd-logind[1601]: Session 15 logged out. Waiting for processes to exit. Mar 3 13:37:42.809848 systemd[1]: sshd@14-95.217.157.231:22-20.161.92.111:57534.service: Deactivated successfully. Mar 3 13:37:42.813445 systemd[1]: session-15.scope: Deactivated successfully. Mar 3 13:37:42.815774 systemd-logind[1601]: Removed session 15. Mar 3 13:37:42.934518 systemd[1]: Started sshd@15-95.217.157.231:22-20.161.92.111:57542.service - OpenSSH per-connection server daemon (20.161.92.111:57542). Mar 3 13:37:43.623042 sshd[5856]: Accepted publickey for core from 20.161.92.111 port 57542 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:43.625399 sshd-session[5856]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:43.634287 systemd-logind[1601]: New session 16 of user core. Mar 3 13:37:43.642251 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 3 13:37:44.163082 sshd[5859]: Connection closed by 20.161.92.111 port 57542 Mar 3 13:37:44.165313 sshd-session[5856]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:44.175087 systemd[1]: sshd@15-95.217.157.231:22-20.161.92.111:57542.service: Deactivated successfully. Mar 3 13:37:44.180840 systemd[1]: session-16.scope: Deactivated successfully. Mar 3 13:37:44.183623 systemd-logind[1601]: Session 16 logged out. Waiting for processes to exit. Mar 3 13:37:44.187296 systemd-logind[1601]: Removed session 16. Mar 3 13:37:44.306777 systemd[1]: Started sshd@16-95.217.157.231:22-20.161.92.111:57554.service - OpenSSH per-connection server daemon (20.161.92.111:57554). Mar 3 13:37:44.957976 sshd[5869]: Accepted publickey for core from 20.161.92.111 port 57554 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:44.959953 sshd-session[5869]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:44.966031 systemd-logind[1601]: New session 17 of user core. Mar 3 13:37:44.971070 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 3 13:37:45.392471 sshd[5920]: Connection closed by 20.161.92.111 port 57554 Mar 3 13:37:45.393163 sshd-session[5869]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:45.399466 systemd[1]: sshd@16-95.217.157.231:22-20.161.92.111:57554.service: Deactivated successfully. Mar 3 13:37:45.400251 systemd-logind[1601]: Session 17 logged out. Waiting for processes to exit. Mar 3 13:37:45.403308 systemd[1]: session-17.scope: Deactivated successfully. Mar 3 13:37:45.407400 systemd-logind[1601]: Removed session 17. Mar 3 13:37:50.522247 systemd[1]: Started sshd@17-95.217.157.231:22-20.161.92.111:39410.service - OpenSSH per-connection server daemon (20.161.92.111:39410). Mar 3 13:37:51.170592 sshd[5957]: Accepted publickey for core from 20.161.92.111 port 39410 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:51.173089 sshd-session[5957]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:51.182220 systemd-logind[1601]: New session 18 of user core. Mar 3 13:37:51.189507 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 3 13:37:51.625173 sshd[5960]: Connection closed by 20.161.92.111 port 39410 Mar 3 13:37:51.626298 sshd-session[5957]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:51.631663 systemd[1]: sshd@17-95.217.157.231:22-20.161.92.111:39410.service: Deactivated successfully. Mar 3 13:37:51.635000 systemd[1]: session-18.scope: Deactivated successfully. Mar 3 13:37:51.637169 systemd-logind[1601]: Session 18 logged out. Waiting for processes to exit. Mar 3 13:37:51.640118 systemd-logind[1601]: Removed session 18. Mar 3 13:37:56.761492 systemd[1]: Started sshd@18-95.217.157.231:22-20.161.92.111:39414.service - OpenSSH per-connection server daemon (20.161.92.111:39414). Mar 3 13:37:57.394269 sshd[5994]: Accepted publickey for core from 20.161.92.111 port 39414 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:57.395820 sshd-session[5994]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:57.401002 systemd-logind[1601]: New session 19 of user core. Mar 3 13:37:57.405067 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 3 13:37:57.866209 sshd[5997]: Connection closed by 20.161.92.111 port 39414 Mar 3 13:37:57.866793 sshd-session[5994]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:57.870861 systemd-logind[1601]: Session 19 logged out. Waiting for processes to exit. Mar 3 13:37:57.871785 systemd[1]: sshd@18-95.217.157.231:22-20.161.92.111:39414.service: Deactivated successfully. Mar 3 13:37:57.874089 systemd[1]: session-19.scope: Deactivated successfully. Mar 3 13:37:57.875195 systemd-logind[1601]: Removed session 19. Mar 3 13:38:47.779216 systemd[1]: cri-containerd-64e1e855e086e4e1da11c3a82238ec1c412598b8c5a8af91f669b5f89a18f386.scope: Deactivated successfully. Mar 3 13:38:47.780185 systemd[1]: cri-containerd-64e1e855e086e4e1da11c3a82238ec1c412598b8c5a8af91f669b5f89a18f386.scope: Consumed 3.668s CPU time, 62.8M memory peak, 64K read from disk. Mar 3 13:38:47.785151 containerd[1629]: time="2026-03-03T13:38:47.785075479Z" level=info msg="received container exit event container_id:\"64e1e855e086e4e1da11c3a82238ec1c412598b8c5a8af91f669b5f89a18f386\" id:\"64e1e855e086e4e1da11c3a82238ec1c412598b8c5a8af91f669b5f89a18f386\" pid:2647 exit_status:1 exited_at:{seconds:1772545127 nanos:784617849}" Mar 3 13:38:47.821320 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-64e1e855e086e4e1da11c3a82238ec1c412598b8c5a8af91f669b5f89a18f386-rootfs.mount: Deactivated successfully. Mar 3 13:38:47.953763 kubelet[2794]: I0303 13:38:47.953696 2794 scope.go:117] "RemoveContainer" containerID="64e1e855e086e4e1da11c3a82238ec1c412598b8c5a8af91f669b5f89a18f386" Mar 3 13:38:47.956606 containerd[1629]: time="2026-03-03T13:38:47.956532782Z" level=info msg="CreateContainer within sandbox \"be840f150a21ea77628b92e0279f29c482b42fad7dcfd2043c22b97aca0ea243\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 3 13:38:47.967973 containerd[1629]: time="2026-03-03T13:38:47.967799624Z" level=info msg="Container ff82a6e76989cbcd538e7682d2ff09afe1f11aaa8e89e917c248489efbbc1628: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:38:47.982110 containerd[1629]: time="2026-03-03T13:38:47.982059144Z" level=info msg="CreateContainer within sandbox \"be840f150a21ea77628b92e0279f29c482b42fad7dcfd2043c22b97aca0ea243\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"ff82a6e76989cbcd538e7682d2ff09afe1f11aaa8e89e917c248489efbbc1628\"" Mar 3 13:38:47.982957 containerd[1629]: time="2026-03-03T13:38:47.982629413Z" level=info msg="StartContainer for \"ff82a6e76989cbcd538e7682d2ff09afe1f11aaa8e89e917c248489efbbc1628\"" Mar 3 13:38:47.984508 containerd[1629]: time="2026-03-03T13:38:47.984467052Z" level=info msg="connecting to shim ff82a6e76989cbcd538e7682d2ff09afe1f11aaa8e89e917c248489efbbc1628" address="unix:///run/containerd/s/dd938fddc3d70fb126e1c39df700f62eb4490e05822b84d5290c03c066adc88c" protocol=ttrpc version=3 Mar 3 13:38:48.013060 systemd[1]: Started cri-containerd-ff82a6e76989cbcd538e7682d2ff09afe1f11aaa8e89e917c248489efbbc1628.scope - libcontainer container ff82a6e76989cbcd538e7682d2ff09afe1f11aaa8e89e917c248489efbbc1628. Mar 3 13:38:48.066994 containerd[1629]: time="2026-03-03T13:38:48.066731002Z" level=info msg="StartContainer for \"ff82a6e76989cbcd538e7682d2ff09afe1f11aaa8e89e917c248489efbbc1628\" returns successfully" Mar 3 13:38:48.230255 kubelet[2794]: E0303 13:38:48.230090 2794 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:42422->10.0.0.2:2379: read: connection timed out" Mar 3 13:38:48.255086 systemd[1]: cri-containerd-0b1e4b35a8df3a681fca107deb7daa6a4a23ff0003598dd95357954b80fb9484.scope: Deactivated successfully. Mar 3 13:38:48.255398 systemd[1]: cri-containerd-0b1e4b35a8df3a681fca107deb7daa6a4a23ff0003598dd95357954b80fb9484.scope: Consumed 8.890s CPU time, 146.5M memory peak, 880K read from disk. Mar 3 13:38:48.257761 containerd[1629]: time="2026-03-03T13:38:48.257740324Z" level=info msg="received container exit event container_id:\"0b1e4b35a8df3a681fca107deb7daa6a4a23ff0003598dd95357954b80fb9484\" id:\"0b1e4b35a8df3a681fca107deb7daa6a4a23ff0003598dd95357954b80fb9484\" pid:3119 exit_status:1 exited_at:{seconds:1772545128 nanos:256448734}" Mar 3 13:38:48.279716 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0b1e4b35a8df3a681fca107deb7daa6a4a23ff0003598dd95357954b80fb9484-rootfs.mount: Deactivated successfully. Mar 3 13:38:48.957559 kubelet[2794]: I0303 13:38:48.957501 2794 scope.go:117] "RemoveContainer" containerID="0b1e4b35a8df3a681fca107deb7daa6a4a23ff0003598dd95357954b80fb9484" Mar 3 13:38:48.959876 containerd[1629]: time="2026-03-03T13:38:48.959830734Z" level=info msg="CreateContainer within sandbox \"bbdf6954b98a2719e6bc610b7681a1ea78d90a97b8c06c6a4aae3d3aa6f0f686\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 3 13:38:48.970480 containerd[1629]: time="2026-03-03T13:38:48.970443387Z" level=info msg="Container f738ae28cc4fac2c2a01da0b410bee7013d0a1d5d849aaf5e4edc81bf0b69309: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:38:48.975449 containerd[1629]: time="2026-03-03T13:38:48.975397464Z" level=info msg="CreateContainer within sandbox \"bbdf6954b98a2719e6bc610b7681a1ea78d90a97b8c06c6a4aae3d3aa6f0f686\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"f738ae28cc4fac2c2a01da0b410bee7013d0a1d5d849aaf5e4edc81bf0b69309\"" Mar 3 13:38:48.975782 containerd[1629]: time="2026-03-03T13:38:48.975770333Z" level=info msg="StartContainer for \"f738ae28cc4fac2c2a01da0b410bee7013d0a1d5d849aaf5e4edc81bf0b69309\"" Mar 3 13:38:48.976699 containerd[1629]: time="2026-03-03T13:38:48.976615562Z" level=info msg="connecting to shim f738ae28cc4fac2c2a01da0b410bee7013d0a1d5d849aaf5e4edc81bf0b69309" address="unix:///run/containerd/s/621d83b8f69a5b0bdc3c06d9def17c77797ffdb5b9804d0aae16f089ad672871" protocol=ttrpc version=3 Mar 3 13:38:48.978200 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1391933380.mount: Deactivated successfully. Mar 3 13:38:49.000005 systemd[1]: Started cri-containerd-f738ae28cc4fac2c2a01da0b410bee7013d0a1d5d849aaf5e4edc81bf0b69309.scope - libcontainer container f738ae28cc4fac2c2a01da0b410bee7013d0a1d5d849aaf5e4edc81bf0b69309. Mar 3 13:38:49.034305 containerd[1629]: time="2026-03-03T13:38:49.034260341Z" level=info msg="StartContainer for \"f738ae28cc4fac2c2a01da0b410bee7013d0a1d5d849aaf5e4edc81bf0b69309\" returns successfully" Mar 3 13:38:51.619640 kubelet[2794]: E0303 13:38:51.617147 2794 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:42234->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-2-4-7-599052a073.189958657602cde5 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-2-4-7-599052a073,UID:5fb806e12fed151e952edd9e40abf303,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Liveness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-7-599052a073,},FirstTimestamp:2026-03-03 13:38:41.141837285 +0000 UTC m=+190.904955570,LastTimestamp:2026-03-03 13:38:41.141837285 +0000 UTC m=+190.904955570,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-7-599052a073,}" Mar 3 13:38:53.153065 kubelet[2794]: I0303 13:38:53.153016 2794 status_manager.go:895] "Failed to get status for pod" podUID="0580c5d8bfde85509b022740c9e9a5b4" pod="kube-system/kube-controller-manager-ci-4459-2-4-7-599052a073" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:42334->10.0.0.2:2379: read: connection timed out" Mar 3 13:38:53.526705 systemd[1]: cri-containerd-3827b2d26b9e0685086a76f7c52c617a015be952bdb07c65ea6913172d57817d.scope: Deactivated successfully. Mar 3 13:38:53.527272 systemd[1]: cri-containerd-3827b2d26b9e0685086a76f7c52c617a015be952bdb07c65ea6913172d57817d.scope: Consumed 1.842s CPU time, 21.8M memory peak. Mar 3 13:38:53.531345 containerd[1629]: time="2026-03-03T13:38:53.531174059Z" level=info msg="received container exit event container_id:\"3827b2d26b9e0685086a76f7c52c617a015be952bdb07c65ea6913172d57817d\" id:\"3827b2d26b9e0685086a76f7c52c617a015be952bdb07c65ea6913172d57817d\" pid:2622 exit_status:1 exited_at:{seconds:1772545133 nanos:530022060}" Mar 3 13:38:53.576742 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3827b2d26b9e0685086a76f7c52c617a015be952bdb07c65ea6913172d57817d-rootfs.mount: Deactivated successfully. Mar 3 13:38:53.978635 kubelet[2794]: I0303 13:38:53.978313 2794 scope.go:117] "RemoveContainer" containerID="3827b2d26b9e0685086a76f7c52c617a015be952bdb07c65ea6913172d57817d" Mar 3 13:38:53.981181 containerd[1629]: time="2026-03-03T13:38:53.981142055Z" level=info msg="CreateContainer within sandbox \"738790978ae1921e080eb5c0d33effbd6cb435b3c9cf654db91b36bdc90da453\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 3 13:38:53.993044 containerd[1629]: time="2026-03-03T13:38:53.992977466Z" level=info msg="Container 690ced7c647d31fcb4e7080191470ae3a0bec4aaa32bd3a4ee848e542bfa60ba: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:38:54.005694 containerd[1629]: time="2026-03-03T13:38:54.005582928Z" level=info msg="CreateContainer within sandbox \"738790978ae1921e080eb5c0d33effbd6cb435b3c9cf654db91b36bdc90da453\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"690ced7c647d31fcb4e7080191470ae3a0bec4aaa32bd3a4ee848e542bfa60ba\"" Mar 3 13:38:54.006669 containerd[1629]: time="2026-03-03T13:38:54.006621317Z" level=info msg="StartContainer for \"690ced7c647d31fcb4e7080191470ae3a0bec4aaa32bd3a4ee848e542bfa60ba\"" Mar 3 13:38:54.008200 containerd[1629]: time="2026-03-03T13:38:54.008147066Z" level=info msg="connecting to shim 690ced7c647d31fcb4e7080191470ae3a0bec4aaa32bd3a4ee848e542bfa60ba" address="unix:///run/containerd/s/1a467ddfdd99ae933ead0d2a405f197da539f00084e93458368dddb3af09fad4" protocol=ttrpc version=3 Mar 3 13:38:54.037002 systemd[1]: Started cri-containerd-690ced7c647d31fcb4e7080191470ae3a0bec4aaa32bd3a4ee848e542bfa60ba.scope - libcontainer container 690ced7c647d31fcb4e7080191470ae3a0bec4aaa32bd3a4ee848e542bfa60ba. Mar 3 13:38:54.091401 containerd[1629]: time="2026-03-03T13:38:54.091368650Z" level=info msg="StartContainer for \"690ced7c647d31fcb4e7080191470ae3a0bec4aaa32bd3a4ee848e542bfa60ba\" returns successfully" Mar 3 13:38:58.230998 kubelet[2794]: E0303 13:38:58.230917 2794 controller.go:195] "Failed to update lease" err="Put \"https://95.217.157.231:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-7-599052a073?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 3 13:39:00.215004 systemd[1]: cri-containerd-f738ae28cc4fac2c2a01da0b410bee7013d0a1d5d849aaf5e4edc81bf0b69309.scope: Deactivated successfully. Mar 3 13:39:00.215680 containerd[1629]: time="2026-03-03T13:39:00.215067350Z" level=info msg="received container exit event container_id:\"f738ae28cc4fac2c2a01da0b410bee7013d0a1d5d849aaf5e4edc81bf0b69309\" id:\"f738ae28cc4fac2c2a01da0b410bee7013d0a1d5d849aaf5e4edc81bf0b69309\" pid:6255 exit_status:1 exited_at:{seconds:1772545140 nanos:214771789}" Mar 3 13:39:00.215622 systemd[1]: cri-containerd-f738ae28cc4fac2c2a01da0b410bee7013d0a1d5d849aaf5e4edc81bf0b69309.scope: Consumed 207ms CPU time, 38.6M memory peak, 1.3M read from disk. Mar 3 13:39:00.252610 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f738ae28cc4fac2c2a01da0b410bee7013d0a1d5d849aaf5e4edc81bf0b69309-rootfs.mount: Deactivated successfully. Mar 3 13:39:01.000865 kubelet[2794]: I0303 13:39:01.000837 2794 scope.go:117] "RemoveContainer" containerID="0b1e4b35a8df3a681fca107deb7daa6a4a23ff0003598dd95357954b80fb9484" Mar 3 13:39:01.001593 kubelet[2794]: I0303 13:39:01.001413 2794 scope.go:117] "RemoveContainer" containerID="f738ae28cc4fac2c2a01da0b410bee7013d0a1d5d849aaf5e4edc81bf0b69309" Mar 3 13:39:01.001593 kubelet[2794]: E0303 13:39:01.001541 2794 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-6bf85f8dd-vgkz5_tigera-operator(d3509e65-77d3-4f0e-9d9a-8b47fdce9585)\"" pod="tigera-operator/tigera-operator-6bf85f8dd-vgkz5" podUID="d3509e65-77d3-4f0e-9d9a-8b47fdce9585" Mar 3 13:39:01.003085 containerd[1629]: time="2026-03-03T13:39:01.003015526Z" level=info msg="RemoveContainer for \"0b1e4b35a8df3a681fca107deb7daa6a4a23ff0003598dd95357954b80fb9484\"" Mar 3 13:39:01.010124 containerd[1629]: time="2026-03-03T13:39:01.010086314Z" level=info msg="RemoveContainer for \"0b1e4b35a8df3a681fca107deb7daa6a4a23ff0003598dd95357954b80fb9484\" returns successfully" Mar 3 13:39:08.232442 kubelet[2794]: E0303 13:39:08.232164 2794 controller.go:195] "Failed to update lease" err="Put \"https://95.217.157.231:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-7-599052a073?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"