Mar 13 00:33:18.957008 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Mar 12 22:08:29 -00 2026 Mar 13 00:33:18.957031 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=a2116dc4421f78fe124deb19b9ad6d70a0cb4fc0b3349854f4ce4e2904d4925d Mar 13 00:33:18.957039 kernel: BIOS-provided physical RAM map: Mar 13 00:33:18.957045 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 13 00:33:18.957053 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ed3efff] usable Mar 13 00:33:18.957058 kernel: BIOS-e820: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Mar 13 00:33:18.957064 kernel: BIOS-e820: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Mar 13 00:33:18.957069 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Mar 13 00:33:18.957074 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Mar 13 00:33:18.957079 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Mar 13 00:33:18.957085 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Mar 13 00:33:18.957090 kernel: BIOS-e820: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Mar 13 00:33:18.957095 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 13 00:33:18.957102 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 13 00:33:18.957108 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Mar 13 00:33:18.957114 kernel: BIOS-e820: [mem 0x0000000100000000-0x0000000179ffffff] usable Mar 13 00:33:18.957119 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 13 00:33:18.957125 kernel: NX (Execute Disable) protection: active Mar 13 00:33:18.957132 kernel: APIC: Static calls initialized Mar 13 00:33:18.957138 kernel: e820: update [mem 0x7dfae018-0x7dfb7a57] usable ==> usable Mar 13 00:33:18.957143 kernel: e820: update [mem 0x7df72018-0x7dfad657] usable ==> usable Mar 13 00:33:18.957149 kernel: e820: update [mem 0x7df36018-0x7df71657] usable ==> usable Mar 13 00:33:18.957154 kernel: extended physical RAM map: Mar 13 00:33:18.957159 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 13 00:33:18.957165 kernel: reserve setup_data: [mem 0x0000000000100000-0x000000007df36017] usable Mar 13 00:33:18.957170 kernel: reserve setup_data: [mem 0x000000007df36018-0x000000007df71657] usable Mar 13 00:33:18.957175 kernel: reserve setup_data: [mem 0x000000007df71658-0x000000007df72017] usable Mar 13 00:33:18.957181 kernel: reserve setup_data: [mem 0x000000007df72018-0x000000007dfad657] usable Mar 13 00:33:18.957186 kernel: reserve setup_data: [mem 0x000000007dfad658-0x000000007dfae017] usable Mar 13 00:33:18.957194 kernel: reserve setup_data: [mem 0x000000007dfae018-0x000000007dfb7a57] usable Mar 13 00:33:18.957199 kernel: reserve setup_data: [mem 0x000000007dfb7a58-0x000000007ed3efff] usable Mar 13 00:33:18.957204 kernel: reserve setup_data: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Mar 13 00:33:18.957210 kernel: reserve setup_data: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Mar 13 00:33:18.957215 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Mar 13 00:33:18.957235 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Mar 13 00:33:18.957240 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Mar 13 00:33:18.957246 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Mar 13 00:33:18.957251 kernel: reserve setup_data: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Mar 13 00:33:18.957256 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 13 00:33:18.957262 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 13 00:33:18.957273 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Mar 13 00:33:18.957279 kernel: reserve setup_data: [mem 0x0000000100000000-0x0000000179ffffff] usable Mar 13 00:33:18.957284 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 13 00:33:18.957290 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Mar 13 00:33:18.957296 kernel: efi: SMBIOS=0x7f988000 SMBIOS 3.0=0x7f986000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7e845198 RNG=0x7fb73018 Mar 13 00:33:18.957304 kernel: random: crng init done Mar 13 00:33:18.957310 kernel: efi: Remove mem136: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Mar 13 00:33:18.957316 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Mar 13 00:33:18.957321 kernel: secureboot: Secure boot disabled Mar 13 00:33:18.957327 kernel: SMBIOS 3.0.0 present. Mar 13 00:33:18.957332 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Mar 13 00:33:18.957338 kernel: DMI: Memory slots populated: 1/1 Mar 13 00:33:18.957343 kernel: Hypervisor detected: KVM Mar 13 00:33:18.957349 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Mar 13 00:33:18.957354 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 13 00:33:18.957360 kernel: kvm-clock: using sched offset of 14148697815 cycles Mar 13 00:33:18.957368 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 13 00:33:18.957374 kernel: tsc: Detected 2399.998 MHz processor Mar 13 00:33:18.957380 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 13 00:33:18.957386 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 13 00:33:18.957392 kernel: last_pfn = 0x17a000 max_arch_pfn = 0x10000000000 Mar 13 00:33:18.957398 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Mar 13 00:33:18.957403 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 13 00:33:18.957409 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Mar 13 00:33:18.957415 kernel: Using GB pages for direct mapping Mar 13 00:33:18.957423 kernel: ACPI: Early table checksum verification disabled Mar 13 00:33:18.957429 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Mar 13 00:33:18.957434 kernel: ACPI: XSDT 0x000000007FB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Mar 13 00:33:18.957440 kernel: ACPI: FACP 0x000000007FB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:33:18.957446 kernel: ACPI: DSDT 0x000000007FB7A000 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:33:18.957452 kernel: ACPI: FACS 0x000000007FBDD000 000040 Mar 13 00:33:18.957458 kernel: ACPI: APIC 0x000000007FB78000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:33:18.957463 kernel: ACPI: HPET 0x000000007FB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:33:18.957469 kernel: ACPI: MCFG 0x000000007FB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:33:18.957477 kernel: ACPI: WAET 0x000000007FB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:33:18.957483 kernel: ACPI: BGRT 0x000000007FB74000 000038 (v01 INTEL EDK2 00000002 01000013) Mar 13 00:33:18.957489 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb79000-0x7fb790f3] Mar 13 00:33:18.957495 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb7a000-0x7fb7c442] Mar 13 00:33:18.957500 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Mar 13 00:33:18.957506 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb78000-0x7fb7807f] Mar 13 00:33:18.957512 kernel: ACPI: Reserving HPET table memory at [mem 0x7fb77000-0x7fb77037] Mar 13 00:33:18.957517 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb76000-0x7fb7603b] Mar 13 00:33:18.957523 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb75000-0x7fb75027] Mar 13 00:33:18.957531 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb74000-0x7fb74037] Mar 13 00:33:18.957537 kernel: No NUMA configuration found Mar 13 00:33:18.957543 kernel: Faking a node at [mem 0x0000000000000000-0x0000000179ffffff] Mar 13 00:33:18.957549 kernel: NODE_DATA(0) allocated [mem 0x179ff6dc0-0x179ffdfff] Mar 13 00:33:18.957554 kernel: Zone ranges: Mar 13 00:33:18.957560 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 13 00:33:18.957566 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 13 00:33:18.957571 kernel: Normal [mem 0x0000000100000000-0x0000000179ffffff] Mar 13 00:33:18.957577 kernel: Device empty Mar 13 00:33:18.957585 kernel: Movable zone start for each node Mar 13 00:33:18.957591 kernel: Early memory node ranges Mar 13 00:33:18.957597 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Mar 13 00:33:18.957602 kernel: node 0: [mem 0x0000000000100000-0x000000007ed3efff] Mar 13 00:33:18.957608 kernel: node 0: [mem 0x000000007ee00000-0x000000007f8ecfff] Mar 13 00:33:18.957614 kernel: node 0: [mem 0x000000007fbff000-0x000000007ff7bfff] Mar 13 00:33:18.957620 kernel: node 0: [mem 0x0000000100000000-0x0000000179ffffff] Mar 13 00:33:18.957626 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x0000000179ffffff] Mar 13 00:33:18.957632 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 13 00:33:18.957638 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Mar 13 00:33:18.957646 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Mar 13 00:33:18.957651 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Mar 13 00:33:18.957657 kernel: On node 0, zone Normal: 132 pages in unavailable ranges Mar 13 00:33:18.957663 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Mar 13 00:33:18.957669 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 13 00:33:18.957675 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 13 00:33:18.957680 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 13 00:33:18.957686 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 13 00:33:18.957692 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 13 00:33:18.957700 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 13 00:33:18.957706 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 13 00:33:18.957711 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 13 00:33:18.957717 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 13 00:33:18.957723 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Mar 13 00:33:18.957729 kernel: CPU topo: Max. logical packages: 1 Mar 13 00:33:18.957734 kernel: CPU topo: Max. logical dies: 1 Mar 13 00:33:18.957749 kernel: CPU topo: Max. dies per package: 1 Mar 13 00:33:18.957755 kernel: CPU topo: Max. threads per core: 1 Mar 13 00:33:18.957761 kernel: CPU topo: Num. cores per package: 2 Mar 13 00:33:18.957767 kernel: CPU topo: Num. threads per package: 2 Mar 13 00:33:18.957773 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Mar 13 00:33:18.957782 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 13 00:33:18.957787 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Mar 13 00:33:18.957793 kernel: Booting paravirtualized kernel on KVM Mar 13 00:33:18.957800 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 13 00:33:18.957816 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 13 00:33:18.957822 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Mar 13 00:33:18.957828 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Mar 13 00:33:18.957833 kernel: pcpu-alloc: [0] 0 1 Mar 13 00:33:18.957839 kernel: kvm-guest: PV spinlocks disabled, no host support Mar 13 00:33:18.957846 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=a2116dc4421f78fe124deb19b9ad6d70a0cb4fc0b3349854f4ce4e2904d4925d Mar 13 00:33:18.957852 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 13 00:33:18.957858 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 13 00:33:18.957864 kernel: Fallback order for Node 0: 0 Mar 13 00:33:18.957873 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1022792 Mar 13 00:33:18.957879 kernel: Policy zone: Normal Mar 13 00:33:18.957885 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 13 00:33:18.957890 kernel: software IO TLB: area num 2. Mar 13 00:33:18.957896 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 13 00:33:18.957903 kernel: ftrace: allocating 40099 entries in 157 pages Mar 13 00:33:18.957909 kernel: ftrace: allocated 157 pages with 5 groups Mar 13 00:33:18.957915 kernel: Dynamic Preempt: voluntary Mar 13 00:33:18.957921 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 13 00:33:18.957930 kernel: rcu: RCU event tracing is enabled. Mar 13 00:33:18.957937 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 13 00:33:18.957943 kernel: Trampoline variant of Tasks RCU enabled. Mar 13 00:33:18.957949 kernel: Rude variant of Tasks RCU enabled. Mar 13 00:33:18.957955 kernel: Tracing variant of Tasks RCU enabled. Mar 13 00:33:18.957961 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 13 00:33:18.957967 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 13 00:33:18.957974 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 13 00:33:18.957980 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 13 00:33:18.957988 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 13 00:33:18.957994 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 13 00:33:18.958000 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 13 00:33:18.958006 kernel: Console: colour dummy device 80x25 Mar 13 00:33:18.958012 kernel: printk: legacy console [tty0] enabled Mar 13 00:33:18.958018 kernel: printk: legacy console [ttyS0] enabled Mar 13 00:33:18.958024 kernel: ACPI: Core revision 20240827 Mar 13 00:33:18.958030 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Mar 13 00:33:18.958036 kernel: APIC: Switch to symmetric I/O mode setup Mar 13 00:33:18.958045 kernel: x2apic enabled Mar 13 00:33:18.958051 kernel: APIC: Switched APIC routing to: physical x2apic Mar 13 00:33:18.958057 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 13 00:33:18.958063 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x229835b7123, max_idle_ns: 440795242976 ns Mar 13 00:33:18.958069 kernel: Calibrating delay loop (skipped) preset value.. 4799.99 BogoMIPS (lpj=2399998) Mar 13 00:33:18.958075 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 13 00:33:18.958081 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Mar 13 00:33:18.958087 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Mar 13 00:33:18.958094 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 13 00:33:18.958102 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Mar 13 00:33:18.958108 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 13 00:33:18.958115 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 13 00:33:18.958121 kernel: active return thunk: srso_alias_return_thunk Mar 13 00:33:18.958127 kernel: Speculative Return Stack Overflow: Mitigation: Safe RET Mar 13 00:33:18.958133 kernel: Transient Scheduler Attacks: Forcing mitigation on in a VM Mar 13 00:33:18.958139 kernel: Transient Scheduler Attacks: Vulnerable: Clear CPU buffers attempted, no microcode Mar 13 00:33:18.958145 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 13 00:33:18.958151 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 13 00:33:18.958159 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 13 00:33:18.958165 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Mar 13 00:33:18.958172 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Mar 13 00:33:18.958178 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Mar 13 00:33:18.958184 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Mar 13 00:33:18.958190 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 13 00:33:18.958196 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Mar 13 00:33:18.958202 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Mar 13 00:33:18.958208 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Mar 13 00:33:18.958216 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Mar 13 00:33:18.958236 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Mar 13 00:33:18.958242 kernel: Freeing SMP alternatives memory: 32K Mar 13 00:33:18.958248 kernel: pid_max: default: 32768 minimum: 301 Mar 13 00:33:18.958254 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 13 00:33:18.958260 kernel: landlock: Up and running. Mar 13 00:33:18.958266 kernel: SELinux: Initializing. Mar 13 00:33:18.958272 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 13 00:33:18.958278 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 13 00:33:18.958287 kernel: smpboot: CPU0: AMD EPYC-Genoa Processor (family: 0x19, model: 0x11, stepping: 0x0) Mar 13 00:33:18.958293 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Mar 13 00:33:18.958300 kernel: ... version: 0 Mar 13 00:33:18.958306 kernel: ... bit width: 48 Mar 13 00:33:18.958312 kernel: ... generic registers: 6 Mar 13 00:33:18.958318 kernel: ... value mask: 0000ffffffffffff Mar 13 00:33:18.958324 kernel: ... max period: 00007fffffffffff Mar 13 00:33:18.958330 kernel: ... fixed-purpose events: 0 Mar 13 00:33:18.958336 kernel: ... event mask: 000000000000003f Mar 13 00:33:18.958345 kernel: signal: max sigframe size: 3376 Mar 13 00:33:18.958351 kernel: rcu: Hierarchical SRCU implementation. Mar 13 00:33:18.958357 kernel: rcu: Max phase no-delay instances is 400. Mar 13 00:33:18.958363 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 13 00:33:18.958369 kernel: smp: Bringing up secondary CPUs ... Mar 13 00:33:18.958375 kernel: smpboot: x86: Booting SMP configuration: Mar 13 00:33:18.958381 kernel: .... node #0, CPUs: #1 Mar 13 00:33:18.958387 kernel: smp: Brought up 1 node, 2 CPUs Mar 13 00:33:18.958393 kernel: smpboot: Total of 2 processors activated (9599.99 BogoMIPS) Mar 13 00:33:18.958402 kernel: Memory: 3848512K/4091168K available (14336K kernel code, 2445K rwdata, 26064K rodata, 46200K init, 2560K bss, 237024K reserved, 0K cma-reserved) Mar 13 00:33:18.958408 kernel: devtmpfs: initialized Mar 13 00:33:18.958414 kernel: x86/mm: Memory block size: 128MB Mar 13 00:33:18.958420 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Mar 13 00:33:18.958426 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 13 00:33:18.958433 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 13 00:33:18.958439 kernel: pinctrl core: initialized pinctrl subsystem Mar 13 00:33:18.958445 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 13 00:33:18.958451 kernel: audit: initializing netlink subsys (disabled) Mar 13 00:33:18.958459 kernel: audit: type=2000 audit(1773361995.852:1): state=initialized audit_enabled=0 res=1 Mar 13 00:33:18.958465 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 13 00:33:18.958471 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 13 00:33:18.958477 kernel: cpuidle: using governor menu Mar 13 00:33:18.958483 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 13 00:33:18.958489 kernel: dca service started, version 1.12.1 Mar 13 00:33:18.958495 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Mar 13 00:33:18.958501 kernel: PCI: Using configuration type 1 for base access Mar 13 00:33:18.958507 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 13 00:33:18.958516 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 13 00:33:18.958522 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 13 00:33:18.958528 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 13 00:33:18.958534 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 13 00:33:18.958540 kernel: ACPI: Added _OSI(Module Device) Mar 13 00:33:18.958546 kernel: ACPI: Added _OSI(Processor Device) Mar 13 00:33:18.958551 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 13 00:33:18.958557 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 13 00:33:18.958563 kernel: ACPI: Interpreter enabled Mar 13 00:33:18.958572 kernel: ACPI: PM: (supports S0 S5) Mar 13 00:33:18.958578 kernel: ACPI: Using IOAPIC for interrupt routing Mar 13 00:33:18.958584 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 13 00:33:18.958590 kernel: PCI: Using E820 reservations for host bridge windows Mar 13 00:33:18.958596 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 13 00:33:18.958602 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 13 00:33:18.958780 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 13 00:33:18.958911 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Mar 13 00:33:18.959018 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Mar 13 00:33:18.959026 kernel: PCI host bridge to bus 0000:00 Mar 13 00:33:18.959136 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 13 00:33:18.959645 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 13 00:33:18.959764 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 13 00:33:18.959877 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Mar 13 00:33:18.959977 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Mar 13 00:33:18.960073 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc7ffffffff window] Mar 13 00:33:18.960169 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 13 00:33:18.960317 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Mar 13 00:33:18.960439 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Mar 13 00:33:18.960546 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Mar 13 00:33:18.960651 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc060500000-0xc060503fff 64bit pref] Mar 13 00:33:18.960757 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8138a000-0x8138afff] Mar 13 00:33:18.960871 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Mar 13 00:33:18.960977 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 13 00:33:18.961092 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.961198 kernel: pci 0000:00:02.0: BAR 0 [mem 0x81389000-0x81389fff] Mar 13 00:33:18.961319 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 13 00:33:18.961428 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Mar 13 00:33:18.961532 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 13 00:33:18.961644 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.961748 kernel: pci 0000:00:02.1: BAR 0 [mem 0x81388000-0x81388fff] Mar 13 00:33:18.961865 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 13 00:33:18.961970 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Mar 13 00:33:18.962081 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.962188 kernel: pci 0000:00:02.2: BAR 0 [mem 0x81387000-0x81387fff] Mar 13 00:33:18.962307 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 13 00:33:18.962411 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Mar 13 00:33:18.962515 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 13 00:33:18.962629 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.962733 kernel: pci 0000:00:02.3: BAR 0 [mem 0x81386000-0x81386fff] Mar 13 00:33:18.962847 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 13 00:33:18.962955 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 13 00:33:18.963066 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.963171 kernel: pci 0000:00:02.4: BAR 0 [mem 0x81385000-0x81385fff] Mar 13 00:33:18.963289 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 13 00:33:18.963394 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Mar 13 00:33:18.963498 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 13 00:33:18.963610 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.963720 kernel: pci 0000:00:02.5: BAR 0 [mem 0x81384000-0x81384fff] Mar 13 00:33:18.963831 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 13 00:33:18.963936 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Mar 13 00:33:18.964039 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 13 00:33:18.964150 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.964267 kernel: pci 0000:00:02.6: BAR 0 [mem 0x81383000-0x81383fff] Mar 13 00:33:18.964371 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 13 00:33:18.964475 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Mar 13 00:33:18.964582 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 13 00:33:18.964694 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.964820 kernel: pci 0000:00:02.7: BAR 0 [mem 0x81382000-0x81382fff] Mar 13 00:33:18.964927 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 13 00:33:18.965381 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Mar 13 00:33:18.965497 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 13 00:33:18.965614 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.965719 kernel: pci 0000:00:03.0: BAR 0 [mem 0x81381000-0x81381fff] Mar 13 00:33:18.965834 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 13 00:33:18.965940 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Mar 13 00:33:18.966044 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 13 00:33:18.966156 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Mar 13 00:33:18.966279 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 13 00:33:18.966397 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Mar 13 00:33:18.966501 kernel: pci 0000:00:1f.2: BAR 4 [io 0x6040-0x605f] Mar 13 00:33:18.966606 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x81380000-0x81380fff] Mar 13 00:33:18.966719 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Mar 13 00:33:18.966832 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6000-0x603f] Mar 13 00:33:18.966952 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 13 00:33:18.967067 kernel: pci 0000:01:00.0: BAR 1 [mem 0x81200000-0x81200fff] Mar 13 00:33:18.967178 kernel: pci 0000:01:00.0: BAR 4 [mem 0xc060000000-0xc060003fff 64bit pref] Mar 13 00:33:18.969428 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 13 00:33:18.969555 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 13 00:33:18.969679 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Mar 13 00:33:18.969792 kernel: pci 0000:02:00.0: BAR 0 [mem 0x81100000-0x81103fff 64bit] Mar 13 00:33:18.969916 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 13 00:33:18.970041 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Mar 13 00:33:18.970152 kernel: pci 0000:03:00.0: BAR 1 [mem 0x81000000-0x81000fff] Mar 13 00:33:18.970278 kernel: pci 0000:03:00.0: BAR 4 [mem 0xc060100000-0xc060103fff 64bit pref] Mar 13 00:33:18.970386 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 13 00:33:18.970506 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Mar 13 00:33:18.970617 kernel: pci 0000:04:00.0: BAR 4 [mem 0xc060200000-0xc060203fff 64bit pref] Mar 13 00:33:18.970725 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 13 00:33:18.970861 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Mar 13 00:33:18.970973 kernel: pci 0000:05:00.0: BAR 1 [mem 0x80f00000-0x80f00fff] Mar 13 00:33:18.971083 kernel: pci 0000:05:00.0: BAR 4 [mem 0xc060300000-0xc060303fff 64bit pref] Mar 13 00:33:18.971190 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 13 00:33:18.973413 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Mar 13 00:33:18.975023 kernel: pci 0000:06:00.0: BAR 1 [mem 0x80e00000-0x80e00fff] Mar 13 00:33:18.975193 kernel: pci 0000:06:00.0: BAR 4 [mem 0xc060400000-0xc060403fff 64bit pref] Mar 13 00:33:18.975332 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 13 00:33:18.975341 kernel: acpiphp: Slot [0] registered Mar 13 00:33:18.975463 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 13 00:33:18.975575 kernel: pci 0000:07:00.0: BAR 1 [mem 0x80c00000-0x80c00fff] Mar 13 00:33:18.975686 kernel: pci 0000:07:00.0: BAR 4 [mem 0xc000000000-0xc000003fff 64bit pref] Mar 13 00:33:18.975798 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 13 00:33:18.975917 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 13 00:33:18.975930 kernel: acpiphp: Slot [0-2] registered Mar 13 00:33:18.976037 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 13 00:33:18.976045 kernel: acpiphp: Slot [0-3] registered Mar 13 00:33:18.976152 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 13 00:33:18.976163 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 13 00:33:18.976186 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 13 00:33:18.976195 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 13 00:33:18.976202 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 13 00:33:18.976210 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 13 00:33:18.976217 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 13 00:33:18.979938 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 13 00:33:18.979951 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 13 00:33:18.979959 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 13 00:33:18.979965 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 13 00:33:18.979972 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 13 00:33:18.979978 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 13 00:33:18.979985 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 13 00:33:18.979998 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 13 00:33:18.980004 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 13 00:33:18.980013 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 13 00:33:18.980020 kernel: iommu: Default domain type: Translated Mar 13 00:33:18.980027 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 13 00:33:18.980035 kernel: efivars: Registered efivars operations Mar 13 00:33:18.980042 kernel: PCI: Using ACPI for IRQ routing Mar 13 00:33:18.980048 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 13 00:33:18.980055 kernel: e820: reserve RAM buffer [mem 0x7df36018-0x7fffffff] Mar 13 00:33:18.980061 kernel: e820: reserve RAM buffer [mem 0x7df72018-0x7fffffff] Mar 13 00:33:18.980068 kernel: e820: reserve RAM buffer [mem 0x7dfae018-0x7fffffff] Mar 13 00:33:18.980074 kernel: e820: reserve RAM buffer [mem 0x7ed3f000-0x7fffffff] Mar 13 00:33:18.980080 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Mar 13 00:33:18.980087 kernel: e820: reserve RAM buffer [mem 0x7ff7c000-0x7fffffff] Mar 13 00:33:18.980096 kernel: e820: reserve RAM buffer [mem 0x17a000000-0x17bffffff] Mar 13 00:33:18.980275 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 13 00:33:18.980391 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 13 00:33:18.980500 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 13 00:33:18.980508 kernel: vgaarb: loaded Mar 13 00:33:18.980515 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Mar 13 00:33:18.980522 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Mar 13 00:33:18.980529 kernel: clocksource: Switched to clocksource kvm-clock Mar 13 00:33:18.980535 kernel: VFS: Disk quotas dquot_6.6.0 Mar 13 00:33:18.980546 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 13 00:33:18.980553 kernel: pnp: PnP ACPI init Mar 13 00:33:18.980674 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Mar 13 00:33:18.980684 kernel: pnp: PnP ACPI: found 5 devices Mar 13 00:33:18.980691 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 13 00:33:18.980697 kernel: NET: Registered PF_INET protocol family Mar 13 00:33:18.980704 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 13 00:33:18.980710 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 13 00:33:18.980719 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 13 00:33:18.980726 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 13 00:33:18.980732 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 13 00:33:18.980739 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 13 00:33:18.980745 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 13 00:33:18.980752 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 13 00:33:18.980758 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 13 00:33:18.980765 kernel: NET: Registered PF_XDP protocol family Mar 13 00:33:18.980896 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Mar 13 00:33:18.981020 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Mar 13 00:33:18.981131 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 13 00:33:18.981269 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 13 00:33:18.981381 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 13 00:33:18.981489 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Mar 13 00:33:18.981595 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Mar 13 00:33:18.981702 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Mar 13 00:33:18.981895 kernel: pci 0000:01:00.0: ROM [mem 0x81280000-0x812fffff pref]: assigned Mar 13 00:33:18.982010 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 13 00:33:18.982118 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Mar 13 00:33:18.982285 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 13 00:33:18.982403 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 13 00:33:18.982510 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Mar 13 00:33:18.982619 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 13 00:33:18.982730 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Mar 13 00:33:18.982853 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 13 00:33:18.982971 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 13 00:33:18.983079 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 13 00:33:18.983188 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 13 00:33:18.984487 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Mar 13 00:33:18.984613 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 13 00:33:18.984726 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 13 00:33:18.984861 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Mar 13 00:33:18.984969 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 13 00:33:18.985086 kernel: pci 0000:07:00.0: ROM [mem 0x80c80000-0x80cfffff pref]: assigned Mar 13 00:33:18.985201 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 13 00:33:18.988612 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Mar 13 00:33:18.988745 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Mar 13 00:33:18.988866 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 13 00:33:18.988981 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 13 00:33:18.989097 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Mar 13 00:33:18.989205 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Mar 13 00:33:18.989360 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 13 00:33:18.989474 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 13 00:33:18.989583 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Mar 13 00:33:18.989691 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Mar 13 00:33:18.989799 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 13 00:33:18.989919 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 13 00:33:18.990020 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 13 00:33:18.990123 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 13 00:33:18.990275 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Mar 13 00:33:18.990380 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Mar 13 00:33:18.990484 kernel: pci_bus 0000:00: resource 9 [mem 0xc000000000-0xc7ffffffff window] Mar 13 00:33:18.990600 kernel: pci_bus 0000:01: resource 1 [mem 0x81200000-0x812fffff] Mar 13 00:33:18.990706 kernel: pci_bus 0000:01: resource 2 [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 13 00:33:18.990834 kernel: pci_bus 0000:02: resource 1 [mem 0x81100000-0x811fffff] Mar 13 00:33:18.990950 kernel: pci_bus 0000:03: resource 1 [mem 0x81000000-0x810fffff] Mar 13 00:33:18.991056 kernel: pci_bus 0000:03: resource 2 [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 13 00:33:18.991172 kernel: pci_bus 0000:04: resource 2 [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 13 00:33:18.991302 kernel: pci_bus 0000:05: resource 1 [mem 0x80f00000-0x80ffffff] Mar 13 00:33:18.991409 kernel: pci_bus 0000:05: resource 2 [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 13 00:33:18.991522 kernel: pci_bus 0000:06: resource 1 [mem 0x80e00000-0x80efffff] Mar 13 00:33:18.991631 kernel: pci_bus 0000:06: resource 2 [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 13 00:33:18.991744 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Mar 13 00:33:18.991858 kernel: pci_bus 0000:07: resource 1 [mem 0x80c00000-0x80dfffff] Mar 13 00:33:18.991962 kernel: pci_bus 0000:07: resource 2 [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 13 00:33:18.992074 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Mar 13 00:33:18.992179 kernel: pci_bus 0000:08: resource 1 [mem 0x80a00000-0x80bfffff] Mar 13 00:33:18.994351 kernel: pci_bus 0000:08: resource 2 [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 13 00:33:18.994499 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Mar 13 00:33:18.994607 kernel: pci_bus 0000:09: resource 1 [mem 0x80800000-0x809fffff] Mar 13 00:33:18.994713 kernel: pci_bus 0000:09: resource 2 [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 13 00:33:18.994722 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 13 00:33:18.994729 kernel: PCI: CLS 0 bytes, default 64 Mar 13 00:33:18.994736 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 13 00:33:18.994743 kernel: software IO TLB: mapped [mem 0x0000000077ffd000-0x000000007bffd000] (64MB) Mar 13 00:33:18.994750 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x229835b7123, max_idle_ns: 440795242976 ns Mar 13 00:33:18.994759 kernel: Initialise system trusted keyrings Mar 13 00:33:18.994766 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 13 00:33:18.994773 kernel: Key type asymmetric registered Mar 13 00:33:18.994779 kernel: Asymmetric key parser 'x509' registered Mar 13 00:33:18.994785 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 13 00:33:18.994792 kernel: io scheduler mq-deadline registered Mar 13 00:33:18.994798 kernel: io scheduler kyber registered Mar 13 00:33:18.994817 kernel: io scheduler bfq registered Mar 13 00:33:18.994938 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 13 00:33:18.995052 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 13 00:33:18.995165 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 13 00:33:18.995295 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 13 00:33:18.995407 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 13 00:33:18.995516 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 13 00:33:18.995628 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 13 00:33:18.995739 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 13 00:33:18.995859 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 13 00:33:18.995972 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 13 00:33:18.996086 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 13 00:33:18.996196 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 13 00:33:18.998200 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 13 00:33:18.998343 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 13 00:33:18.998459 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 13 00:33:18.998570 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 13 00:33:18.998584 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 13 00:33:18.998695 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Mar 13 00:33:18.998816 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Mar 13 00:33:18.998824 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 13 00:33:18.998831 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Mar 13 00:33:18.998838 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 13 00:33:18.998845 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 13 00:33:18.998854 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 13 00:33:18.998861 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 13 00:33:18.998868 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 13 00:33:18.998874 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 13 00:33:18.998992 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 13 00:33:18.999099 kernel: rtc_cmos 00:03: registered as rtc0 Mar 13 00:33:18.999203 kernel: rtc_cmos 00:03: setting system clock to 2026-03-13T00:33:18 UTC (1773361998) Mar 13 00:33:18.999319 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Mar 13 00:33:18.999331 kernel: amd_pstate: The CPPC feature is supported but currently disabled by the BIOS. Please enable it if your BIOS has the CPPC option. Mar 13 00:33:18.999339 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 13 00:33:18.999345 kernel: efifb: probing for efifb Mar 13 00:33:18.999352 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Mar 13 00:33:18.999359 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Mar 13 00:33:18.999365 kernel: efifb: scrolling: redraw Mar 13 00:33:18.999372 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 13 00:33:18.999379 kernel: Console: switching to colour frame buffer device 160x50 Mar 13 00:33:18.999388 kernel: fb0: EFI VGA frame buffer device Mar 13 00:33:18.999395 kernel: pstore: Using crash dump compression: deflate Mar 13 00:33:18.999401 kernel: pstore: Registered efi_pstore as persistent store backend Mar 13 00:33:18.999408 kernel: NET: Registered PF_INET6 protocol family Mar 13 00:33:18.999415 kernel: Segment Routing with IPv6 Mar 13 00:33:18.999421 kernel: In-situ OAM (IOAM) with IPv6 Mar 13 00:33:18.999428 kernel: NET: Registered PF_PACKET protocol family Mar 13 00:33:18.999435 kernel: Key type dns_resolver registered Mar 13 00:33:18.999441 kernel: IPI shorthand broadcast: enabled Mar 13 00:33:18.999448 kernel: sched_clock: Marking stable (3078011261, 296655761)->(3427064555, -52397533) Mar 13 00:33:18.999457 kernel: registered taskstats version 1 Mar 13 00:33:18.999464 kernel: Loading compiled-in X.509 certificates Mar 13 00:33:18.999471 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: 5aff49df330f42445474818d085d5033fee752d8' Mar 13 00:33:18.999477 kernel: Demotion targets for Node 0: null Mar 13 00:33:18.999484 kernel: Key type .fscrypt registered Mar 13 00:33:18.999490 kernel: Key type fscrypt-provisioning registered Mar 13 00:33:18.999497 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 13 00:33:18.999503 kernel: ima: Allocated hash algorithm: sha1 Mar 13 00:33:18.999512 kernel: ima: No architecture policies found Mar 13 00:33:18.999519 kernel: clk: Disabling unused clocks Mar 13 00:33:18.999526 kernel: Warning: unable to open an initial console. Mar 13 00:33:18.999533 kernel: Freeing unused kernel image (initmem) memory: 46200K Mar 13 00:33:18.999539 kernel: Write protecting the kernel read-only data: 40960k Mar 13 00:33:18.999546 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Mar 13 00:33:18.999552 kernel: Run /init as init process Mar 13 00:33:18.999559 kernel: with arguments: Mar 13 00:33:18.999566 kernel: /init Mar 13 00:33:18.999575 kernel: with environment: Mar 13 00:33:18.999582 kernel: HOME=/ Mar 13 00:33:18.999588 kernel: TERM=linux Mar 13 00:33:18.999596 systemd[1]: Successfully made /usr/ read-only. Mar 13 00:33:18.999606 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 13 00:33:18.999613 systemd[1]: Detected virtualization kvm. Mar 13 00:33:18.999620 systemd[1]: Detected architecture x86-64. Mar 13 00:33:18.999627 systemd[1]: Running in initrd. Mar 13 00:33:18.999635 systemd[1]: No hostname configured, using default hostname. Mar 13 00:33:18.999643 systemd[1]: Hostname set to . Mar 13 00:33:18.999650 systemd[1]: Initializing machine ID from VM UUID. Mar 13 00:33:18.999657 systemd[1]: Queued start job for default target initrd.target. Mar 13 00:33:18.999663 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 13 00:33:18.999670 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 13 00:33:18.999677 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 13 00:33:18.999685 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 13 00:33:18.999694 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 13 00:33:18.999701 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 13 00:33:18.999709 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 13 00:33:18.999716 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 13 00:33:18.999723 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 13 00:33:18.999730 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 13 00:33:18.999737 systemd[1]: Reached target paths.target - Path Units. Mar 13 00:33:18.999746 systemd[1]: Reached target slices.target - Slice Units. Mar 13 00:33:18.999753 systemd[1]: Reached target swap.target - Swaps. Mar 13 00:33:18.999760 systemd[1]: Reached target timers.target - Timer Units. Mar 13 00:33:18.999768 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 13 00:33:18.999775 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 13 00:33:18.999783 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 13 00:33:18.999790 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 13 00:33:18.999797 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 13 00:33:18.999813 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 13 00:33:18.999820 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 13 00:33:18.999827 systemd[1]: Reached target sockets.target - Socket Units. Mar 13 00:33:18.999834 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 13 00:33:18.999840 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 13 00:33:18.999847 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 13 00:33:18.999854 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 13 00:33:18.999861 systemd[1]: Starting systemd-fsck-usr.service... Mar 13 00:33:18.999868 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 13 00:33:18.999878 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 13 00:33:18.999885 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:33:18.999892 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 13 00:33:18.999899 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 13 00:33:18.999909 systemd[1]: Finished systemd-fsck-usr.service. Mar 13 00:33:18.999917 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 13 00:33:18.999950 systemd-journald[199]: Collecting audit messages is disabled. Mar 13 00:33:18.999969 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 13 00:33:18.999978 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 13 00:33:18.999987 systemd-journald[199]: Journal started Mar 13 00:33:19.000002 systemd-journald[199]: Runtime Journal (/run/log/journal/d246c5c1fbd24b728708f8dd09a8d503) is 8M, max 76.1M, 68.1M free. Mar 13 00:33:18.975118 systemd-modules-load[200]: Inserted module 'overlay' Mar 13 00:33:19.003716 systemd[1]: Started systemd-journald.service - Journal Service. Mar 13 00:33:19.006100 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:19.011086 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 13 00:33:19.017380 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 13 00:33:19.019118 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 13 00:33:19.027667 systemd-modules-load[200]: Inserted module 'br_netfilter' Mar 13 00:33:19.028244 kernel: Bridge firewalling registered Mar 13 00:33:19.029504 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 13 00:33:19.031241 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 13 00:33:19.035648 systemd-tmpfiles[218]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 13 00:33:19.036381 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 13 00:33:19.052359 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 13 00:33:19.053555 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 13 00:33:19.058688 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 13 00:33:19.061042 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 13 00:33:19.065389 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 13 00:33:19.084658 dracut-cmdline[235]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=a2116dc4421f78fe124deb19b9ad6d70a0cb4fc0b3349854f4ce4e2904d4925d Mar 13 00:33:19.116332 systemd-resolved[238]: Positive Trust Anchors: Mar 13 00:33:19.117048 systemd-resolved[238]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 13 00:33:19.117073 systemd-resolved[238]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 13 00:33:19.122783 systemd-resolved[238]: Defaulting to hostname 'linux'. Mar 13 00:33:19.124259 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 13 00:33:19.125068 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 13 00:33:19.182275 kernel: SCSI subsystem initialized Mar 13 00:33:19.192284 kernel: Loading iSCSI transport class v2.0-870. Mar 13 00:33:19.205263 kernel: iscsi: registered transport (tcp) Mar 13 00:33:19.230370 kernel: iscsi: registered transport (qla4xxx) Mar 13 00:33:19.230450 kernel: QLogic iSCSI HBA Driver Mar 13 00:33:19.252110 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 13 00:33:19.269820 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 13 00:33:19.272751 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 13 00:33:19.326569 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 13 00:33:19.328802 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 13 00:33:19.382287 kernel: raid6: avx512x4 gen() 39521 MB/s Mar 13 00:33:19.399270 kernel: raid6: avx512x2 gen() 41166 MB/s Mar 13 00:33:19.417269 kernel: raid6: avx512x1 gen() 37070 MB/s Mar 13 00:33:19.435271 kernel: raid6: avx2x4 gen() 41311 MB/s Mar 13 00:33:19.453266 kernel: raid6: avx2x2 gen() 43261 MB/s Mar 13 00:33:19.472735 kernel: raid6: avx2x1 gen() 34579 MB/s Mar 13 00:33:19.472825 kernel: raid6: using algorithm avx2x2 gen() 43261 MB/s Mar 13 00:33:19.491637 kernel: raid6: .... xor() 32734 MB/s, rmw enabled Mar 13 00:33:19.491743 kernel: raid6: using avx512x2 recovery algorithm Mar 13 00:33:19.512281 kernel: xor: automatically using best checksumming function avx Mar 13 00:33:19.678277 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 13 00:33:19.688333 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 13 00:33:19.691048 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 13 00:33:19.722500 systemd-udevd[448]: Using default interface naming scheme 'v255'. Mar 13 00:33:19.730371 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 13 00:33:19.736385 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 13 00:33:19.763617 dracut-pre-trigger[459]: rd.md=0: removing MD RAID activation Mar 13 00:33:19.802025 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 13 00:33:19.805283 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 13 00:33:19.900534 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 13 00:33:19.903364 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 13 00:33:19.995598 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Mar 13 00:33:20.001199 kernel: ACPI: bus type USB registered Mar 13 00:33:20.001216 kernel: usbcore: registered new interface driver usbfs Mar 13 00:33:20.008255 kernel: scsi host0: Virtio SCSI HBA Mar 13 00:33:20.016256 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Mar 13 00:33:20.021244 kernel: usbcore: registered new interface driver hub Mar 13 00:33:20.038264 kernel: cryptd: max_cpu_qlen set to 1000 Mar 13 00:33:20.048251 kernel: usbcore: registered new device driver usb Mar 13 00:33:20.070968 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:33:20.071122 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:20.074436 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:33:20.077535 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:33:20.079652 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 13 00:33:20.096050 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:33:20.142187 kernel: sd 0:0:0:0: Power-on or device reset occurred Mar 13 00:33:20.142514 kernel: sd 0:0:0:0: [sda] 160006144 512-byte logical blocks: (81.9 GB/76.3 GiB) Mar 13 00:33:20.142728 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 13 00:33:20.142937 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Mar 13 00:33:20.143128 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Mar 13 00:33:20.143350 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Mar 13 00:33:20.143365 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 13 00:33:20.143379 kernel: GPT:17805311 != 160006143 Mar 13 00:33:20.143392 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 13 00:33:20.143405 kernel: GPT:17805311 != 160006143 Mar 13 00:33:20.143418 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 13 00:33:20.143435 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 13 00:33:20.143448 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 13 00:33:20.143641 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 13 00:33:20.143842 kernel: libata version 3.00 loaded. Mar 13 00:33:20.143856 kernel: AES CTR mode by8 optimization enabled Mar 13 00:33:20.096214 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:20.106543 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:33:20.171542 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Mar 13 00:33:20.171758 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 13 00:33:20.171908 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 13 00:33:20.172045 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Mar 13 00:33:20.172174 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Mar 13 00:33:20.176418 kernel: hub 1-0:1.0: USB hub found Mar 13 00:33:20.191341 kernel: hub 1-0:1.0: 4 ports detected Mar 13 00:33:20.195260 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 13 00:33:20.198261 kernel: hub 2-0:1.0: USB hub found Mar 13 00:33:20.203769 kernel: ahci 0000:00:1f.2: version 3.0 Mar 13 00:33:20.204006 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 13 00:33:20.206262 kernel: hub 2-0:1.0: 4 ports detected Mar 13 00:33:20.209443 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:20.227578 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Mar 13 00:33:20.227846 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Mar 13 00:33:20.227985 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 13 00:33:20.254246 kernel: scsi host1: ahci Mar 13 00:33:20.258238 kernel: scsi host2: ahci Mar 13 00:33:20.260420 kernel: scsi host3: ahci Mar 13 00:33:20.262412 kernel: scsi host4: ahci Mar 13 00:33:20.264146 kernel: scsi host5: ahci Mar 13 00:33:20.264578 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Mar 13 00:33:20.277717 kernel: scsi host6: ahci Mar 13 00:33:20.277952 kernel: ata1: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380100 irq 51 lpm-pol 1 Mar 13 00:33:20.277972 kernel: ata2: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380180 irq 51 lpm-pol 1 Mar 13 00:33:20.277981 kernel: ata3: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380200 irq 51 lpm-pol 1 Mar 13 00:33:20.277989 kernel: ata4: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380280 irq 51 lpm-pol 1 Mar 13 00:33:20.277998 kernel: ata5: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380300 irq 51 lpm-pol 1 Mar 13 00:33:20.281004 kernel: ata6: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380380 irq 51 lpm-pol 1 Mar 13 00:33:20.291324 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Mar 13 00:33:20.298776 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 13 00:33:20.304947 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Mar 13 00:33:20.305402 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Mar 13 00:33:20.307783 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 13 00:33:20.323439 disk-uuid[648]: Primary Header is updated. Mar 13 00:33:20.323439 disk-uuid[648]: Secondary Entries is updated. Mar 13 00:33:20.323439 disk-uuid[648]: Secondary Header is updated. Mar 13 00:33:20.330246 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 13 00:33:20.345254 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 13 00:33:20.433985 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 13 00:33:20.570263 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 13 00:33:20.600255 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 13 00:33:20.600349 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Mar 13 00:33:20.600364 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 13 00:33:20.600378 kernel: ata1.00: LPM support broken, forcing max_power Mar 13 00:33:20.600391 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Mar 13 00:33:20.600420 kernel: ata1.00: applying bridge limits Mar 13 00:33:20.604497 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 13 00:33:20.604548 kernel: ata1.00: LPM support broken, forcing max_power Mar 13 00:33:20.607875 kernel: ata1.00: configured for UDMA/100 Mar 13 00:33:20.610449 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 13 00:33:20.611388 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 13 00:33:20.611417 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 13 00:33:20.640325 kernel: usbcore: registered new interface driver usbhid Mar 13 00:33:20.640397 kernel: usbhid: USB HID core driver Mar 13 00:33:20.651634 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Mar 13 00:33:20.651688 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Mar 13 00:33:20.662551 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Mar 13 00:33:20.663031 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 13 00:33:20.674248 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Mar 13 00:33:21.020564 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 13 00:33:21.021726 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 13 00:33:21.022540 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 13 00:33:21.023258 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 13 00:33:21.025368 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 13 00:33:21.051344 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 13 00:33:21.350807 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 13 00:33:21.354261 disk-uuid[649]: The operation has completed successfully. Mar 13 00:33:21.413310 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 13 00:33:21.413451 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 13 00:33:21.454302 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 13 00:33:21.466961 sh[681]: Success Mar 13 00:33:21.491437 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 13 00:33:21.491524 kernel: device-mapper: uevent: version 1.0.3 Mar 13 00:33:21.492282 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 13 00:33:21.510268 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Mar 13 00:33:21.556750 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 13 00:33:21.560318 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 13 00:33:21.571629 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 13 00:33:21.582250 kernel: BTRFS: device fsid 503642f8-c59c-4168-97a8-9c3603183fa3 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (693) Mar 13 00:33:21.588980 kernel: BTRFS info (device dm-0): first mount of filesystem 503642f8-c59c-4168-97a8-9c3603183fa3 Mar 13 00:33:21.589049 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 13 00:33:21.602514 kernel: BTRFS info (device dm-0 state E): enabling ssd optimizations Mar 13 00:33:21.602589 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 13 00:33:21.602601 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 13 00:33:21.607898 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 13 00:33:21.608942 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 13 00:33:21.609556 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 13 00:33:21.610444 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 13 00:33:21.613861 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 13 00:33:21.639294 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (724) Mar 13 00:33:21.643977 kernel: BTRFS info (device sda6): first mount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:33:21.644023 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 13 00:33:21.654477 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 13 00:33:21.654543 kernel: BTRFS info (device sda6): turning on async discard Mar 13 00:33:21.654558 kernel: BTRFS info (device sda6): enabling free space tree Mar 13 00:33:21.663257 kernel: BTRFS info (device sda6): last unmount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:33:21.665172 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 13 00:33:21.666924 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 13 00:33:21.770691 ignition[781]: Ignition 2.22.0 Mar 13 00:33:21.770703 ignition[781]: Stage: fetch-offline Mar 13 00:33:21.770738 ignition[781]: no configs at "/usr/lib/ignition/base.d" Mar 13 00:33:21.770747 ignition[781]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:33:21.770823 ignition[781]: parsed url from cmdline: "" Mar 13 00:33:21.774904 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 13 00:33:21.770840 ignition[781]: no config URL provided Mar 13 00:33:21.770845 ignition[781]: reading system config file "/usr/lib/ignition/user.ign" Mar 13 00:33:21.770854 ignition[781]: no config at "/usr/lib/ignition/user.ign" Mar 13 00:33:21.770859 ignition[781]: failed to fetch config: resource requires networking Mar 13 00:33:21.778369 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 13 00:33:21.770999 ignition[781]: Ignition finished successfully Mar 13 00:33:21.781007 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 13 00:33:21.821944 systemd-networkd[867]: lo: Link UP Mar 13 00:33:21.821955 systemd-networkd[867]: lo: Gained carrier Mar 13 00:33:21.824693 systemd-networkd[867]: Enumeration completed Mar 13 00:33:21.825139 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 13 00:33:21.825580 systemd-networkd[867]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:33:21.825584 systemd-networkd[867]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 13 00:33:21.826404 systemd-networkd[867]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:33:21.826409 systemd-networkd[867]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 13 00:33:21.826729 systemd-networkd[867]: eth0: Link UP Mar 13 00:33:21.826861 systemd-networkd[867]: eth1: Link UP Mar 13 00:33:21.827158 systemd[1]: Reached target network.target - Network. Mar 13 00:33:21.828705 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 13 00:33:21.829418 systemd-networkd[867]: eth0: Gained carrier Mar 13 00:33:21.829431 systemd-networkd[867]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:33:21.831841 systemd-networkd[867]: eth1: Gained carrier Mar 13 00:33:21.831851 systemd-networkd[867]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:33:21.855640 ignition[870]: Ignition 2.22.0 Mar 13 00:33:21.855652 ignition[870]: Stage: fetch Mar 13 00:33:21.855928 ignition[870]: no configs at "/usr/lib/ignition/base.d" Mar 13 00:33:21.855939 ignition[870]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:33:21.856032 ignition[870]: parsed url from cmdline: "" Mar 13 00:33:21.856036 ignition[870]: no config URL provided Mar 13 00:33:21.856058 ignition[870]: reading system config file "/usr/lib/ignition/user.ign" Mar 13 00:33:21.856067 ignition[870]: no config at "/usr/lib/ignition/user.ign" Mar 13 00:33:21.856089 ignition[870]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Mar 13 00:33:21.856297 ignition[870]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Mar 13 00:33:21.861300 systemd-networkd[867]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 13 00:33:21.883312 systemd-networkd[867]: eth0: DHCPv4 address 157.180.95.181/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 13 00:33:22.056853 ignition[870]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Mar 13 00:33:22.062430 ignition[870]: GET result: OK Mar 13 00:33:22.062602 ignition[870]: parsing config with SHA512: ba9afa6554f809e7da73973df585a8f9471cb747931dba5926518b525e685961e35a5973ff435722a464cc760d30ef91dfee5a88d12fbffdca2bfde429c605cf Mar 13 00:33:22.073323 unknown[870]: fetched base config from "system" Mar 13 00:33:22.073347 unknown[870]: fetched base config from "system" Mar 13 00:33:22.073796 ignition[870]: fetch: fetch complete Mar 13 00:33:22.073359 unknown[870]: fetched user config from "hetzner" Mar 13 00:33:22.073808 ignition[870]: fetch: fetch passed Mar 13 00:33:22.073908 ignition[870]: Ignition finished successfully Mar 13 00:33:22.080756 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 13 00:33:22.084424 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 13 00:33:22.138857 ignition[878]: Ignition 2.22.0 Mar 13 00:33:22.138869 ignition[878]: Stage: kargs Mar 13 00:33:22.139004 ignition[878]: no configs at "/usr/lib/ignition/base.d" Mar 13 00:33:22.139015 ignition[878]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:33:22.143278 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 13 00:33:22.139723 ignition[878]: kargs: kargs passed Mar 13 00:33:22.139769 ignition[878]: Ignition finished successfully Mar 13 00:33:22.145757 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 13 00:33:22.186268 ignition[885]: Ignition 2.22.0 Mar 13 00:33:22.187285 ignition[885]: Stage: disks Mar 13 00:33:22.187447 ignition[885]: no configs at "/usr/lib/ignition/base.d" Mar 13 00:33:22.187458 ignition[885]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:33:22.188093 ignition[885]: disks: disks passed Mar 13 00:33:22.188144 ignition[885]: Ignition finished successfully Mar 13 00:33:22.190428 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 13 00:33:22.192185 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 13 00:33:22.193935 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 13 00:33:22.194636 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 13 00:33:22.195871 systemd[1]: Reached target sysinit.target - System Initialization. Mar 13 00:33:22.197067 systemd[1]: Reached target basic.target - Basic System. Mar 13 00:33:22.199820 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 13 00:33:22.242476 systemd-fsck[893]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Mar 13 00:33:22.246117 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 13 00:33:22.249203 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 13 00:33:22.372268 kernel: EXT4-fs (sda9): mounted filesystem 26348f72-0225-4c06-aedc-823e61beebc6 r/w with ordered data mode. Quota mode: none. Mar 13 00:33:22.373095 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 13 00:33:22.374024 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 13 00:33:22.376258 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 13 00:33:22.379154 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 13 00:33:22.380977 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 13 00:33:22.383305 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 13 00:33:22.384078 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 13 00:33:22.394464 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 13 00:33:22.397303 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 13 00:33:22.411273 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (901) Mar 13 00:33:22.417252 kernel: BTRFS info (device sda6): first mount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:33:22.421264 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 13 00:33:22.439778 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 13 00:33:22.439868 kernel: BTRFS info (device sda6): turning on async discard Mar 13 00:33:22.439884 kernel: BTRFS info (device sda6): enabling free space tree Mar 13 00:33:22.445253 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 13 00:33:22.455926 coreos-metadata[903]: Mar 13 00:33:22.455 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Mar 13 00:33:22.457651 coreos-metadata[903]: Mar 13 00:33:22.457 INFO Fetch successful Mar 13 00:33:22.459199 coreos-metadata[903]: Mar 13 00:33:22.459 INFO wrote hostname ci-4459-2-4-n-7393fd8643 to /sysroot/etc/hostname Mar 13 00:33:22.461653 initrd-setup-root[928]: cut: /sysroot/etc/passwd: No such file or directory Mar 13 00:33:22.461255 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 13 00:33:22.466565 initrd-setup-root[936]: cut: /sysroot/etc/group: No such file or directory Mar 13 00:33:22.471306 initrd-setup-root[943]: cut: /sysroot/etc/shadow: No such file or directory Mar 13 00:33:22.474950 initrd-setup-root[950]: cut: /sysroot/etc/gshadow: No such file or directory Mar 13 00:33:22.564190 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 13 00:33:22.565760 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 13 00:33:22.567459 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 13 00:33:22.582886 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 13 00:33:22.586246 kernel: BTRFS info (device sda6): last unmount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:33:22.601565 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 13 00:33:22.623258 ignition[1019]: INFO : Ignition 2.22.0 Mar 13 00:33:22.623258 ignition[1019]: INFO : Stage: mount Mar 13 00:33:22.623258 ignition[1019]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 13 00:33:22.623258 ignition[1019]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:33:22.623258 ignition[1019]: INFO : mount: mount passed Mar 13 00:33:22.626421 ignition[1019]: INFO : Ignition finished successfully Mar 13 00:33:22.627743 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 13 00:33:22.629306 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 13 00:33:22.660027 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 13 00:33:22.680255 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1029) Mar 13 00:33:22.680319 kernel: BTRFS info (device sda6): first mount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:33:22.683346 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 13 00:33:22.692872 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 13 00:33:22.692949 kernel: BTRFS info (device sda6): turning on async discard Mar 13 00:33:22.692961 kernel: BTRFS info (device sda6): enabling free space tree Mar 13 00:33:22.697209 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 13 00:33:22.727016 ignition[1045]: INFO : Ignition 2.22.0 Mar 13 00:33:22.727016 ignition[1045]: INFO : Stage: files Mar 13 00:33:22.728082 ignition[1045]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 13 00:33:22.728082 ignition[1045]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:33:22.728082 ignition[1045]: DEBUG : files: compiled without relabeling support, skipping Mar 13 00:33:22.729203 ignition[1045]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 13 00:33:22.729203 ignition[1045]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 13 00:33:22.730953 ignition[1045]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 13 00:33:22.731439 ignition[1045]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 13 00:33:22.732198 unknown[1045]: wrote ssh authorized keys file for user: core Mar 13 00:33:22.732835 ignition[1045]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 13 00:33:22.735437 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 13 00:33:22.735437 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Mar 13 00:33:22.989737 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 13 00:33:23.301334 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 13 00:33:23.303053 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 13 00:33:23.303053 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 13 00:33:23.303053 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 13 00:33:23.303053 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 13 00:33:23.303053 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 13 00:33:23.303053 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 13 00:33:23.303053 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 13 00:33:23.303053 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 13 00:33:23.309593 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 13 00:33:23.309593 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 13 00:33:23.309593 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Mar 13 00:33:23.309593 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Mar 13 00:33:23.309593 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Mar 13 00:33:23.309593 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.35.1-x86-64.raw: attempt #1 Mar 13 00:33:23.662627 systemd-networkd[867]: eth1: Gained IPv6LL Mar 13 00:33:23.854497 systemd-networkd[867]: eth0: Gained IPv6LL Mar 13 00:33:23.877075 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 13 00:33:25.056838 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Mar 13 00:33:25.056838 ignition[1045]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 13 00:33:25.060215 ignition[1045]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 13 00:33:25.060215 ignition[1045]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 13 00:33:25.060215 ignition[1045]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 13 00:33:25.064634 ignition[1045]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 13 00:33:25.064634 ignition[1045]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 13 00:33:25.064634 ignition[1045]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 13 00:33:25.064634 ignition[1045]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 13 00:33:25.064634 ignition[1045]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Mar 13 00:33:25.064634 ignition[1045]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Mar 13 00:33:25.064634 ignition[1045]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 13 00:33:25.064634 ignition[1045]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 13 00:33:25.064634 ignition[1045]: INFO : files: files passed Mar 13 00:33:25.064634 ignition[1045]: INFO : Ignition finished successfully Mar 13 00:33:25.063517 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 13 00:33:25.066366 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 13 00:33:25.070025 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 13 00:33:25.087052 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 13 00:33:25.088785 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 13 00:33:25.096403 initrd-setup-root-after-ignition[1076]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 13 00:33:25.096403 initrd-setup-root-after-ignition[1076]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 13 00:33:25.099490 initrd-setup-root-after-ignition[1079]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 13 00:33:25.103409 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 13 00:33:25.104727 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 13 00:33:25.106734 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 13 00:33:25.174600 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 13 00:33:25.174789 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 13 00:33:25.176437 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 13 00:33:25.177338 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 13 00:33:25.178138 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 13 00:33:25.180781 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 13 00:33:25.214011 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 13 00:33:25.217807 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 13 00:33:25.248561 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 13 00:33:25.249701 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 13 00:33:25.251109 systemd[1]: Stopped target timers.target - Timer Units. Mar 13 00:33:25.252431 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 13 00:33:25.252653 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 13 00:33:25.254208 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 13 00:33:25.255343 systemd[1]: Stopped target basic.target - Basic System. Mar 13 00:33:25.256467 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 13 00:33:25.257424 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 13 00:33:25.258669 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 13 00:33:25.259850 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 13 00:33:25.261041 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 13 00:33:25.262304 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 13 00:33:25.263490 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 13 00:33:25.264586 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 13 00:33:25.265940 systemd[1]: Stopped target swap.target - Swaps. Mar 13 00:33:25.267013 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 13 00:33:25.267201 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 13 00:33:25.268746 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 13 00:33:25.269978 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 13 00:33:25.271025 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 13 00:33:25.271392 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 13 00:33:25.272726 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 13 00:33:25.272972 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 13 00:33:25.274536 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 13 00:33:25.274770 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 13 00:33:25.276181 systemd[1]: ignition-files.service: Deactivated successfully. Mar 13 00:33:25.276395 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 13 00:33:25.277412 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 13 00:33:25.277616 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 13 00:33:25.280383 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 13 00:33:25.281159 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 13 00:33:25.281424 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 13 00:33:25.284356 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 13 00:33:25.286388 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 13 00:33:25.287365 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 13 00:33:25.288304 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 13 00:33:25.288749 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 13 00:33:25.295102 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 13 00:33:25.295628 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 13 00:33:25.313272 ignition[1100]: INFO : Ignition 2.22.0 Mar 13 00:33:25.313272 ignition[1100]: INFO : Stage: umount Mar 13 00:33:25.313272 ignition[1100]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 13 00:33:25.313272 ignition[1100]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:33:25.313272 ignition[1100]: INFO : umount: umount passed Mar 13 00:33:25.313272 ignition[1100]: INFO : Ignition finished successfully Mar 13 00:33:25.312923 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 13 00:33:25.316661 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 13 00:33:25.316795 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 13 00:33:25.318416 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 13 00:33:25.318555 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 13 00:33:25.320323 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 13 00:33:25.320432 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 13 00:33:25.321539 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 13 00:33:25.321599 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 13 00:33:25.322536 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 13 00:33:25.322592 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 13 00:33:25.323434 systemd[1]: Stopped target network.target - Network. Mar 13 00:33:25.324253 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 13 00:33:25.324324 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 13 00:33:25.325124 systemd[1]: Stopped target paths.target - Path Units. Mar 13 00:33:25.325966 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 13 00:33:25.330296 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 13 00:33:25.330813 systemd[1]: Stopped target slices.target - Slice Units. Mar 13 00:33:25.331687 systemd[1]: Stopped target sockets.target - Socket Units. Mar 13 00:33:25.332582 systemd[1]: iscsid.socket: Deactivated successfully. Mar 13 00:33:25.332646 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 13 00:33:25.333423 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 13 00:33:25.333480 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 13 00:33:25.334213 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 13 00:33:25.334318 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 13 00:33:25.335090 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 13 00:33:25.335142 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 13 00:33:25.335887 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 13 00:33:25.335952 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 13 00:33:25.336922 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 13 00:33:25.337665 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 13 00:33:25.340909 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 13 00:33:25.341046 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 13 00:33:25.344391 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 13 00:33:25.345205 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 13 00:33:25.345288 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 13 00:33:25.347453 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 13 00:33:25.349054 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 13 00:33:25.349169 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 13 00:33:25.351159 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 13 00:33:25.351774 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 13 00:33:25.352885 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 13 00:33:25.352936 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 13 00:33:25.354744 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 13 00:33:25.355198 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 13 00:33:25.355333 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 13 00:33:25.356270 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 13 00:33:25.356325 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 13 00:33:25.358894 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 13 00:33:25.358941 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 13 00:33:25.360122 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 13 00:33:25.361956 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 13 00:33:25.381849 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 13 00:33:25.382529 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 13 00:33:25.383544 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 13 00:33:25.383613 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 13 00:33:25.384034 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 13 00:33:25.384064 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 13 00:33:25.386347 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 13 00:33:25.386414 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 13 00:33:25.387551 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 13 00:33:25.387608 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 13 00:33:25.388802 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 13 00:33:25.388855 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 13 00:33:25.390954 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 13 00:33:25.391502 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 13 00:33:25.391561 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 13 00:33:25.392504 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 13 00:33:25.392562 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 13 00:33:25.393907 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:33:25.393960 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:25.395670 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 13 00:33:25.395797 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 13 00:33:25.413268 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 13 00:33:25.413409 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 13 00:33:25.414632 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 13 00:33:25.416124 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 13 00:33:25.434099 systemd[1]: Switching root. Mar 13 00:33:25.475919 systemd-journald[199]: Journal stopped Mar 13 00:33:26.882011 systemd-journald[199]: Received SIGTERM from PID 1 (systemd). Mar 13 00:33:26.882129 kernel: SELinux: policy capability network_peer_controls=1 Mar 13 00:33:26.882147 kernel: SELinux: policy capability open_perms=1 Mar 13 00:33:26.882159 kernel: SELinux: policy capability extended_socket_class=1 Mar 13 00:33:26.882172 kernel: SELinux: policy capability always_check_network=0 Mar 13 00:33:26.882184 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 13 00:33:26.882208 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 13 00:33:26.883421 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 13 00:33:26.883463 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 13 00:33:26.883480 kernel: SELinux: policy capability userspace_initial_context=0 Mar 13 00:33:26.883497 kernel: audit: type=1403 audit(1773362005.673:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 13 00:33:26.883519 systemd[1]: Successfully loaded SELinux policy in 72.942ms. Mar 13 00:33:26.883548 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.662ms. Mar 13 00:33:26.883563 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 13 00:33:26.883582 systemd[1]: Detected virtualization kvm. Mar 13 00:33:26.883597 systemd[1]: Detected architecture x86-64. Mar 13 00:33:26.883610 systemd[1]: Detected first boot. Mar 13 00:33:26.883625 systemd[1]: Hostname set to . Mar 13 00:33:26.883639 systemd[1]: Initializing machine ID from VM UUID. Mar 13 00:33:26.883654 zram_generator::config[1145]: No configuration found. Mar 13 00:33:26.883669 kernel: Guest personality initialized and is inactive Mar 13 00:33:26.883684 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Mar 13 00:33:26.883700 kernel: Initialized host personality Mar 13 00:33:26.883714 kernel: NET: Registered PF_VSOCK protocol family Mar 13 00:33:26.883727 systemd[1]: Populated /etc with preset unit settings. Mar 13 00:33:26.883743 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 13 00:33:26.883758 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 13 00:33:26.883772 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 13 00:33:26.883786 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 13 00:33:26.883800 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 13 00:33:26.883815 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 13 00:33:26.883833 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 13 00:33:26.883851 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 13 00:33:26.883866 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 13 00:33:26.883892 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 13 00:33:26.883906 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 13 00:33:26.883917 systemd[1]: Created slice user.slice - User and Session Slice. Mar 13 00:33:26.883929 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 13 00:33:26.883940 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 13 00:33:26.883955 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 13 00:33:26.883971 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 13 00:33:26.883986 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 13 00:33:26.884002 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 13 00:33:26.884016 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 13 00:33:26.884031 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 13 00:33:26.884049 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 13 00:33:26.884063 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 13 00:33:26.884079 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 13 00:33:26.884094 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 13 00:33:26.884108 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 13 00:33:26.884122 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 13 00:33:26.884137 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 13 00:33:26.884151 systemd[1]: Reached target slices.target - Slice Units. Mar 13 00:33:26.884170 systemd[1]: Reached target swap.target - Swaps. Mar 13 00:33:26.884186 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 13 00:33:26.884201 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 13 00:33:26.884213 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 13 00:33:26.887181 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 13 00:33:26.887305 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 13 00:33:26.887324 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 13 00:33:26.887338 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 13 00:33:26.887352 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 13 00:33:26.887366 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 13 00:33:26.887387 systemd[1]: Mounting media.mount - External Media Directory... Mar 13 00:33:26.887401 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:26.887414 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 13 00:33:26.887428 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 13 00:33:26.887442 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 13 00:33:26.887457 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 13 00:33:26.887471 systemd[1]: Reached target machines.target - Containers. Mar 13 00:33:26.887485 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 13 00:33:26.887502 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:33:26.887515 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 13 00:33:26.887529 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 13 00:33:26.887542 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 13 00:33:26.887556 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 13 00:33:26.887571 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 13 00:33:26.887585 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 13 00:33:26.887598 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 13 00:33:26.887613 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 13 00:33:26.887629 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 13 00:33:26.887644 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 13 00:33:26.887657 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 13 00:33:26.887671 systemd[1]: Stopped systemd-fsck-usr.service. Mar 13 00:33:26.887686 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:33:26.887702 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 13 00:33:26.887716 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 13 00:33:26.887733 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 13 00:33:26.887751 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 13 00:33:26.887765 kernel: loop: module loaded Mar 13 00:33:26.887786 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 13 00:33:26.887801 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 13 00:33:26.887815 systemd[1]: verity-setup.service: Deactivated successfully. Mar 13 00:33:26.887828 systemd[1]: Stopped verity-setup.service. Mar 13 00:33:26.887841 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:26.887855 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 13 00:33:26.887869 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 13 00:33:26.887898 systemd[1]: Mounted media.mount - External Media Directory. Mar 13 00:33:26.887911 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 13 00:33:26.887927 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 13 00:33:26.887940 kernel: fuse: init (API version 7.41) Mar 13 00:33:26.887953 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 13 00:33:26.887966 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 13 00:33:26.887980 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 13 00:33:26.887993 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 13 00:33:26.888006 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 13 00:33:26.888019 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 13 00:33:26.888039 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 13 00:33:26.888052 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 13 00:33:26.888066 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 13 00:33:26.888078 kernel: ACPI: bus type drm_connector registered Mar 13 00:33:26.888092 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 13 00:33:26.888106 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 13 00:33:26.888120 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 13 00:33:26.888134 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 13 00:33:26.888212 systemd-journald[1233]: Collecting audit messages is disabled. Mar 13 00:33:26.888311 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 13 00:33:26.888328 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 13 00:33:26.888344 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 13 00:33:26.888361 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 13 00:33:26.888377 systemd-journald[1233]: Journal started Mar 13 00:33:26.888402 systemd-journald[1233]: Runtime Journal (/run/log/journal/d246c5c1fbd24b728708f8dd09a8d503) is 8M, max 76.1M, 68.1M free. Mar 13 00:33:26.399842 systemd[1]: Queued start job for default target multi-user.target. Mar 13 00:33:26.425998 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 13 00:33:26.426720 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 13 00:33:26.893248 systemd[1]: Started systemd-journald.service - Journal Service. Mar 13 00:33:26.896400 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 13 00:33:26.897542 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 13 00:33:26.914300 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 13 00:33:26.920395 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 13 00:33:26.924342 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 13 00:33:26.926973 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 13 00:33:26.927027 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 13 00:33:26.929057 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 13 00:33:26.938512 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 13 00:33:26.941029 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:33:26.943516 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 13 00:33:26.949473 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 13 00:33:26.951951 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 13 00:33:26.956929 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 13 00:33:26.957637 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 13 00:33:26.960529 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 13 00:33:26.964575 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 13 00:33:26.972627 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 13 00:33:26.977998 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 13 00:33:26.978788 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 13 00:33:27.002582 systemd-journald[1233]: Time spent on flushing to /var/log/journal/d246c5c1fbd24b728708f8dd09a8d503 is 100.096ms for 1242 entries. Mar 13 00:33:27.002582 systemd-journald[1233]: System Journal (/var/log/journal/d246c5c1fbd24b728708f8dd09a8d503) is 8M, max 584.8M, 576.8M free. Mar 13 00:33:27.129577 systemd-journald[1233]: Received client request to flush runtime journal. Mar 13 00:33:27.129635 kernel: loop0: detected capacity change from 0 to 128560 Mar 13 00:33:27.129657 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 13 00:33:27.129679 kernel: loop1: detected capacity change from 0 to 8 Mar 13 00:33:27.014538 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 13 00:33:27.016686 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 13 00:33:27.018938 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 13 00:33:27.027063 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 13 00:33:27.081725 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 13 00:33:27.086276 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 13 00:33:27.090514 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 13 00:33:27.123606 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 13 00:33:27.132510 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 13 00:33:27.145668 kernel: loop2: detected capacity change from 0 to 217752 Mar 13 00:33:27.153651 systemd-tmpfiles[1285]: ACLs are not supported, ignoring. Mar 13 00:33:27.154420 systemd-tmpfiles[1285]: ACLs are not supported, ignoring. Mar 13 00:33:27.166760 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 13 00:33:27.193262 kernel: loop3: detected capacity change from 0 to 110984 Mar 13 00:33:27.238376 kernel: loop4: detected capacity change from 0 to 128560 Mar 13 00:33:27.260320 kernel: loop5: detected capacity change from 0 to 8 Mar 13 00:33:27.266276 kernel: loop6: detected capacity change from 0 to 217752 Mar 13 00:33:27.291252 kernel: loop7: detected capacity change from 0 to 110984 Mar 13 00:33:27.311542 (sd-merge)[1296]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Mar 13 00:33:27.312733 (sd-merge)[1296]: Merged extensions into '/usr'. Mar 13 00:33:27.319600 systemd[1]: Reload requested from client PID 1270 ('systemd-sysext') (unit systemd-sysext.service)... Mar 13 00:33:27.319737 systemd[1]: Reloading... Mar 13 00:33:27.423258 zram_generator::config[1322]: No configuration found. Mar 13 00:33:27.538013 ldconfig[1265]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 13 00:33:27.621536 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 13 00:33:27.622024 systemd[1]: Reloading finished in 301 ms. Mar 13 00:33:27.655187 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 13 00:33:27.656102 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 13 00:33:27.657053 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 13 00:33:27.670010 systemd[1]: Starting ensure-sysext.service... Mar 13 00:33:27.673357 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 13 00:33:27.679393 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 13 00:33:27.696329 systemd-tmpfiles[1367]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 13 00:33:27.696638 systemd[1]: Reload requested from client PID 1366 ('systemctl') (unit ensure-sysext.service)... Mar 13 00:33:27.696650 systemd[1]: Reloading... Mar 13 00:33:27.696865 systemd-tmpfiles[1367]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 13 00:33:27.697244 systemd-tmpfiles[1367]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 13 00:33:27.697558 systemd-tmpfiles[1367]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 13 00:33:27.698509 systemd-tmpfiles[1367]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 13 00:33:27.698816 systemd-tmpfiles[1367]: ACLs are not supported, ignoring. Mar 13 00:33:27.698941 systemd-tmpfiles[1367]: ACLs are not supported, ignoring. Mar 13 00:33:27.703411 systemd-tmpfiles[1367]: Detected autofs mount point /boot during canonicalization of boot. Mar 13 00:33:27.703554 systemd-tmpfiles[1367]: Skipping /boot Mar 13 00:33:27.714925 systemd-tmpfiles[1367]: Detected autofs mount point /boot during canonicalization of boot. Mar 13 00:33:27.714940 systemd-tmpfiles[1367]: Skipping /boot Mar 13 00:33:27.768050 systemd-udevd[1368]: Using default interface naming scheme 'v255'. Mar 13 00:33:27.802280 zram_generator::config[1397]: No configuration found. Mar 13 00:33:28.089248 kernel: mousedev: PS/2 mouse device common for all mice Mar 13 00:33:28.100263 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Mar 13 00:33:28.121451 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 13 00:33:28.121969 systemd[1]: Reloading finished in 424 ms. Mar 13 00:33:28.135180 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 13 00:33:28.137615 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 13 00:33:28.145295 kernel: ACPI: button: Power Button [PWRF] Mar 13 00:33:28.163178 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 13 00:33:28.168467 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 13 00:33:28.173538 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 13 00:33:28.176662 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 13 00:33:28.191204 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 13 00:33:28.193781 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 13 00:33:28.200702 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 13 00:33:28.204754 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:28.204942 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:33:28.207594 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 13 00:33:28.214085 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 13 00:33:28.219249 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 13 00:33:28.219845 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:33:28.219995 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:33:28.220069 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:28.226192 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:28.227459 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:33:28.227612 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:33:28.227676 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:33:28.227739 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:28.233212 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:28.234289 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:33:28.237767 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 13 00:33:28.238346 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:33:28.238435 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:33:28.238543 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:28.249205 systemd[1]: Finished ensure-sysext.service. Mar 13 00:33:28.255528 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 13 00:33:28.257443 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 13 00:33:28.257688 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 13 00:33:28.272701 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 13 00:33:28.273672 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 13 00:33:28.274668 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 13 00:33:28.296308 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 13 00:33:28.307873 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 13 00:33:28.308181 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 13 00:33:28.309247 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 13 00:33:28.309489 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 13 00:33:28.311209 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 13 00:33:28.318830 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 13 00:33:28.322843 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 13 00:33:28.344876 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 13 00:33:28.350963 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 13 00:33:28.380134 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 13 00:33:28.387858 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 13 00:33:28.399530 augenrules[1528]: No rules Mar 13 00:33:28.401592 systemd[1]: audit-rules.service: Deactivated successfully. Mar 13 00:33:28.404311 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 13 00:33:28.491648 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Mar 13 00:33:28.491718 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:28.491862 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:33:28.494138 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 13 00:33:28.505561 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 13 00:33:28.516583 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 13 00:33:28.517297 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:33:28.517350 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:33:28.517384 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 13 00:33:28.517400 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:28.530670 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 13 00:33:28.530979 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 13 00:33:28.556481 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Mar 13 00:33:28.556827 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 13 00:33:28.557024 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 13 00:33:28.587983 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 13 00:33:28.589293 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 13 00:33:28.599910 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 13 00:33:28.606605 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 13 00:33:28.607311 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 13 00:33:28.613173 systemd-networkd[1483]: lo: Link UP Mar 13 00:33:28.619046 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 13 00:33:28.621716 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 13 00:33:28.621753 systemd-networkd[1483]: lo: Gained carrier Mar 13 00:33:28.623520 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 13 00:33:28.624180 systemd[1]: Reached target time-set.target - System Time Set. Mar 13 00:33:28.626340 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 13 00:33:28.634395 systemd-timesyncd[1499]: No network connectivity, watching for changes. Mar 13 00:33:28.635043 systemd-networkd[1483]: Enumeration completed Mar 13 00:33:28.636322 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 13 00:33:28.640353 systemd-networkd[1483]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:33:28.640361 systemd-networkd[1483]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 13 00:33:28.641595 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 13 00:33:28.645761 systemd-networkd[1483]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:33:28.645770 systemd-networkd[1483]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 13 00:33:28.646445 systemd-resolved[1484]: Positive Trust Anchors: Mar 13 00:33:28.646460 systemd-resolved[1484]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 13 00:33:28.646496 systemd-resolved[1484]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 13 00:33:28.648792 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 13 00:33:28.648937 systemd-networkd[1483]: eth0: Link UP Mar 13 00:33:28.649106 systemd-networkd[1483]: eth0: Gained carrier Mar 13 00:33:28.649130 systemd-networkd[1483]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:33:28.658396 systemd-networkd[1483]: eth1: Link UP Mar 13 00:33:28.659068 systemd-networkd[1483]: eth1: Gained carrier Mar 13 00:33:28.659093 systemd-networkd[1483]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:33:28.660825 systemd-resolved[1484]: Using system hostname 'ci-4459-2-4-n-7393fd8643'. Mar 13 00:33:28.667648 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 13 00:33:28.669774 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 13 00:33:28.670499 systemd[1]: Reached target network.target - Network. Mar 13 00:33:28.670847 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 13 00:33:28.671668 systemd[1]: Reached target sysinit.target - System Initialization. Mar 13 00:33:28.672144 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 13 00:33:28.674590 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 13 00:33:28.674955 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Mar 13 00:33:28.675543 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 13 00:33:28.675985 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 13 00:33:28.676382 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 13 00:33:28.676717 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 13 00:33:28.676745 systemd[1]: Reached target paths.target - Path Units. Mar 13 00:33:28.677084 systemd[1]: Reached target timers.target - Timer Units. Mar 13 00:33:28.679048 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 13 00:33:28.682089 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 13 00:33:28.683342 systemd-networkd[1483]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 13 00:33:28.685350 systemd-timesyncd[1499]: Network configuration changed, trying to establish connection. Mar 13 00:33:28.686545 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 13 00:33:28.687125 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 13 00:33:28.687946 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 13 00:33:28.698861 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 13 00:33:28.699743 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 13 00:33:28.703775 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 13 00:33:28.705122 systemd[1]: Reached target sockets.target - Socket Units. Mar 13 00:33:28.705527 systemd[1]: Reached target basic.target - Basic System. Mar 13 00:33:28.705924 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 13 00:33:28.705948 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 13 00:33:28.708132 systemd[1]: Starting containerd.service - containerd container runtime... Mar 13 00:33:28.710053 systemd-networkd[1483]: eth0: DHCPv4 address 157.180.95.181/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 13 00:33:28.710851 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 13 00:33:28.713690 systemd-timesyncd[1499]: Network configuration changed, trying to establish connection. Mar 13 00:33:28.717479 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 13 00:33:28.720452 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 13 00:33:28.725195 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 13 00:33:28.728456 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 13 00:33:28.730308 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 13 00:33:28.735566 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Mar 13 00:33:28.741073 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 13 00:33:28.746057 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 13 00:33:28.758150 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Mar 13 00:33:28.766747 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 13 00:33:28.770256 jq[1574]: false Mar 13 00:33:28.778559 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 13 00:33:28.787070 coreos-metadata[1571]: Mar 13 00:33:28.785 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Mar 13 00:33:28.788830 coreos-metadata[1571]: Mar 13 00:33:28.788 INFO Fetch successful Mar 13 00:33:28.788830 coreos-metadata[1571]: Mar 13 00:33:28.788 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Mar 13 00:33:28.791110 coreos-metadata[1571]: Mar 13 00:33:28.790 INFO Fetch successful Mar 13 00:33:28.791879 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 13 00:33:28.794245 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 13 00:33:28.796771 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 13 00:33:28.802863 systemd[1]: Starting update-engine.service - Update Engine... Mar 13 00:33:28.811455 google_oslogin_nss_cache[1576]: oslogin_cache_refresh[1576]: Refreshing passwd entry cache Mar 13 00:33:28.810701 oslogin_cache_refresh[1576]: Refreshing passwd entry cache Mar 13 00:33:28.813429 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 13 00:33:28.817312 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 13 00:33:28.821516 google_oslogin_nss_cache[1576]: oslogin_cache_refresh[1576]: Failure getting users, quitting Mar 13 00:33:28.821508 oslogin_cache_refresh[1576]: Failure getting users, quitting Mar 13 00:33:28.821647 google_oslogin_nss_cache[1576]: oslogin_cache_refresh[1576]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 13 00:33:28.821647 google_oslogin_nss_cache[1576]: oslogin_cache_refresh[1576]: Refreshing group entry cache Mar 13 00:33:28.821535 oslogin_cache_refresh[1576]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 13 00:33:28.821613 oslogin_cache_refresh[1576]: Refreshing group entry cache Mar 13 00:33:28.829804 google_oslogin_nss_cache[1576]: oslogin_cache_refresh[1576]: Failure getting groups, quitting Mar 13 00:33:28.829804 google_oslogin_nss_cache[1576]: oslogin_cache_refresh[1576]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 13 00:33:28.826011 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 13 00:33:28.823434 oslogin_cache_refresh[1576]: Failure getting groups, quitting Mar 13 00:33:28.827764 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 13 00:33:28.823449 oslogin_cache_refresh[1576]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 13 00:33:28.828074 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 13 00:33:28.831371 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Mar 13 00:33:28.832438 extend-filesystems[1575]: Found /dev/sda6 Mar 13 00:33:28.831653 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Mar 13 00:33:28.852108 extend-filesystems[1575]: Found /dev/sda9 Mar 13 00:33:28.865375 systemd[1]: motdgen.service: Deactivated successfully. Mar 13 00:33:28.866053 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 13 00:33:28.879098 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 13 00:33:28.879755 extend-filesystems[1575]: Checking size of /dev/sda9 Mar 13 00:33:28.884522 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 13 00:33:28.919078 tar[1600]: linux-amd64/LICENSE Mar 13 00:33:28.919691 tar[1600]: linux-amd64/helm Mar 13 00:33:28.923874 extend-filesystems[1575]: Resized partition /dev/sda9 Mar 13 00:33:28.931611 jq[1595]: true Mar 13 00:33:28.940531 extend-filesystems[1623]: resize2fs 1.47.3 (8-Jul-2025) Mar 13 00:33:28.951446 update_engine[1594]: I20260313 00:33:28.948861 1594 main.cc:92] Flatcar Update Engine starting Mar 13 00:33:28.953098 (ntainerd)[1620]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 13 00:33:28.954767 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:33:28.972459 sshd_keygen[1604]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 13 00:33:28.996404 jq[1624]: true Mar 13 00:33:29.004056 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 19393531 blocks Mar 13 00:33:29.003035 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 13 00:33:29.006932 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:33:29.007167 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:29.013948 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 13 00:33:29.020855 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:33:29.026979 dbus-daemon[1572]: [system] SELinux support is enabled Mar 13 00:33:29.027245 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 13 00:33:29.030848 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 13 00:33:29.030871 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 13 00:33:29.031959 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 13 00:33:29.031981 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 13 00:33:29.056191 systemd[1]: Started update-engine.service - Update Engine. Mar 13 00:33:29.058300 update_engine[1594]: I20260313 00:33:29.056865 1594 update_check_scheduler.cc:74] Next update check in 10m6s Mar 13 00:33:29.060405 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 13 00:33:29.086323 kernel: EDAC MC: Ver: 3.0.0 Mar 13 00:33:29.091618 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 13 00:33:29.094013 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 13 00:33:29.100026 systemd[1]: issuegen.service: Deactivated successfully. Mar 13 00:33:29.101069 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 13 00:33:29.103782 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 13 00:33:29.134887 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 13 00:33:29.137005 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 13 00:33:29.139874 bash[1657]: Updated "/home/core/.ssh/authorized_keys" Mar 13 00:33:29.139444 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 13 00:33:29.139927 systemd[1]: Reached target getty.target - Login Prompts. Mar 13 00:33:29.142085 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 13 00:33:29.148412 systemd[1]: Starting sshkeys.service... Mar 13 00:33:29.216457 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 13 00:33:29.219865 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 13 00:33:29.251555 kernel: EXT4-fs (sda9): resized filesystem to 19393531 Mar 13 00:33:29.283511 extend-filesystems[1623]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Mar 13 00:33:29.283511 extend-filesystems[1623]: old_desc_blocks = 1, new_desc_blocks = 10 Mar 13 00:33:29.283511 extend-filesystems[1623]: The filesystem on /dev/sda9 is now 19393531 (4k) blocks long. Mar 13 00:33:29.290491 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Mar 13 00:33:29.290545 extend-filesystems[1575]: Resized filesystem in /dev/sda9 Mar 13 00:33:29.290555 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 13 00:33:29.290861 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 13 00:33:29.297075 kernel: Console: switching to colour dummy device 80x25 Mar 13 00:33:29.300624 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Mar 13 00:33:29.306169 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 13 00:33:29.306232 kernel: [drm] features: -context_init Mar 13 00:33:29.306269 coreos-metadata[1677]: Mar 13 00:33:29.304 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Mar 13 00:33:29.309144 coreos-metadata[1677]: Mar 13 00:33:29.309 INFO Fetch successful Mar 13 00:33:29.310274 unknown[1677]: wrote ssh authorized keys file for user: core Mar 13 00:33:29.314254 kernel: [drm] number of scanouts: 1 Mar 13 00:33:29.317549 kernel: [drm] number of cap sets: 0 Mar 13 00:33:29.322252 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Mar 13 00:33:29.329139 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Mar 13 00:33:29.329298 kernel: Console: switching to colour frame buffer device 160x50 Mar 13 00:33:29.339406 containerd[1620]: time="2026-03-13T00:33:29Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 13 00:33:29.340261 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 13 00:33:29.352644 containerd[1620]: time="2026-03-13T00:33:29.352567211Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 13 00:33:29.363719 locksmithd[1645]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 13 00:33:29.373772 containerd[1620]: time="2026-03-13T00:33:29.373713719Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.43µs" Mar 13 00:33:29.373772 containerd[1620]: time="2026-03-13T00:33:29.373757659Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 13 00:33:29.373772 containerd[1620]: time="2026-03-13T00:33:29.373777419Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 13 00:33:29.374037 containerd[1620]: time="2026-03-13T00:33:29.374017439Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 13 00:33:29.374058 containerd[1620]: time="2026-03-13T00:33:29.374038659Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 13 00:33:29.376243 containerd[1620]: time="2026-03-13T00:33:29.374074189Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 13 00:33:29.376243 containerd[1620]: time="2026-03-13T00:33:29.374159959Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 13 00:33:29.376243 containerd[1620]: time="2026-03-13T00:33:29.374184869Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 13 00:33:29.376426 containerd[1620]: time="2026-03-13T00:33:29.376390941Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 13 00:33:29.376426 containerd[1620]: time="2026-03-13T00:33:29.376420591Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 13 00:33:29.376462 containerd[1620]: time="2026-03-13T00:33:29.376433211Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 13 00:33:29.376462 containerd[1620]: time="2026-03-13T00:33:29.376446501Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 13 00:33:29.376575 containerd[1620]: time="2026-03-13T00:33:29.376554161Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 13 00:33:29.376815 containerd[1620]: time="2026-03-13T00:33:29.376783812Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 13 00:33:29.376857 containerd[1620]: time="2026-03-13T00:33:29.376837382Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 13 00:33:29.376857 containerd[1620]: time="2026-03-13T00:33:29.376851982Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 13 00:33:29.376925 containerd[1620]: time="2026-03-13T00:33:29.376907582Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 13 00:33:29.383313 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:29.392661 containerd[1620]: time="2026-03-13T00:33:29.392604165Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 13 00:33:29.392761 containerd[1620]: time="2026-03-13T00:33:29.392748715Z" level=info msg="metadata content store policy set" policy=shared Mar 13 00:33:29.396507 containerd[1620]: time="2026-03-13T00:33:29.396458168Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 13 00:33:29.396599 containerd[1620]: time="2026-03-13T00:33:29.396525748Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 13 00:33:29.396599 containerd[1620]: time="2026-03-13T00:33:29.396544888Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 13 00:33:29.396599 containerd[1620]: time="2026-03-13T00:33:29.396555388Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 13 00:33:29.396599 containerd[1620]: time="2026-03-13T00:33:29.396567718Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 13 00:33:29.396599 containerd[1620]: time="2026-03-13T00:33:29.396577138Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 13 00:33:29.396599 containerd[1620]: time="2026-03-13T00:33:29.396588348Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 13 00:33:29.396599 containerd[1620]: time="2026-03-13T00:33:29.396599808Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 13 00:33:29.396707 containerd[1620]: time="2026-03-13T00:33:29.396616298Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 13 00:33:29.396707 containerd[1620]: time="2026-03-13T00:33:29.396625578Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 13 00:33:29.396707 containerd[1620]: time="2026-03-13T00:33:29.396633458Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 13 00:33:29.396707 containerd[1620]: time="2026-03-13T00:33:29.396644328Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 13 00:33:29.396804 containerd[1620]: time="2026-03-13T00:33:29.396778488Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 13 00:33:29.396822 containerd[1620]: time="2026-03-13T00:33:29.396804898Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 13 00:33:29.396844 containerd[1620]: time="2026-03-13T00:33:29.396825898Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 13 00:33:29.396844 containerd[1620]: time="2026-03-13T00:33:29.396842078Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 13 00:33:29.396875 containerd[1620]: time="2026-03-13T00:33:29.396851958Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 13 00:33:29.396875 containerd[1620]: time="2026-03-13T00:33:29.396861688Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 13 00:33:29.396875 containerd[1620]: time="2026-03-13T00:33:29.396872548Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 13 00:33:29.396939 containerd[1620]: time="2026-03-13T00:33:29.396881658Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 13 00:33:29.396939 containerd[1620]: time="2026-03-13T00:33:29.396904788Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 13 00:33:29.396939 containerd[1620]: time="2026-03-13T00:33:29.396914288Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 13 00:33:29.396939 containerd[1620]: time="2026-03-13T00:33:29.396923378Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 13 00:33:29.396995 containerd[1620]: time="2026-03-13T00:33:29.396967068Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 13 00:33:29.396995 containerd[1620]: time="2026-03-13T00:33:29.396979588Z" level=info msg="Start snapshots syncer" Mar 13 00:33:29.397033 containerd[1620]: time="2026-03-13T00:33:29.397013008Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 13 00:33:29.401945 containerd[1620]: time="2026-03-13T00:33:29.401865302Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 13 00:33:29.401945 containerd[1620]: time="2026-03-13T00:33:29.401953133Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 13 00:33:29.402168 containerd[1620]: time="2026-03-13T00:33:29.402013333Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 13 00:33:29.402188 containerd[1620]: time="2026-03-13T00:33:29.402178723Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 13 00:33:29.402305 containerd[1620]: time="2026-03-13T00:33:29.402196473Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 13 00:33:29.402305 containerd[1620]: time="2026-03-13T00:33:29.402206793Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 13 00:33:29.402305 containerd[1620]: time="2026-03-13T00:33:29.402215273Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 13 00:33:29.405329 containerd[1620]: time="2026-03-13T00:33:29.405280195Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 13 00:33:29.405329 containerd[1620]: time="2026-03-13T00:33:29.405322485Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 13 00:33:29.405329 containerd[1620]: time="2026-03-13T00:33:29.405335535Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 13 00:33:29.405497 containerd[1620]: time="2026-03-13T00:33:29.405366025Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 13 00:33:29.405497 containerd[1620]: time="2026-03-13T00:33:29.405375675Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 13 00:33:29.405497 containerd[1620]: time="2026-03-13T00:33:29.405398125Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 13 00:33:29.405497 containerd[1620]: time="2026-03-13T00:33:29.405446385Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 13 00:33:29.405497 containerd[1620]: time="2026-03-13T00:33:29.405460505Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 13 00:33:29.405497 containerd[1620]: time="2026-03-13T00:33:29.405468515Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 13 00:33:29.405497 containerd[1620]: time="2026-03-13T00:33:29.405477145Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 13 00:33:29.405497 containerd[1620]: time="2026-03-13T00:33:29.405484175Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 13 00:33:29.405497 containerd[1620]: time="2026-03-13T00:33:29.405492576Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 13 00:33:29.405934 containerd[1620]: time="2026-03-13T00:33:29.405509126Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 13 00:33:29.405934 containerd[1620]: time="2026-03-13T00:33:29.405526806Z" level=info msg="runtime interface created" Mar 13 00:33:29.405934 containerd[1620]: time="2026-03-13T00:33:29.405532106Z" level=info msg="created NRI interface" Mar 13 00:33:29.405934 containerd[1620]: time="2026-03-13T00:33:29.405539596Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 13 00:33:29.405934 containerd[1620]: time="2026-03-13T00:33:29.405555066Z" level=info msg="Connect containerd service" Mar 13 00:33:29.405934 containerd[1620]: time="2026-03-13T00:33:29.405572816Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 13 00:33:29.407376 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:33:29.407542 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:29.409728 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:33:29.417203 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:33:29.420321 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 13 00:33:29.421621 containerd[1620]: time="2026-03-13T00:33:29.420825718Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 13 00:33:29.434872 update-ssh-keys[1694]: Updated "/home/core/.ssh/authorized_keys" Mar 13 00:33:29.437987 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 13 00:33:29.446184 systemd[1]: Finished sshkeys.service. Mar 13 00:33:29.501327 systemd-logind[1591]: New seat seat0. Mar 13 00:33:29.504482 systemd-logind[1591]: Watching system buttons on /dev/input/event3 (Power Button) Mar 13 00:33:29.504510 systemd-logind[1591]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 13 00:33:29.504812 systemd[1]: Started systemd-logind.service - User Login Management. Mar 13 00:33:29.572511 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:29.628539 containerd[1620]: time="2026-03-13T00:33:29.628412881Z" level=info msg="Start subscribing containerd event" Mar 13 00:33:29.630383 containerd[1620]: time="2026-03-13T00:33:29.630321393Z" level=info msg="Start recovering state" Mar 13 00:33:29.631085 containerd[1620]: time="2026-03-13T00:33:29.630669063Z" level=info msg="Start event monitor" Mar 13 00:33:29.631085 containerd[1620]: time="2026-03-13T00:33:29.630685293Z" level=info msg="Start cni network conf syncer for default" Mar 13 00:33:29.631085 containerd[1620]: time="2026-03-13T00:33:29.630718293Z" level=info msg="Start streaming server" Mar 13 00:33:29.631085 containerd[1620]: time="2026-03-13T00:33:29.630733653Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 13 00:33:29.631085 containerd[1620]: time="2026-03-13T00:33:29.630740313Z" level=info msg="runtime interface starting up..." Mar 13 00:33:29.631085 containerd[1620]: time="2026-03-13T00:33:29.630745653Z" level=info msg="starting plugins..." Mar 13 00:33:29.631085 containerd[1620]: time="2026-03-13T00:33:29.630759793Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 13 00:33:29.632216 containerd[1620]: time="2026-03-13T00:33:29.631058713Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 13 00:33:29.632216 containerd[1620]: time="2026-03-13T00:33:29.632102134Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 13 00:33:29.632532 systemd[1]: Started containerd.service - containerd container runtime. Mar 13 00:33:29.635972 containerd[1620]: time="2026-03-13T00:33:29.635297457Z" level=info msg="containerd successfully booted in 0.299189s" Mar 13 00:33:29.718475 tar[1600]: linux-amd64/README.md Mar 13 00:33:29.738362 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 13 00:33:30.190511 systemd-networkd[1483]: eth0: Gained IPv6LL Mar 13 00:33:30.191589 systemd-timesyncd[1499]: Network configuration changed, trying to establish connection. Mar 13 00:33:30.194110 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 13 00:33:30.195059 systemd[1]: Reached target network-online.target - Network is Online. Mar 13 00:33:30.198780 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:33:30.201449 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 13 00:33:30.235429 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 13 00:33:30.318432 systemd-networkd[1483]: eth1: Gained IPv6LL Mar 13 00:33:30.319028 systemd-timesyncd[1499]: Network configuration changed, trying to establish connection. Mar 13 00:33:31.071354 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:33:31.072713 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 13 00:33:31.076487 systemd[1]: Startup finished in 3.144s (kernel) + 6.995s (initrd) + 5.474s (userspace) = 15.614s. Mar 13 00:33:31.082734 (kubelet)[1739]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 13 00:33:31.588190 kubelet[1739]: E0313 00:33:31.588112 1739 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 13 00:33:31.592816 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 13 00:33:31.593058 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 13 00:33:31.593677 systemd[1]: kubelet.service: Consumed 889ms CPU time, 254.4M memory peak. Mar 13 00:33:32.935590 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 13 00:33:32.939497 systemd[1]: Started sshd@0-157.180.95.181:22-4.153.228.146:41792.service - OpenSSH per-connection server daemon (4.153.228.146:41792). Mar 13 00:33:33.616530 sshd[1752]: Accepted publickey for core from 4.153.228.146 port 41792 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:33:33.620527 sshd-session[1752]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:33:33.630587 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 13 00:33:33.632611 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 13 00:33:33.646063 systemd-logind[1591]: New session 1 of user core. Mar 13 00:33:33.662452 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 13 00:33:33.667531 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 13 00:33:33.686480 (systemd)[1757]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 13 00:33:33.690615 systemd-logind[1591]: New session c1 of user core. Mar 13 00:33:33.850363 systemd[1757]: Queued start job for default target default.target. Mar 13 00:33:33.860653 systemd[1757]: Created slice app.slice - User Application Slice. Mar 13 00:33:33.860691 systemd[1757]: Reached target paths.target - Paths. Mar 13 00:33:33.860838 systemd[1757]: Reached target timers.target - Timers. Mar 13 00:33:33.862463 systemd[1757]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 13 00:33:33.898927 systemd[1757]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 13 00:33:33.899422 systemd[1757]: Reached target sockets.target - Sockets. Mar 13 00:33:33.899573 systemd[1757]: Reached target basic.target - Basic System. Mar 13 00:33:33.899693 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 13 00:33:33.900493 systemd[1757]: Reached target default.target - Main User Target. Mar 13 00:33:33.900558 systemd[1757]: Startup finished in 200ms. Mar 13 00:33:33.903486 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 13 00:33:34.284708 systemd[1]: Started sshd@1-157.180.95.181:22-4.153.228.146:41802.service - OpenSSH per-connection server daemon (4.153.228.146:41802). Mar 13 00:33:34.938267 sshd[1768]: Accepted publickey for core from 4.153.228.146 port 41802 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:33:34.940194 sshd-session[1768]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:33:34.948330 systemd-logind[1591]: New session 2 of user core. Mar 13 00:33:34.962600 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 13 00:33:35.311370 sshd[1771]: Connection closed by 4.153.228.146 port 41802 Mar 13 00:33:35.313559 sshd-session[1768]: pam_unix(sshd:session): session closed for user core Mar 13 00:33:35.323855 systemd[1]: sshd@1-157.180.95.181:22-4.153.228.146:41802.service: Deactivated successfully. Mar 13 00:33:35.327910 systemd[1]: session-2.scope: Deactivated successfully. Mar 13 00:33:35.329665 systemd-logind[1591]: Session 2 logged out. Waiting for processes to exit. Mar 13 00:33:35.332639 systemd-logind[1591]: Removed session 2. Mar 13 00:33:35.455049 systemd[1]: Started sshd@2-157.180.95.181:22-4.153.228.146:41804.service - OpenSSH per-connection server daemon (4.153.228.146:41804). Mar 13 00:33:36.106328 sshd[1777]: Accepted publickey for core from 4.153.228.146 port 41804 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:33:36.108086 sshd-session[1777]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:33:36.113106 systemd-logind[1591]: New session 3 of user core. Mar 13 00:33:36.120425 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 13 00:33:36.469065 sshd[1780]: Connection closed by 4.153.228.146 port 41804 Mar 13 00:33:36.470616 sshd-session[1777]: pam_unix(sshd:session): session closed for user core Mar 13 00:33:36.474904 systemd[1]: sshd@2-157.180.95.181:22-4.153.228.146:41804.service: Deactivated successfully. Mar 13 00:33:36.476937 systemd[1]: session-3.scope: Deactivated successfully. Mar 13 00:33:36.479459 systemd-logind[1591]: Session 3 logged out. Waiting for processes to exit. Mar 13 00:33:36.481450 systemd-logind[1591]: Removed session 3. Mar 13 00:33:36.608573 systemd[1]: Started sshd@3-157.180.95.181:22-4.153.228.146:41808.service - OpenSSH per-connection server daemon (4.153.228.146:41808). Mar 13 00:33:37.280335 sshd[1786]: Accepted publickey for core from 4.153.228.146 port 41808 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:33:37.282152 sshd-session[1786]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:33:37.288604 systemd-logind[1591]: New session 4 of user core. Mar 13 00:33:37.294491 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 13 00:33:37.645316 sshd[1789]: Connection closed by 4.153.228.146 port 41808 Mar 13 00:33:37.647497 sshd-session[1786]: pam_unix(sshd:session): session closed for user core Mar 13 00:33:37.651740 systemd[1]: sshd@3-157.180.95.181:22-4.153.228.146:41808.service: Deactivated successfully. Mar 13 00:33:37.653913 systemd[1]: session-4.scope: Deactivated successfully. Mar 13 00:33:37.656134 systemd-logind[1591]: Session 4 logged out. Waiting for processes to exit. Mar 13 00:33:37.657628 systemd-logind[1591]: Removed session 4. Mar 13 00:33:37.776172 systemd[1]: Started sshd@4-157.180.95.181:22-4.153.228.146:41820.service - OpenSSH per-connection server daemon (4.153.228.146:41820). Mar 13 00:33:38.431615 sshd[1795]: Accepted publickey for core from 4.153.228.146 port 41820 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:33:38.434501 sshd-session[1795]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:33:38.441831 systemd-logind[1591]: New session 5 of user core. Mar 13 00:33:38.450753 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 13 00:33:38.690046 sudo[1799]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 13 00:33:38.690466 sudo[1799]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 13 00:33:38.708412 sudo[1799]: pam_unix(sudo:session): session closed for user root Mar 13 00:33:38.828772 sshd[1798]: Connection closed by 4.153.228.146 port 41820 Mar 13 00:33:38.830519 sshd-session[1795]: pam_unix(sshd:session): session closed for user core Mar 13 00:33:38.835664 systemd-logind[1591]: Session 5 logged out. Waiting for processes to exit. Mar 13 00:33:38.836572 systemd[1]: sshd@4-157.180.95.181:22-4.153.228.146:41820.service: Deactivated successfully. Mar 13 00:33:38.839000 systemd[1]: session-5.scope: Deactivated successfully. Mar 13 00:33:38.841811 systemd-logind[1591]: Removed session 5. Mar 13 00:33:38.963084 systemd[1]: Started sshd@5-157.180.95.181:22-4.153.228.146:37942.service - OpenSSH per-connection server daemon (4.153.228.146:37942). Mar 13 00:33:39.617788 sshd[1805]: Accepted publickey for core from 4.153.228.146 port 37942 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:33:39.619259 sshd-session[1805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:33:39.623816 systemd-logind[1591]: New session 6 of user core. Mar 13 00:33:39.629349 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 13 00:33:39.862639 sudo[1810]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 13 00:33:39.862935 sudo[1810]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 13 00:33:39.868139 sudo[1810]: pam_unix(sudo:session): session closed for user root Mar 13 00:33:39.874498 sudo[1809]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 13 00:33:39.874794 sudo[1809]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 13 00:33:39.886199 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 13 00:33:39.937660 augenrules[1832]: No rules Mar 13 00:33:39.939589 systemd[1]: audit-rules.service: Deactivated successfully. Mar 13 00:33:39.939864 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 13 00:33:39.941263 sudo[1809]: pam_unix(sudo:session): session closed for user root Mar 13 00:33:40.061137 sshd[1808]: Connection closed by 4.153.228.146 port 37942 Mar 13 00:33:40.062122 sshd-session[1805]: pam_unix(sshd:session): session closed for user core Mar 13 00:33:40.066850 systemd[1]: sshd@5-157.180.95.181:22-4.153.228.146:37942.service: Deactivated successfully. Mar 13 00:33:40.069011 systemd[1]: session-6.scope: Deactivated successfully. Mar 13 00:33:40.070402 systemd-logind[1591]: Session 6 logged out. Waiting for processes to exit. Mar 13 00:33:40.071710 systemd-logind[1591]: Removed session 6. Mar 13 00:33:40.194437 systemd[1]: Started sshd@6-157.180.95.181:22-4.153.228.146:37950.service - OpenSSH per-connection server daemon (4.153.228.146:37950). Mar 13 00:33:40.847065 sshd[1841]: Accepted publickey for core from 4.153.228.146 port 37950 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:33:40.847715 sshd-session[1841]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:33:40.852284 systemd-logind[1591]: New session 7 of user core. Mar 13 00:33:40.860396 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 13 00:33:41.089410 sudo[1845]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 13 00:33:41.089786 sudo[1845]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 13 00:33:41.392550 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 13 00:33:41.409700 (dockerd)[1863]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 13 00:33:41.621920 dockerd[1863]: time="2026-03-13T00:33:41.621851313Z" level=info msg="Starting up" Mar 13 00:33:41.622677 dockerd[1863]: time="2026-03-13T00:33:41.622649113Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 13 00:33:41.627994 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 13 00:33:41.630860 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:33:41.645798 dockerd[1863]: time="2026-03-13T00:33:41.644348831Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 13 00:33:41.678194 systemd[1]: var-lib-docker-metacopy\x2dcheck3281838696-merged.mount: Deactivated successfully. Mar 13 00:33:41.706684 dockerd[1863]: time="2026-03-13T00:33:41.706633053Z" level=info msg="Loading containers: start." Mar 13 00:33:41.720957 kernel: Initializing XFRM netlink socket Mar 13 00:33:41.802967 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:33:41.811522 (kubelet)[1935]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 13 00:33:41.851291 kubelet[1935]: E0313 00:33:41.851237 1935 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 13 00:33:41.856603 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 13 00:33:41.856774 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 13 00:33:41.857337 systemd[1]: kubelet.service: Consumed 163ms CPU time, 108.1M memory peak. Mar 13 00:33:41.934272 systemd-timesyncd[1499]: Network configuration changed, trying to establish connection. Mar 13 00:33:41.975699 systemd-networkd[1483]: docker0: Link UP Mar 13 00:33:41.980065 dockerd[1863]: time="2026-03-13T00:33:41.980022931Z" level=info msg="Loading containers: done." Mar 13 00:33:41.996082 dockerd[1863]: time="2026-03-13T00:33:41.996044274Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 13 00:33:41.996217 dockerd[1863]: time="2026-03-13T00:33:41.996110354Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 13 00:33:41.996217 dockerd[1863]: time="2026-03-13T00:33:41.996179424Z" level=info msg="Initializing buildkit" Mar 13 00:33:42.023651 dockerd[1863]: time="2026-03-13T00:33:42.023613277Z" level=info msg="Completed buildkit initialization" Mar 13 00:33:42.029255 dockerd[1863]: time="2026-03-13T00:33:42.029215922Z" level=info msg="Daemon has completed initialization" Mar 13 00:33:42.029474 dockerd[1863]: time="2026-03-13T00:33:42.029366022Z" level=info msg="API listen on /run/docker.sock" Mar 13 00:33:42.029480 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 13 00:33:42.830450 systemd-resolved[1484]: Clock change detected. Flushing caches. Mar 13 00:33:42.830705 systemd-timesyncd[1499]: Contacted time server 172.104.154.182:123 (2.flatcar.pool.ntp.org). Mar 13 00:33:42.830761 systemd-timesyncd[1499]: Initial clock synchronization to Fri 2026-03-13 00:33:42.830049 UTC. Mar 13 00:33:43.031297 containerd[1620]: time="2026-03-13T00:33:43.031236295Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.2\"" Mar 13 00:33:43.682890 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1777996092.mount: Deactivated successfully. Mar 13 00:33:44.840587 containerd[1620]: time="2026-03-13T00:33:44.840532882Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:44.842877 containerd[1620]: time="2026-03-13T00:33:44.842843594Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.35.2: active requests=0, bytes read=27696567" Mar 13 00:33:44.844203 containerd[1620]: time="2026-03-13T00:33:44.844163385Z" level=info msg="ImageCreate event name:\"sha256:66108468ce51257077e642f2f509cd61d470029036a7954a1a47ca15b2706dda\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:44.846540 containerd[1620]: time="2026-03-13T00:33:44.846480977Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:68cdc586f13b13edb7aa30a18155be530136a39cfd5ef8672aad8ccc98f0a7f7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:44.847445 containerd[1620]: time="2026-03-13T00:33:44.847140677Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.35.2\" with image id \"sha256:66108468ce51257077e642f2f509cd61d470029036a7954a1a47ca15b2706dda\", repo tag \"registry.k8s.io/kube-apiserver:v1.35.2\", repo digest \"registry.k8s.io/kube-apiserver@sha256:68cdc586f13b13edb7aa30a18155be530136a39cfd5ef8672aad8ccc98f0a7f7\", size \"27693066\" in 1.815848512s" Mar 13 00:33:44.847445 containerd[1620]: time="2026-03-13T00:33:44.847170967Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.2\" returns image reference \"sha256:66108468ce51257077e642f2f509cd61d470029036a7954a1a47ca15b2706dda\"" Mar 13 00:33:44.847903 containerd[1620]: time="2026-03-13T00:33:44.847878948Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.2\"" Mar 13 00:33:46.051541 containerd[1620]: time="2026-03-13T00:33:46.051472891Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:46.052942 containerd[1620]: time="2026-03-13T00:33:46.052673252Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.35.2: active requests=0, bytes read=21450722" Mar 13 00:33:46.054069 containerd[1620]: time="2026-03-13T00:33:46.054037183Z" level=info msg="ImageCreate event name:\"sha256:0f2dd35011c05b55a97c9304ae1d36cfd58499cc1fd3dd8ccdf6efef1144e36a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:46.056729 containerd[1620]: time="2026-03-13T00:33:46.056682235Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d9784320a41dd1b155c0ad8fdb5823d60c475870f3dd23865edde36b585748f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:46.057872 containerd[1620]: time="2026-03-13T00:33:46.057822436Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.35.2\" with image id \"sha256:0f2dd35011c05b55a97c9304ae1d36cfd58499cc1fd3dd8ccdf6efef1144e36a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.35.2\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d9784320a41dd1b155c0ad8fdb5823d60c475870f3dd23865edde36b585748f2\", size \"23142311\" in 1.209918418s" Mar 13 00:33:46.057999 containerd[1620]: time="2026-03-13T00:33:46.057984526Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.2\" returns image reference \"sha256:0f2dd35011c05b55a97c9304ae1d36cfd58499cc1fd3dd8ccdf6efef1144e36a\"" Mar 13 00:33:46.058549 containerd[1620]: time="2026-03-13T00:33:46.058517356Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.2\"" Mar 13 00:33:47.064962 containerd[1620]: time="2026-03-13T00:33:47.063921394Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:47.065530 containerd[1620]: time="2026-03-13T00:33:47.065032985Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.35.2: active requests=0, bytes read=15548451" Mar 13 00:33:47.066297 containerd[1620]: time="2026-03-13T00:33:47.066222356Z" level=info msg="ImageCreate event name:\"sha256:ee83c410d7938aa1752b4e79a8d51f03710b4becc23b2e095fba471049fb2914\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:47.069461 containerd[1620]: time="2026-03-13T00:33:47.069403939Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:5833e2c4b779215efe7a48126c067de199e86aa5a86518693adeef16db0ff943\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:47.070792 containerd[1620]: time="2026-03-13T00:33:47.070654630Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.35.2\" with image id \"sha256:ee83c410d7938aa1752b4e79a8d51f03710b4becc23b2e095fba471049fb2914\", repo tag \"registry.k8s.io/kube-scheduler:v1.35.2\", repo digest \"registry.k8s.io/kube-scheduler@sha256:5833e2c4b779215efe7a48126c067de199e86aa5a86518693adeef16db0ff943\", size \"17240058\" in 1.012105453s" Mar 13 00:33:47.070792 containerd[1620]: time="2026-03-13T00:33:47.070691180Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.2\" returns image reference \"sha256:ee83c410d7938aa1752b4e79a8d51f03710b4becc23b2e095fba471049fb2914\"" Mar 13 00:33:47.072028 containerd[1620]: time="2026-03-13T00:33:47.072006291Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.2\"" Mar 13 00:33:48.116588 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1747202270.mount: Deactivated successfully. Mar 13 00:33:48.429050 containerd[1620]: time="2026-03-13T00:33:48.428749951Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:48.429896 containerd[1620]: time="2026-03-13T00:33:48.429864172Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.35.2: active requests=0, bytes read=25685340" Mar 13 00:33:48.430574 containerd[1620]: time="2026-03-13T00:33:48.430532143Z" level=info msg="ImageCreate event name:\"sha256:3c471cf273e44f68c91b48985c27627d581915b9ee5e72f7227bbf2146008b5e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:48.432188 containerd[1620]: time="2026-03-13T00:33:48.432165424Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:015265214cc874b593a7adccdcfe4ac15d2b8e9ae89881bdcd5bcb99d42e1862\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:48.432841 containerd[1620]: time="2026-03-13T00:33:48.432676314Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.35.2\" with image id \"sha256:3c471cf273e44f68c91b48985c27627d581915b9ee5e72f7227bbf2146008b5e\", repo tag \"registry.k8s.io/kube-proxy:v1.35.2\", repo digest \"registry.k8s.io/kube-proxy@sha256:015265214cc874b593a7adccdcfe4ac15d2b8e9ae89881bdcd5bcb99d42e1862\", size \"25684331\" in 1.360523983s" Mar 13 00:33:48.432841 containerd[1620]: time="2026-03-13T00:33:48.432709044Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.2\" returns image reference \"sha256:3c471cf273e44f68c91b48985c27627d581915b9ee5e72f7227bbf2146008b5e\"" Mar 13 00:33:48.433363 containerd[1620]: time="2026-03-13T00:33:48.433207245Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\"" Mar 13 00:33:48.944681 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount589870835.mount: Deactivated successfully. Mar 13 00:33:49.837649 containerd[1620]: time="2026-03-13T00:33:49.837584335Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:49.838854 containerd[1620]: time="2026-03-13T00:33:49.838623176Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.13.1: active requests=0, bytes read=23556642" Mar 13 00:33:49.839713 containerd[1620]: time="2026-03-13T00:33:49.839685716Z" level=info msg="ImageCreate event name:\"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:49.841876 containerd[1620]: time="2026-03-13T00:33:49.841846668Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:49.842728 containerd[1620]: time="2026-03-13T00:33:49.842704219Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.13.1\" with image id \"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\", repo tag \"registry.k8s.io/coredns/coredns:v1.13.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\", size \"23553139\" in 1.409471014s" Mar 13 00:33:49.842785 containerd[1620]: time="2026-03-13T00:33:49.842731709Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\" returns image reference \"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\"" Mar 13 00:33:49.843274 containerd[1620]: time="2026-03-13T00:33:49.843205819Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 13 00:33:50.324314 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount73192721.mount: Deactivated successfully. Mar 13 00:33:50.329590 containerd[1620]: time="2026-03-13T00:33:50.329510235Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:50.331134 containerd[1620]: time="2026-03-13T00:33:50.330992106Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321240" Mar 13 00:33:50.331483 containerd[1620]: time="2026-03-13T00:33:50.331403436Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:50.334159 containerd[1620]: time="2026-03-13T00:33:50.334049468Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:50.335776 containerd[1620]: time="2026-03-13T00:33:50.335252619Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 492.02214ms" Mar 13 00:33:50.335776 containerd[1620]: time="2026-03-13T00:33:50.335307399Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Mar 13 00:33:50.335929 containerd[1620]: time="2026-03-13T00:33:50.335787690Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\"" Mar 13 00:33:50.846269 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1865335502.mount: Deactivated successfully. Mar 13 00:33:51.551257 containerd[1620]: time="2026-03-13T00:33:51.550312022Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.6-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:51.551257 containerd[1620]: time="2026-03-13T00:33:51.551226732Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.6-0: active requests=0, bytes read=23630398" Mar 13 00:33:51.551846 containerd[1620]: time="2026-03-13T00:33:51.551815453Z" level=info msg="ImageCreate event name:\"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:51.553560 containerd[1620]: time="2026-03-13T00:33:51.553535844Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:51.554242 containerd[1620]: time="2026-03-13T00:33:51.554212985Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.6-0\" with image id \"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\", repo tag \"registry.k8s.io/etcd:3.6.6-0\", repo digest \"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\", size \"23641797\" in 1.218401215s" Mar 13 00:33:51.554284 containerd[1620]: time="2026-03-13T00:33:51.554242685Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\" returns image reference \"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\"" Mar 13 00:33:52.518201 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 13 00:33:52.524660 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:33:52.533874 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 13 00:33:52.534199 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 13 00:33:52.534649 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:33:52.537349 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:33:52.570931 systemd[1]: Reload requested from client PID 2311 ('systemctl') (unit session-7.scope)... Mar 13 00:33:52.570953 systemd[1]: Reloading... Mar 13 00:33:52.712331 zram_generator::config[2355]: No configuration found. Mar 13 00:33:52.939428 systemd[1]: Reloading finished in 367 ms. Mar 13 00:33:53.008928 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 13 00:33:53.009067 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 13 00:33:53.009810 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:33:53.009870 systemd[1]: kubelet.service: Consumed 134ms CPU time, 98.2M memory peak. Mar 13 00:33:53.012172 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:33:53.175738 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:33:53.183554 (kubelet)[2409]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 13 00:33:53.219544 kubelet[2409]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 00:33:53.345719 kubelet[2409]: I0313 00:33:53.345664 2409 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Mar 13 00:33:53.347163 kubelet[2409]: I0313 00:33:53.345841 2409 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 00:33:53.347163 kubelet[2409]: I0313 00:33:53.345867 2409 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 13 00:33:53.347163 kubelet[2409]: I0313 00:33:53.345872 2409 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 13 00:33:53.347163 kubelet[2409]: I0313 00:33:53.346073 2409 server.go:951] "Client rotation is on, will bootstrap in background" Mar 13 00:33:53.374407 kubelet[2409]: I0313 00:33:53.374370 2409 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 13 00:33:53.376451 kubelet[2409]: E0313 00:33:53.376422 2409 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://157.180.95.181:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 157.180.95.181:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 13 00:33:53.384028 kubelet[2409]: I0313 00:33:53.383495 2409 server.go:1418] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 00:33:53.390547 kubelet[2409]: I0313 00:33:53.390503 2409 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 13 00:33:53.392313 kubelet[2409]: I0313 00:33:53.392258 2409 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 00:33:53.392454 kubelet[2409]: I0313 00:33:53.392295 2409 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-n-7393fd8643","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 00:33:53.392454 kubelet[2409]: I0313 00:33:53.392446 2409 topology_manager.go:143] "Creating topology manager with none policy" Mar 13 00:33:53.392454 kubelet[2409]: I0313 00:33:53.392454 2409 container_manager_linux.go:308] "Creating device plugin manager" Mar 13 00:33:53.392672 kubelet[2409]: I0313 00:33:53.392541 2409 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Mar 13 00:33:53.395573 kubelet[2409]: I0313 00:33:53.395137 2409 state_mem.go:41] "Initialized" logger="CPUManager state memory" Mar 13 00:33:53.395573 kubelet[2409]: I0313 00:33:53.395342 2409 kubelet.go:482] "Attempting to sync node with API server" Mar 13 00:33:53.395573 kubelet[2409]: I0313 00:33:53.395358 2409 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 00:33:53.396608 kubelet[2409]: I0313 00:33:53.395740 2409 kubelet.go:394] "Adding apiserver pod source" Mar 13 00:33:53.396608 kubelet[2409]: I0313 00:33:53.395754 2409 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 00:33:53.399300 kubelet[2409]: I0313 00:33:53.398868 2409 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 13 00:33:53.403886 kubelet[2409]: I0313 00:33:53.400783 2409 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 13 00:33:53.403886 kubelet[2409]: I0313 00:33:53.400809 2409 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 13 00:33:53.403886 kubelet[2409]: W0313 00:33:53.400862 2409 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 13 00:33:53.403886 kubelet[2409]: I0313 00:33:53.403446 2409 server.go:1257] "Started kubelet" Mar 13 00:33:53.414886 kubelet[2409]: I0313 00:33:53.412327 2409 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Mar 13 00:33:53.418414 kubelet[2409]: E0313 00:33:53.415204 2409 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://157.180.95.181:6443/api/v1/namespaces/default/events\": dial tcp 157.180.95.181:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-4-n-7393fd8643.189c3f5fab2ce9d1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-4-n-7393fd8643,UID:,APIVersion:v1,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-n-7393fd8643,},FirstTimestamp:2026-03-13 00:33:53.403423185 +0000 UTC m=+0.214582609,LastTimestamp:2026-03-13 00:33:53.403423185 +0000 UTC m=+0.214582609,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-n-7393fd8643,}" Mar 13 00:33:53.420125 kubelet[2409]: I0313 00:33:53.419074 2409 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 00:33:53.420284 kubelet[2409]: I0313 00:33:53.420273 2409 server.go:317] "Adding debug handlers to kubelet server" Mar 13 00:33:53.424649 kubelet[2409]: I0313 00:33:53.423778 2409 volume_manager.go:311] "Starting Kubelet Volume Manager" Mar 13 00:33:53.424649 kubelet[2409]: E0313 00:33:53.423993 2409 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-7393fd8643\" not found" Mar 13 00:33:53.427717 kubelet[2409]: I0313 00:33:53.426698 2409 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 00:33:53.427717 kubelet[2409]: I0313 00:33:53.426761 2409 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 13 00:33:53.427717 kubelet[2409]: I0313 00:33:53.426935 2409 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 00:33:53.427717 kubelet[2409]: I0313 00:33:53.427089 2409 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 13 00:33:53.427717 kubelet[2409]: I0313 00:33:53.427454 2409 reconciler.go:29] "Reconciler: start to sync state" Mar 13 00:33:53.428003 kubelet[2409]: I0313 00:33:53.427977 2409 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 13 00:33:53.433399 kubelet[2409]: E0313 00:33:53.432708 2409 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.95.181:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-7393fd8643?timeout=10s\": dial tcp 157.180.95.181:6443: connect: connection refused" interval="200ms" Mar 13 00:33:53.433399 kubelet[2409]: I0313 00:33:53.432955 2409 factory.go:223] Registration of the systemd container factory successfully Mar 13 00:33:53.433399 kubelet[2409]: I0313 00:33:53.433033 2409 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 13 00:33:53.437188 kubelet[2409]: I0313 00:33:53.436142 2409 factory.go:223] Registration of the containerd container factory successfully Mar 13 00:33:53.446480 kubelet[2409]: I0313 00:33:53.446280 2409 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 13 00:33:53.448709 kubelet[2409]: I0313 00:33:53.448663 2409 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 13 00:33:53.448709 kubelet[2409]: I0313 00:33:53.448690 2409 status_manager.go:249] "Starting to sync pod status with apiserver" Mar 13 00:33:53.448709 kubelet[2409]: I0313 00:33:53.448715 2409 kubelet.go:2501] "Starting kubelet main sync loop" Mar 13 00:33:53.448853 kubelet[2409]: E0313 00:33:53.448773 2409 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 00:33:53.466151 kubelet[2409]: E0313 00:33:53.465882 2409 kubelet.go:1656] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 13 00:33:53.477668 kubelet[2409]: I0313 00:33:53.477022 2409 cpu_manager.go:225] "Starting" policy="none" Mar 13 00:33:53.477668 kubelet[2409]: I0313 00:33:53.477040 2409 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 13 00:33:53.477668 kubelet[2409]: I0313 00:33:53.477060 2409 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Mar 13 00:33:53.492009 kubelet[2409]: I0313 00:33:53.491607 2409 policy_none.go:50] "Start" Mar 13 00:33:53.492009 kubelet[2409]: I0313 00:33:53.491642 2409 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 13 00:33:53.492009 kubelet[2409]: I0313 00:33:53.491656 2409 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 13 00:33:53.493403 kubelet[2409]: I0313 00:33:53.493373 2409 policy_none.go:44] "Start" Mar 13 00:33:53.498563 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 13 00:33:53.508887 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 13 00:33:53.512084 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 13 00:33:53.523079 kubelet[2409]: E0313 00:33:53.523046 2409 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 13 00:33:53.524503 kubelet[2409]: E0313 00:33:53.524048 2409 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-7393fd8643\" not found" Mar 13 00:33:53.524503 kubelet[2409]: I0313 00:33:53.524277 2409 eviction_manager.go:194] "Eviction manager: starting control loop" Mar 13 00:33:53.525002 kubelet[2409]: I0313 00:33:53.524286 2409 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 00:33:53.526295 kubelet[2409]: E0313 00:33:53.526273 2409 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 13 00:33:53.526346 kubelet[2409]: E0313 00:33:53.526304 2409 eviction_manager.go:297] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-4-n-7393fd8643\" not found" Mar 13 00:33:53.526374 kubelet[2409]: I0313 00:33:53.526362 2409 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Mar 13 00:33:53.561265 systemd[1]: Created slice kubepods-burstable-pod6e7dd5f523e82d58e4391e484d4a43f5.slice - libcontainer container kubepods-burstable-pod6e7dd5f523e82d58e4391e484d4a43f5.slice. Mar 13 00:33:53.582411 kubelet[2409]: E0313 00:33:53.582359 2409 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-7393fd8643\" not found" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:53.585897 systemd[1]: Created slice kubepods-burstable-pod76807beaf181ab0c9f75be8d8d850de3.slice - libcontainer container kubepods-burstable-pod76807beaf181ab0c9f75be8d8d850de3.slice. Mar 13 00:33:53.588479 kubelet[2409]: E0313 00:33:53.588434 2409 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-7393fd8643\" not found" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:53.603097 systemd[1]: Created slice kubepods-burstable-pod5b8b1a67ae94557ca60791d4e96c593f.slice - libcontainer container kubepods-burstable-pod5b8b1a67ae94557ca60791d4e96c593f.slice. Mar 13 00:33:53.606075 kubelet[2409]: E0313 00:33:53.606029 2409 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-7393fd8643\" not found" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:53.629015 kubelet[2409]: I0313 00:33:53.628910 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/76807beaf181ab0c9f75be8d8d850de3-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-7393fd8643\" (UID: \"76807beaf181ab0c9f75be8d8d850de3\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:53.629459 kubelet[2409]: I0313 00:33:53.629080 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/76807beaf181ab0c9f75be8d8d850de3-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-n-7393fd8643\" (UID: \"76807beaf181ab0c9f75be8d8d850de3\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:53.629546 kubelet[2409]: I0313 00:33:53.629467 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/76807beaf181ab0c9f75be8d8d850de3-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-7393fd8643\" (UID: \"76807beaf181ab0c9f75be8d8d850de3\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:53.629546 kubelet[2409]: I0313 00:33:53.629493 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/76807beaf181ab0c9f75be8d8d850de3-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-n-7393fd8643\" (UID: \"76807beaf181ab0c9f75be8d8d850de3\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:53.629546 kubelet[2409]: I0313 00:33:53.629525 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5b8b1a67ae94557ca60791d4e96c593f-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-n-7393fd8643\" (UID: \"5b8b1a67ae94557ca60791d4e96c593f\") " pod="kube-system/kube-scheduler-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:53.629672 kubelet[2409]: I0313 00:33:53.629547 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6e7dd5f523e82d58e4391e484d4a43f5-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-n-7393fd8643\" (UID: \"6e7dd5f523e82d58e4391e484d4a43f5\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:53.629672 kubelet[2409]: I0313 00:33:53.629572 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/76807beaf181ab0c9f75be8d8d850de3-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-n-7393fd8643\" (UID: \"76807beaf181ab0c9f75be8d8d850de3\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:53.629672 kubelet[2409]: I0313 00:33:53.629595 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6e7dd5f523e82d58e4391e484d4a43f5-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-n-7393fd8643\" (UID: \"6e7dd5f523e82d58e4391e484d4a43f5\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:53.629672 kubelet[2409]: I0313 00:33:53.629618 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6e7dd5f523e82d58e4391e484d4a43f5-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-n-7393fd8643\" (UID: \"6e7dd5f523e82d58e4391e484d4a43f5\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:53.629672 kubelet[2409]: I0313 00:33:53.629358 2409 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:53.630399 kubelet[2409]: E0313 00:33:53.630351 2409 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://157.180.95.181:6443/api/v1/nodes\": dial tcp 157.180.95.181:6443: connect: connection refused" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:53.634130 kubelet[2409]: E0313 00:33:53.634056 2409 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.95.181:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-7393fd8643?timeout=10s\": dial tcp 157.180.95.181:6443: connect: connection refused" interval="400ms" Mar 13 00:33:53.834663 kubelet[2409]: I0313 00:33:53.834184 2409 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:53.835333 kubelet[2409]: E0313 00:33:53.835274 2409 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://157.180.95.181:6443/api/v1/nodes\": dial tcp 157.180.95.181:6443: connect: connection refused" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:53.886137 containerd[1620]: time="2026-03-13T00:33:53.886060967Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-n-7393fd8643,Uid:6e7dd5f523e82d58e4391e484d4a43f5,Namespace:kube-system,Attempt:0,}" Mar 13 00:33:53.896298 containerd[1620]: time="2026-03-13T00:33:53.896250816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-n-7393fd8643,Uid:76807beaf181ab0c9f75be8d8d850de3,Namespace:kube-system,Attempt:0,}" Mar 13 00:33:53.909684 containerd[1620]: time="2026-03-13T00:33:53.909290577Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-n-7393fd8643,Uid:5b8b1a67ae94557ca60791d4e96c593f,Namespace:kube-system,Attempt:0,}" Mar 13 00:33:54.035034 kubelet[2409]: E0313 00:33:54.034946 2409 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.95.181:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-7393fd8643?timeout=10s\": dial tcp 157.180.95.181:6443: connect: connection refused" interval="800ms" Mar 13 00:33:54.238133 kubelet[2409]: I0313 00:33:54.238040 2409 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:54.238771 kubelet[2409]: E0313 00:33:54.238611 2409 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://157.180.95.181:6443/api/v1/nodes\": dial tcp 157.180.95.181:6443: connect: connection refused" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:54.390076 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount588656672.mount: Deactivated successfully. Mar 13 00:33:54.400690 containerd[1620]: time="2026-03-13T00:33:54.400616226Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 13 00:33:54.405367 containerd[1620]: time="2026-03-13T00:33:54.405295940Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Mar 13 00:33:54.406364 containerd[1620]: time="2026-03-13T00:33:54.406314441Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 13 00:33:54.408076 containerd[1620]: time="2026-03-13T00:33:54.407983242Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 13 00:33:54.409787 containerd[1620]: time="2026-03-13T00:33:54.409737814Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 13 00:33:54.410960 containerd[1620]: time="2026-03-13T00:33:54.410871615Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 13 00:33:54.412133 containerd[1620]: time="2026-03-13T00:33:54.411735405Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 13 00:33:54.414264 containerd[1620]: time="2026-03-13T00:33:54.413049236Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 13 00:33:54.415389 containerd[1620]: time="2026-03-13T00:33:54.415333248Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 517.424771ms" Mar 13 00:33:54.416514 containerd[1620]: time="2026-03-13T00:33:54.416476499Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 528.37218ms" Mar 13 00:33:54.422043 containerd[1620]: time="2026-03-13T00:33:54.421965894Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 510.178705ms" Mar 13 00:33:54.467979 containerd[1620]: time="2026-03-13T00:33:54.467791612Z" level=info msg="connecting to shim 0e9968d0f5e68aa830fb5350c73dc172e9df58ae3d2404995c075d519e2496dd" address="unix:///run/containerd/s/7a18c8e172315a4fd36af06976cfc2be4081a82c6bea9e605b3180beede49f52" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:33:54.472700 containerd[1620]: time="2026-03-13T00:33:54.472660016Z" level=info msg="connecting to shim eba1da6b241ca24069929c47905ca1582ab016e11e96c072df45cfca83eb427d" address="unix:///run/containerd/s/18a43a1e3362e6b88a3daeaac4bad170b5631da2232f895c5e3d446f1e925765" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:33:54.480309 containerd[1620]: time="2026-03-13T00:33:54.480270152Z" level=info msg="connecting to shim 6ff7590e5c496e9f7fad61bfbe1fae1001231010324d04c3928d5ad146a00405" address="unix:///run/containerd/s/1eb4258441f488811d405c6f3ffac03d853ee99c4e915ef71e1ac2903a1f2685" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:33:54.498257 systemd[1]: Started cri-containerd-eba1da6b241ca24069929c47905ca1582ab016e11e96c072df45cfca83eb427d.scope - libcontainer container eba1da6b241ca24069929c47905ca1582ab016e11e96c072df45cfca83eb427d. Mar 13 00:33:54.508297 systemd[1]: Started cri-containerd-0e9968d0f5e68aa830fb5350c73dc172e9df58ae3d2404995c075d519e2496dd.scope - libcontainer container 0e9968d0f5e68aa830fb5350c73dc172e9df58ae3d2404995c075d519e2496dd. Mar 13 00:33:54.529353 systemd[1]: Started cri-containerd-6ff7590e5c496e9f7fad61bfbe1fae1001231010324d04c3928d5ad146a00405.scope - libcontainer container 6ff7590e5c496e9f7fad61bfbe1fae1001231010324d04c3928d5ad146a00405. Mar 13 00:33:54.566235 containerd[1620]: time="2026-03-13T00:33:54.566187714Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-n-7393fd8643,Uid:6e7dd5f523e82d58e4391e484d4a43f5,Namespace:kube-system,Attempt:0,} returns sandbox id \"eba1da6b241ca24069929c47905ca1582ab016e11e96c072df45cfca83eb427d\"" Mar 13 00:33:54.572236 containerd[1620]: time="2026-03-13T00:33:54.572205989Z" level=info msg="CreateContainer within sandbox \"eba1da6b241ca24069929c47905ca1582ab016e11e96c072df45cfca83eb427d\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 13 00:33:54.583898 containerd[1620]: time="2026-03-13T00:33:54.583864089Z" level=info msg="Container 6e9707d9c08bec3e480506e4491631c439d4950789348a7ab2ef9a2dbdaccfc9: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:33:54.592182 containerd[1620]: time="2026-03-13T00:33:54.592134006Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-n-7393fd8643,Uid:76807beaf181ab0c9f75be8d8d850de3,Namespace:kube-system,Attempt:0,} returns sandbox id \"0e9968d0f5e68aa830fb5350c73dc172e9df58ae3d2404995c075d519e2496dd\"" Mar 13 00:33:54.594603 containerd[1620]: time="2026-03-13T00:33:54.594557758Z" level=info msg="CreateContainer within sandbox \"eba1da6b241ca24069929c47905ca1582ab016e11e96c072df45cfca83eb427d\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"6e9707d9c08bec3e480506e4491631c439d4950789348a7ab2ef9a2dbdaccfc9\"" Mar 13 00:33:54.595655 containerd[1620]: time="2026-03-13T00:33:54.595618928Z" level=info msg="StartContainer for \"6e9707d9c08bec3e480506e4491631c439d4950789348a7ab2ef9a2dbdaccfc9\"" Mar 13 00:33:54.596517 containerd[1620]: time="2026-03-13T00:33:54.596478179Z" level=info msg="connecting to shim 6e9707d9c08bec3e480506e4491631c439d4950789348a7ab2ef9a2dbdaccfc9" address="unix:///run/containerd/s/18a43a1e3362e6b88a3daeaac4bad170b5631da2232f895c5e3d446f1e925765" protocol=ttrpc version=3 Mar 13 00:33:54.599857 containerd[1620]: time="2026-03-13T00:33:54.599145511Z" level=info msg="CreateContainer within sandbox \"0e9968d0f5e68aa830fb5350c73dc172e9df58ae3d2404995c075d519e2496dd\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 13 00:33:54.615789 containerd[1620]: time="2026-03-13T00:33:54.615742945Z" level=info msg="Container f659ef2d455179e956f36b02d5c1c7e67eb201aa9ca7f56cb90f78eab9d93ebd: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:33:54.620410 systemd[1]: Started cri-containerd-6e9707d9c08bec3e480506e4491631c439d4950789348a7ab2ef9a2dbdaccfc9.scope - libcontainer container 6e9707d9c08bec3e480506e4491631c439d4950789348a7ab2ef9a2dbdaccfc9. Mar 13 00:33:54.623226 containerd[1620]: time="2026-03-13T00:33:54.623087071Z" level=info msg="CreateContainer within sandbox \"0e9968d0f5e68aa830fb5350c73dc172e9df58ae3d2404995c075d519e2496dd\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"f659ef2d455179e956f36b02d5c1c7e67eb201aa9ca7f56cb90f78eab9d93ebd\"" Mar 13 00:33:54.623831 containerd[1620]: time="2026-03-13T00:33:54.623804522Z" level=info msg="StartContainer for \"f659ef2d455179e956f36b02d5c1c7e67eb201aa9ca7f56cb90f78eab9d93ebd\"" Mar 13 00:33:54.625921 containerd[1620]: time="2026-03-13T00:33:54.625887134Z" level=info msg="connecting to shim f659ef2d455179e956f36b02d5c1c7e67eb201aa9ca7f56cb90f78eab9d93ebd" address="unix:///run/containerd/s/7a18c8e172315a4fd36af06976cfc2be4081a82c6bea9e605b3180beede49f52" protocol=ttrpc version=3 Mar 13 00:33:54.630400 containerd[1620]: time="2026-03-13T00:33:54.629631797Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-n-7393fd8643,Uid:5b8b1a67ae94557ca60791d4e96c593f,Namespace:kube-system,Attempt:0,} returns sandbox id \"6ff7590e5c496e9f7fad61bfbe1fae1001231010324d04c3928d5ad146a00405\"" Mar 13 00:33:54.637588 containerd[1620]: time="2026-03-13T00:33:54.637512573Z" level=info msg="CreateContainer within sandbox \"6ff7590e5c496e9f7fad61bfbe1fae1001231010324d04c3928d5ad146a00405\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 13 00:33:54.645128 containerd[1620]: time="2026-03-13T00:33:54.644956430Z" level=info msg="Container 88ec2912720f177b0f30cde776237bd8a409f50716142900b067729edfeacd3e: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:33:54.656247 systemd[1]: Started cri-containerd-f659ef2d455179e956f36b02d5c1c7e67eb201aa9ca7f56cb90f78eab9d93ebd.scope - libcontainer container f659ef2d455179e956f36b02d5c1c7e67eb201aa9ca7f56cb90f78eab9d93ebd. Mar 13 00:33:54.664688 containerd[1620]: time="2026-03-13T00:33:54.664651066Z" level=info msg="CreateContainer within sandbox \"6ff7590e5c496e9f7fad61bfbe1fae1001231010324d04c3928d5ad146a00405\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"88ec2912720f177b0f30cde776237bd8a409f50716142900b067729edfeacd3e\"" Mar 13 00:33:54.665617 containerd[1620]: time="2026-03-13T00:33:54.665575627Z" level=info msg="StartContainer for \"88ec2912720f177b0f30cde776237bd8a409f50716142900b067729edfeacd3e\"" Mar 13 00:33:54.667376 containerd[1620]: time="2026-03-13T00:33:54.667345778Z" level=info msg="connecting to shim 88ec2912720f177b0f30cde776237bd8a409f50716142900b067729edfeacd3e" address="unix:///run/containerd/s/1eb4258441f488811d405c6f3ffac03d853ee99c4e915ef71e1ac2903a1f2685" protocol=ttrpc version=3 Mar 13 00:33:54.700373 systemd[1]: Started cri-containerd-88ec2912720f177b0f30cde776237bd8a409f50716142900b067729edfeacd3e.scope - libcontainer container 88ec2912720f177b0f30cde776237bd8a409f50716142900b067729edfeacd3e. Mar 13 00:33:54.707602 containerd[1620]: time="2026-03-13T00:33:54.707557042Z" level=info msg="StartContainer for \"6e9707d9c08bec3e480506e4491631c439d4950789348a7ab2ef9a2dbdaccfc9\" returns successfully" Mar 13 00:33:54.755893 containerd[1620]: time="2026-03-13T00:33:54.755060351Z" level=info msg="StartContainer for \"f659ef2d455179e956f36b02d5c1c7e67eb201aa9ca7f56cb90f78eab9d93ebd\" returns successfully" Mar 13 00:33:54.806167 containerd[1620]: time="2026-03-13T00:33:54.805088453Z" level=info msg="StartContainer for \"88ec2912720f177b0f30cde776237bd8a409f50716142900b067729edfeacd3e\" returns successfully" Mar 13 00:33:55.042928 kubelet[2409]: I0313 00:33:55.042807 2409 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:55.478892 kubelet[2409]: E0313 00:33:55.478845 2409 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-7393fd8643\" not found" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:55.483861 kubelet[2409]: E0313 00:33:55.483820 2409 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-7393fd8643\" not found" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:55.488447 kubelet[2409]: E0313 00:33:55.488409 2409 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-7393fd8643\" not found" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:55.637447 kubelet[2409]: E0313 00:33:55.637366 2409 nodelease.go:50] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-4-n-7393fd8643\" not found" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:55.735741 kubelet[2409]: I0313 00:33:55.735602 2409 kubelet_node_status.go:77] "Successfully registered node" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:55.735741 kubelet[2409]: E0313 00:33:55.735652 2409 kubelet_node_status.go:474] "Error updating node status, will retry" err="error getting node \"ci-4459-2-4-n-7393fd8643\": node \"ci-4459-2-4-n-7393fd8643\" not found" Mar 13 00:33:55.745740 kubelet[2409]: E0313 00:33:55.745685 2409 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-7393fd8643\" not found" Mar 13 00:33:55.845991 kubelet[2409]: E0313 00:33:55.845931 2409 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-7393fd8643\" not found" Mar 13 00:33:55.946781 kubelet[2409]: E0313 00:33:55.946720 2409 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-7393fd8643\" not found" Mar 13 00:33:56.047714 kubelet[2409]: E0313 00:33:56.047567 2409 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-7393fd8643\" not found" Mar 13 00:33:56.147749 kubelet[2409]: E0313 00:33:56.147650 2409 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-7393fd8643\" not found" Mar 13 00:33:56.248641 kubelet[2409]: E0313 00:33:56.248597 2409 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-7393fd8643\" not found" Mar 13 00:33:56.349489 kubelet[2409]: E0313 00:33:56.349430 2409 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-7393fd8643\" not found" Mar 13 00:33:56.449752 kubelet[2409]: E0313 00:33:56.449695 2409 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-7393fd8643\" not found" Mar 13 00:33:56.488049 kubelet[2409]: E0313 00:33:56.487983 2409 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-7393fd8643\" not found" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:56.488545 kubelet[2409]: E0313 00:33:56.488489 2409 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-7393fd8643\" not found" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:56.550977 kubelet[2409]: E0313 00:33:56.550740 2409 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-7393fd8643\" not found" Mar 13 00:33:56.625677 kubelet[2409]: I0313 00:33:56.625033 2409 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:56.637405 kubelet[2409]: I0313 00:33:56.637315 2409 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:56.643840 kubelet[2409]: I0313 00:33:56.643782 2409 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:57.400124 kubelet[2409]: I0313 00:33:57.400043 2409 apiserver.go:52] "Watching apiserver" Mar 13 00:33:57.428058 kubelet[2409]: I0313 00:33:57.427979 2409 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 13 00:33:57.488280 kubelet[2409]: I0313 00:33:57.488232 2409 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:57.495457 kubelet[2409]: E0313 00:33:57.495229 2409 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-7393fd8643\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:58.153576 systemd[1]: Reload requested from client PID 2697 ('systemctl') (unit session-7.scope)... Mar 13 00:33:58.153608 systemd[1]: Reloading... Mar 13 00:33:58.254270 zram_generator::config[2741]: No configuration found. Mar 13 00:33:58.492135 systemd[1]: Reloading finished in 337 ms. Mar 13 00:33:58.522237 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:33:58.542518 systemd[1]: kubelet.service: Deactivated successfully. Mar 13 00:33:58.542798 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:33:58.542859 systemd[1]: kubelet.service: Consumed 599ms CPU time, 125.3M memory peak. Mar 13 00:33:58.545390 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:33:58.742131 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:33:58.753623 (kubelet)[2792]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 13 00:33:58.808853 kubelet[2792]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 00:33:58.821158 kubelet[2792]: I0313 00:33:58.820689 2792 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Mar 13 00:33:58.821158 kubelet[2792]: I0313 00:33:58.820740 2792 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 00:33:58.821158 kubelet[2792]: I0313 00:33:58.820760 2792 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 13 00:33:58.821158 kubelet[2792]: I0313 00:33:58.820765 2792 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 13 00:33:58.821158 kubelet[2792]: I0313 00:33:58.821005 2792 server.go:951] "Client rotation is on, will bootstrap in background" Mar 13 00:33:58.822174 kubelet[2792]: I0313 00:33:58.822129 2792 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 13 00:33:58.823789 kubelet[2792]: I0313 00:33:58.823763 2792 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 13 00:33:58.828587 kubelet[2792]: I0313 00:33:58.828519 2792 server.go:1418] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 00:33:58.832445 kubelet[2792]: I0313 00:33:58.832406 2792 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 13 00:33:58.832667 kubelet[2792]: I0313 00:33:58.832622 2792 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 00:33:58.832805 kubelet[2792]: I0313 00:33:58.832651 2792 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-n-7393fd8643","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 00:33:58.832805 kubelet[2792]: I0313 00:33:58.832796 2792 topology_manager.go:143] "Creating topology manager with none policy" Mar 13 00:33:58.832805 kubelet[2792]: I0313 00:33:58.832804 2792 container_manager_linux.go:308] "Creating device plugin manager" Mar 13 00:33:58.832960 kubelet[2792]: I0313 00:33:58.832848 2792 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Mar 13 00:33:58.833069 kubelet[2792]: I0313 00:33:58.833017 2792 state_mem.go:41] "Initialized" logger="CPUManager state memory" Mar 13 00:33:58.834038 kubelet[2792]: I0313 00:33:58.833258 2792 kubelet.go:482] "Attempting to sync node with API server" Mar 13 00:33:58.834038 kubelet[2792]: I0313 00:33:58.833280 2792 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 00:33:58.834038 kubelet[2792]: I0313 00:33:58.833393 2792 kubelet.go:394] "Adding apiserver pod source" Mar 13 00:33:58.834038 kubelet[2792]: I0313 00:33:58.833406 2792 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 00:33:58.834744 kubelet[2792]: I0313 00:33:58.834715 2792 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 13 00:33:58.836492 kubelet[2792]: I0313 00:33:58.836462 2792 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 13 00:33:58.836619 kubelet[2792]: I0313 00:33:58.836607 2792 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 13 00:33:58.840382 kubelet[2792]: I0313 00:33:58.840354 2792 server.go:1257] "Started kubelet" Mar 13 00:33:58.852645 kubelet[2792]: I0313 00:33:58.851898 2792 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Mar 13 00:33:58.857085 kubelet[2792]: I0313 00:33:58.856985 2792 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 00:33:58.858372 kubelet[2792]: I0313 00:33:58.858354 2792 server.go:317] "Adding debug handlers to kubelet server" Mar 13 00:33:58.863835 kubelet[2792]: I0313 00:33:58.863654 2792 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 00:33:58.863835 kubelet[2792]: I0313 00:33:58.863837 2792 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 13 00:33:58.865056 kubelet[2792]: I0313 00:33:58.864016 2792 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 00:33:58.865056 kubelet[2792]: I0313 00:33:58.864048 2792 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 13 00:33:58.865056 kubelet[2792]: I0313 00:33:58.864021 2792 volume_manager.go:311] "Starting Kubelet Volume Manager" Mar 13 00:33:58.865056 kubelet[2792]: E0313 00:33:58.864314 2792 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-7393fd8643\" not found" Mar 13 00:33:58.865056 kubelet[2792]: I0313 00:33:58.864781 2792 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 13 00:33:58.868267 kubelet[2792]: I0313 00:33:58.868233 2792 reconciler.go:29] "Reconciler: start to sync state" Mar 13 00:33:58.871800 kubelet[2792]: I0313 00:33:58.871757 2792 factory.go:223] Registration of the systemd container factory successfully Mar 13 00:33:58.871935 kubelet[2792]: I0313 00:33:58.871862 2792 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 13 00:33:58.880421 kubelet[2792]: I0313 00:33:58.880379 2792 factory.go:223] Registration of the containerd container factory successfully Mar 13 00:33:58.886502 kubelet[2792]: E0313 00:33:58.886391 2792 kubelet.go:1656] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 13 00:33:58.910149 kubelet[2792]: I0313 00:33:58.910052 2792 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 13 00:33:58.913394 kubelet[2792]: I0313 00:33:58.913355 2792 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 13 00:33:58.913610 kubelet[2792]: I0313 00:33:58.913553 2792 status_manager.go:249] "Starting to sync pod status with apiserver" Mar 13 00:33:58.913610 kubelet[2792]: I0313 00:33:58.913582 2792 kubelet.go:2501] "Starting kubelet main sync loop" Mar 13 00:33:58.915373 kubelet[2792]: E0313 00:33:58.915268 2792 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 00:33:58.953627 kubelet[2792]: I0313 00:33:58.953251 2792 cpu_manager.go:225] "Starting" policy="none" Mar 13 00:33:58.953627 kubelet[2792]: I0313 00:33:58.953269 2792 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 13 00:33:58.953627 kubelet[2792]: I0313 00:33:58.953297 2792 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Mar 13 00:33:58.953627 kubelet[2792]: I0313 00:33:58.953478 2792 state_mem.go:94] "Updated default CPUSet" logger="CPUManager state checkpoint.CPUManager state memory" cpuSet="" Mar 13 00:33:58.953627 kubelet[2792]: I0313 00:33:58.953500 2792 state_mem.go:102] "Updated CPUSet assignments" logger="CPUManager state checkpoint.CPUManager state memory" assignments={} Mar 13 00:33:58.953627 kubelet[2792]: I0313 00:33:58.953516 2792 policy_none.go:50] "Start" Mar 13 00:33:58.953627 kubelet[2792]: I0313 00:33:58.953527 2792 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 13 00:33:58.953627 kubelet[2792]: I0313 00:33:58.953538 2792 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 13 00:33:58.953627 kubelet[2792]: I0313 00:33:58.953625 2792 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 13 00:33:58.953627 kubelet[2792]: I0313 00:33:58.953631 2792 policy_none.go:44] "Start" Mar 13 00:33:58.959967 kubelet[2792]: E0313 00:33:58.959927 2792 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 13 00:33:58.960546 kubelet[2792]: I0313 00:33:58.960441 2792 eviction_manager.go:194] "Eviction manager: starting control loop" Mar 13 00:33:58.960546 kubelet[2792]: I0313 00:33:58.960476 2792 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 00:33:58.962179 kubelet[2792]: I0313 00:33:58.962154 2792 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Mar 13 00:33:58.963501 kubelet[2792]: E0313 00:33:58.963457 2792 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 13 00:33:59.016150 kubelet[2792]: I0313 00:33:59.015980 2792 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.017142 kubelet[2792]: I0313 00:33:59.017082 2792 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.017387 kubelet[2792]: I0313 00:33:59.017331 2792 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.024950 kubelet[2792]: E0313 00:33:59.024905 2792 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-7393fd8643\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.026046 kubelet[2792]: E0313 00:33:59.025987 2792 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-7393fd8643\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.026194 kubelet[2792]: E0313 00:33:59.026076 2792 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-n-7393fd8643\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.063565 kubelet[2792]: I0313 00:33:59.063511 2792 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.069142 kubelet[2792]: I0313 00:33:59.068894 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6e7dd5f523e82d58e4391e484d4a43f5-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-n-7393fd8643\" (UID: \"6e7dd5f523e82d58e4391e484d4a43f5\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.069142 kubelet[2792]: I0313 00:33:59.068946 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/76807beaf181ab0c9f75be8d8d850de3-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-7393fd8643\" (UID: \"76807beaf181ab0c9f75be8d8d850de3\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.069142 kubelet[2792]: I0313 00:33:59.068971 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/76807beaf181ab0c9f75be8d8d850de3-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-n-7393fd8643\" (UID: \"76807beaf181ab0c9f75be8d8d850de3\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.069142 kubelet[2792]: I0313 00:33:59.068993 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5b8b1a67ae94557ca60791d4e96c593f-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-n-7393fd8643\" (UID: \"5b8b1a67ae94557ca60791d4e96c593f\") " pod="kube-system/kube-scheduler-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.069142 kubelet[2792]: I0313 00:33:59.069012 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6e7dd5f523e82d58e4391e484d4a43f5-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-n-7393fd8643\" (UID: \"6e7dd5f523e82d58e4391e484d4a43f5\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.069802 kubelet[2792]: I0313 00:33:59.069041 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6e7dd5f523e82d58e4391e484d4a43f5-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-n-7393fd8643\" (UID: \"6e7dd5f523e82d58e4391e484d4a43f5\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.069802 kubelet[2792]: I0313 00:33:59.069062 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/76807beaf181ab0c9f75be8d8d850de3-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-n-7393fd8643\" (UID: \"76807beaf181ab0c9f75be8d8d850de3\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.069802 kubelet[2792]: I0313 00:33:59.069082 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/76807beaf181ab0c9f75be8d8d850de3-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-7393fd8643\" (UID: \"76807beaf181ab0c9f75be8d8d850de3\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.069802 kubelet[2792]: I0313 00:33:59.069623 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/76807beaf181ab0c9f75be8d8d850de3-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-n-7393fd8643\" (UID: \"76807beaf181ab0c9f75be8d8d850de3\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.071530 kubelet[2792]: I0313 00:33:59.071458 2792 kubelet_node_status.go:123] "Node was previously registered" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.071692 kubelet[2792]: I0313 00:33:59.071629 2792 kubelet_node_status.go:77] "Successfully registered node" node="ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.841459 kubelet[2792]: I0313 00:33:59.841395 2792 apiserver.go:52] "Watching apiserver" Mar 13 00:33:59.864694 kubelet[2792]: I0313 00:33:59.864635 2792 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 13 00:33:59.913115 kubelet[2792]: I0313 00:33:59.913015 2792 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-4-n-7393fd8643" podStartSLOduration=3.913001818 podStartE2EDuration="3.913001818s" podCreationTimestamp="2026-03-13 00:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:33:59.912841558 +0000 UTC m=+1.152316851" watchObservedRunningTime="2026-03-13 00:33:59.913001818 +0000 UTC m=+1.152477111" Mar 13 00:33:59.937427 kubelet[2792]: I0313 00:33:59.936883 2792 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-4-n-7393fd8643" podStartSLOduration=3.9368677180000002 podStartE2EDuration="3.936867718s" podCreationTimestamp="2026-03-13 00:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:33:59.923604667 +0000 UTC m=+1.163079960" watchObservedRunningTime="2026-03-13 00:33:59.936867718 +0000 UTC m=+1.176343001" Mar 13 00:33:59.949904 kubelet[2792]: I0313 00:33:59.949167 2792 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-7393fd8643" podStartSLOduration=3.949149088 podStartE2EDuration="3.949149088s" podCreationTimestamp="2026-03-13 00:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:33:59.936842138 +0000 UTC m=+1.176317431" watchObservedRunningTime="2026-03-13 00:33:59.949149088 +0000 UTC m=+1.188624381" Mar 13 00:33:59.952909 kubelet[2792]: I0313 00:33:59.952257 2792 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-7393fd8643" Mar 13 00:33:59.959388 kubelet[2792]: E0313 00:33:59.959353 2792 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-7393fd8643\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-4-n-7393fd8643" Mar 13 00:34:04.495479 kubelet[2792]: I0313 00:34:04.495430 2792 kuberuntime_manager.go:2062] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 13 00:34:04.495984 containerd[1620]: time="2026-03-13T00:34:04.495867376Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 13 00:34:04.496383 kubelet[2792]: I0313 00:34:04.496058 2792 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 13 00:34:05.238611 systemd[1]: Created slice kubepods-besteffort-podd5431513_c94a_4741_a862_d64506a81842.slice - libcontainer container kubepods-besteffort-podd5431513_c94a_4741_a862_d64506a81842.slice. Mar 13 00:34:05.309134 kubelet[2792]: I0313 00:34:05.309054 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/d5431513-c94a-4741-a862-d64506a81842-kube-proxy\") pod \"kube-proxy-rsgfx\" (UID: \"d5431513-c94a-4741-a862-d64506a81842\") " pod="kube-system/kube-proxy-rsgfx" Mar 13 00:34:05.309535 kubelet[2792]: I0313 00:34:05.309506 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d5431513-c94a-4741-a862-d64506a81842-xtables-lock\") pod \"kube-proxy-rsgfx\" (UID: \"d5431513-c94a-4741-a862-d64506a81842\") " pod="kube-system/kube-proxy-rsgfx" Mar 13 00:34:05.309535 kubelet[2792]: I0313 00:34:05.309535 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5431513-c94a-4741-a862-d64506a81842-lib-modules\") pod \"kube-proxy-rsgfx\" (UID: \"d5431513-c94a-4741-a862-d64506a81842\") " pod="kube-system/kube-proxy-rsgfx" Mar 13 00:34:05.309660 kubelet[2792]: I0313 00:34:05.309595 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnsfz\" (UniqueName: \"kubernetes.io/projected/d5431513-c94a-4741-a862-d64506a81842-kube-api-access-lnsfz\") pod \"kube-proxy-rsgfx\" (UID: \"d5431513-c94a-4741-a862-d64506a81842\") " pod="kube-system/kube-proxy-rsgfx" Mar 13 00:34:05.550818 containerd[1620]: time="2026-03-13T00:34:05.550692975Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-rsgfx,Uid:d5431513-c94a-4741-a862-d64506a81842,Namespace:kube-system,Attempt:0,}" Mar 13 00:34:05.573787 containerd[1620]: time="2026-03-13T00:34:05.573639784Z" level=info msg="connecting to shim 7d461ad659129545e5369655cf9c650254f46506f97279a6380c9006bc6d3d04" address="unix:///run/containerd/s/cb0e844ae469b6e22606c2a291add0a2eaf06a5a4ba2197992984dcc345b4dce" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:05.610424 systemd[1]: Started cri-containerd-7d461ad659129545e5369655cf9c650254f46506f97279a6380c9006bc6d3d04.scope - libcontainer container 7d461ad659129545e5369655cf9c650254f46506f97279a6380c9006bc6d3d04. Mar 13 00:34:05.653017 containerd[1620]: time="2026-03-13T00:34:05.652905190Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-rsgfx,Uid:d5431513-c94a-4741-a862-d64506a81842,Namespace:kube-system,Attempt:0,} returns sandbox id \"7d461ad659129545e5369655cf9c650254f46506f97279a6380c9006bc6d3d04\"" Mar 13 00:34:05.660591 containerd[1620]: time="2026-03-13T00:34:05.660535496Z" level=info msg="CreateContainer within sandbox \"7d461ad659129545e5369655cf9c650254f46506f97279a6380c9006bc6d3d04\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 13 00:34:05.674404 containerd[1620]: time="2026-03-13T00:34:05.674253588Z" level=info msg="Container 655829b641ff7d71cb2fd641ef6d97210ac09fe584285fe0f1663d534c966f72: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:05.686410 containerd[1620]: time="2026-03-13T00:34:05.686299248Z" level=info msg="CreateContainer within sandbox \"7d461ad659129545e5369655cf9c650254f46506f97279a6380c9006bc6d3d04\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"655829b641ff7d71cb2fd641ef6d97210ac09fe584285fe0f1663d534c966f72\"" Mar 13 00:34:05.687311 containerd[1620]: time="2026-03-13T00:34:05.687282689Z" level=info msg="StartContainer for \"655829b641ff7d71cb2fd641ef6d97210ac09fe584285fe0f1663d534c966f72\"" Mar 13 00:34:05.688992 containerd[1620]: time="2026-03-13T00:34:05.688955720Z" level=info msg="connecting to shim 655829b641ff7d71cb2fd641ef6d97210ac09fe584285fe0f1663d534c966f72" address="unix:///run/containerd/s/cb0e844ae469b6e22606c2a291add0a2eaf06a5a4ba2197992984dcc345b4dce" protocol=ttrpc version=3 Mar 13 00:34:05.710317 systemd[1]: Started cri-containerd-655829b641ff7d71cb2fd641ef6d97210ac09fe584285fe0f1663d534c966f72.scope - libcontainer container 655829b641ff7d71cb2fd641ef6d97210ac09fe584285fe0f1663d534c966f72. Mar 13 00:34:05.778561 systemd[1]: Created slice kubepods-besteffort-pod0938926d_a751_4f1e_9ecb_13b724848847.slice - libcontainer container kubepods-besteffort-pod0938926d_a751_4f1e_9ecb_13b724848847.slice. Mar 13 00:34:05.807779 containerd[1620]: time="2026-03-13T00:34:05.807663719Z" level=info msg="StartContainer for \"655829b641ff7d71cb2fd641ef6d97210ac09fe584285fe0f1663d534c966f72\" returns successfully" Mar 13 00:34:05.813978 kubelet[2792]: I0313 00:34:05.813881 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0938926d-a751-4f1e-9ecb-13b724848847-var-lib-calico\") pod \"tigera-operator-6cf4cccc57-679h9\" (UID: \"0938926d-a751-4f1e-9ecb-13b724848847\") " pod="tigera-operator/tigera-operator-6cf4cccc57-679h9" Mar 13 00:34:05.813978 kubelet[2792]: I0313 00:34:05.813929 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fszhp\" (UniqueName: \"kubernetes.io/projected/0938926d-a751-4f1e-9ecb-13b724848847-kube-api-access-fszhp\") pod \"tigera-operator-6cf4cccc57-679h9\" (UID: \"0938926d-a751-4f1e-9ecb-13b724848847\") " pod="tigera-operator/tigera-operator-6cf4cccc57-679h9" Mar 13 00:34:06.084766 containerd[1620]: time="2026-03-13T00:34:06.084644510Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-679h9,Uid:0938926d-a751-4f1e-9ecb-13b724848847,Namespace:tigera-operator,Attempt:0,}" Mar 13 00:34:06.099703 containerd[1620]: time="2026-03-13T00:34:06.099626742Z" level=info msg="connecting to shim cbc06db47363952d44679304961a729edd4ca41f68e91357ed4d827d071f470e" address="unix:///run/containerd/s/afc51b8c363341cab57fc7c55c7b33b71b46e462a6da4f29d5d26a8a5830b7f1" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:06.122375 systemd[1]: Started cri-containerd-cbc06db47363952d44679304961a729edd4ca41f68e91357ed4d827d071f470e.scope - libcontainer container cbc06db47363952d44679304961a729edd4ca41f68e91357ed4d827d071f470e. Mar 13 00:34:06.174135 containerd[1620]: time="2026-03-13T00:34:06.174017264Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-679h9,Uid:0938926d-a751-4f1e-9ecb-13b724848847,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"cbc06db47363952d44679304961a729edd4ca41f68e91357ed4d827d071f470e\"" Mar 13 00:34:06.176512 containerd[1620]: time="2026-03-13T00:34:06.176412716Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 13 00:34:08.139091 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2858308825.mount: Deactivated successfully. Mar 13 00:34:09.075008 containerd[1620]: time="2026-03-13T00:34:09.074931081Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:09.076091 containerd[1620]: time="2026-03-13T00:34:09.075695121Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Mar 13 00:34:09.076862 containerd[1620]: time="2026-03-13T00:34:09.076829212Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:09.079158 containerd[1620]: time="2026-03-13T00:34:09.079077674Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:09.079909 containerd[1620]: time="2026-03-13T00:34:09.079871655Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 2.903224479s" Mar 13 00:34:09.079992 containerd[1620]: time="2026-03-13T00:34:09.079978585Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Mar 13 00:34:09.085441 containerd[1620]: time="2026-03-13T00:34:09.085374009Z" level=info msg="CreateContainer within sandbox \"cbc06db47363952d44679304961a729edd4ca41f68e91357ed4d827d071f470e\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 13 00:34:09.097887 containerd[1620]: time="2026-03-13T00:34:09.096475629Z" level=info msg="Container 6cc418d0e56ff32a5ac1bdd3cfacd844ba924d92d3b3d53b618ec8d9c33259d8: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:09.103815 containerd[1620]: time="2026-03-13T00:34:09.103730385Z" level=info msg="CreateContainer within sandbox \"cbc06db47363952d44679304961a729edd4ca41f68e91357ed4d827d071f470e\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"6cc418d0e56ff32a5ac1bdd3cfacd844ba924d92d3b3d53b618ec8d9c33259d8\"" Mar 13 00:34:09.104743 containerd[1620]: time="2026-03-13T00:34:09.104580605Z" level=info msg="StartContainer for \"6cc418d0e56ff32a5ac1bdd3cfacd844ba924d92d3b3d53b618ec8d9c33259d8\"" Mar 13 00:34:09.106677 containerd[1620]: time="2026-03-13T00:34:09.106651817Z" level=info msg="connecting to shim 6cc418d0e56ff32a5ac1bdd3cfacd844ba924d92d3b3d53b618ec8d9c33259d8" address="unix:///run/containerd/s/afc51b8c363341cab57fc7c55c7b33b71b46e462a6da4f29d5d26a8a5830b7f1" protocol=ttrpc version=3 Mar 13 00:34:09.134507 systemd[1]: Started cri-containerd-6cc418d0e56ff32a5ac1bdd3cfacd844ba924d92d3b3d53b618ec8d9c33259d8.scope - libcontainer container 6cc418d0e56ff32a5ac1bdd3cfacd844ba924d92d3b3d53b618ec8d9c33259d8. Mar 13 00:34:09.180329 containerd[1620]: time="2026-03-13T00:34:09.180213788Z" level=info msg="StartContainer for \"6cc418d0e56ff32a5ac1bdd3cfacd844ba924d92d3b3d53b618ec8d9c33259d8\" returns successfully" Mar 13 00:34:09.992504 kubelet[2792]: I0313 00:34:09.992342 2792 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-proxy-rsgfx" podStartSLOduration=4.992314635 podStartE2EDuration="4.992314635s" podCreationTimestamp="2026-03-13 00:34:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:34:05.978266371 +0000 UTC m=+7.217741664" watchObservedRunningTime="2026-03-13 00:34:09.992314635 +0000 UTC m=+11.231789968" Mar 13 00:34:10.370015 kubelet[2792]: I0313 00:34:10.369950 2792 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6cf4cccc57-679h9" podStartSLOduration=2.46523693 podStartE2EDuration="5.36992996s" podCreationTimestamp="2026-03-13 00:34:05 +0000 UTC" firstStartedPulling="2026-03-13 00:34:06.176059796 +0000 UTC m=+7.415535089" lastFinishedPulling="2026-03-13 00:34:09.080752826 +0000 UTC m=+10.320228119" observedRunningTime="2026-03-13 00:34:09.992849625 +0000 UTC m=+11.232324958" watchObservedRunningTime="2026-03-13 00:34:10.36992996 +0000 UTC m=+11.609405243" Mar 13 00:34:14.771219 sudo[1845]: pam_unix(sudo:session): session closed for user root Mar 13 00:34:14.892147 sshd[1844]: Connection closed by 4.153.228.146 port 37950 Mar 13 00:34:14.893036 sshd-session[1841]: pam_unix(sshd:session): session closed for user core Mar 13 00:34:14.907923 systemd[1]: sshd@6-157.180.95.181:22-4.153.228.146:37950.service: Deactivated successfully. Mar 13 00:34:14.916347 systemd[1]: session-7.scope: Deactivated successfully. Mar 13 00:34:14.916941 systemd[1]: session-7.scope: Consumed 3.138s CPU time, 229M memory peak. Mar 13 00:34:14.921414 systemd-logind[1591]: Session 7 logged out. Waiting for processes to exit. Mar 13 00:34:14.927862 systemd-logind[1591]: Removed session 7. Mar 13 00:34:14.958753 update_engine[1594]: I20260313 00:34:14.958149 1594 update_attempter.cc:509] Updating boot flags... Mar 13 00:34:17.677188 systemd[1]: Created slice kubepods-besteffort-pod55ceb9e5_b1ab_4ede_bccb_6e35bb3c4132.slice - libcontainer container kubepods-besteffort-pod55ceb9e5_b1ab_4ede_bccb_6e35bb3c4132.slice. Mar 13 00:34:17.693124 kubelet[2792]: I0313 00:34:17.693001 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhqhq\" (UniqueName: \"kubernetes.io/projected/55ceb9e5-b1ab-4ede-bccb-6e35bb3c4132-kube-api-access-lhqhq\") pod \"calico-typha-76b95db45d-m22tn\" (UID: \"55ceb9e5-b1ab-4ede-bccb-6e35bb3c4132\") " pod="calico-system/calico-typha-76b95db45d-m22tn" Mar 13 00:34:17.693124 kubelet[2792]: I0313 00:34:17.693045 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/55ceb9e5-b1ab-4ede-bccb-6e35bb3c4132-typha-certs\") pod \"calico-typha-76b95db45d-m22tn\" (UID: \"55ceb9e5-b1ab-4ede-bccb-6e35bb3c4132\") " pod="calico-system/calico-typha-76b95db45d-m22tn" Mar 13 00:34:17.693124 kubelet[2792]: I0313 00:34:17.693063 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55ceb9e5-b1ab-4ede-bccb-6e35bb3c4132-tigera-ca-bundle\") pod \"calico-typha-76b95db45d-m22tn\" (UID: \"55ceb9e5-b1ab-4ede-bccb-6e35bb3c4132\") " pod="calico-system/calico-typha-76b95db45d-m22tn" Mar 13 00:34:17.775542 systemd[1]: Created slice kubepods-besteffort-pod11896047_a44e_411d_abb3_eac7d1f52e7c.slice - libcontainer container kubepods-besteffort-pod11896047_a44e_411d_abb3_eac7d1f52e7c.slice. Mar 13 00:34:17.795206 kubelet[2792]: I0313 00:34:17.794262 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/11896047-a44e-411d-abb3-eac7d1f52e7c-flexvol-driver-host\") pod \"calico-node-x2kz7\" (UID: \"11896047-a44e-411d-abb3-eac7d1f52e7c\") " pod="calico-system/calico-node-x2kz7" Mar 13 00:34:17.795206 kubelet[2792]: I0313 00:34:17.794298 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbhnh\" (UniqueName: \"kubernetes.io/projected/11896047-a44e-411d-abb3-eac7d1f52e7c-kube-api-access-sbhnh\") pod \"calico-node-x2kz7\" (UID: \"11896047-a44e-411d-abb3-eac7d1f52e7c\") " pod="calico-system/calico-node-x2kz7" Mar 13 00:34:17.795206 kubelet[2792]: I0313 00:34:17.794313 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/11896047-a44e-411d-abb3-eac7d1f52e7c-bpffs\") pod \"calico-node-x2kz7\" (UID: \"11896047-a44e-411d-abb3-eac7d1f52e7c\") " pod="calico-system/calico-node-x2kz7" Mar 13 00:34:17.795206 kubelet[2792]: I0313 00:34:17.794328 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/11896047-a44e-411d-abb3-eac7d1f52e7c-cni-log-dir\") pod \"calico-node-x2kz7\" (UID: \"11896047-a44e-411d-abb3-eac7d1f52e7c\") " pod="calico-system/calico-node-x2kz7" Mar 13 00:34:17.795206 kubelet[2792]: I0313 00:34:17.794340 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/11896047-a44e-411d-abb3-eac7d1f52e7c-cni-net-dir\") pod \"calico-node-x2kz7\" (UID: \"11896047-a44e-411d-abb3-eac7d1f52e7c\") " pod="calico-system/calico-node-x2kz7" Mar 13 00:34:17.795415 kubelet[2792]: I0313 00:34:17.794360 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/11896047-a44e-411d-abb3-eac7d1f52e7c-lib-modules\") pod \"calico-node-x2kz7\" (UID: \"11896047-a44e-411d-abb3-eac7d1f52e7c\") " pod="calico-system/calico-node-x2kz7" Mar 13 00:34:17.795415 kubelet[2792]: I0313 00:34:17.794371 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/11896047-a44e-411d-abb3-eac7d1f52e7c-nodeproc\") pod \"calico-node-x2kz7\" (UID: \"11896047-a44e-411d-abb3-eac7d1f52e7c\") " pod="calico-system/calico-node-x2kz7" Mar 13 00:34:17.795415 kubelet[2792]: I0313 00:34:17.794391 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/11896047-a44e-411d-abb3-eac7d1f52e7c-policysync\") pod \"calico-node-x2kz7\" (UID: \"11896047-a44e-411d-abb3-eac7d1f52e7c\") " pod="calico-system/calico-node-x2kz7" Mar 13 00:34:17.795415 kubelet[2792]: I0313 00:34:17.794404 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/11896047-a44e-411d-abb3-eac7d1f52e7c-sys-fs\") pod \"calico-node-x2kz7\" (UID: \"11896047-a44e-411d-abb3-eac7d1f52e7c\") " pod="calico-system/calico-node-x2kz7" Mar 13 00:34:17.795415 kubelet[2792]: I0313 00:34:17.794415 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/11896047-a44e-411d-abb3-eac7d1f52e7c-xtables-lock\") pod \"calico-node-x2kz7\" (UID: \"11896047-a44e-411d-abb3-eac7d1f52e7c\") " pod="calico-system/calico-node-x2kz7" Mar 13 00:34:17.795516 kubelet[2792]: I0313 00:34:17.794427 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/11896047-a44e-411d-abb3-eac7d1f52e7c-cni-bin-dir\") pod \"calico-node-x2kz7\" (UID: \"11896047-a44e-411d-abb3-eac7d1f52e7c\") " pod="calico-system/calico-node-x2kz7" Mar 13 00:34:17.795516 kubelet[2792]: I0313 00:34:17.794438 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/11896047-a44e-411d-abb3-eac7d1f52e7c-node-certs\") pod \"calico-node-x2kz7\" (UID: \"11896047-a44e-411d-abb3-eac7d1f52e7c\") " pod="calico-system/calico-node-x2kz7" Mar 13 00:34:17.795516 kubelet[2792]: I0313 00:34:17.794450 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11896047-a44e-411d-abb3-eac7d1f52e7c-tigera-ca-bundle\") pod \"calico-node-x2kz7\" (UID: \"11896047-a44e-411d-abb3-eac7d1f52e7c\") " pod="calico-system/calico-node-x2kz7" Mar 13 00:34:17.795516 kubelet[2792]: I0313 00:34:17.794463 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/11896047-a44e-411d-abb3-eac7d1f52e7c-var-lib-calico\") pod \"calico-node-x2kz7\" (UID: \"11896047-a44e-411d-abb3-eac7d1f52e7c\") " pod="calico-system/calico-node-x2kz7" Mar 13 00:34:17.795516 kubelet[2792]: I0313 00:34:17.794476 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/11896047-a44e-411d-abb3-eac7d1f52e7c-var-run-calico\") pod \"calico-node-x2kz7\" (UID: \"11896047-a44e-411d-abb3-eac7d1f52e7c\") " pod="calico-system/calico-node-x2kz7" Mar 13 00:34:17.882075 kubelet[2792]: E0313 00:34:17.882037 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-r7g7s" podUID="be715fcd-c3e2-47e4-b475-20bdc4ec1391" Mar 13 00:34:17.895731 kubelet[2792]: I0313 00:34:17.894958 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zqk5\" (UniqueName: \"kubernetes.io/projected/be715fcd-c3e2-47e4-b475-20bdc4ec1391-kube-api-access-2zqk5\") pod \"csi-node-driver-r7g7s\" (UID: \"be715fcd-c3e2-47e4-b475-20bdc4ec1391\") " pod="calico-system/csi-node-driver-r7g7s" Mar 13 00:34:17.897376 kubelet[2792]: I0313 00:34:17.897274 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/be715fcd-c3e2-47e4-b475-20bdc4ec1391-varrun\") pod \"csi-node-driver-r7g7s\" (UID: \"be715fcd-c3e2-47e4-b475-20bdc4ec1391\") " pod="calico-system/csi-node-driver-r7g7s" Mar 13 00:34:17.897444 kubelet[2792]: I0313 00:34:17.897362 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be715fcd-c3e2-47e4-b475-20bdc4ec1391-kubelet-dir\") pod \"csi-node-driver-r7g7s\" (UID: \"be715fcd-c3e2-47e4-b475-20bdc4ec1391\") " pod="calico-system/csi-node-driver-r7g7s" Mar 13 00:34:17.897589 kubelet[2792]: I0313 00:34:17.897513 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/be715fcd-c3e2-47e4-b475-20bdc4ec1391-socket-dir\") pod \"csi-node-driver-r7g7s\" (UID: \"be715fcd-c3e2-47e4-b475-20bdc4ec1391\") " pod="calico-system/csi-node-driver-r7g7s" Mar 13 00:34:17.897589 kubelet[2792]: I0313 00:34:17.897552 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/be715fcd-c3e2-47e4-b475-20bdc4ec1391-registration-dir\") pod \"csi-node-driver-r7g7s\" (UID: \"be715fcd-c3e2-47e4-b475-20bdc4ec1391\") " pod="calico-system/csi-node-driver-r7g7s" Mar 13 00:34:17.899842 kubelet[2792]: E0313 00:34:17.899804 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.899842 kubelet[2792]: W0313 00:34:17.899819 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.900216 kubelet[2792]: E0313 00:34:17.900129 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.901457 kubelet[2792]: E0313 00:34:17.901427 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.901539 kubelet[2792]: W0313 00:34:17.901440 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.901603 kubelet[2792]: E0313 00:34:17.901594 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.902875 kubelet[2792]: E0313 00:34:17.901875 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.902875 kubelet[2792]: W0313 00:34:17.901884 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.903451 kubelet[2792]: E0313 00:34:17.901893 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.903949 kubelet[2792]: E0313 00:34:17.903938 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.904114 kubelet[2792]: W0313 00:34:17.903997 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.904114 kubelet[2792]: E0313 00:34:17.904075 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.904444 kubelet[2792]: E0313 00:34:17.904410 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.904444 kubelet[2792]: W0313 00:34:17.904420 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.904583 kubelet[2792]: E0313 00:34:17.904429 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.905292 kubelet[2792]: E0313 00:34:17.905222 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.905376 kubelet[2792]: W0313 00:34:17.905357 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.905454 kubelet[2792]: E0313 00:34:17.905444 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.915377 kubelet[2792]: E0313 00:34:17.915329 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.918152 kubelet[2792]: W0313 00:34:17.915561 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.918152 kubelet[2792]: E0313 00:34:17.915822 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.926620 kubelet[2792]: E0313 00:34:17.926592 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.926796 kubelet[2792]: W0313 00:34:17.926781 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.926893 kubelet[2792]: E0313 00:34:17.926861 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.986578 containerd[1620]: time="2026-03-13T00:34:17.986480018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-76b95db45d-m22tn,Uid:55ceb9e5-b1ab-4ede-bccb-6e35bb3c4132,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:17.999209 kubelet[2792]: E0313 00:34:17.999158 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.999449 kubelet[2792]: W0313 00:34:17.999361 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.999449 kubelet[2792]: E0313 00:34:17.999386 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.000026 kubelet[2792]: E0313 00:34:18.000004 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.000370 kubelet[2792]: W0313 00:34:18.000137 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.000370 kubelet[2792]: E0313 00:34:18.000152 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.001559 kubelet[2792]: E0313 00:34:18.001491 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.001559 kubelet[2792]: W0313 00:34:18.001504 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.001559 kubelet[2792]: E0313 00:34:18.001518 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.002610 kubelet[2792]: E0313 00:34:18.002591 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.002810 kubelet[2792]: W0313 00:34:18.002692 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.002810 kubelet[2792]: E0313 00:34:18.002707 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.003514 kubelet[2792]: E0313 00:34:18.003459 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.003514 kubelet[2792]: W0313 00:34:18.003470 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.003779 kubelet[2792]: E0313 00:34:18.003481 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.004616 kubelet[2792]: E0313 00:34:18.004370 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.004791 kubelet[2792]: W0313 00:34:18.004756 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.004868 kubelet[2792]: E0313 00:34:18.004839 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.005476 kubelet[2792]: E0313 00:34:18.005455 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.006030 kubelet[2792]: W0313 00:34:18.005575 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.006030 kubelet[2792]: E0313 00:34:18.005589 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.007883 kubelet[2792]: E0313 00:34:18.007354 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.007883 kubelet[2792]: W0313 00:34:18.007368 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.007883 kubelet[2792]: E0313 00:34:18.007486 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.009145 kubelet[2792]: E0313 00:34:18.009002 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.009145 kubelet[2792]: W0313 00:34:18.009013 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.009145 kubelet[2792]: E0313 00:34:18.009023 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.009811 kubelet[2792]: E0313 00:34:18.009767 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.009811 kubelet[2792]: W0313 00:34:18.009777 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.009811 kubelet[2792]: E0313 00:34:18.009786 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.011409 kubelet[2792]: E0313 00:34:18.011323 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.011409 kubelet[2792]: W0313 00:34:18.011335 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.011409 kubelet[2792]: E0313 00:34:18.011344 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.013506 kubelet[2792]: E0313 00:34:18.013271 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.013506 kubelet[2792]: W0313 00:34:18.013293 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.013506 kubelet[2792]: E0313 00:34:18.013305 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.014287 kubelet[2792]: E0313 00:34:18.013618 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.014287 kubelet[2792]: W0313 00:34:18.013628 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.014287 kubelet[2792]: E0313 00:34:18.013636 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.014287 kubelet[2792]: E0313 00:34:18.014259 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.014287 kubelet[2792]: W0313 00:34:18.014267 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.014287 kubelet[2792]: E0313 00:34:18.014275 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.014749 kubelet[2792]: E0313 00:34:18.014737 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.014814 kubelet[2792]: W0313 00:34:18.014803 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.014868 kubelet[2792]: E0313 00:34:18.014859 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.018286 kubelet[2792]: E0313 00:34:18.016088 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.018286 kubelet[2792]: W0313 00:34:18.016130 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.018286 kubelet[2792]: E0313 00:34:18.016139 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.018684 kubelet[2792]: E0313 00:34:18.018555 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.018684 kubelet[2792]: W0313 00:34:18.018567 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.018684 kubelet[2792]: E0313 00:34:18.018579 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.020334 kubelet[2792]: E0313 00:34:18.020293 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.020384 kubelet[2792]: W0313 00:34:18.020333 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.020384 kubelet[2792]: E0313 00:34:18.020365 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.022307 kubelet[2792]: E0313 00:34:18.022270 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.022307 kubelet[2792]: W0313 00:34:18.022302 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.022395 kubelet[2792]: E0313 00:34:18.022327 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.022752 kubelet[2792]: E0313 00:34:18.022683 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.022752 kubelet[2792]: W0313 00:34:18.022700 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.022752 kubelet[2792]: E0313 00:34:18.022717 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.024524 kubelet[2792]: E0313 00:34:18.024489 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.024524 kubelet[2792]: W0313 00:34:18.024520 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.024616 kubelet[2792]: E0313 00:34:18.024544 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.025485 kubelet[2792]: E0313 00:34:18.025446 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.025485 kubelet[2792]: W0313 00:34:18.025471 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.025568 kubelet[2792]: E0313 00:34:18.025490 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.028753 kubelet[2792]: E0313 00:34:18.028408 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.028753 kubelet[2792]: W0313 00:34:18.028452 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.028753 kubelet[2792]: E0313 00:34:18.028493 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.030142 kubelet[2792]: E0313 00:34:18.029710 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.030142 kubelet[2792]: W0313 00:34:18.029742 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.030142 kubelet[2792]: E0313 00:34:18.029765 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.032118 kubelet[2792]: E0313 00:34:18.031465 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.032118 kubelet[2792]: W0313 00:34:18.031491 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.032118 kubelet[2792]: E0313 00:34:18.031515 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.039953 containerd[1620]: time="2026-03-13T00:34:18.039906397Z" level=info msg="connecting to shim 57ff2fb645de15d24b3899a24bfdc16c771c11eeb85c450443a8d69402eb5842" address="unix:///run/containerd/s/f2a0d4841a1b542537d4ec4030eced305bd053c484661e0c8076761b1a8268ad" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:18.043071 kubelet[2792]: E0313 00:34:18.043027 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:18.043071 kubelet[2792]: W0313 00:34:18.043063 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:18.044822 kubelet[2792]: E0313 00:34:18.043091 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:18.090278 containerd[1620]: time="2026-03-13T00:34:18.090192466Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-x2kz7,Uid:11896047-a44e-411d-abb3-eac7d1f52e7c,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:18.112090 systemd[1]: Started cri-containerd-57ff2fb645de15d24b3899a24bfdc16c771c11eeb85c450443a8d69402eb5842.scope - libcontainer container 57ff2fb645de15d24b3899a24bfdc16c771c11eeb85c450443a8d69402eb5842. Mar 13 00:34:18.139020 containerd[1620]: time="2026-03-13T00:34:18.138949997Z" level=info msg="connecting to shim 8f4c3dbb5f984a8d35104e6791b582e6ffeab489eb2706e13dfeec1663950510" address="unix:///run/containerd/s/a98bd5bee0c75ad22eaf3379f60638bccd299ac76445f4bd02fd8ae4af7659c0" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:18.188663 systemd[1]: Started cri-containerd-8f4c3dbb5f984a8d35104e6791b582e6ffeab489eb2706e13dfeec1663950510.scope - libcontainer container 8f4c3dbb5f984a8d35104e6791b582e6ffeab489eb2706e13dfeec1663950510. Mar 13 00:34:18.264339 containerd[1620]: time="2026-03-13T00:34:18.264212988Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-76b95db45d-m22tn,Uid:55ceb9e5-b1ab-4ede-bccb-6e35bb3c4132,Namespace:calico-system,Attempt:0,} returns sandbox id \"57ff2fb645de15d24b3899a24bfdc16c771c11eeb85c450443a8d69402eb5842\"" Mar 13 00:34:18.270862 containerd[1620]: time="2026-03-13T00:34:18.270713071Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 13 00:34:18.310843 containerd[1620]: time="2026-03-13T00:34:18.310788154Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-x2kz7,Uid:11896047-a44e-411d-abb3-eac7d1f52e7c,Namespace:calico-system,Attempt:0,} returns sandbox id \"8f4c3dbb5f984a8d35104e6791b582e6ffeab489eb2706e13dfeec1663950510\"" Mar 13 00:34:19.914572 kubelet[2792]: E0313 00:34:19.914515 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-r7g7s" podUID="be715fcd-c3e2-47e4-b475-20bdc4ec1391" Mar 13 00:34:19.926688 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1780594895.mount: Deactivated successfully. Mar 13 00:34:20.531855 containerd[1620]: time="2026-03-13T00:34:20.531776390Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:20.533002 containerd[1620]: time="2026-03-13T00:34:20.532775474Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Mar 13 00:34:20.533902 containerd[1620]: time="2026-03-13T00:34:20.533869338Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:20.536402 containerd[1620]: time="2026-03-13T00:34:20.536270303Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:20.537007 containerd[1620]: time="2026-03-13T00:34:20.536979948Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 2.265951708s" Mar 13 00:34:20.537129 containerd[1620]: time="2026-03-13T00:34:20.537097748Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Mar 13 00:34:20.538650 containerd[1620]: time="2026-03-13T00:34:20.538626888Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 13 00:34:20.558960 containerd[1620]: time="2026-03-13T00:34:20.558919692Z" level=info msg="CreateContainer within sandbox \"57ff2fb645de15d24b3899a24bfdc16c771c11eeb85c450443a8d69402eb5842\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 13 00:34:20.571293 containerd[1620]: time="2026-03-13T00:34:20.571244266Z" level=info msg="Container 19a27d9deabb165c6bdc6334cda9a75865034b202160354e0753c9621c7ce65a: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:20.580853 containerd[1620]: time="2026-03-13T00:34:20.580779807Z" level=info msg="CreateContainer within sandbox \"57ff2fb645de15d24b3899a24bfdc16c771c11eeb85c450443a8d69402eb5842\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"19a27d9deabb165c6bdc6334cda9a75865034b202160354e0753c9621c7ce65a\"" Mar 13 00:34:20.582025 containerd[1620]: time="2026-03-13T00:34:20.581973659Z" level=info msg="StartContainer for \"19a27d9deabb165c6bdc6334cda9a75865034b202160354e0753c9621c7ce65a\"" Mar 13 00:34:20.583746 containerd[1620]: time="2026-03-13T00:34:20.583715808Z" level=info msg="connecting to shim 19a27d9deabb165c6bdc6334cda9a75865034b202160354e0753c9621c7ce65a" address="unix:///run/containerd/s/f2a0d4841a1b542537d4ec4030eced305bd053c484661e0c8076761b1a8268ad" protocol=ttrpc version=3 Mar 13 00:34:20.607400 systemd[1]: Started cri-containerd-19a27d9deabb165c6bdc6334cda9a75865034b202160354e0753c9621c7ce65a.scope - libcontainer container 19a27d9deabb165c6bdc6334cda9a75865034b202160354e0753c9621c7ce65a. Mar 13 00:34:20.671266 containerd[1620]: time="2026-03-13T00:34:20.671152526Z" level=info msg="StartContainer for \"19a27d9deabb165c6bdc6334cda9a75865034b202160354e0753c9621c7ce65a\" returns successfully" Mar 13 00:34:20.881195 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3456191118.mount: Deactivated successfully. Mar 13 00:34:21.009154 kubelet[2792]: E0313 00:34:21.009038 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.009154 kubelet[2792]: W0313 00:34:21.009066 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.009154 kubelet[2792]: E0313 00:34:21.009088 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.010619 kubelet[2792]: E0313 00:34:21.010590 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.010619 kubelet[2792]: W0313 00:34:21.010609 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.010709 kubelet[2792]: E0313 00:34:21.010626 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.010936 kubelet[2792]: E0313 00:34:21.010900 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.011024 kubelet[2792]: W0313 00:34:21.010985 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.011132 kubelet[2792]: E0313 00:34:21.011070 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.011660 kubelet[2792]: E0313 00:34:21.011587 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.011660 kubelet[2792]: W0313 00:34:21.011598 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.011660 kubelet[2792]: E0313 00:34:21.011610 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.012137 kubelet[2792]: E0313 00:34:21.012065 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.012137 kubelet[2792]: W0313 00:34:21.012075 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.012377 kubelet[2792]: E0313 00:34:21.012095 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.012675 kubelet[2792]: E0313 00:34:21.012626 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.012675 kubelet[2792]: W0313 00:34:21.012637 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.012849 kubelet[2792]: E0313 00:34:21.012646 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.013072 kubelet[2792]: E0313 00:34:21.013062 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.013244 kubelet[2792]: W0313 00:34:21.013141 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.013244 kubelet[2792]: E0313 00:34:21.013151 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.013747 kubelet[2792]: E0313 00:34:21.013622 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.013747 kubelet[2792]: W0313 00:34:21.013663 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.013747 kubelet[2792]: E0313 00:34:21.013672 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.014241 kubelet[2792]: E0313 00:34:21.014163 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.014241 kubelet[2792]: W0313 00:34:21.014186 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.014241 kubelet[2792]: E0313 00:34:21.014196 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.014637 kubelet[2792]: E0313 00:34:21.014626 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.014724 kubelet[2792]: W0313 00:34:21.014700 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.014724 kubelet[2792]: E0313 00:34:21.014714 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.014922 kubelet[2792]: E0313 00:34:21.014906 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.014922 kubelet[2792]: W0313 00:34:21.014915 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.015091 kubelet[2792]: E0313 00:34:21.014923 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.015307 kubelet[2792]: E0313 00:34:21.015283 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.015307 kubelet[2792]: W0313 00:34:21.015294 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.015307 kubelet[2792]: E0313 00:34:21.015304 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.015511 kubelet[2792]: E0313 00:34:21.015501 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.015551 kubelet[2792]: W0313 00:34:21.015510 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.015551 kubelet[2792]: E0313 00:34:21.015519 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.015778 kubelet[2792]: E0313 00:34:21.015755 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.015778 kubelet[2792]: W0313 00:34:21.015771 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.015852 kubelet[2792]: E0313 00:34:21.015785 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.016024 kubelet[2792]: E0313 00:34:21.016005 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.016024 kubelet[2792]: W0313 00:34:21.016016 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.016095 kubelet[2792]: E0313 00:34:21.016026 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.020133 kubelet[2792]: I0313 00:34:21.019989 2792 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-typha-76b95db45d-m22tn" podStartSLOduration=1.7511508120000001 podStartE2EDuration="4.019974961s" podCreationTimestamp="2026-03-13 00:34:17 +0000 UTC" firstStartedPulling="2026-03-13 00:34:18.26964958 +0000 UTC m=+19.509124873" lastFinishedPulling="2026-03-13 00:34:20.538473729 +0000 UTC m=+21.777949022" observedRunningTime="2026-03-13 00:34:21.018849748 +0000 UTC m=+22.258325041" watchObservedRunningTime="2026-03-13 00:34:21.019974961 +0000 UTC m=+22.259450254" Mar 13 00:34:21.038038 kubelet[2792]: E0313 00:34:21.037989 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.038038 kubelet[2792]: W0313 00:34:21.038019 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.038038 kubelet[2792]: E0313 00:34:21.038045 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.038491 kubelet[2792]: E0313 00:34:21.038454 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.038491 kubelet[2792]: W0313 00:34:21.038470 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.038491 kubelet[2792]: E0313 00:34:21.038489 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.038773 kubelet[2792]: E0313 00:34:21.038750 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.038773 kubelet[2792]: W0313 00:34:21.038765 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.038854 kubelet[2792]: E0313 00:34:21.038776 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.039153 kubelet[2792]: E0313 00:34:21.039134 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.039153 kubelet[2792]: W0313 00:34:21.039145 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.039267 kubelet[2792]: E0313 00:34:21.039155 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.039396 kubelet[2792]: E0313 00:34:21.039373 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.039487 kubelet[2792]: W0313 00:34:21.039384 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.039487 kubelet[2792]: E0313 00:34:21.039462 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.039815 kubelet[2792]: E0313 00:34:21.039766 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.039815 kubelet[2792]: W0313 00:34:21.039777 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.039815 kubelet[2792]: E0313 00:34:21.039787 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.040297 kubelet[2792]: E0313 00:34:21.040279 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.040297 kubelet[2792]: W0313 00:34:21.040292 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.040366 kubelet[2792]: E0313 00:34:21.040301 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.041168 kubelet[2792]: E0313 00:34:21.041147 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.041168 kubelet[2792]: W0313 00:34:21.041161 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.041254 kubelet[2792]: E0313 00:34:21.041186 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.041448 kubelet[2792]: E0313 00:34:21.041422 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.041448 kubelet[2792]: W0313 00:34:21.041437 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.041520 kubelet[2792]: E0313 00:34:21.041450 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.041728 kubelet[2792]: E0313 00:34:21.041707 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.041728 kubelet[2792]: W0313 00:34:21.041720 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.041787 kubelet[2792]: E0313 00:34:21.041731 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.041950 kubelet[2792]: E0313 00:34:21.041933 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.041950 kubelet[2792]: W0313 00:34:21.041946 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.041995 kubelet[2792]: E0313 00:34:21.041980 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.042281 kubelet[2792]: E0313 00:34:21.042259 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.042281 kubelet[2792]: W0313 00:34:21.042271 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.042281 kubelet[2792]: E0313 00:34:21.042281 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.042796 kubelet[2792]: E0313 00:34:21.042779 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.042796 kubelet[2792]: W0313 00:34:21.042793 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.042850 kubelet[2792]: E0313 00:34:21.042803 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.043389 kubelet[2792]: E0313 00:34:21.043367 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.043389 kubelet[2792]: W0313 00:34:21.043382 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.043449 kubelet[2792]: E0313 00:34:21.043391 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.043610 kubelet[2792]: E0313 00:34:21.043593 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.043610 kubelet[2792]: W0313 00:34:21.043604 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.043662 kubelet[2792]: E0313 00:34:21.043641 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.043921 kubelet[2792]: E0313 00:34:21.043908 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.043921 kubelet[2792]: W0313 00:34:21.043920 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.043975 kubelet[2792]: E0313 00:34:21.043930 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.044539 kubelet[2792]: E0313 00:34:21.044516 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.044539 kubelet[2792]: W0313 00:34:21.044528 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.044539 kubelet[2792]: E0313 00:34:21.044538 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.044791 kubelet[2792]: E0313 00:34:21.044765 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.044791 kubelet[2792]: W0313 00:34:21.044779 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.044791 kubelet[2792]: E0313 00:34:21.044792 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.915092 kubelet[2792]: E0313 00:34:21.915006 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-r7g7s" podUID="be715fcd-c3e2-47e4-b475-20bdc4ec1391" Mar 13 00:34:22.008496 kubelet[2792]: I0313 00:34:22.008431 2792 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:34:22.023459 kubelet[2792]: E0313 00:34:22.023199 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.023459 kubelet[2792]: W0313 00:34:22.023231 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.023459 kubelet[2792]: E0313 00:34:22.023267 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.024041 kubelet[2792]: E0313 00:34:22.023638 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.024041 kubelet[2792]: W0313 00:34:22.023650 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.024041 kubelet[2792]: E0313 00:34:22.023666 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.024041 kubelet[2792]: E0313 00:34:22.023963 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.024041 kubelet[2792]: W0313 00:34:22.023974 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.024041 kubelet[2792]: E0313 00:34:22.023987 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.024771 kubelet[2792]: E0313 00:34:22.024742 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.024771 kubelet[2792]: W0313 00:34:22.024762 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.024833 kubelet[2792]: E0313 00:34:22.024779 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.025826 kubelet[2792]: E0313 00:34:22.025780 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.025944 kubelet[2792]: W0313 00:34:22.025806 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.025944 kubelet[2792]: E0313 00:34:22.025868 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.026403 kubelet[2792]: E0313 00:34:22.026332 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.026403 kubelet[2792]: W0313 00:34:22.026352 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.026403 kubelet[2792]: E0313 00:34:22.026367 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.026886 kubelet[2792]: E0313 00:34:22.026763 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.026886 kubelet[2792]: W0313 00:34:22.026778 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.026886 kubelet[2792]: E0313 00:34:22.026792 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.027533 kubelet[2792]: E0313 00:34:22.027484 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.027533 kubelet[2792]: W0313 00:34:22.027504 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.027533 kubelet[2792]: E0313 00:34:22.027519 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.028270 kubelet[2792]: E0313 00:34:22.028171 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.028270 kubelet[2792]: W0313 00:34:22.028213 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.028270 kubelet[2792]: E0313 00:34:22.028230 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.028704 kubelet[2792]: E0313 00:34:22.028572 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.028704 kubelet[2792]: W0313 00:34:22.028607 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.028704 kubelet[2792]: E0313 00:34:22.028620 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.029244 kubelet[2792]: E0313 00:34:22.029010 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.029244 kubelet[2792]: W0313 00:34:22.029022 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.029244 kubelet[2792]: E0313 00:34:22.029035 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.029450 kubelet[2792]: E0313 00:34:22.029428 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.029450 kubelet[2792]: W0313 00:34:22.029446 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.029567 kubelet[2792]: E0313 00:34:22.029460 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.029821 kubelet[2792]: E0313 00:34:22.029785 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.029821 kubelet[2792]: W0313 00:34:22.029806 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.029821 kubelet[2792]: E0313 00:34:22.029818 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.030330 kubelet[2792]: E0313 00:34:22.030296 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.030330 kubelet[2792]: W0313 00:34:22.030315 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.030330 kubelet[2792]: E0313 00:34:22.030329 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.030791 kubelet[2792]: E0313 00:34:22.030730 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.030791 kubelet[2792]: W0313 00:34:22.030766 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.030984 kubelet[2792]: E0313 00:34:22.030825 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.049458 kubelet[2792]: E0313 00:34:22.049359 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.049458 kubelet[2792]: W0313 00:34:22.049385 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.049458 kubelet[2792]: E0313 00:34:22.049405 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.050301 kubelet[2792]: E0313 00:34:22.049655 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.050301 kubelet[2792]: W0313 00:34:22.049662 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.050301 kubelet[2792]: E0313 00:34:22.049669 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.050301 kubelet[2792]: E0313 00:34:22.049849 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.050301 kubelet[2792]: W0313 00:34:22.049856 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.050301 kubelet[2792]: E0313 00:34:22.049864 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.050762 kubelet[2792]: E0313 00:34:22.050687 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.050762 kubelet[2792]: W0313 00:34:22.050737 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.050762 kubelet[2792]: E0313 00:34:22.050768 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.051225 kubelet[2792]: E0313 00:34:22.051202 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.051264 kubelet[2792]: W0313 00:34:22.051224 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.051264 kubelet[2792]: E0313 00:34:22.051242 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.051600 kubelet[2792]: E0313 00:34:22.051580 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.051630 kubelet[2792]: W0313 00:34:22.051600 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.051630 kubelet[2792]: E0313 00:34:22.051615 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.052171 kubelet[2792]: E0313 00:34:22.052085 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.052171 kubelet[2792]: W0313 00:34:22.052144 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.052171 kubelet[2792]: E0313 00:34:22.052160 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.052642 kubelet[2792]: E0313 00:34:22.052602 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.052642 kubelet[2792]: W0313 00:34:22.052628 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.052786 kubelet[2792]: E0313 00:34:22.052651 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.053224 kubelet[2792]: E0313 00:34:22.053149 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.053224 kubelet[2792]: W0313 00:34:22.053169 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.053224 kubelet[2792]: E0313 00:34:22.053214 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.053996 kubelet[2792]: E0313 00:34:22.053863 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.053996 kubelet[2792]: W0313 00:34:22.053877 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.053996 kubelet[2792]: E0313 00:34:22.053891 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.054259 kubelet[2792]: E0313 00:34:22.054247 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.054339 kubelet[2792]: W0313 00:34:22.054306 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.054339 kubelet[2792]: E0313 00:34:22.054323 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.054570 kubelet[2792]: E0313 00:34:22.054546 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.054570 kubelet[2792]: W0313 00:34:22.054559 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.054711 kubelet[2792]: E0313 00:34:22.054571 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.054901 kubelet[2792]: E0313 00:34:22.054863 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.054901 kubelet[2792]: W0313 00:34:22.054887 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.054979 kubelet[2792]: E0313 00:34:22.054905 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.055333 kubelet[2792]: E0313 00:34:22.055302 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.055333 kubelet[2792]: W0313 00:34:22.055324 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.055408 kubelet[2792]: E0313 00:34:22.055340 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.055713 kubelet[2792]: E0313 00:34:22.055657 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.055713 kubelet[2792]: W0313 00:34:22.055671 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.055713 kubelet[2792]: E0313 00:34:22.055682 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.056207 kubelet[2792]: E0313 00:34:22.056012 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.056207 kubelet[2792]: W0313 00:34:22.056026 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.056207 kubelet[2792]: E0313 00:34:22.056037 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.056388 kubelet[2792]: E0313 00:34:22.056375 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.056465 kubelet[2792]: W0313 00:34:22.056432 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.056465 kubelet[2792]: E0313 00:34:22.056449 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.057079 kubelet[2792]: E0313 00:34:22.057047 2792 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:22.057079 kubelet[2792]: W0313 00:34:22.057072 2792 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:22.057206 kubelet[2792]: E0313 00:34:22.057092 2792 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.238512 containerd[1620]: time="2026-03-13T00:34:22.237468472Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:22.240040 containerd[1620]: time="2026-03-13T00:34:22.239998169Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Mar 13 00:34:22.241547 containerd[1620]: time="2026-03-13T00:34:22.241317342Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:22.244200 containerd[1620]: time="2026-03-13T00:34:22.244140877Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:22.244894 containerd[1620]: time="2026-03-13T00:34:22.244748843Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 1.705969546s" Mar 13 00:34:22.244894 containerd[1620]: time="2026-03-13T00:34:22.244789013Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Mar 13 00:34:22.252143 containerd[1620]: time="2026-03-13T00:34:22.251695086Z" level=info msg="CreateContainer within sandbox \"8f4c3dbb5f984a8d35104e6791b582e6ffeab489eb2706e13dfeec1663950510\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 13 00:34:22.264597 containerd[1620]: time="2026-03-13T00:34:22.263914101Z" level=info msg="Container ab83599da6fba58499568bfd5c63ac11b60787978cc293e031a8f85b984e2590: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:22.279071 containerd[1620]: time="2026-03-13T00:34:22.278989081Z" level=info msg="CreateContainer within sandbox \"8f4c3dbb5f984a8d35104e6791b582e6ffeab489eb2706e13dfeec1663950510\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ab83599da6fba58499568bfd5c63ac11b60787978cc293e031a8f85b984e2590\"" Mar 13 00:34:22.281462 containerd[1620]: time="2026-03-13T00:34:22.281401898Z" level=info msg="StartContainer for \"ab83599da6fba58499568bfd5c63ac11b60787978cc293e031a8f85b984e2590\"" Mar 13 00:34:22.283426 containerd[1620]: time="2026-03-13T00:34:22.283371426Z" level=info msg="connecting to shim ab83599da6fba58499568bfd5c63ac11b60787978cc293e031a8f85b984e2590" address="unix:///run/containerd/s/a98bd5bee0c75ad22eaf3379f60638bccd299ac76445f4bd02fd8ae4af7659c0" protocol=ttrpc version=3 Mar 13 00:34:22.307416 systemd[1]: Started cri-containerd-ab83599da6fba58499568bfd5c63ac11b60787978cc293e031a8f85b984e2590.scope - libcontainer container ab83599da6fba58499568bfd5c63ac11b60787978cc293e031a8f85b984e2590. Mar 13 00:34:22.387275 containerd[1620]: time="2026-03-13T00:34:22.387220551Z" level=info msg="StartContainer for \"ab83599da6fba58499568bfd5c63ac11b60787978cc293e031a8f85b984e2590\" returns successfully" Mar 13 00:34:22.405669 systemd[1]: cri-containerd-ab83599da6fba58499568bfd5c63ac11b60787978cc293e031a8f85b984e2590.scope: Deactivated successfully. Mar 13 00:34:22.410595 containerd[1620]: time="2026-03-13T00:34:22.410539236Z" level=info msg="received container exit event container_id:\"ab83599da6fba58499568bfd5c63ac11b60787978cc293e031a8f85b984e2590\" id:\"ab83599da6fba58499568bfd5c63ac11b60787978cc293e031a8f85b984e2590\" pid:3481 exited_at:{seconds:1773362062 nanos:409954170}" Mar 13 00:34:22.438845 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ab83599da6fba58499568bfd5c63ac11b60787978cc293e031a8f85b984e2590-rootfs.mount: Deactivated successfully. Mar 13 00:34:23.017324 containerd[1620]: time="2026-03-13T00:34:23.016510810Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 13 00:34:23.914139 kubelet[2792]: E0313 00:34:23.913946 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-r7g7s" podUID="be715fcd-c3e2-47e4-b475-20bdc4ec1391" Mar 13 00:34:25.914584 kubelet[2792]: E0313 00:34:25.914533 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-r7g7s" podUID="be715fcd-c3e2-47e4-b475-20bdc4ec1391" Mar 13 00:34:27.914301 kubelet[2792]: E0313 00:34:27.914258 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-r7g7s" podUID="be715fcd-c3e2-47e4-b475-20bdc4ec1391" Mar 13 00:34:28.956749 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1094879078.mount: Deactivated successfully. Mar 13 00:34:28.990271 containerd[1620]: time="2026-03-13T00:34:28.990191789Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:28.991316 containerd[1620]: time="2026-03-13T00:34:28.991286175Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Mar 13 00:34:28.992188 containerd[1620]: time="2026-03-13T00:34:28.992147582Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:28.993916 containerd[1620]: time="2026-03-13T00:34:28.993882847Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:28.994463 containerd[1620]: time="2026-03-13T00:34:28.994306575Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 5.977307658s" Mar 13 00:34:28.994463 containerd[1620]: time="2026-03-13T00:34:28.994333505Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Mar 13 00:34:28.999045 containerd[1620]: time="2026-03-13T00:34:28.999009299Z" level=info msg="CreateContainer within sandbox \"8f4c3dbb5f984a8d35104e6791b582e6ffeab489eb2706e13dfeec1663950510\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 13 00:34:29.009438 containerd[1620]: time="2026-03-13T00:34:29.009401837Z" level=info msg="Container 7dafc3c8d95b6df2d5a94fbe6d3ae62fa1852f4180c9e065faa5b9f34e875c8f: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:29.014425 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2349097295.mount: Deactivated successfully. Mar 13 00:34:29.019972 containerd[1620]: time="2026-03-13T00:34:29.019926883Z" level=info msg="CreateContainer within sandbox \"8f4c3dbb5f984a8d35104e6791b582e6ffeab489eb2706e13dfeec1663950510\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"7dafc3c8d95b6df2d5a94fbe6d3ae62fa1852f4180c9e065faa5b9f34e875c8f\"" Mar 13 00:34:29.020640 containerd[1620]: time="2026-03-13T00:34:29.020620732Z" level=info msg="StartContainer for \"7dafc3c8d95b6df2d5a94fbe6d3ae62fa1852f4180c9e065faa5b9f34e875c8f\"" Mar 13 00:34:29.021815 containerd[1620]: time="2026-03-13T00:34:29.021792078Z" level=info msg="connecting to shim 7dafc3c8d95b6df2d5a94fbe6d3ae62fa1852f4180c9e065faa5b9f34e875c8f" address="unix:///run/containerd/s/a98bd5bee0c75ad22eaf3379f60638bccd299ac76445f4bd02fd8ae4af7659c0" protocol=ttrpc version=3 Mar 13 00:34:29.046282 systemd[1]: Started cri-containerd-7dafc3c8d95b6df2d5a94fbe6d3ae62fa1852f4180c9e065faa5b9f34e875c8f.scope - libcontainer container 7dafc3c8d95b6df2d5a94fbe6d3ae62fa1852f4180c9e065faa5b9f34e875c8f. Mar 13 00:34:29.110608 containerd[1620]: time="2026-03-13T00:34:29.110557192Z" level=info msg="StartContainer for \"7dafc3c8d95b6df2d5a94fbe6d3ae62fa1852f4180c9e065faa5b9f34e875c8f\" returns successfully" Mar 13 00:34:29.148897 systemd[1]: cri-containerd-7dafc3c8d95b6df2d5a94fbe6d3ae62fa1852f4180c9e065faa5b9f34e875c8f.scope: Deactivated successfully. Mar 13 00:34:29.150562 containerd[1620]: time="2026-03-13T00:34:29.150249099Z" level=info msg="received container exit event container_id:\"7dafc3c8d95b6df2d5a94fbe6d3ae62fa1852f4180c9e065faa5b9f34e875c8f\" id:\"7dafc3c8d95b6df2d5a94fbe6d3ae62fa1852f4180c9e065faa5b9f34e875c8f\" pid:3535 exited_at:{seconds:1773362069 nanos:150026870}" Mar 13 00:34:29.914406 kubelet[2792]: E0313 00:34:29.914316 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-r7g7s" podUID="be715fcd-c3e2-47e4-b475-20bdc4ec1391" Mar 13 00:34:29.957431 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7dafc3c8d95b6df2d5a94fbe6d3ae62fa1852f4180c9e065faa5b9f34e875c8f-rootfs.mount: Deactivated successfully. Mar 13 00:34:30.048132 containerd[1620]: time="2026-03-13T00:34:30.047366547Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 13 00:34:31.914761 kubelet[2792]: E0313 00:34:31.914660 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-r7g7s" podUID="be715fcd-c3e2-47e4-b475-20bdc4ec1391" Mar 13 00:34:33.605772 containerd[1620]: time="2026-03-13T00:34:33.605713131Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:33.606665 containerd[1620]: time="2026-03-13T00:34:33.606547369Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Mar 13 00:34:33.607334 containerd[1620]: time="2026-03-13T00:34:33.607282128Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:33.609333 containerd[1620]: time="2026-03-13T00:34:33.609291104Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:33.610456 containerd[1620]: time="2026-03-13T00:34:33.609954392Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 3.562533776s" Mar 13 00:34:33.610456 containerd[1620]: time="2026-03-13T00:34:33.609984261Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Mar 13 00:34:33.614970 containerd[1620]: time="2026-03-13T00:34:33.614922270Z" level=info msg="CreateContainer within sandbox \"8f4c3dbb5f984a8d35104e6791b582e6ffeab489eb2706e13dfeec1663950510\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 13 00:34:33.629138 containerd[1620]: time="2026-03-13T00:34:33.627273144Z" level=info msg="Container 2efee8c5a7bf25e6acd84be89ec4a014ef552a12a35c2fc0edc96727bb42facf: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:33.641538 containerd[1620]: time="2026-03-13T00:34:33.641464262Z" level=info msg="CreateContainer within sandbox \"8f4c3dbb5f984a8d35104e6791b582e6ffeab489eb2706e13dfeec1663950510\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"2efee8c5a7bf25e6acd84be89ec4a014ef552a12a35c2fc0edc96727bb42facf\"" Mar 13 00:34:33.643135 containerd[1620]: time="2026-03-13T00:34:33.642293280Z" level=info msg="StartContainer for \"2efee8c5a7bf25e6acd84be89ec4a014ef552a12a35c2fc0edc96727bb42facf\"" Mar 13 00:34:33.644004 containerd[1620]: time="2026-03-13T00:34:33.643953366Z" level=info msg="connecting to shim 2efee8c5a7bf25e6acd84be89ec4a014ef552a12a35c2fc0edc96727bb42facf" address="unix:///run/containerd/s/a98bd5bee0c75ad22eaf3379f60638bccd299ac76445f4bd02fd8ae4af7659c0" protocol=ttrpc version=3 Mar 13 00:34:33.677274 systemd[1]: Started cri-containerd-2efee8c5a7bf25e6acd84be89ec4a014ef552a12a35c2fc0edc96727bb42facf.scope - libcontainer container 2efee8c5a7bf25e6acd84be89ec4a014ef552a12a35c2fc0edc96727bb42facf. Mar 13 00:34:33.745976 containerd[1620]: time="2026-03-13T00:34:33.745921201Z" level=info msg="StartContainer for \"2efee8c5a7bf25e6acd84be89ec4a014ef552a12a35c2fc0edc96727bb42facf\" returns successfully" Mar 13 00:34:33.914070 kubelet[2792]: E0313 00:34:33.913914 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-r7g7s" podUID="be715fcd-c3e2-47e4-b475-20bdc4ec1391" Mar 13 00:34:34.251883 systemd[1]: cri-containerd-2efee8c5a7bf25e6acd84be89ec4a014ef552a12a35c2fc0edc96727bb42facf.scope: Deactivated successfully. Mar 13 00:34:34.252280 systemd[1]: cri-containerd-2efee8c5a7bf25e6acd84be89ec4a014ef552a12a35c2fc0edc96727bb42facf.scope: Consumed 481ms CPU time, 196.8M memory peak, 2.6M read from disk, 177M written to disk. Mar 13 00:34:34.254333 containerd[1620]: time="2026-03-13T00:34:34.254303097Z" level=info msg="received container exit event container_id:\"2efee8c5a7bf25e6acd84be89ec4a014ef552a12a35c2fc0edc96727bb42facf\" id:\"2efee8c5a7bf25e6acd84be89ec4a014ef552a12a35c2fc0edc96727bb42facf\" pid:3594 exited_at:{seconds:1773362074 nanos:254065638}" Mar 13 00:34:34.304888 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2efee8c5a7bf25e6acd84be89ec4a014ef552a12a35c2fc0edc96727bb42facf-rootfs.mount: Deactivated successfully. Mar 13 00:34:34.333130 kubelet[2792]: I0313 00:34:34.331945 2792 kubelet_node_status.go:427] "Fast updating node status as it just became ready" Mar 13 00:34:34.372489 systemd[1]: Created slice kubepods-burstable-pod988277de_d582_401c_833d_eea6558241b9.slice - libcontainer container kubepods-burstable-pod988277de_d582_401c_833d_eea6558241b9.slice. Mar 13 00:34:34.387465 systemd[1]: Created slice kubepods-burstable-pod831fd1c0_4b12_4ba4_b935_44636352b176.slice - libcontainer container kubepods-burstable-pod831fd1c0_4b12_4ba4_b935_44636352b176.slice. Mar 13 00:34:34.399980 systemd[1]: Created slice kubepods-besteffort-podb882fb70_ad4a_4262_8db1_90f940d6b91e.slice - libcontainer container kubepods-besteffort-podb882fb70_ad4a_4262_8db1_90f940d6b91e.slice. Mar 13 00:34:34.410690 systemd[1]: Created slice kubepods-besteffort-podf89ecdfa_6054_4fb2_b6f9_3c1b8fdaf628.slice - libcontainer container kubepods-besteffort-podf89ecdfa_6054_4fb2_b6f9_3c1b8fdaf628.slice. Mar 13 00:34:34.421297 systemd[1]: Created slice kubepods-besteffort-podd52b7e65_e88d_43f4_9c3d_b04fe7aef49f.slice - libcontainer container kubepods-besteffort-podd52b7e65_e88d_43f4_9c3d_b04fe7aef49f.slice. Mar 13 00:34:34.428977 systemd[1]: Created slice kubepods-besteffort-pod37779b7e_6012_4701_98cf_613605a31477.slice - libcontainer container kubepods-besteffort-pod37779b7e_6012_4701_98cf_613605a31477.slice. Mar 13 00:34:34.436884 systemd[1]: Created slice kubepods-besteffort-pod7dbc3d6d_6370_496b_9a6f_d93cadf0acb6.slice - libcontainer container kubepods-besteffort-pod7dbc3d6d_6370_496b_9a6f_d93cadf0acb6.slice. Mar 13 00:34:34.443697 kubelet[2792]: I0313 00:34:34.443652 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7dbc3d6d-6370-496b-9a6f-d93cadf0acb6-calico-apiserver-certs\") pod \"calico-apiserver-57869c95c-cw5r5\" (UID: \"7dbc3d6d-6370-496b-9a6f-d93cadf0acb6\") " pod="calico-system/calico-apiserver-57869c95c-cw5r5" Mar 13 00:34:34.443697 kubelet[2792]: I0313 00:34:34.443686 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcbc2\" (UniqueName: \"kubernetes.io/projected/7dbc3d6d-6370-496b-9a6f-d93cadf0acb6-kube-api-access-mcbc2\") pod \"calico-apiserver-57869c95c-cw5r5\" (UID: \"7dbc3d6d-6370-496b-9a6f-d93cadf0acb6\") " pod="calico-system/calico-apiserver-57869c95c-cw5r5" Mar 13 00:34:34.443697 kubelet[2792]: I0313 00:34:34.443699 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv6xl\" (UniqueName: \"kubernetes.io/projected/831fd1c0-4b12-4ba4-b935-44636352b176-kube-api-access-jv6xl\") pod \"coredns-7d764666f9-rts4t\" (UID: \"831fd1c0-4b12-4ba4-b935-44636352b176\") " pod="kube-system/coredns-7d764666f9-rts4t" Mar 13 00:34:34.443697 kubelet[2792]: I0313 00:34:34.443714 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d52b7e65-e88d-43f4-9c3d-b04fe7aef49f-config\") pod \"goldmane-9f7667bb8-zl7lf\" (UID: \"d52b7e65-e88d-43f4-9c3d-b04fe7aef49f\") " pod="calico-system/goldmane-9f7667bb8-zl7lf" Mar 13 00:34:34.443971 kubelet[2792]: I0313 00:34:34.443728 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/37779b7e-6012-4701-98cf-613605a31477-calico-apiserver-certs\") pod \"calico-apiserver-57869c95c-7vd8k\" (UID: \"37779b7e-6012-4701-98cf-613605a31477\") " pod="calico-system/calico-apiserver-57869c95c-7vd8k" Mar 13 00:34:34.443971 kubelet[2792]: I0313 00:34:34.443742 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f89ecdfa-6054-4fb2-b6f9-3c1b8fdaf628-tigera-ca-bundle\") pod \"calico-kube-controllers-7f89cbb487-v9qld\" (UID: \"f89ecdfa-6054-4fb2-b6f9-3c1b8fdaf628\") " pod="calico-system/calico-kube-controllers-7f89cbb487-v9qld" Mar 13 00:34:34.443971 kubelet[2792]: I0313 00:34:34.443754 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d52b7e65-e88d-43f4-9c3d-b04fe7aef49f-goldmane-ca-bundle\") pod \"goldmane-9f7667bb8-zl7lf\" (UID: \"d52b7e65-e88d-43f4-9c3d-b04fe7aef49f\") " pod="calico-system/goldmane-9f7667bb8-zl7lf" Mar 13 00:34:34.443971 kubelet[2792]: I0313 00:34:34.443766 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5twkp\" (UniqueName: \"kubernetes.io/projected/b882fb70-ad4a-4262-8db1-90f940d6b91e-kube-api-access-5twkp\") pod \"whisker-59b5c4f6bf-ctj6q\" (UID: \"b882fb70-ad4a-4262-8db1-90f940d6b91e\") " pod="calico-system/whisker-59b5c4f6bf-ctj6q" Mar 13 00:34:34.443971 kubelet[2792]: I0313 00:34:34.443779 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crfw7\" (UniqueName: \"kubernetes.io/projected/988277de-d582-401c-833d-eea6558241b9-kube-api-access-crfw7\") pod \"coredns-7d764666f9-nmbnx\" (UID: \"988277de-d582-401c-833d-eea6558241b9\") " pod="kube-system/coredns-7d764666f9-nmbnx" Mar 13 00:34:34.444251 kubelet[2792]: I0313 00:34:34.443791 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/831fd1c0-4b12-4ba4-b935-44636352b176-config-volume\") pod \"coredns-7d764666f9-rts4t\" (UID: \"831fd1c0-4b12-4ba4-b935-44636352b176\") " pod="kube-system/coredns-7d764666f9-rts4t" Mar 13 00:34:34.444251 kubelet[2792]: I0313 00:34:34.443803 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsn7w\" (UniqueName: \"kubernetes.io/projected/37779b7e-6012-4701-98cf-613605a31477-kube-api-access-zsn7w\") pod \"calico-apiserver-57869c95c-7vd8k\" (UID: \"37779b7e-6012-4701-98cf-613605a31477\") " pod="calico-system/calico-apiserver-57869c95c-7vd8k" Mar 13 00:34:34.444251 kubelet[2792]: I0313 00:34:34.443815 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/b882fb70-ad4a-4262-8db1-90f940d6b91e-nginx-config\") pod \"whisker-59b5c4f6bf-ctj6q\" (UID: \"b882fb70-ad4a-4262-8db1-90f940d6b91e\") " pod="calico-system/whisker-59b5c4f6bf-ctj6q" Mar 13 00:34:34.444251 kubelet[2792]: I0313 00:34:34.443827 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b882fb70-ad4a-4262-8db1-90f940d6b91e-whisker-backend-key-pair\") pod \"whisker-59b5c4f6bf-ctj6q\" (UID: \"b882fb70-ad4a-4262-8db1-90f940d6b91e\") " pod="calico-system/whisker-59b5c4f6bf-ctj6q" Mar 13 00:34:34.444251 kubelet[2792]: I0313 00:34:34.443838 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b882fb70-ad4a-4262-8db1-90f940d6b91e-whisker-ca-bundle\") pod \"whisker-59b5c4f6bf-ctj6q\" (UID: \"b882fb70-ad4a-4262-8db1-90f940d6b91e\") " pod="calico-system/whisker-59b5c4f6bf-ctj6q" Mar 13 00:34:34.444440 kubelet[2792]: I0313 00:34:34.443852 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bccb8\" (UniqueName: \"kubernetes.io/projected/d52b7e65-e88d-43f4-9c3d-b04fe7aef49f-kube-api-access-bccb8\") pod \"goldmane-9f7667bb8-zl7lf\" (UID: \"d52b7e65-e88d-43f4-9c3d-b04fe7aef49f\") " pod="calico-system/goldmane-9f7667bb8-zl7lf" Mar 13 00:34:34.444440 kubelet[2792]: I0313 00:34:34.443866 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/988277de-d582-401c-833d-eea6558241b9-config-volume\") pod \"coredns-7d764666f9-nmbnx\" (UID: \"988277de-d582-401c-833d-eea6558241b9\") " pod="kube-system/coredns-7d764666f9-nmbnx" Mar 13 00:34:34.444440 kubelet[2792]: I0313 00:34:34.443881 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qphk\" (UniqueName: \"kubernetes.io/projected/f89ecdfa-6054-4fb2-b6f9-3c1b8fdaf628-kube-api-access-5qphk\") pod \"calico-kube-controllers-7f89cbb487-v9qld\" (UID: \"f89ecdfa-6054-4fb2-b6f9-3c1b8fdaf628\") " pod="calico-system/calico-kube-controllers-7f89cbb487-v9qld" Mar 13 00:34:34.444440 kubelet[2792]: I0313 00:34:34.443893 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/d52b7e65-e88d-43f4-9c3d-b04fe7aef49f-goldmane-key-pair\") pod \"goldmane-9f7667bb8-zl7lf\" (UID: \"d52b7e65-e88d-43f4-9c3d-b04fe7aef49f\") " pod="calico-system/goldmane-9f7667bb8-zl7lf" Mar 13 00:34:34.684714 containerd[1620]: time="2026-03-13T00:34:34.684664329Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-nmbnx,Uid:988277de-d582-401c-833d-eea6558241b9,Namespace:kube-system,Attempt:0,}" Mar 13 00:34:34.697988 containerd[1620]: time="2026-03-13T00:34:34.697936112Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-rts4t,Uid:831fd1c0-4b12-4ba4-b935-44636352b176,Namespace:kube-system,Attempt:0,}" Mar 13 00:34:34.711916 containerd[1620]: time="2026-03-13T00:34:34.711793564Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-59b5c4f6bf-ctj6q,Uid:b882fb70-ad4a-4262-8db1-90f940d6b91e,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:34.724351 containerd[1620]: time="2026-03-13T00:34:34.722863522Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7f89cbb487-v9qld,Uid:f89ecdfa-6054-4fb2-b6f9-3c1b8fdaf628,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:34.731829 containerd[1620]: time="2026-03-13T00:34:34.731788374Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-zl7lf,Uid:d52b7e65-e88d-43f4-9c3d-b04fe7aef49f,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:34.737781 containerd[1620]: time="2026-03-13T00:34:34.737739662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57869c95c-7vd8k,Uid:37779b7e-6012-4701-98cf-613605a31477,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:34.757503 containerd[1620]: time="2026-03-13T00:34:34.757421352Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57869c95c-cw5r5,Uid:7dbc3d6d-6370-496b-9a6f-d93cadf0acb6,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:34.944857 containerd[1620]: time="2026-03-13T00:34:34.944388355Z" level=error msg="Failed to destroy network for sandbox \"88e9af6a55afa8313e782e6dc46edd4dfbfe9762587fbfe01023ec3705c08169\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.947909 containerd[1620]: time="2026-03-13T00:34:34.947864639Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-59b5c4f6bf-ctj6q,Uid:b882fb70-ad4a-4262-8db1-90f940d6b91e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"88e9af6a55afa8313e782e6dc46edd4dfbfe9762587fbfe01023ec3705c08169\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.948273 kubelet[2792]: E0313 00:34:34.948222 2792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88e9af6a55afa8313e782e6dc46edd4dfbfe9762587fbfe01023ec3705c08169\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.949247 kubelet[2792]: E0313 00:34:34.948844 2792 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88e9af6a55afa8313e782e6dc46edd4dfbfe9762587fbfe01023ec3705c08169\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-59b5c4f6bf-ctj6q" Mar 13 00:34:34.949247 kubelet[2792]: E0313 00:34:34.948874 2792 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88e9af6a55afa8313e782e6dc46edd4dfbfe9762587fbfe01023ec3705c08169\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-59b5c4f6bf-ctj6q" Mar 13 00:34:34.949247 kubelet[2792]: E0313 00:34:34.948935 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-59b5c4f6bf-ctj6q_calico-system(b882fb70-ad4a-4262-8db1-90f940d6b91e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-59b5c4f6bf-ctj6q_calico-system(b882fb70-ad4a-4262-8db1-90f940d6b91e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"88e9af6a55afa8313e782e6dc46edd4dfbfe9762587fbfe01023ec3705c08169\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-59b5c4f6bf-ctj6q" podUID="b882fb70-ad4a-4262-8db1-90f940d6b91e" Mar 13 00:34:34.954204 containerd[1620]: time="2026-03-13T00:34:34.954091356Z" level=error msg="Failed to destroy network for sandbox \"176045091214c0180edbb96d56badfee6e80482aa99aa0d6c31607291aca535a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.956227 containerd[1620]: time="2026-03-13T00:34:34.956172882Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7f89cbb487-v9qld,Uid:f89ecdfa-6054-4fb2-b6f9-3c1b8fdaf628,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"176045091214c0180edbb96d56badfee6e80482aa99aa0d6c31607291aca535a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.956730 kubelet[2792]: E0313 00:34:34.956386 2792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"176045091214c0180edbb96d56badfee6e80482aa99aa0d6c31607291aca535a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.956730 kubelet[2792]: E0313 00:34:34.956441 2792 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"176045091214c0180edbb96d56badfee6e80482aa99aa0d6c31607291aca535a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7f89cbb487-v9qld" Mar 13 00:34:34.956730 kubelet[2792]: E0313 00:34:34.956461 2792 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"176045091214c0180edbb96d56badfee6e80482aa99aa0d6c31607291aca535a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7f89cbb487-v9qld" Mar 13 00:34:34.956845 kubelet[2792]: E0313 00:34:34.956527 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7f89cbb487-v9qld_calico-system(f89ecdfa-6054-4fb2-b6f9-3c1b8fdaf628)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7f89cbb487-v9qld_calico-system(f89ecdfa-6054-4fb2-b6f9-3c1b8fdaf628)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"176045091214c0180edbb96d56badfee6e80482aa99aa0d6c31607291aca535a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7f89cbb487-v9qld" podUID="f89ecdfa-6054-4fb2-b6f9-3c1b8fdaf628" Mar 13 00:34:34.964367 containerd[1620]: time="2026-03-13T00:34:34.964315855Z" level=error msg="Failed to destroy network for sandbox \"a5671cff03cc2f20809c5542b75c72e3bc3dcccc61a0df39cf0fb378f1938f9a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.966128 containerd[1620]: time="2026-03-13T00:34:34.965892002Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-zl7lf,Uid:d52b7e65-e88d-43f4-9c3d-b04fe7aef49f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a5671cff03cc2f20809c5542b75c72e3bc3dcccc61a0df39cf0fb378f1938f9a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.966265 kubelet[2792]: E0313 00:34:34.966085 2792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a5671cff03cc2f20809c5542b75c72e3bc3dcccc61a0df39cf0fb378f1938f9a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.966265 kubelet[2792]: E0313 00:34:34.966142 2792 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a5671cff03cc2f20809c5542b75c72e3bc3dcccc61a0df39cf0fb378f1938f9a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-zl7lf" Mar 13 00:34:34.966265 kubelet[2792]: E0313 00:34:34.966158 2792 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a5671cff03cc2f20809c5542b75c72e3bc3dcccc61a0df39cf0fb378f1938f9a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-zl7lf" Mar 13 00:34:34.966393 kubelet[2792]: E0313 00:34:34.966216 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-9f7667bb8-zl7lf_calico-system(d52b7e65-e88d-43f4-9c3d-b04fe7aef49f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-9f7667bb8-zl7lf_calico-system(d52b7e65-e88d-43f4-9c3d-b04fe7aef49f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a5671cff03cc2f20809c5542b75c72e3bc3dcccc61a0df39cf0fb378f1938f9a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9f7667bb8-zl7lf" podUID="d52b7e65-e88d-43f4-9c3d-b04fe7aef49f" Mar 13 00:34:34.970906 containerd[1620]: time="2026-03-13T00:34:34.970752892Z" level=error msg="Failed to destroy network for sandbox \"13d018ca7d943b58bcea573f7cf0ebcf76c0954b2d1fb68c830f6965e6cd4d67\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.974751 containerd[1620]: time="2026-03-13T00:34:34.974695924Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-nmbnx,Uid:988277de-d582-401c-833d-eea6558241b9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"13d018ca7d943b58bcea573f7cf0ebcf76c0954b2d1fb68c830f6965e6cd4d67\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.975532 kubelet[2792]: E0313 00:34:34.974925 2792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13d018ca7d943b58bcea573f7cf0ebcf76c0954b2d1fb68c830f6965e6cd4d67\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.975532 kubelet[2792]: E0313 00:34:34.974975 2792 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13d018ca7d943b58bcea573f7cf0ebcf76c0954b2d1fb68c830f6965e6cd4d67\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-nmbnx" Mar 13 00:34:34.975532 kubelet[2792]: E0313 00:34:34.975003 2792 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13d018ca7d943b58bcea573f7cf0ebcf76c0954b2d1fb68c830f6965e6cd4d67\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-nmbnx" Mar 13 00:34:34.975677 kubelet[2792]: E0313 00:34:34.975047 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-nmbnx_kube-system(988277de-d582-401c-833d-eea6558241b9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-nmbnx_kube-system(988277de-d582-401c-833d-eea6558241b9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"13d018ca7d943b58bcea573f7cf0ebcf76c0954b2d1fb68c830f6965e6cd4d67\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-nmbnx" podUID="988277de-d582-401c-833d-eea6558241b9" Mar 13 00:34:34.982385 containerd[1620]: time="2026-03-13T00:34:34.982334038Z" level=error msg="Failed to destroy network for sandbox \"9436e5e7b48142d4fde5ce706f8fe778c61a55cebda521582257209b8bf85143\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.985156 containerd[1620]: time="2026-03-13T00:34:34.985094053Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57869c95c-cw5r5,Uid:7dbc3d6d-6370-496b-9a6f-d93cadf0acb6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9436e5e7b48142d4fde5ce706f8fe778c61a55cebda521582257209b8bf85143\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.985404 kubelet[2792]: E0313 00:34:34.985370 2792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9436e5e7b48142d4fde5ce706f8fe778c61a55cebda521582257209b8bf85143\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.985460 kubelet[2792]: E0313 00:34:34.985422 2792 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9436e5e7b48142d4fde5ce706f8fe778c61a55cebda521582257209b8bf85143\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-57869c95c-cw5r5" Mar 13 00:34:34.985460 kubelet[2792]: E0313 00:34:34.985439 2792 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9436e5e7b48142d4fde5ce706f8fe778c61a55cebda521582257209b8bf85143\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-57869c95c-cw5r5" Mar 13 00:34:34.985519 kubelet[2792]: E0313 00:34:34.985481 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57869c95c-cw5r5_calico-system(7dbc3d6d-6370-496b-9a6f-d93cadf0acb6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57869c95c-cw5r5_calico-system(7dbc3d6d-6370-496b-9a6f-d93cadf0acb6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9436e5e7b48142d4fde5ce706f8fe778c61a55cebda521582257209b8bf85143\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-57869c95c-cw5r5" podUID="7dbc3d6d-6370-496b-9a6f-d93cadf0acb6" Mar 13 00:34:34.987779 containerd[1620]: time="2026-03-13T00:34:34.987737938Z" level=error msg="Failed to destroy network for sandbox \"c292d1c3eef12281110aaf413b572bc9e2ede3edefcbd9ab3c11cf11663205eb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.988949 containerd[1620]: time="2026-03-13T00:34:34.988925115Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57869c95c-7vd8k,Uid:37779b7e-6012-4701-98cf-613605a31477,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c292d1c3eef12281110aaf413b572bc9e2ede3edefcbd9ab3c11cf11663205eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.989253 containerd[1620]: time="2026-03-13T00:34:34.989114335Z" level=error msg="Failed to destroy network for sandbox \"aa727e55117763b215bff061bf4a86dded6d3bd1460d72166fba886c0333e3c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.989611 kubelet[2792]: E0313 00:34:34.989546 2792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c292d1c3eef12281110aaf413b572bc9e2ede3edefcbd9ab3c11cf11663205eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.989668 kubelet[2792]: E0313 00:34:34.989616 2792 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c292d1c3eef12281110aaf413b572bc9e2ede3edefcbd9ab3c11cf11663205eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-57869c95c-7vd8k" Mar 13 00:34:34.989763 kubelet[2792]: E0313 00:34:34.989639 2792 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c292d1c3eef12281110aaf413b572bc9e2ede3edefcbd9ab3c11cf11663205eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-57869c95c-7vd8k" Mar 13 00:34:34.989810 kubelet[2792]: E0313 00:34:34.989758 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57869c95c-7vd8k_calico-system(37779b7e-6012-4701-98cf-613605a31477)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57869c95c-7vd8k_calico-system(37779b7e-6012-4701-98cf-613605a31477)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c292d1c3eef12281110aaf413b572bc9e2ede3edefcbd9ab3c11cf11663205eb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-57869c95c-7vd8k" podUID="37779b7e-6012-4701-98cf-613605a31477" Mar 13 00:34:34.990550 containerd[1620]: time="2026-03-13T00:34:34.990508012Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-rts4t,Uid:831fd1c0-4b12-4ba4-b935-44636352b176,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa727e55117763b215bff061bf4a86dded6d3bd1460d72166fba886c0333e3c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.990862 kubelet[2792]: E0313 00:34:34.990827 2792 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa727e55117763b215bff061bf4a86dded6d3bd1460d72166fba886c0333e3c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.990971 kubelet[2792]: E0313 00:34:34.990945 2792 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa727e55117763b215bff061bf4a86dded6d3bd1460d72166fba886c0333e3c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-rts4t" Mar 13 00:34:34.991053 kubelet[2792]: E0313 00:34:34.991027 2792 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa727e55117763b215bff061bf4a86dded6d3bd1460d72166fba886c0333e3c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-rts4t" Mar 13 00:34:34.991215 kubelet[2792]: E0313 00:34:34.991191 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-rts4t_kube-system(831fd1c0-4b12-4ba4-b935-44636352b176)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-rts4t_kube-system(831fd1c0-4b12-4ba4-b935-44636352b176)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aa727e55117763b215bff061bf4a86dded6d3bd1460d72166fba886c0333e3c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-rts4t" podUID="831fd1c0-4b12-4ba4-b935-44636352b176" Mar 13 00:34:35.098650 containerd[1620]: time="2026-03-13T00:34:35.098604291Z" level=info msg="CreateContainer within sandbox \"8f4c3dbb5f984a8d35104e6791b582e6ffeab489eb2706e13dfeec1663950510\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 13 00:34:35.106888 containerd[1620]: time="2026-03-13T00:34:35.106837246Z" level=info msg="Container 6dd6a5be6b38e4f516585c4a3649beeb2a07bfb13383d7a3c4ce92065479745d: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:35.115017 containerd[1620]: time="2026-03-13T00:34:35.114973221Z" level=info msg="CreateContainer within sandbox \"8f4c3dbb5f984a8d35104e6791b582e6ffeab489eb2706e13dfeec1663950510\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"6dd6a5be6b38e4f516585c4a3649beeb2a07bfb13383d7a3c4ce92065479745d\"" Mar 13 00:34:35.115933 containerd[1620]: time="2026-03-13T00:34:35.115784060Z" level=info msg="StartContainer for \"6dd6a5be6b38e4f516585c4a3649beeb2a07bfb13383d7a3c4ce92065479745d\"" Mar 13 00:34:35.117829 containerd[1620]: time="2026-03-13T00:34:35.117794206Z" level=info msg="connecting to shim 6dd6a5be6b38e4f516585c4a3649beeb2a07bfb13383d7a3c4ce92065479745d" address="unix:///run/containerd/s/a98bd5bee0c75ad22eaf3379f60638bccd299ac76445f4bd02fd8ae4af7659c0" protocol=ttrpc version=3 Mar 13 00:34:35.138319 systemd[1]: Started cri-containerd-6dd6a5be6b38e4f516585c4a3649beeb2a07bfb13383d7a3c4ce92065479745d.scope - libcontainer container 6dd6a5be6b38e4f516585c4a3649beeb2a07bfb13383d7a3c4ce92065479745d. Mar 13 00:34:35.231736 containerd[1620]: time="2026-03-13T00:34:35.231592777Z" level=info msg="StartContainer for \"6dd6a5be6b38e4f516585c4a3649beeb2a07bfb13383d7a3c4ce92065479745d\" returns successfully" Mar 13 00:34:35.452820 kubelet[2792]: I0313 00:34:35.452761 2792 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/projected/b882fb70-ad4a-4262-8db1-90f940d6b91e-kube-api-access-5twkp\" (UniqueName: \"kubernetes.io/projected/b882fb70-ad4a-4262-8db1-90f940d6b91e-kube-api-access-5twkp\") pod \"b882fb70-ad4a-4262-8db1-90f940d6b91e\" (UID: \"b882fb70-ad4a-4262-8db1-90f940d6b91e\") " Mar 13 00:34:35.452820 kubelet[2792]: I0313 00:34:35.452815 2792 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/b882fb70-ad4a-4262-8db1-90f940d6b91e-nginx-config\" (UniqueName: \"kubernetes.io/configmap/b882fb70-ad4a-4262-8db1-90f940d6b91e-nginx-config\") pod \"b882fb70-ad4a-4262-8db1-90f940d6b91e\" (UID: \"b882fb70-ad4a-4262-8db1-90f940d6b91e\") " Mar 13 00:34:35.453010 kubelet[2792]: I0313 00:34:35.452840 2792 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/b882fb70-ad4a-4262-8db1-90f940d6b91e-whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b882fb70-ad4a-4262-8db1-90f940d6b91e-whisker-ca-bundle\") pod \"b882fb70-ad4a-4262-8db1-90f940d6b91e\" (UID: \"b882fb70-ad4a-4262-8db1-90f940d6b91e\") " Mar 13 00:34:35.453010 kubelet[2792]: I0313 00:34:35.452898 2792 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/secret/b882fb70-ad4a-4262-8db1-90f940d6b91e-whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b882fb70-ad4a-4262-8db1-90f940d6b91e-whisker-backend-key-pair\") pod \"b882fb70-ad4a-4262-8db1-90f940d6b91e\" (UID: \"b882fb70-ad4a-4262-8db1-90f940d6b91e\") " Mar 13 00:34:35.453681 kubelet[2792]: I0313 00:34:35.453545 2792 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b882fb70-ad4a-4262-8db1-90f940d6b91e-nginx-config" pod "b882fb70-ad4a-4262-8db1-90f940d6b91e" (UID: "b882fb70-ad4a-4262-8db1-90f940d6b91e"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 13 00:34:35.456359 kubelet[2792]: I0313 00:34:35.456208 2792 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b882fb70-ad4a-4262-8db1-90f940d6b91e-whisker-ca-bundle" pod "b882fb70-ad4a-4262-8db1-90f940d6b91e" (UID: "b882fb70-ad4a-4262-8db1-90f940d6b91e"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 13 00:34:35.461440 kubelet[2792]: I0313 00:34:35.461330 2792 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b882fb70-ad4a-4262-8db1-90f940d6b91e-kube-api-access-5twkp" pod "b882fb70-ad4a-4262-8db1-90f940d6b91e" (UID: "b882fb70-ad4a-4262-8db1-90f940d6b91e"). InnerVolumeSpecName "kube-api-access-5twkp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 13 00:34:35.461690 kubelet[2792]: I0313 00:34:35.461450 2792 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b882fb70-ad4a-4262-8db1-90f940d6b91e-whisker-backend-key-pair" pod "b882fb70-ad4a-4262-8db1-90f940d6b91e" (UID: "b882fb70-ad4a-4262-8db1-90f940d6b91e"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 13 00:34:35.553705 kubelet[2792]: I0313 00:34:35.553536 2792 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5twkp\" (UniqueName: \"kubernetes.io/projected/b882fb70-ad4a-4262-8db1-90f940d6b91e-kube-api-access-5twkp\") on node \"ci-4459-2-4-n-7393fd8643\" DevicePath \"\"" Mar 13 00:34:35.553705 kubelet[2792]: I0313 00:34:35.553577 2792 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/b882fb70-ad4a-4262-8db1-90f940d6b91e-nginx-config\") on node \"ci-4459-2-4-n-7393fd8643\" DevicePath \"\"" Mar 13 00:34:35.553705 kubelet[2792]: I0313 00:34:35.553589 2792 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b882fb70-ad4a-4262-8db1-90f940d6b91e-whisker-ca-bundle\") on node \"ci-4459-2-4-n-7393fd8643\" DevicePath \"\"" Mar 13 00:34:35.553705 kubelet[2792]: I0313 00:34:35.553601 2792 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b882fb70-ad4a-4262-8db1-90f940d6b91e-whisker-backend-key-pair\") on node \"ci-4459-2-4-n-7393fd8643\" DevicePath \"\"" Mar 13 00:34:35.631349 systemd[1]: run-netns-cni\x2df60d2044\x2d8eb1\x2d0b2c\x2dc563\x2d362f2c06280a.mount: Deactivated successfully. Mar 13 00:34:35.631678 systemd[1]: run-netns-cni\x2d7262dd20\x2d6322\x2d26d2\x2d9630\x2d4f625ab7c364.mount: Deactivated successfully. Mar 13 00:34:35.631839 systemd[1]: run-netns-cni\x2defbd7a92\x2d70eb\x2d1f33\x2db0f7\x2dd6a735c7dcbf.mount: Deactivated successfully. Mar 13 00:34:35.631986 systemd[1]: run-netns-cni\x2d54278888\x2dfda3\x2de5e5\x2d33da\x2dbf1bde8a99ba.mount: Deactivated successfully. Mar 13 00:34:35.632163 systemd[1]: var-lib-kubelet-pods-b882fb70\x2dad4a\x2d4262\x2d8db1\x2d90f940d6b91e-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d5twkp.mount: Deactivated successfully. Mar 13 00:34:35.632347 systemd[1]: var-lib-kubelet-pods-b882fb70\x2dad4a\x2d4262\x2d8db1\x2d90f940d6b91e-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 13 00:34:35.920231 systemd[1]: Created slice kubepods-besteffort-podbe715fcd_c3e2_47e4_b475_20bdc4ec1391.slice - libcontainer container kubepods-besteffort-podbe715fcd_c3e2_47e4_b475_20bdc4ec1391.slice. Mar 13 00:34:35.924914 containerd[1620]: time="2026-03-13T00:34:35.924863372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-r7g7s,Uid:be715fcd-c3e2-47e4-b475-20bdc4ec1391,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:36.056914 systemd-networkd[1483]: cali77490e6d45e: Link UP Mar 13 00:34:36.058682 systemd-networkd[1483]: cali77490e6d45e: Gained carrier Mar 13 00:34:36.079986 containerd[1620]: 2026-03-13 00:34:35.954 [ERROR][3874] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 13 00:34:36.079986 containerd[1620]: 2026-03-13 00:34:35.973 [INFO][3874] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--7393fd8643-k8s-csi--node--driver--r7g7s-eth0 csi-node-driver- calico-system be715fcd-c3e2-47e4-b475-20bdc4ec1391 690 0 2026-03-13 00:34:17 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:589b8b8d94 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-4-n-7393fd8643 csi-node-driver-r7g7s eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali77490e6d45e [] [] }} ContainerID="7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" Namespace="calico-system" Pod="csi-node-driver-r7g7s" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-csi--node--driver--r7g7s-" Mar 13 00:34:36.079986 containerd[1620]: 2026-03-13 00:34:35.973 [INFO][3874] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" Namespace="calico-system" Pod="csi-node-driver-r7g7s" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-csi--node--driver--r7g7s-eth0" Mar 13 00:34:36.079986 containerd[1620]: 2026-03-13 00:34:36.003 [INFO][3885] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" HandleID="k8s-pod-network.7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" Workload="ci--4459--2--4--n--7393fd8643-k8s-csi--node--driver--r7g7s-eth0" Mar 13 00:34:36.080305 containerd[1620]: 2026-03-13 00:34:36.010 [INFO][3885] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" HandleID="k8s-pod-network.7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" Workload="ci--4459--2--4--n--7393fd8643-k8s-csi--node--driver--r7g7s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fd5e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-7393fd8643", "pod":"csi-node-driver-r7g7s", "timestamp":"2026-03-13 00:34:36.003719956 +0000 UTC"}, Hostname:"ci-4459-2-4-n-7393fd8643", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00036f4a0)} Mar 13 00:34:36.080305 containerd[1620]: 2026-03-13 00:34:36.011 [INFO][3885] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:34:36.080305 containerd[1620]: 2026-03-13 00:34:36.011 [INFO][3885] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:34:36.080305 containerd[1620]: 2026-03-13 00:34:36.011 [INFO][3885] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-7393fd8643' Mar 13 00:34:36.080305 containerd[1620]: 2026-03-13 00:34:36.014 [INFO][3885] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.080305 containerd[1620]: 2026-03-13 00:34:36.018 [INFO][3885] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.080305 containerd[1620]: 2026-03-13 00:34:36.023 [INFO][3885] ipam/ipam.go 526: Trying affinity for 192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.080305 containerd[1620]: 2026-03-13 00:34:36.024 [INFO][3885] ipam/ipam.go 160: Attempting to load block cidr=192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.080305 containerd[1620]: 2026-03-13 00:34:36.026 [INFO][3885] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.080559 containerd[1620]: 2026-03-13 00:34:36.026 [INFO][3885] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.86.192/26 handle="k8s-pod-network.7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.080559 containerd[1620]: 2026-03-13 00:34:36.028 [INFO][3885] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e Mar 13 00:34:36.080559 containerd[1620]: 2026-03-13 00:34:36.032 [INFO][3885] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.86.192/26 handle="k8s-pod-network.7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.080559 containerd[1620]: 2026-03-13 00:34:36.039 [INFO][3885] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.86.193/26] block=192.168.86.192/26 handle="k8s-pod-network.7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.080559 containerd[1620]: 2026-03-13 00:34:36.039 [INFO][3885] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.86.193/26] handle="k8s-pod-network.7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.080559 containerd[1620]: 2026-03-13 00:34:36.039 [INFO][3885] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:34:36.080559 containerd[1620]: 2026-03-13 00:34:36.039 [INFO][3885] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.86.193/26] IPv6=[] ContainerID="7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" HandleID="k8s-pod-network.7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" Workload="ci--4459--2--4--n--7393fd8643-k8s-csi--node--driver--r7g7s-eth0" Mar 13 00:34:36.080726 containerd[1620]: 2026-03-13 00:34:36.044 [INFO][3874] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" Namespace="calico-system" Pod="csi-node-driver-r7g7s" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-csi--node--driver--r7g7s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--7393fd8643-k8s-csi--node--driver--r7g7s-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"be715fcd-c3e2-47e4-b475-20bdc4ec1391", ResourceVersion:"690", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-7393fd8643", ContainerID:"", Pod:"csi-node-driver-r7g7s", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.86.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali77490e6d45e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:36.080792 containerd[1620]: 2026-03-13 00:34:36.044 [INFO][3874] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.86.193/32] ContainerID="7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" Namespace="calico-system" Pod="csi-node-driver-r7g7s" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-csi--node--driver--r7g7s-eth0" Mar 13 00:34:36.080792 containerd[1620]: 2026-03-13 00:34:36.044 [INFO][3874] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali77490e6d45e ContainerID="7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" Namespace="calico-system" Pod="csi-node-driver-r7g7s" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-csi--node--driver--r7g7s-eth0" Mar 13 00:34:36.080792 containerd[1620]: 2026-03-13 00:34:36.056 [INFO][3874] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" Namespace="calico-system" Pod="csi-node-driver-r7g7s" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-csi--node--driver--r7g7s-eth0" Mar 13 00:34:36.080866 containerd[1620]: 2026-03-13 00:34:36.057 [INFO][3874] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" Namespace="calico-system" Pod="csi-node-driver-r7g7s" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-csi--node--driver--r7g7s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--7393fd8643-k8s-csi--node--driver--r7g7s-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"be715fcd-c3e2-47e4-b475-20bdc4ec1391", ResourceVersion:"690", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-7393fd8643", ContainerID:"7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e", Pod:"csi-node-driver-r7g7s", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.86.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali77490e6d45e", MAC:"b6:10:1e:02:c4:93", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:36.083792 containerd[1620]: 2026-03-13 00:34:36.070 [INFO][3874] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" Namespace="calico-system" Pod="csi-node-driver-r7g7s" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-csi--node--driver--r7g7s-eth0" Mar 13 00:34:36.101486 systemd[1]: Removed slice kubepods-besteffort-podb882fb70_ad4a_4262_8db1_90f940d6b91e.slice - libcontainer container kubepods-besteffort-podb882fb70_ad4a_4262_8db1_90f940d6b91e.slice. Mar 13 00:34:36.117806 kubelet[2792]: I0313 00:34:36.117227 2792 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-node-x2kz7" podStartSLOduration=2.35604913 podStartE2EDuration="19.117208917s" podCreationTimestamp="2026-03-13 00:34:17 +0000 UTC" firstStartedPulling="2026-03-13 00:34:18.312992659 +0000 UTC m=+19.552467952" lastFinishedPulling="2026-03-13 00:34:35.074152416 +0000 UTC m=+36.313627739" observedRunningTime="2026-03-13 00:34:36.115482679 +0000 UTC m=+37.354957972" watchObservedRunningTime="2026-03-13 00:34:36.117208917 +0000 UTC m=+37.356684210" Mar 13 00:34:36.153559 containerd[1620]: time="2026-03-13T00:34:36.153321067Z" level=info msg="connecting to shim 7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e" address="unix:///run/containerd/s/646e5b5b5104e8f08f01398cb23eb32c117f5db8f275e83a833fb699dc8c3b7c" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:36.191727 systemd[1]: Started cri-containerd-7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e.scope - libcontainer container 7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e. Mar 13 00:34:36.210330 systemd[1]: Created slice kubepods-besteffort-podac895386_b9be_47d2_bc3c_8c003b1eb1f3.slice - libcontainer container kubepods-besteffort-podac895386_b9be_47d2_bc3c_8c003b1eb1f3.slice. Mar 13 00:34:36.249416 containerd[1620]: time="2026-03-13T00:34:36.249353256Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-r7g7s,Uid:be715fcd-c3e2-47e4-b475-20bdc4ec1391,Namespace:calico-system,Attempt:0,} returns sandbox id \"7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e\"" Mar 13 00:34:36.251619 containerd[1620]: time="2026-03-13T00:34:36.251571292Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 13 00:34:36.260733 kubelet[2792]: I0313 00:34:36.260690 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/ac895386-b9be-47d2-bc3c-8c003b1eb1f3-nginx-config\") pod \"whisker-5747684f86-wpv7c\" (UID: \"ac895386-b9be-47d2-bc3c-8c003b1eb1f3\") " pod="calico-system/whisker-5747684f86-wpv7c" Mar 13 00:34:36.261066 kubelet[2792]: I0313 00:34:36.260948 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ac895386-b9be-47d2-bc3c-8c003b1eb1f3-whisker-backend-key-pair\") pod \"whisker-5747684f86-wpv7c\" (UID: \"ac895386-b9be-47d2-bc3c-8c003b1eb1f3\") " pod="calico-system/whisker-5747684f86-wpv7c" Mar 13 00:34:36.261066 kubelet[2792]: I0313 00:34:36.260977 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac895386-b9be-47d2-bc3c-8c003b1eb1f3-whisker-ca-bundle\") pod \"whisker-5747684f86-wpv7c\" (UID: \"ac895386-b9be-47d2-bc3c-8c003b1eb1f3\") " pod="calico-system/whisker-5747684f86-wpv7c" Mar 13 00:34:36.261066 kubelet[2792]: I0313 00:34:36.260998 2792 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k99vg\" (UniqueName: \"kubernetes.io/projected/ac895386-b9be-47d2-bc3c-8c003b1eb1f3-kube-api-access-k99vg\") pod \"whisker-5747684f86-wpv7c\" (UID: \"ac895386-b9be-47d2-bc3c-8c003b1eb1f3\") " pod="calico-system/whisker-5747684f86-wpv7c" Mar 13 00:34:36.517485 containerd[1620]: time="2026-03-13T00:34:36.517361007Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5747684f86-wpv7c,Uid:ac895386-b9be-47d2-bc3c-8c003b1eb1f3,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:36.671817 systemd-networkd[1483]: caliaedd7618b3e: Link UP Mar 13 00:34:36.672149 systemd-networkd[1483]: caliaedd7618b3e: Gained carrier Mar 13 00:34:36.696183 containerd[1620]: 2026-03-13 00:34:36.549 [ERROR][3954] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 13 00:34:36.696183 containerd[1620]: 2026-03-13 00:34:36.565 [INFO][3954] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--7393fd8643-k8s-whisker--5747684f86--wpv7c-eth0 whisker-5747684f86- calico-system ac895386-b9be-47d2-bc3c-8c003b1eb1f3 884 0 2026-03-13 00:34:36 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5747684f86 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-4-n-7393fd8643 whisker-5747684f86-wpv7c eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] caliaedd7618b3e [] [] }} ContainerID="2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" Namespace="calico-system" Pod="whisker-5747684f86-wpv7c" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-whisker--5747684f86--wpv7c-" Mar 13 00:34:36.696183 containerd[1620]: 2026-03-13 00:34:36.565 [INFO][3954] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" Namespace="calico-system" Pod="whisker-5747684f86-wpv7c" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-whisker--5747684f86--wpv7c-eth0" Mar 13 00:34:36.696183 containerd[1620]: 2026-03-13 00:34:36.604 [INFO][3967] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" HandleID="k8s-pod-network.2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" Workload="ci--4459--2--4--n--7393fd8643-k8s-whisker--5747684f86--wpv7c-eth0" Mar 13 00:34:36.696460 containerd[1620]: 2026-03-13 00:34:36.615 [INFO][3967] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" HandleID="k8s-pod-network.2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" Workload="ci--4459--2--4--n--7393fd8643-k8s-whisker--5747684f86--wpv7c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277890), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-7393fd8643", "pod":"whisker-5747684f86-wpv7c", "timestamp":"2026-03-13 00:34:36.604743862 +0000 UTC"}, Hostname:"ci-4459-2-4-n-7393fd8643", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001146e0)} Mar 13 00:34:36.696460 containerd[1620]: 2026-03-13 00:34:36.616 [INFO][3967] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:34:36.696460 containerd[1620]: 2026-03-13 00:34:36.616 [INFO][3967] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:34:36.696460 containerd[1620]: 2026-03-13 00:34:36.616 [INFO][3967] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-7393fd8643' Mar 13 00:34:36.696460 containerd[1620]: 2026-03-13 00:34:36.619 [INFO][3967] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.696460 containerd[1620]: 2026-03-13 00:34:36.627 [INFO][3967] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.696460 containerd[1620]: 2026-03-13 00:34:36.636 [INFO][3967] ipam/ipam.go 526: Trying affinity for 192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.696460 containerd[1620]: 2026-03-13 00:34:36.638 [INFO][3967] ipam/ipam.go 160: Attempting to load block cidr=192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.696460 containerd[1620]: 2026-03-13 00:34:36.641 [INFO][3967] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.696691 containerd[1620]: 2026-03-13 00:34:36.642 [INFO][3967] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.86.192/26 handle="k8s-pod-network.2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.696691 containerd[1620]: 2026-03-13 00:34:36.644 [INFO][3967] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01 Mar 13 00:34:36.696691 containerd[1620]: 2026-03-13 00:34:36.649 [INFO][3967] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.86.192/26 handle="k8s-pod-network.2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.696691 containerd[1620]: 2026-03-13 00:34:36.656 [INFO][3967] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.86.194/26] block=192.168.86.192/26 handle="k8s-pod-network.2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.696691 containerd[1620]: 2026-03-13 00:34:36.656 [INFO][3967] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.86.194/26] handle="k8s-pod-network.2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:36.696691 containerd[1620]: 2026-03-13 00:34:36.656 [INFO][3967] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:34:36.696691 containerd[1620]: 2026-03-13 00:34:36.656 [INFO][3967] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.86.194/26] IPv6=[] ContainerID="2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" HandleID="k8s-pod-network.2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" Workload="ci--4459--2--4--n--7393fd8643-k8s-whisker--5747684f86--wpv7c-eth0" Mar 13 00:34:36.696867 containerd[1620]: 2026-03-13 00:34:36.662 [INFO][3954] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" Namespace="calico-system" Pod="whisker-5747684f86-wpv7c" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-whisker--5747684f86--wpv7c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--7393fd8643-k8s-whisker--5747684f86--wpv7c-eth0", GenerateName:"whisker-5747684f86-", Namespace:"calico-system", SelfLink:"", UID:"ac895386-b9be-47d2-bc3c-8c003b1eb1f3", ResourceVersion:"884", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5747684f86", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-7393fd8643", ContainerID:"", Pod:"whisker-5747684f86-wpv7c", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.86.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliaedd7618b3e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:36.696867 containerd[1620]: 2026-03-13 00:34:36.662 [INFO][3954] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.86.194/32] ContainerID="2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" Namespace="calico-system" Pod="whisker-5747684f86-wpv7c" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-whisker--5747684f86--wpv7c-eth0" Mar 13 00:34:36.696959 containerd[1620]: 2026-03-13 00:34:36.662 [INFO][3954] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaedd7618b3e ContainerID="2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" Namespace="calico-system" Pod="whisker-5747684f86-wpv7c" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-whisker--5747684f86--wpv7c-eth0" Mar 13 00:34:36.696959 containerd[1620]: 2026-03-13 00:34:36.673 [INFO][3954] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" Namespace="calico-system" Pod="whisker-5747684f86-wpv7c" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-whisker--5747684f86--wpv7c-eth0" Mar 13 00:34:36.697012 containerd[1620]: 2026-03-13 00:34:36.676 [INFO][3954] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" Namespace="calico-system" Pod="whisker-5747684f86-wpv7c" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-whisker--5747684f86--wpv7c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--7393fd8643-k8s-whisker--5747684f86--wpv7c-eth0", GenerateName:"whisker-5747684f86-", Namespace:"calico-system", SelfLink:"", UID:"ac895386-b9be-47d2-bc3c-8c003b1eb1f3", ResourceVersion:"884", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5747684f86", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-7393fd8643", ContainerID:"2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01", Pod:"whisker-5747684f86-wpv7c", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.86.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliaedd7618b3e", MAC:"7a:a9:9e:e2:e9:d9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:36.697077 containerd[1620]: 2026-03-13 00:34:36.689 [INFO][3954] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" Namespace="calico-system" Pod="whisker-5747684f86-wpv7c" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-whisker--5747684f86--wpv7c-eth0" Mar 13 00:34:36.761359 containerd[1620]: time="2026-03-13T00:34:36.761305349Z" level=info msg="connecting to shim 2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01" address="unix:///run/containerd/s/1bd3f64d040061ec419a31ba17861678ff2757f501b0ebae1d7797687ed58d51" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:36.828668 systemd[1]: Started cri-containerd-2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01.scope - libcontainer container 2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01. Mar 13 00:34:36.919992 kubelet[2792]: I0313 00:34:36.919954 2792 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="b882fb70-ad4a-4262-8db1-90f940d6b91e" path="/var/lib/kubelet/pods/b882fb70-ad4a-4262-8db1-90f940d6b91e/volumes" Mar 13 00:34:36.921387 containerd[1620]: time="2026-03-13T00:34:36.921343232Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5747684f86-wpv7c,Uid:ac895386-b9be-47d2-bc3c-8c003b1eb1f3,Namespace:calico-system,Attempt:0,} returns sandbox id \"2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01\"" Mar 13 00:34:37.095953 kubelet[2792]: I0313 00:34:37.095913 2792 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:34:37.438461 systemd-networkd[1483]: cali77490e6d45e: Gained IPv6LL Mar 13 00:34:38.196683 containerd[1620]: time="2026-03-13T00:34:38.196619935Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:38.197727 containerd[1620]: time="2026-03-13T00:34:38.197568554Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Mar 13 00:34:38.198507 containerd[1620]: time="2026-03-13T00:34:38.198478023Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:38.200166 containerd[1620]: time="2026-03-13T00:34:38.200139630Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:38.200684 containerd[1620]: time="2026-03-13T00:34:38.200664190Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 1.949060448s" Mar 13 00:34:38.200746 containerd[1620]: time="2026-03-13T00:34:38.200735420Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Mar 13 00:34:38.202154 containerd[1620]: time="2026-03-13T00:34:38.201767358Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 13 00:34:38.205314 containerd[1620]: time="2026-03-13T00:34:38.205275973Z" level=info msg="CreateContainer within sandbox \"7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 13 00:34:38.217610 containerd[1620]: time="2026-03-13T00:34:38.216438228Z" level=info msg="Container e1f6773bd7cca2b7f02ae6fde72b7561ec66a7f70616f4770a5fb5f0c16f4373: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:38.221563 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4168721333.mount: Deactivated successfully. Mar 13 00:34:38.233937 containerd[1620]: time="2026-03-13T00:34:38.233888024Z" level=info msg="CreateContainer within sandbox \"7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"e1f6773bd7cca2b7f02ae6fde72b7561ec66a7f70616f4770a5fb5f0c16f4373\"" Mar 13 00:34:38.234587 containerd[1620]: time="2026-03-13T00:34:38.234527124Z" level=info msg="StartContainer for \"e1f6773bd7cca2b7f02ae6fde72b7561ec66a7f70616f4770a5fb5f0c16f4373\"" Mar 13 00:34:38.236507 containerd[1620]: time="2026-03-13T00:34:38.236440670Z" level=info msg="connecting to shim e1f6773bd7cca2b7f02ae6fde72b7561ec66a7f70616f4770a5fb5f0c16f4373" address="unix:///run/containerd/s/646e5b5b5104e8f08f01398cb23eb32c117f5db8f275e83a833fb699dc8c3b7c" protocol=ttrpc version=3 Mar 13 00:34:38.261315 systemd[1]: Started cri-containerd-e1f6773bd7cca2b7f02ae6fde72b7561ec66a7f70616f4770a5fb5f0c16f4373.scope - libcontainer container e1f6773bd7cca2b7f02ae6fde72b7561ec66a7f70616f4770a5fb5f0c16f4373. Mar 13 00:34:38.369902 containerd[1620]: time="2026-03-13T00:34:38.369803578Z" level=info msg="StartContainer for \"e1f6773bd7cca2b7f02ae6fde72b7561ec66a7f70616f4770a5fb5f0c16f4373\" returns successfully" Mar 13 00:34:38.654306 systemd-networkd[1483]: caliaedd7618b3e: Gained IPv6LL Mar 13 00:34:40.444861 containerd[1620]: time="2026-03-13T00:34:40.444768643Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:40.446146 containerd[1620]: time="2026-03-13T00:34:40.445958812Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Mar 13 00:34:40.446953 containerd[1620]: time="2026-03-13T00:34:40.446924441Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:40.449223 containerd[1620]: time="2026-03-13T00:34:40.449187899Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:40.449979 containerd[1620]: time="2026-03-13T00:34:40.449952497Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 2.248160489s" Mar 13 00:34:40.450064 containerd[1620]: time="2026-03-13T00:34:40.450049727Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Mar 13 00:34:40.451287 containerd[1620]: time="2026-03-13T00:34:40.451237906Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 13 00:34:40.455561 containerd[1620]: time="2026-03-13T00:34:40.455515652Z" level=info msg="CreateContainer within sandbox \"2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 13 00:34:40.465026 containerd[1620]: time="2026-03-13T00:34:40.464973791Z" level=info msg="Container e9d8b0633ebcf74ff35b678bd697da7cd3f044819687a04aa0418ece9fdf631b: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:40.474765 containerd[1620]: time="2026-03-13T00:34:40.474708950Z" level=info msg="CreateContainer within sandbox \"2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"e9d8b0633ebcf74ff35b678bd697da7cd3f044819687a04aa0418ece9fdf631b\"" Mar 13 00:34:40.475710 containerd[1620]: time="2026-03-13T00:34:40.475673330Z" level=info msg="StartContainer for \"e9d8b0633ebcf74ff35b678bd697da7cd3f044819687a04aa0418ece9fdf631b\"" Mar 13 00:34:40.477081 containerd[1620]: time="2026-03-13T00:34:40.477036748Z" level=info msg="connecting to shim e9d8b0633ebcf74ff35b678bd697da7cd3f044819687a04aa0418ece9fdf631b" address="unix:///run/containerd/s/1bd3f64d040061ec419a31ba17861678ff2757f501b0ebae1d7797687ed58d51" protocol=ttrpc version=3 Mar 13 00:34:40.503440 systemd[1]: Started cri-containerd-e9d8b0633ebcf74ff35b678bd697da7cd3f044819687a04aa0418ece9fdf631b.scope - libcontainer container e9d8b0633ebcf74ff35b678bd697da7cd3f044819687a04aa0418ece9fdf631b. Mar 13 00:34:40.565129 containerd[1620]: time="2026-03-13T00:34:40.564797721Z" level=info msg="StartContainer for \"e9d8b0633ebcf74ff35b678bd697da7cd3f044819687a04aa0418ece9fdf631b\" returns successfully" Mar 13 00:34:42.087472 kubelet[2792]: I0313 00:34:42.086459 2792 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:34:42.851429 kubelet[2792]: I0313 00:34:42.850918 2792 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:34:42.876134 containerd[1620]: time="2026-03-13T00:34:42.874601550Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:42.891861 containerd[1620]: time="2026-03-13T00:34:42.888123178Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Mar 13 00:34:42.892212 containerd[1620]: time="2026-03-13T00:34:42.892014406Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:42.898127 containerd[1620]: time="2026-03-13T00:34:42.898055520Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:42.899139 containerd[1620]: time="2026-03-13T00:34:42.899071880Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 2.447671663s" Mar 13 00:34:42.899231 containerd[1620]: time="2026-03-13T00:34:42.899144520Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Mar 13 00:34:42.920750 containerd[1620]: time="2026-03-13T00:34:42.920680561Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 13 00:34:42.929801 containerd[1620]: time="2026-03-13T00:34:42.929687413Z" level=info msg="CreateContainer within sandbox \"7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 13 00:34:42.964628 containerd[1620]: time="2026-03-13T00:34:42.964568912Z" level=info msg="Container 50e83d7b47d58938a4dc2da37d514a125caa391741b9a2856981d3d861628536: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:42.968095 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount126213266.mount: Deactivated successfully. Mar 13 00:34:42.981518 containerd[1620]: time="2026-03-13T00:34:42.981346068Z" level=info msg="CreateContainer within sandbox \"7f7e264d82f2386cee91f83a61b08621868a9198d286f2bef1e911ff7997444e\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"50e83d7b47d58938a4dc2da37d514a125caa391741b9a2856981d3d861628536\"" Mar 13 00:34:42.982737 containerd[1620]: time="2026-03-13T00:34:42.982457727Z" level=info msg="StartContainer for \"50e83d7b47d58938a4dc2da37d514a125caa391741b9a2856981d3d861628536\"" Mar 13 00:34:42.985174 containerd[1620]: time="2026-03-13T00:34:42.984969525Z" level=info msg="connecting to shim 50e83d7b47d58938a4dc2da37d514a125caa391741b9a2856981d3d861628536" address="unix:///run/containerd/s/646e5b5b5104e8f08f01398cb23eb32c117f5db8f275e83a833fb699dc8c3b7c" protocol=ttrpc version=3 Mar 13 00:34:43.022732 systemd[1]: Started cri-containerd-50e83d7b47d58938a4dc2da37d514a125caa391741b9a2856981d3d861628536.scope - libcontainer container 50e83d7b47d58938a4dc2da37d514a125caa391741b9a2856981d3d861628536. Mar 13 00:34:43.126432 containerd[1620]: time="2026-03-13T00:34:43.126279636Z" level=info msg="StartContainer for \"50e83d7b47d58938a4dc2da37d514a125caa391741b9a2856981d3d861628536\" returns successfully" Mar 13 00:34:43.284213 systemd-networkd[1483]: vxlan.calico: Link UP Mar 13 00:34:43.284227 systemd-networkd[1483]: vxlan.calico: Gained carrier Mar 13 00:34:43.994228 kubelet[2792]: I0313 00:34:43.994138 2792 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 13 00:34:43.994228 kubelet[2792]: I0313 00:34:43.994171 2792 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 13 00:34:44.147348 kubelet[2792]: I0313 00:34:44.146806 2792 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/csi-node-driver-r7g7s" podStartSLOduration=20.48327666 podStartE2EDuration="27.146789223s" podCreationTimestamp="2026-03-13 00:34:17 +0000 UTC" firstStartedPulling="2026-03-13 00:34:36.251264043 +0000 UTC m=+37.490739336" lastFinishedPulling="2026-03-13 00:34:42.914776606 +0000 UTC m=+44.154251899" observedRunningTime="2026-03-13 00:34:44.146442084 +0000 UTC m=+45.385917377" watchObservedRunningTime="2026-03-13 00:34:44.146789223 +0000 UTC m=+45.386264516" Mar 13 00:34:44.734504 systemd-networkd[1483]: vxlan.calico: Gained IPv6LL Mar 13 00:34:44.788366 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3230247190.mount: Deactivated successfully. Mar 13 00:34:44.811431 containerd[1620]: time="2026-03-13T00:34:44.811282244Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:44.812470 containerd[1620]: time="2026-03-13T00:34:44.812417243Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Mar 13 00:34:44.813343 containerd[1620]: time="2026-03-13T00:34:44.813259512Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:44.815176 containerd[1620]: time="2026-03-13T00:34:44.815152271Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:44.816046 containerd[1620]: time="2026-03-13T00:34:44.815514990Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 1.894772129s" Mar 13 00:34:44.816046 containerd[1620]: time="2026-03-13T00:34:44.815544500Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Mar 13 00:34:44.820430 containerd[1620]: time="2026-03-13T00:34:44.820388257Z" level=info msg="CreateContainer within sandbox \"2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 13 00:34:44.829899 containerd[1620]: time="2026-03-13T00:34:44.829235762Z" level=info msg="Container f99473e05f7927f9a4e84e9a6b623648d9e5c244a4f3de58b6c8f9fbda5ae3b2: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:44.840534 containerd[1620]: time="2026-03-13T00:34:44.840481354Z" level=info msg="CreateContainer within sandbox \"2d46d4b422419c05953a162837934cc18a098ce4bbd538ecc4a59ae047438b01\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"f99473e05f7927f9a4e84e9a6b623648d9e5c244a4f3de58b6c8f9fbda5ae3b2\"" Mar 13 00:34:44.841543 containerd[1620]: time="2026-03-13T00:34:44.841478294Z" level=info msg="StartContainer for \"f99473e05f7927f9a4e84e9a6b623648d9e5c244a4f3de58b6c8f9fbda5ae3b2\"" Mar 13 00:34:44.842686 containerd[1620]: time="2026-03-13T00:34:44.842653683Z" level=info msg="connecting to shim f99473e05f7927f9a4e84e9a6b623648d9e5c244a4f3de58b6c8f9fbda5ae3b2" address="unix:///run/containerd/s/1bd3f64d040061ec419a31ba17861678ff2757f501b0ebae1d7797687ed58d51" protocol=ttrpc version=3 Mar 13 00:34:44.865473 systemd[1]: Started cri-containerd-f99473e05f7927f9a4e84e9a6b623648d9e5c244a4f3de58b6c8f9fbda5ae3b2.scope - libcontainer container f99473e05f7927f9a4e84e9a6b623648d9e5c244a4f3de58b6c8f9fbda5ae3b2. Mar 13 00:34:44.920666 containerd[1620]: time="2026-03-13T00:34:44.920602471Z" level=info msg="StartContainer for \"f99473e05f7927f9a4e84e9a6b623648d9e5c244a4f3de58b6c8f9fbda5ae3b2\" returns successfully" Mar 13 00:34:46.919994 containerd[1620]: time="2026-03-13T00:34:46.919931237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-zl7lf,Uid:d52b7e65-e88d-43f4-9c3d-b04fe7aef49f,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:47.046451 systemd-networkd[1483]: cali9e66dcb1464: Link UP Mar 13 00:34:47.047039 systemd-networkd[1483]: cali9e66dcb1464: Gained carrier Mar 13 00:34:47.067344 containerd[1620]: 2026-03-13 00:34:46.973 [INFO][4548] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--7393fd8643-k8s-goldmane--9f7667bb8--zl7lf-eth0 goldmane-9f7667bb8- calico-system d52b7e65-e88d-43f4-9c3d-b04fe7aef49f 827 0 2026-03-13 00:34:17 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:9f7667bb8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-4-n-7393fd8643 goldmane-9f7667bb8-zl7lf eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali9e66dcb1464 [] [] }} ContainerID="38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" Namespace="calico-system" Pod="goldmane-9f7667bb8-zl7lf" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-goldmane--9f7667bb8--zl7lf-" Mar 13 00:34:47.067344 containerd[1620]: 2026-03-13 00:34:46.973 [INFO][4548] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" Namespace="calico-system" Pod="goldmane-9f7667bb8-zl7lf" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-goldmane--9f7667bb8--zl7lf-eth0" Mar 13 00:34:47.067344 containerd[1620]: 2026-03-13 00:34:47.004 [INFO][4560] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" HandleID="k8s-pod-network.38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" Workload="ci--4459--2--4--n--7393fd8643-k8s-goldmane--9f7667bb8--zl7lf-eth0" Mar 13 00:34:47.067538 containerd[1620]: 2026-03-13 00:34:47.016 [INFO][4560] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" HandleID="k8s-pod-network.38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" Workload="ci--4459--2--4--n--7393fd8643-k8s-goldmane--9f7667bb8--zl7lf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002774e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-7393fd8643", "pod":"goldmane-9f7667bb8-zl7lf", "timestamp":"2026-03-13 00:34:47.004653957 +0000 UTC"}, Hostname:"ci-4459-2-4-n-7393fd8643", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001142c0)} Mar 13 00:34:47.067538 containerd[1620]: 2026-03-13 00:34:47.016 [INFO][4560] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:34:47.067538 containerd[1620]: 2026-03-13 00:34:47.017 [INFO][4560] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:34:47.067538 containerd[1620]: 2026-03-13 00:34:47.017 [INFO][4560] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-7393fd8643' Mar 13 00:34:47.067538 containerd[1620]: 2026-03-13 00:34:47.019 [INFO][4560] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:47.067538 containerd[1620]: 2026-03-13 00:34:47.023 [INFO][4560] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:47.067538 containerd[1620]: 2026-03-13 00:34:47.027 [INFO][4560] ipam/ipam.go 526: Trying affinity for 192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:47.067538 containerd[1620]: 2026-03-13 00:34:47.028 [INFO][4560] ipam/ipam.go 160: Attempting to load block cidr=192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:47.067538 containerd[1620]: 2026-03-13 00:34:47.030 [INFO][4560] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:47.067708 containerd[1620]: 2026-03-13 00:34:47.030 [INFO][4560] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.86.192/26 handle="k8s-pod-network.38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:47.067708 containerd[1620]: 2026-03-13 00:34:47.032 [INFO][4560] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038 Mar 13 00:34:47.067708 containerd[1620]: 2026-03-13 00:34:47.035 [INFO][4560] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.86.192/26 handle="k8s-pod-network.38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:47.067708 containerd[1620]: 2026-03-13 00:34:47.040 [INFO][4560] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.86.195/26] block=192.168.86.192/26 handle="k8s-pod-network.38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:47.067708 containerd[1620]: 2026-03-13 00:34:47.040 [INFO][4560] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.86.195/26] handle="k8s-pod-network.38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:47.067708 containerd[1620]: 2026-03-13 00:34:47.040 [INFO][4560] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:34:47.067708 containerd[1620]: 2026-03-13 00:34:47.040 [INFO][4560] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.86.195/26] IPv6=[] ContainerID="38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" HandleID="k8s-pod-network.38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" Workload="ci--4459--2--4--n--7393fd8643-k8s-goldmane--9f7667bb8--zl7lf-eth0" Mar 13 00:34:47.067819 containerd[1620]: 2026-03-13 00:34:47.042 [INFO][4548] cni-plugin/k8s.go 418: Populated endpoint ContainerID="38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" Namespace="calico-system" Pod="goldmane-9f7667bb8-zl7lf" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-goldmane--9f7667bb8--zl7lf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--7393fd8643-k8s-goldmane--9f7667bb8--zl7lf-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"d52b7e65-e88d-43f4-9c3d-b04fe7aef49f", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-7393fd8643", ContainerID:"", Pod:"goldmane-9f7667bb8-zl7lf", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.86.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali9e66dcb1464", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:47.067819 containerd[1620]: 2026-03-13 00:34:47.043 [INFO][4548] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.86.195/32] ContainerID="38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" Namespace="calico-system" Pod="goldmane-9f7667bb8-zl7lf" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-goldmane--9f7667bb8--zl7lf-eth0" Mar 13 00:34:47.067889 containerd[1620]: 2026-03-13 00:34:47.043 [INFO][4548] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9e66dcb1464 ContainerID="38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" Namespace="calico-system" Pod="goldmane-9f7667bb8-zl7lf" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-goldmane--9f7667bb8--zl7lf-eth0" Mar 13 00:34:47.067889 containerd[1620]: 2026-03-13 00:34:47.046 [INFO][4548] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" Namespace="calico-system" Pod="goldmane-9f7667bb8-zl7lf" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-goldmane--9f7667bb8--zl7lf-eth0" Mar 13 00:34:47.067949 containerd[1620]: 2026-03-13 00:34:47.048 [INFO][4548] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" Namespace="calico-system" Pod="goldmane-9f7667bb8-zl7lf" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-goldmane--9f7667bb8--zl7lf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--7393fd8643-k8s-goldmane--9f7667bb8--zl7lf-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"d52b7e65-e88d-43f4-9c3d-b04fe7aef49f", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-7393fd8643", ContainerID:"38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038", Pod:"goldmane-9f7667bb8-zl7lf", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.86.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali9e66dcb1464", MAC:"ee:b9:3d:da:5e:ac", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:47.068000 containerd[1620]: 2026-03-13 00:34:47.060 [INFO][4548] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" Namespace="calico-system" Pod="goldmane-9f7667bb8-zl7lf" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-goldmane--9f7667bb8--zl7lf-eth0" Mar 13 00:34:47.077791 kubelet[2792]: I0313 00:34:47.077539 2792 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/whisker-5747684f86-wpv7c" podStartSLOduration=3.185972065 podStartE2EDuration="11.077524368s" podCreationTimestamp="2026-03-13 00:34:36 +0000 UTC" firstStartedPulling="2026-03-13 00:34:36.924726507 +0000 UTC m=+38.164201800" lastFinishedPulling="2026-03-13 00:34:44.81627881 +0000 UTC m=+46.055754103" observedRunningTime="2026-03-13 00:34:45.152004292 +0000 UTC m=+46.391479605" watchObservedRunningTime="2026-03-13 00:34:47.077524368 +0000 UTC m=+48.316999651" Mar 13 00:34:47.089380 containerd[1620]: time="2026-03-13T00:34:47.089333533Z" level=info msg="connecting to shim 38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038" address="unix:///run/containerd/s/99daa62cf2a7cf00c02ac97a7c9729e01c52708d3ec53a2d16ab4262641d4423" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:47.119610 systemd[1]: Started cri-containerd-38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038.scope - libcontainer container 38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038. Mar 13 00:34:47.181735 containerd[1620]: time="2026-03-13T00:34:47.181638746Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-zl7lf,Uid:d52b7e65-e88d-43f4-9c3d-b04fe7aef49f,Namespace:calico-system,Attempt:0,} returns sandbox id \"38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038\"" Mar 13 00:34:47.184332 containerd[1620]: time="2026-03-13T00:34:47.184288455Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 13 00:34:47.917670 containerd[1620]: time="2026-03-13T00:34:47.917539043Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-nmbnx,Uid:988277de-d582-401c-833d-eea6558241b9,Namespace:kube-system,Attempt:0,}" Mar 13 00:34:48.072070 systemd-networkd[1483]: cali4c34978564e: Link UP Mar 13 00:34:48.074144 systemd-networkd[1483]: cali4c34978564e: Gained carrier Mar 13 00:34:48.097861 containerd[1620]: 2026-03-13 00:34:47.973 [INFO][4636] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--nmbnx-eth0 coredns-7d764666f9- kube-system 988277de-d582-401c-833d-eea6558241b9 817 0 2026-03-13 00:34:05 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-n-7393fd8643 coredns-7d764666f9-nmbnx eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4c34978564e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" Namespace="kube-system" Pod="coredns-7d764666f9-nmbnx" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--nmbnx-" Mar 13 00:34:48.097861 containerd[1620]: 2026-03-13 00:34:47.974 [INFO][4636] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" Namespace="kube-system" Pod="coredns-7d764666f9-nmbnx" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--nmbnx-eth0" Mar 13 00:34:48.097861 containerd[1620]: 2026-03-13 00:34:48.009 [INFO][4648] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" HandleID="k8s-pod-network.a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" Workload="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--nmbnx-eth0" Mar 13 00:34:48.099364 containerd[1620]: 2026-03-13 00:34:48.017 [INFO][4648] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" HandleID="k8s-pod-network.a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" Workload="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--nmbnx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277290), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-n-7393fd8643", "pod":"coredns-7d764666f9-nmbnx", "timestamp":"2026-03-13 00:34:48.009576457 +0000 UTC"}, Hostname:"ci-4459-2-4-n-7393fd8643", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00026f080)} Mar 13 00:34:48.099364 containerd[1620]: 2026-03-13 00:34:48.017 [INFO][4648] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:34:48.099364 containerd[1620]: 2026-03-13 00:34:48.017 [INFO][4648] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:34:48.099364 containerd[1620]: 2026-03-13 00:34:48.017 [INFO][4648] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-7393fd8643' Mar 13 00:34:48.099364 containerd[1620]: 2026-03-13 00:34:48.020 [INFO][4648] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:48.099364 containerd[1620]: 2026-03-13 00:34:48.031 [INFO][4648] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:48.099364 containerd[1620]: 2026-03-13 00:34:48.038 [INFO][4648] ipam/ipam.go 526: Trying affinity for 192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:48.099364 containerd[1620]: 2026-03-13 00:34:48.040 [INFO][4648] ipam/ipam.go 160: Attempting to load block cidr=192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:48.099364 containerd[1620]: 2026-03-13 00:34:48.045 [INFO][4648] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:48.099621 containerd[1620]: 2026-03-13 00:34:48.045 [INFO][4648] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.86.192/26 handle="k8s-pod-network.a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:48.099621 containerd[1620]: 2026-03-13 00:34:48.048 [INFO][4648] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017 Mar 13 00:34:48.099621 containerd[1620]: 2026-03-13 00:34:48.055 [INFO][4648] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.86.192/26 handle="k8s-pod-network.a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:48.099621 containerd[1620]: 2026-03-13 00:34:48.062 [INFO][4648] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.86.196/26] block=192.168.86.192/26 handle="k8s-pod-network.a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:48.099621 containerd[1620]: 2026-03-13 00:34:48.063 [INFO][4648] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.86.196/26] handle="k8s-pod-network.a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:48.099621 containerd[1620]: 2026-03-13 00:34:48.063 [INFO][4648] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:34:48.099621 containerd[1620]: 2026-03-13 00:34:48.063 [INFO][4648] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.86.196/26] IPv6=[] ContainerID="a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" HandleID="k8s-pod-network.a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" Workload="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--nmbnx-eth0" Mar 13 00:34:48.099867 containerd[1620]: 2026-03-13 00:34:48.066 [INFO][4636] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" Namespace="kube-system" Pod="coredns-7d764666f9-nmbnx" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--nmbnx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--nmbnx-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"988277de-d582-401c-833d-eea6558241b9", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-7393fd8643", ContainerID:"", Pod:"coredns-7d764666f9-nmbnx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.86.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4c34978564e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:48.099867 containerd[1620]: 2026-03-13 00:34:48.066 [INFO][4636] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.86.196/32] ContainerID="a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" Namespace="kube-system" Pod="coredns-7d764666f9-nmbnx" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--nmbnx-eth0" Mar 13 00:34:48.099867 containerd[1620]: 2026-03-13 00:34:48.066 [INFO][4636] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4c34978564e ContainerID="a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" Namespace="kube-system" Pod="coredns-7d764666f9-nmbnx" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--nmbnx-eth0" Mar 13 00:34:48.099867 containerd[1620]: 2026-03-13 00:34:48.072 [INFO][4636] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" Namespace="kube-system" Pod="coredns-7d764666f9-nmbnx" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--nmbnx-eth0" Mar 13 00:34:48.099867 containerd[1620]: 2026-03-13 00:34:48.075 [INFO][4636] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" Namespace="kube-system" Pod="coredns-7d764666f9-nmbnx" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--nmbnx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--nmbnx-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"988277de-d582-401c-833d-eea6558241b9", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-7393fd8643", ContainerID:"a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017", Pod:"coredns-7d764666f9-nmbnx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.86.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4c34978564e", MAC:"c6:0e:c8:b5:49:69", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:48.100219 containerd[1620]: 2026-03-13 00:34:48.090 [INFO][4636] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" Namespace="kube-system" Pod="coredns-7d764666f9-nmbnx" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--nmbnx-eth0" Mar 13 00:34:48.141398 containerd[1620]: time="2026-03-13T00:34:48.141316105Z" level=info msg="connecting to shim a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017" address="unix:///run/containerd/s/dc0fee1f577b37472f5dfc105abf1cde0cea4d5ec4f1ea1c6cb79f9bf309c301" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:48.191771 systemd[1]: Started cri-containerd-a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017.scope - libcontainer container a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017. Mar 13 00:34:48.277043 containerd[1620]: time="2026-03-13T00:34:48.276969751Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-nmbnx,Uid:988277de-d582-401c-833d-eea6558241b9,Namespace:kube-system,Attempt:0,} returns sandbox id \"a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017\"" Mar 13 00:34:48.283470 containerd[1620]: time="2026-03-13T00:34:48.283236489Z" level=info msg="CreateContainer within sandbox \"a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 13 00:34:48.299293 containerd[1620]: time="2026-03-13T00:34:48.299247013Z" level=info msg="Container d7017dfd2d738b44d4fb75f295499e94c49fa8feb18096cd5097fc1c0970684a: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:48.305248 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1674055390.mount: Deactivated successfully. Mar 13 00:34:48.309404 containerd[1620]: time="2026-03-13T00:34:48.309316481Z" level=info msg="CreateContainer within sandbox \"a96e561dc324aa599639da552d402ccf7a1b516cb5ba3c091c5056a4e4f04017\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d7017dfd2d738b44d4fb75f295499e94c49fa8feb18096cd5097fc1c0970684a\"" Mar 13 00:34:48.311490 containerd[1620]: time="2026-03-13T00:34:48.310701110Z" level=info msg="StartContainer for \"d7017dfd2d738b44d4fb75f295499e94c49fa8feb18096cd5097fc1c0970684a\"" Mar 13 00:34:48.312183 containerd[1620]: time="2026-03-13T00:34:48.312078409Z" level=info msg="connecting to shim d7017dfd2d738b44d4fb75f295499e94c49fa8feb18096cd5097fc1c0970684a" address="unix:///run/containerd/s/dc0fee1f577b37472f5dfc105abf1cde0cea4d5ec4f1ea1c6cb79f9bf309c301" protocol=ttrpc version=3 Mar 13 00:34:48.345412 systemd[1]: Started cri-containerd-d7017dfd2d738b44d4fb75f295499e94c49fa8feb18096cd5097fc1c0970684a.scope - libcontainer container d7017dfd2d738b44d4fb75f295499e94c49fa8feb18096cd5097fc1c0970684a. Mar 13 00:34:48.383176 containerd[1620]: time="2026-03-13T00:34:48.383133956Z" level=info msg="StartContainer for \"d7017dfd2d738b44d4fb75f295499e94c49fa8feb18096cd5097fc1c0970684a\" returns successfully" Mar 13 00:34:48.702541 systemd-networkd[1483]: cali9e66dcb1464: Gained IPv6LL Mar 13 00:34:48.947176 containerd[1620]: time="2026-03-13T00:34:48.947061185Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7f89cbb487-v9qld,Uid:f89ecdfa-6054-4fb2-b6f9-3c1b8fdaf628,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:48.948616 containerd[1620]: time="2026-03-13T00:34:48.948428185Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-rts4t,Uid:831fd1c0-4b12-4ba4-b935-44636352b176,Namespace:kube-system,Attempt:0,}" Mar 13 00:34:49.197012 systemd-networkd[1483]: cali9ba6ce1034e: Link UP Mar 13 00:34:49.218407 systemd-networkd[1483]: cali9ba6ce1034e: Gained carrier Mar 13 00:34:49.258457 kubelet[2792]: I0313 00:34:49.258398 2792 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-nmbnx" podStartSLOduration=44.258212003 podStartE2EDuration="44.258212003s" podCreationTimestamp="2026-03-13 00:34:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:34:49.235024549 +0000 UTC m=+50.474499842" watchObservedRunningTime="2026-03-13 00:34:49.258212003 +0000 UTC m=+50.497687296" Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.044 [INFO][4781] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--rts4t-eth0 coredns-7d764666f9- kube-system 831fd1c0-4b12-4ba4-b935-44636352b176 826 0 2026-03-13 00:34:05 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-n-7393fd8643 coredns-7d764666f9-rts4t eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9ba6ce1034e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" Namespace="kube-system" Pod="coredns-7d764666f9-rts4t" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--rts4t-" Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.047 [INFO][4781] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" Namespace="kube-system" Pod="coredns-7d764666f9-rts4t" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--rts4t-eth0" Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.094 [INFO][4797] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" HandleID="k8s-pod-network.6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" Workload="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--rts4t-eth0" Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.111 [INFO][4797] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" HandleID="k8s-pod-network.6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" Workload="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--rts4t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002f7c90), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-n-7393fd8643", "pod":"coredns-7d764666f9-rts4t", "timestamp":"2026-03-13 00:34:49.094736924 +0000 UTC"}, Hostname:"ci-4459-2-4-n-7393fd8643", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000271760)} Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.111 [INFO][4797] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.111 [INFO][4797] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.111 [INFO][4797] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-7393fd8643' Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.116 [INFO][4797] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.126 [INFO][4797] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.135 [INFO][4797] ipam/ipam.go 526: Trying affinity for 192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.139 [INFO][4797] ipam/ipam.go 160: Attempting to load block cidr=192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.143 [INFO][4797] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.143 [INFO][4797] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.86.192/26 handle="k8s-pod-network.6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.146 [INFO][4797] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335 Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.155 [INFO][4797] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.86.192/26 handle="k8s-pod-network.6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.168 [INFO][4797] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.86.197/26] block=192.168.86.192/26 handle="k8s-pod-network.6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.169 [INFO][4797] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.86.197/26] handle="k8s-pod-network.6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.170 [INFO][4797] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:34:49.276793 containerd[1620]: 2026-03-13 00:34:49.171 [INFO][4797] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.86.197/26] IPv6=[] ContainerID="6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" HandleID="k8s-pod-network.6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" Workload="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--rts4t-eth0" Mar 13 00:34:49.281744 containerd[1620]: 2026-03-13 00:34:49.178 [INFO][4781] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" Namespace="kube-system" Pod="coredns-7d764666f9-rts4t" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--rts4t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--rts4t-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"831fd1c0-4b12-4ba4-b935-44636352b176", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-7393fd8643", ContainerID:"", Pod:"coredns-7d764666f9-rts4t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.86.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9ba6ce1034e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:49.281744 containerd[1620]: 2026-03-13 00:34:49.179 [INFO][4781] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.86.197/32] ContainerID="6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" Namespace="kube-system" Pod="coredns-7d764666f9-rts4t" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--rts4t-eth0" Mar 13 00:34:49.281744 containerd[1620]: 2026-03-13 00:34:49.179 [INFO][4781] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9ba6ce1034e ContainerID="6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" Namespace="kube-system" Pod="coredns-7d764666f9-rts4t" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--rts4t-eth0" Mar 13 00:34:49.281744 containerd[1620]: 2026-03-13 00:34:49.219 [INFO][4781] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" Namespace="kube-system" Pod="coredns-7d764666f9-rts4t" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--rts4t-eth0" Mar 13 00:34:49.281744 containerd[1620]: 2026-03-13 00:34:49.223 [INFO][4781] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" Namespace="kube-system" Pod="coredns-7d764666f9-rts4t" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--rts4t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--rts4t-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"831fd1c0-4b12-4ba4-b935-44636352b176", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-7393fd8643", ContainerID:"6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335", Pod:"coredns-7d764666f9-rts4t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.86.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9ba6ce1034e", MAC:"7a:25:73:a2:c5:1e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:49.284649 containerd[1620]: 2026-03-13 00:34:49.262 [INFO][4781] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" Namespace="kube-system" Pod="coredns-7d764666f9-rts4t" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-coredns--7d764666f9--rts4t-eth0" Mar 13 00:34:49.350666 containerd[1620]: time="2026-03-13T00:34:49.348809321Z" level=info msg="connecting to shim 6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335" address="unix:///run/containerd/s/5ba4407c2fbf06a052db98d41771c1bf508f908e5182ced38a3d920b619ef94e" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:49.451145 systemd[1]: Started cri-containerd-6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335.scope - libcontainer container 6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335. Mar 13 00:34:49.458376 systemd-networkd[1483]: calic3c793bd544: Link UP Mar 13 00:34:49.460807 systemd-networkd[1483]: calic3c793bd544: Gained carrier Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.069 [INFO][4772] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--7393fd8643-k8s-calico--kube--controllers--7f89cbb487--v9qld-eth0 calico-kube-controllers-7f89cbb487- calico-system f89ecdfa-6054-4fb2-b6f9-3c1b8fdaf628 821 0 2026-03-13 00:34:17 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7f89cbb487 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-4-n-7393fd8643 calico-kube-controllers-7f89cbb487-v9qld eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic3c793bd544 [] [] }} ContainerID="244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" Namespace="calico-system" Pod="calico-kube-controllers-7f89cbb487-v9qld" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--kube--controllers--7f89cbb487--v9qld-" Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.069 [INFO][4772] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" Namespace="calico-system" Pod="calico-kube-controllers-7f89cbb487-v9qld" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--kube--controllers--7f89cbb487--v9qld-eth0" Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.188 [INFO][4803] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" HandleID="k8s-pod-network.244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" Workload="ci--4459--2--4--n--7393fd8643-k8s-calico--kube--controllers--7f89cbb487--v9qld-eth0" Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.239 [INFO][4803] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" HandleID="k8s-pod-network.244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" Workload="ci--4459--2--4--n--7393fd8643-k8s-calico--kube--controllers--7f89cbb487--v9qld-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000386370), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-7393fd8643", "pod":"calico-kube-controllers-7f89cbb487-v9qld", "timestamp":"2026-03-13 00:34:49.18880417 +0000 UTC"}, Hostname:"ci-4459-2-4-n-7393fd8643", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00053d8c0)} Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.239 [INFO][4803] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.240 [INFO][4803] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.240 [INFO][4803] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-7393fd8643' Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.264 [INFO][4803] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.276 [INFO][4803] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.309 [INFO][4803] ipam/ipam.go 526: Trying affinity for 192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.342 [INFO][4803] ipam/ipam.go 160: Attempting to load block cidr=192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.359 [INFO][4803] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.359 [INFO][4803] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.86.192/26 handle="k8s-pod-network.244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.364 [INFO][4803] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600 Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.379 [INFO][4803] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.86.192/26 handle="k8s-pod-network.244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.406 [INFO][4803] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.86.198/26] block=192.168.86.192/26 handle="k8s-pod-network.244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.406 [INFO][4803] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.86.198/26] handle="k8s-pod-network.244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.406 [INFO][4803] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:34:49.498869 containerd[1620]: 2026-03-13 00:34:49.406 [INFO][4803] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.86.198/26] IPv6=[] ContainerID="244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" HandleID="k8s-pod-network.244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" Workload="ci--4459--2--4--n--7393fd8643-k8s-calico--kube--controllers--7f89cbb487--v9qld-eth0" Mar 13 00:34:49.499985 containerd[1620]: 2026-03-13 00:34:49.426 [INFO][4772] cni-plugin/k8s.go 418: Populated endpoint ContainerID="244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" Namespace="calico-system" Pod="calico-kube-controllers-7f89cbb487-v9qld" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--kube--controllers--7f89cbb487--v9qld-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--7393fd8643-k8s-calico--kube--controllers--7f89cbb487--v9qld-eth0", GenerateName:"calico-kube-controllers-7f89cbb487-", Namespace:"calico-system", SelfLink:"", UID:"f89ecdfa-6054-4fb2-b6f9-3c1b8fdaf628", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7f89cbb487", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-7393fd8643", ContainerID:"", Pod:"calico-kube-controllers-7f89cbb487-v9qld", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.86.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic3c793bd544", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:49.499985 containerd[1620]: 2026-03-13 00:34:49.426 [INFO][4772] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.86.198/32] ContainerID="244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" Namespace="calico-system" Pod="calico-kube-controllers-7f89cbb487-v9qld" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--kube--controllers--7f89cbb487--v9qld-eth0" Mar 13 00:34:49.499985 containerd[1620]: 2026-03-13 00:34:49.426 [INFO][4772] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic3c793bd544 ContainerID="244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" Namespace="calico-system" Pod="calico-kube-controllers-7f89cbb487-v9qld" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--kube--controllers--7f89cbb487--v9qld-eth0" Mar 13 00:34:49.499985 containerd[1620]: 2026-03-13 00:34:49.463 [INFO][4772] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" Namespace="calico-system" Pod="calico-kube-controllers-7f89cbb487-v9qld" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--kube--controllers--7f89cbb487--v9qld-eth0" Mar 13 00:34:49.499985 containerd[1620]: 2026-03-13 00:34:49.465 [INFO][4772] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" Namespace="calico-system" Pod="calico-kube-controllers-7f89cbb487-v9qld" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--kube--controllers--7f89cbb487--v9qld-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--7393fd8643-k8s-calico--kube--controllers--7f89cbb487--v9qld-eth0", GenerateName:"calico-kube-controllers-7f89cbb487-", Namespace:"calico-system", SelfLink:"", UID:"f89ecdfa-6054-4fb2-b6f9-3c1b8fdaf628", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7f89cbb487", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-7393fd8643", ContainerID:"244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600", Pod:"calico-kube-controllers-7f89cbb487-v9qld", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.86.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic3c793bd544", MAC:"fa:0c:5d:84:b7:0b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:49.499985 containerd[1620]: 2026-03-13 00:34:49.492 [INFO][4772] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" Namespace="calico-system" Pod="calico-kube-controllers-7f89cbb487-v9qld" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--kube--controllers--7f89cbb487--v9qld-eth0" Mar 13 00:34:49.543245 containerd[1620]: time="2026-03-13T00:34:49.542438753Z" level=info msg="connecting to shim 244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600" address="unix:///run/containerd/s/6629ea8eef67b58e38668df73ec0462b7ef1f8603602ca6105ee8a9d668eb626" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:49.592128 containerd[1620]: time="2026-03-13T00:34:49.592068480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-rts4t,Uid:831fd1c0-4b12-4ba4-b935-44636352b176,Namespace:kube-system,Attempt:0,} returns sandbox id \"6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335\"" Mar 13 00:34:49.606561 containerd[1620]: time="2026-03-13T00:34:49.605279056Z" level=info msg="CreateContainer within sandbox \"6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 13 00:34:49.619762 containerd[1620]: time="2026-03-13T00:34:49.619204263Z" level=info msg="Container 8c724b793bb13d752769f7cd5d5db1ec6e45a66308856c5cb3501e717133d3f8: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:49.621272 systemd[1]: Started cri-containerd-244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600.scope - libcontainer container 244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600. Mar 13 00:34:49.646810 containerd[1620]: time="2026-03-13T00:34:49.646767497Z" level=info msg="CreateContainer within sandbox \"6570dd9b3077f80dd4c2667759f311a47342670b29f9f0f729a8d5e27f5b9335\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8c724b793bb13d752769f7cd5d5db1ec6e45a66308856c5cb3501e717133d3f8\"" Mar 13 00:34:49.649962 containerd[1620]: time="2026-03-13T00:34:49.649923316Z" level=info msg="StartContainer for \"8c724b793bb13d752769f7cd5d5db1ec6e45a66308856c5cb3501e717133d3f8\"" Mar 13 00:34:49.653438 containerd[1620]: time="2026-03-13T00:34:49.653393584Z" level=info msg="connecting to shim 8c724b793bb13d752769f7cd5d5db1ec6e45a66308856c5cb3501e717133d3f8" address="unix:///run/containerd/s/5ba4407c2fbf06a052db98d41771c1bf508f908e5182ced38a3d920b619ef94e" protocol=ttrpc version=3 Mar 13 00:34:49.686682 systemd[1]: Started cri-containerd-8c724b793bb13d752769f7cd5d5db1ec6e45a66308856c5cb3501e717133d3f8.scope - libcontainer container 8c724b793bb13d752769f7cd5d5db1ec6e45a66308856c5cb3501e717133d3f8. Mar 13 00:34:49.762133 containerd[1620]: time="2026-03-13T00:34:49.761732818Z" level=info msg="StartContainer for \"8c724b793bb13d752769f7cd5d5db1ec6e45a66308856c5cb3501e717133d3f8\" returns successfully" Mar 13 00:34:49.790393 systemd-networkd[1483]: cali4c34978564e: Gained IPv6LL Mar 13 00:34:49.793355 containerd[1620]: time="2026-03-13T00:34:49.793246430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7f89cbb487-v9qld,Uid:f89ecdfa-6054-4fb2-b6f9-3c1b8fdaf628,Namespace:calico-system,Attempt:0,} returns sandbox id \"244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600\"" Mar 13 00:34:49.919647 containerd[1620]: time="2026-03-13T00:34:49.919359018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57869c95c-cw5r5,Uid:7dbc3d6d-6370-496b-9a6f-d93cadf0acb6,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:49.919803 containerd[1620]: time="2026-03-13T00:34:49.919774788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57869c95c-7vd8k,Uid:37779b7e-6012-4701-98cf-613605a31477,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:50.125923 systemd-networkd[1483]: cali825e0ebbe52: Link UP Mar 13 00:34:50.126761 systemd-networkd[1483]: cali825e0ebbe52: Gained carrier Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:49.992 [INFO][4984] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--cw5r5-eth0 calico-apiserver-57869c95c- calico-system 7dbc3d6d-6370-496b-9a6f-d93cadf0acb6 825 0 2026-03-13 00:34:16 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:57869c95c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-n-7393fd8643 calico-apiserver-57869c95c-cw5r5 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali825e0ebbe52 [] [] }} ContainerID="943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" Namespace="calico-system" Pod="calico-apiserver-57869c95c-cw5r5" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--cw5r5-" Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:49.992 [INFO][4984] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" Namespace="calico-system" Pod="calico-apiserver-57869c95c-cw5r5" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--cw5r5-eth0" Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.042 [INFO][5010] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" HandleID="k8s-pod-network.943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" Workload="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--cw5r5-eth0" Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.053 [INFO][5010] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" HandleID="k8s-pod-network.943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" Workload="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--cw5r5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fddd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-7393fd8643", "pod":"calico-apiserver-57869c95c-cw5r5", "timestamp":"2026-03-13 00:34:50.042428791 +0000 UTC"}, Hostname:"ci-4459-2-4-n-7393fd8643", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000379b80)} Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.053 [INFO][5010] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.053 [INFO][5010] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.054 [INFO][5010] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-7393fd8643' Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.058 [INFO][5010] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.066 [INFO][5010] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.073 [INFO][5010] ipam/ipam.go 526: Trying affinity for 192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.076 [INFO][5010] ipam/ipam.go 160: Attempting to load block cidr=192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.083 [INFO][5010] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.086 [INFO][5010] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.86.192/26 handle="k8s-pod-network.943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.092 [INFO][5010] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.100 [INFO][5010] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.86.192/26 handle="k8s-pod-network.943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.113 [INFO][5010] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.86.199/26] block=192.168.86.192/26 handle="k8s-pod-network.943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.115 [INFO][5010] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.86.199/26] handle="k8s-pod-network.943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.115 [INFO][5010] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:34:50.180265 containerd[1620]: 2026-03-13 00:34:50.115 [INFO][5010] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.86.199/26] IPv6=[] ContainerID="943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" HandleID="k8s-pod-network.943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" Workload="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--cw5r5-eth0" Mar 13 00:34:50.180793 containerd[1620]: 2026-03-13 00:34:50.120 [INFO][4984] cni-plugin/k8s.go 418: Populated endpoint ContainerID="943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" Namespace="calico-system" Pod="calico-apiserver-57869c95c-cw5r5" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--cw5r5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--cw5r5-eth0", GenerateName:"calico-apiserver-57869c95c-", Namespace:"calico-system", SelfLink:"", UID:"7dbc3d6d-6370-496b-9a6f-d93cadf0acb6", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57869c95c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-7393fd8643", ContainerID:"", Pod:"calico-apiserver-57869c95c-cw5r5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.86.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali825e0ebbe52", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:50.180793 containerd[1620]: 2026-03-13 00:34:50.121 [INFO][4984] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.86.199/32] ContainerID="943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" Namespace="calico-system" Pod="calico-apiserver-57869c95c-cw5r5" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--cw5r5-eth0" Mar 13 00:34:50.180793 containerd[1620]: 2026-03-13 00:34:50.121 [INFO][4984] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali825e0ebbe52 ContainerID="943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" Namespace="calico-system" Pod="calico-apiserver-57869c95c-cw5r5" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--cw5r5-eth0" Mar 13 00:34:50.180793 containerd[1620]: 2026-03-13 00:34:50.125 [INFO][4984] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" Namespace="calico-system" Pod="calico-apiserver-57869c95c-cw5r5" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--cw5r5-eth0" Mar 13 00:34:50.180793 containerd[1620]: 2026-03-13 00:34:50.126 [INFO][4984] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" Namespace="calico-system" Pod="calico-apiserver-57869c95c-cw5r5" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--cw5r5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--cw5r5-eth0", GenerateName:"calico-apiserver-57869c95c-", Namespace:"calico-system", SelfLink:"", UID:"7dbc3d6d-6370-496b-9a6f-d93cadf0acb6", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57869c95c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-7393fd8643", ContainerID:"943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab", Pod:"calico-apiserver-57869c95c-cw5r5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.86.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali825e0ebbe52", MAC:"42:8f:0d:5c:56:a2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:50.180793 containerd[1620]: 2026-03-13 00:34:50.147 [INFO][4984] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" Namespace="calico-system" Pod="calico-apiserver-57869c95c-cw5r5" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--cw5r5-eth0" Mar 13 00:34:50.244895 kubelet[2792]: I0313 00:34:50.244792 2792 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-rts4t" podStartSLOduration=45.244774364 podStartE2EDuration="45.244774364s" podCreationTimestamp="2026-03-13 00:34:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:34:50.215516389 +0000 UTC m=+51.454991772" watchObservedRunningTime="2026-03-13 00:34:50.244774364 +0000 UTC m=+51.484249657" Mar 13 00:34:50.262482 containerd[1620]: time="2026-03-13T00:34:50.261944520Z" level=info msg="connecting to shim 943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab" address="unix:///run/containerd/s/8cf39b693cf5ed181953530ccb6c6b75fa043748ee4d5e121e797120808d6f53" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:50.292861 systemd-networkd[1483]: cali50b6ba949b1: Link UP Mar 13 00:34:50.298663 systemd-networkd[1483]: cali50b6ba949b1: Gained carrier Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.032 [INFO][4985] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--7vd8k-eth0 calico-apiserver-57869c95c- calico-system 37779b7e-6012-4701-98cf-613605a31477 828 0 2026-03-13 00:34:16 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:57869c95c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-n-7393fd8643 calico-apiserver-57869c95c-7vd8k eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali50b6ba949b1 [] [] }} ContainerID="bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" Namespace="calico-system" Pod="calico-apiserver-57869c95c-7vd8k" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--7vd8k-" Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.032 [INFO][4985] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" Namespace="calico-system" Pod="calico-apiserver-57869c95c-7vd8k" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--7vd8k-eth0" Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.102 [INFO][5018] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" HandleID="k8s-pod-network.bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" Workload="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--7vd8k-eth0" Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.114 [INFO][5018] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" HandleID="k8s-pod-network.bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" Workload="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--7vd8k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ef880), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-7393fd8643", "pod":"calico-apiserver-57869c95c-7vd8k", "timestamp":"2026-03-13 00:34:50.102286129 +0000 UTC"}, Hostname:"ci-4459-2-4-n-7393fd8643", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000400dc0)} Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.115 [INFO][5018] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.115 [INFO][5018] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.116 [INFO][5018] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-7393fd8643' Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.161 [INFO][5018] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.183 [INFO][5018] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.193 [INFO][5018] ipam/ipam.go 526: Trying affinity for 192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.201 [INFO][5018] ipam/ipam.go 160: Attempting to load block cidr=192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.207 [INFO][5018] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.86.192/26 host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.207 [INFO][5018] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.86.192/26 handle="k8s-pod-network.bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.216 [INFO][5018] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49 Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.229 [INFO][5018] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.86.192/26 handle="k8s-pod-network.bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.257 [INFO][5018] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.86.200/26] block=192.168.86.192/26 handle="k8s-pod-network.bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.257 [INFO][5018] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.86.200/26] handle="k8s-pod-network.bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" host="ci-4459-2-4-n-7393fd8643" Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.257 [INFO][5018] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:34:50.340967 containerd[1620]: 2026-03-13 00:34:50.257 [INFO][5018] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.86.200/26] IPv6=[] ContainerID="bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" HandleID="k8s-pod-network.bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" Workload="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--7vd8k-eth0" Mar 13 00:34:50.341769 containerd[1620]: 2026-03-13 00:34:50.272 [INFO][4985] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" Namespace="calico-system" Pod="calico-apiserver-57869c95c-7vd8k" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--7vd8k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--7vd8k-eth0", GenerateName:"calico-apiserver-57869c95c-", Namespace:"calico-system", SelfLink:"", UID:"37779b7e-6012-4701-98cf-613605a31477", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57869c95c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-7393fd8643", ContainerID:"", Pod:"calico-apiserver-57869c95c-7vd8k", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.86.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali50b6ba949b1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:50.341769 containerd[1620]: 2026-03-13 00:34:50.274 [INFO][4985] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.86.200/32] ContainerID="bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" Namespace="calico-system" Pod="calico-apiserver-57869c95c-7vd8k" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--7vd8k-eth0" Mar 13 00:34:50.341769 containerd[1620]: 2026-03-13 00:34:50.274 [INFO][4985] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali50b6ba949b1 ContainerID="bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" Namespace="calico-system" Pod="calico-apiserver-57869c95c-7vd8k" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--7vd8k-eth0" Mar 13 00:34:50.341769 containerd[1620]: 2026-03-13 00:34:50.299 [INFO][4985] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" Namespace="calico-system" Pod="calico-apiserver-57869c95c-7vd8k" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--7vd8k-eth0" Mar 13 00:34:50.341769 containerd[1620]: 2026-03-13 00:34:50.300 [INFO][4985] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" Namespace="calico-system" Pod="calico-apiserver-57869c95c-7vd8k" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--7vd8k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--7vd8k-eth0", GenerateName:"calico-apiserver-57869c95c-", Namespace:"calico-system", SelfLink:"", UID:"37779b7e-6012-4701-98cf-613605a31477", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57869c95c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-7393fd8643", ContainerID:"bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49", Pod:"calico-apiserver-57869c95c-7vd8k", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.86.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali50b6ba949b1", MAC:"32:d6:fa:6b:01:16", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:50.341769 containerd[1620]: 2026-03-13 00:34:50.320 [INFO][4985] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" Namespace="calico-system" Pod="calico-apiserver-57869c95c-7vd8k" WorkloadEndpoint="ci--4459--2--4--n--7393fd8643-k8s-calico--apiserver--57869c95c--7vd8k-eth0" Mar 13 00:34:50.343378 systemd[1]: Started cri-containerd-943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab.scope - libcontainer container 943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab. Mar 13 00:34:50.421890 containerd[1620]: time="2026-03-13T00:34:50.421806352Z" level=info msg="connecting to shim bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49" address="unix:///run/containerd/s/1a94f77405be22b1942dafe106edbbef4c57509ef507abfbc105093bcc7d070d" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:50.484418 systemd[1]: Started cri-containerd-bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49.scope - libcontainer container bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49. Mar 13 00:34:50.510572 containerd[1620]: time="2026-03-13T00:34:50.510532545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57869c95c-cw5r5,Uid:7dbc3d6d-6370-496b-9a6f-d93cadf0acb6,Namespace:calico-system,Attempt:0,} returns sandbox id \"943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab\"" Mar 13 00:34:50.600744 containerd[1620]: time="2026-03-13T00:34:50.600698059Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57869c95c-7vd8k,Uid:37779b7e-6012-4701-98cf-613605a31477,Namespace:calico-system,Attempt:0,} returns sandbox id \"bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49\"" Mar 13 00:34:50.933301 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount725613369.mount: Deactivated successfully. Mar 13 00:34:51.070622 systemd-networkd[1483]: cali9ba6ce1034e: Gained IPv6LL Mar 13 00:34:51.220621 containerd[1620]: time="2026-03-13T00:34:51.220429150Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:51.221959 containerd[1620]: time="2026-03-13T00:34:51.221799000Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Mar 13 00:34:51.223700 containerd[1620]: time="2026-03-13T00:34:51.223622680Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:51.226844 containerd[1620]: time="2026-03-13T00:34:51.226760389Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:51.227901 containerd[1620]: time="2026-03-13T00:34:51.227649940Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 4.043316954s" Mar 13 00:34:51.227901 containerd[1620]: time="2026-03-13T00:34:51.227713900Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Mar 13 00:34:51.229677 containerd[1620]: time="2026-03-13T00:34:51.229624140Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 13 00:34:51.235136 containerd[1620]: time="2026-03-13T00:34:51.234451569Z" level=info msg="CreateContainer within sandbox \"38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 13 00:34:51.246181 containerd[1620]: time="2026-03-13T00:34:51.244308618Z" level=info msg="Container 1b849bd1a723aae580c5010a957b8047f9812a534855d95fc794c4e76d728a65: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:51.263188 containerd[1620]: time="2026-03-13T00:34:51.263094905Z" level=info msg="CreateContainer within sandbox \"38b8689fabc5389a706009ddf4e3de456299a6ad1b60431d52f6c79b6e1ce038\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"1b849bd1a723aae580c5010a957b8047f9812a534855d95fc794c4e76d728a65\"" Mar 13 00:34:51.266522 containerd[1620]: time="2026-03-13T00:34:51.266460395Z" level=info msg="StartContainer for \"1b849bd1a723aae580c5010a957b8047f9812a534855d95fc794c4e76d728a65\"" Mar 13 00:34:51.268369 containerd[1620]: time="2026-03-13T00:34:51.268240045Z" level=info msg="connecting to shim 1b849bd1a723aae580c5010a957b8047f9812a534855d95fc794c4e76d728a65" address="unix:///run/containerd/s/99daa62cf2a7cf00c02ac97a7c9729e01c52708d3ec53a2d16ab4262641d4423" protocol=ttrpc version=3 Mar 13 00:34:51.306445 systemd[1]: Started cri-containerd-1b849bd1a723aae580c5010a957b8047f9812a534855d95fc794c4e76d728a65.scope - libcontainer container 1b849bd1a723aae580c5010a957b8047f9812a534855d95fc794c4e76d728a65. Mar 13 00:34:51.326428 systemd-networkd[1483]: calic3c793bd544: Gained IPv6LL Mar 13 00:34:51.328413 systemd-networkd[1483]: cali825e0ebbe52: Gained IPv6LL Mar 13 00:34:51.382086 containerd[1620]: time="2026-03-13T00:34:51.382034621Z" level=info msg="StartContainer for \"1b849bd1a723aae580c5010a957b8047f9812a534855d95fc794c4e76d728a65\" returns successfully" Mar 13 00:34:51.966403 systemd-networkd[1483]: cali50b6ba949b1: Gained IPv6LL Mar 13 00:34:55.054515 containerd[1620]: time="2026-03-13T00:34:55.054448820Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:55.056095 containerd[1620]: time="2026-03-13T00:34:55.055917300Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Mar 13 00:34:55.057133 containerd[1620]: time="2026-03-13T00:34:55.057093000Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:55.060220 containerd[1620]: time="2026-03-13T00:34:55.060162291Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:55.061147 containerd[1620]: time="2026-03-13T00:34:55.061124090Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 3.8314292s" Mar 13 00:34:55.061344 containerd[1620]: time="2026-03-13T00:34:55.061237651Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Mar 13 00:34:55.062627 containerd[1620]: time="2026-03-13T00:34:55.062599761Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 13 00:34:55.084457 containerd[1620]: time="2026-03-13T00:34:55.084372143Z" level=info msg="CreateContainer within sandbox \"244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 13 00:34:55.092885 containerd[1620]: time="2026-03-13T00:34:55.092830994Z" level=info msg="Container 2792d0c61cc109789f5df52f2b76c7cb581f574ad068e46f8a632cad8e49c8b7: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:55.105858 containerd[1620]: time="2026-03-13T00:34:55.105799655Z" level=info msg="CreateContainer within sandbox \"244a9dead2537679c018baea1cf453c4580ee1f31a2315e4cc3b0bf3dc46f600\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"2792d0c61cc109789f5df52f2b76c7cb581f574ad068e46f8a632cad8e49c8b7\"" Mar 13 00:34:55.106536 containerd[1620]: time="2026-03-13T00:34:55.106504105Z" level=info msg="StartContainer for \"2792d0c61cc109789f5df52f2b76c7cb581f574ad068e46f8a632cad8e49c8b7\"" Mar 13 00:34:55.108013 containerd[1620]: time="2026-03-13T00:34:55.107987775Z" level=info msg="connecting to shim 2792d0c61cc109789f5df52f2b76c7cb581f574ad068e46f8a632cad8e49c8b7" address="unix:///run/containerd/s/6629ea8eef67b58e38668df73ec0462b7ef1f8603602ca6105ee8a9d668eb626" protocol=ttrpc version=3 Mar 13 00:34:55.139321 systemd[1]: Started cri-containerd-2792d0c61cc109789f5df52f2b76c7cb581f574ad068e46f8a632cad8e49c8b7.scope - libcontainer container 2792d0c61cc109789f5df52f2b76c7cb581f574ad068e46f8a632cad8e49c8b7. Mar 13 00:34:55.197581 containerd[1620]: time="2026-03-13T00:34:55.197488664Z" level=info msg="StartContainer for \"2792d0c61cc109789f5df52f2b76c7cb581f574ad068e46f8a632cad8e49c8b7\" returns successfully" Mar 13 00:34:55.232755 kubelet[2792]: I0313 00:34:55.232628 2792 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/goldmane-9f7667bb8-zl7lf" podStartSLOduration=34.187344213 podStartE2EDuration="38.232608397s" podCreationTimestamp="2026-03-13 00:34:17 +0000 UTC" firstStartedPulling="2026-03-13 00:34:47.183945395 +0000 UTC m=+48.423420678" lastFinishedPulling="2026-03-13 00:34:51.229209569 +0000 UTC m=+52.468684862" observedRunningTime="2026-03-13 00:34:52.215089684 +0000 UTC m=+53.454565027" watchObservedRunningTime="2026-03-13 00:34:55.232608397 +0000 UTC m=+56.472083690" Mar 13 00:34:56.287300 kubelet[2792]: I0313 00:34:56.287076 2792 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7f89cbb487-v9qld" podStartSLOduration=34.021385074 podStartE2EDuration="39.287062235s" podCreationTimestamp="2026-03-13 00:34:17 +0000 UTC" firstStartedPulling="2026-03-13 00:34:49.796554159 +0000 UTC m=+51.036029462" lastFinishedPulling="2026-03-13 00:34:55.06223133 +0000 UTC m=+56.301706623" observedRunningTime="2026-03-13 00:34:55.236471218 +0000 UTC m=+56.475946521" watchObservedRunningTime="2026-03-13 00:34:56.287062235 +0000 UTC m=+57.526537528" Mar 13 00:34:57.309842 containerd[1620]: time="2026-03-13T00:34:57.309792625Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:57.311061 containerd[1620]: time="2026-03-13T00:34:57.311036005Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Mar 13 00:34:57.312208 containerd[1620]: time="2026-03-13T00:34:57.312165886Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:57.314299 containerd[1620]: time="2026-03-13T00:34:57.314258375Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:57.314816 containerd[1620]: time="2026-03-13T00:34:57.314732866Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 2.251981085s" Mar 13 00:34:57.314816 containerd[1620]: time="2026-03-13T00:34:57.314763596Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 13 00:34:57.316507 containerd[1620]: time="2026-03-13T00:34:57.316480356Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 13 00:34:57.318887 containerd[1620]: time="2026-03-13T00:34:57.318832546Z" level=info msg="CreateContainer within sandbox \"943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 13 00:34:57.329607 containerd[1620]: time="2026-03-13T00:34:57.328686568Z" level=info msg="Container e2866d3c5820f39edcf03bc13a1bb4911e0f890d755ad176e33a9545b98fc856: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:57.340327 containerd[1620]: time="2026-03-13T00:34:57.340282401Z" level=info msg="CreateContainer within sandbox \"943023fbf7e6a096ff12c7241f498d1454e0ae2e8df2b8e5e7ee655ce8645cab\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e2866d3c5820f39edcf03bc13a1bb4911e0f890d755ad176e33a9545b98fc856\"" Mar 13 00:34:57.345236 containerd[1620]: time="2026-03-13T00:34:57.345191981Z" level=info msg="StartContainer for \"e2866d3c5820f39edcf03bc13a1bb4911e0f890d755ad176e33a9545b98fc856\"" Mar 13 00:34:57.347463 containerd[1620]: time="2026-03-13T00:34:57.347429402Z" level=info msg="connecting to shim e2866d3c5820f39edcf03bc13a1bb4911e0f890d755ad176e33a9545b98fc856" address="unix:///run/containerd/s/8cf39b693cf5ed181953530ccb6c6b75fa043748ee4d5e121e797120808d6f53" protocol=ttrpc version=3 Mar 13 00:34:57.375305 systemd[1]: Started cri-containerd-e2866d3c5820f39edcf03bc13a1bb4911e0f890d755ad176e33a9545b98fc856.scope - libcontainer container e2866d3c5820f39edcf03bc13a1bb4911e0f890d755ad176e33a9545b98fc856. Mar 13 00:34:57.426713 containerd[1620]: time="2026-03-13T00:34:57.426631697Z" level=info msg="StartContainer for \"e2866d3c5820f39edcf03bc13a1bb4911e0f890d755ad176e33a9545b98fc856\" returns successfully" Mar 13 00:34:57.801143 containerd[1620]: time="2026-03-13T00:34:57.800903967Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:57.802929 containerd[1620]: time="2026-03-13T00:34:57.802876857Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 13 00:34:57.807324 containerd[1620]: time="2026-03-13T00:34:57.807195558Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 490.682472ms" Mar 13 00:34:57.807324 containerd[1620]: time="2026-03-13T00:34:57.807269188Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 13 00:34:57.815652 containerd[1620]: time="2026-03-13T00:34:57.815056490Z" level=info msg="CreateContainer within sandbox \"bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 13 00:34:57.831932 containerd[1620]: time="2026-03-13T00:34:57.831873133Z" level=info msg="Container 191cf13f94a086a1d459b70c2aeffe00e67ef0c51a7e7e8988aa01789ef3831d: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:57.847990 containerd[1620]: time="2026-03-13T00:34:57.847919646Z" level=info msg="CreateContainer within sandbox \"bdf3d578d60fbf5957c53c54d4f394acbc35946f40afbda4bebeedfdf30d2d49\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"191cf13f94a086a1d459b70c2aeffe00e67ef0c51a7e7e8988aa01789ef3831d\"" Mar 13 00:34:57.849904 containerd[1620]: time="2026-03-13T00:34:57.849852996Z" level=info msg="StartContainer for \"191cf13f94a086a1d459b70c2aeffe00e67ef0c51a7e7e8988aa01789ef3831d\"" Mar 13 00:34:57.851682 containerd[1620]: time="2026-03-13T00:34:57.851604936Z" level=info msg="connecting to shim 191cf13f94a086a1d459b70c2aeffe00e67ef0c51a7e7e8988aa01789ef3831d" address="unix:///run/containerd/s/1a94f77405be22b1942dafe106edbbef4c57509ef507abfbc105093bcc7d070d" protocol=ttrpc version=3 Mar 13 00:34:57.876365 systemd[1]: Started cri-containerd-191cf13f94a086a1d459b70c2aeffe00e67ef0c51a7e7e8988aa01789ef3831d.scope - libcontainer container 191cf13f94a086a1d459b70c2aeffe00e67ef0c51a7e7e8988aa01789ef3831d. Mar 13 00:34:57.954160 containerd[1620]: time="2026-03-13T00:34:57.954007355Z" level=info msg="StartContainer for \"191cf13f94a086a1d459b70c2aeffe00e67ef0c51a7e7e8988aa01789ef3831d\" returns successfully" Mar 13 00:34:58.250284 kubelet[2792]: I0313 00:34:58.250227 2792 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-57869c95c-7vd8k" podStartSLOduration=35.044914992 podStartE2EDuration="42.250215351s" podCreationTimestamp="2026-03-13 00:34:16 +0000 UTC" firstStartedPulling="2026-03-13 00:34:50.603444409 +0000 UTC m=+51.842919702" lastFinishedPulling="2026-03-13 00:34:57.808744768 +0000 UTC m=+59.048220061" observedRunningTime="2026-03-13 00:34:58.250164781 +0000 UTC m=+59.489640074" watchObservedRunningTime="2026-03-13 00:34:58.250215351 +0000 UTC m=+59.489690644" Mar 13 00:34:58.250783 kubelet[2792]: I0313 00:34:58.250299 2792 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-57869c95c-cw5r5" podStartSLOduration=35.456611958 podStartE2EDuration="42.250296281s" podCreationTimestamp="2026-03-13 00:34:16 +0000 UTC" firstStartedPulling="2026-03-13 00:34:50.521929243 +0000 UTC m=+51.761404536" lastFinishedPulling="2026-03-13 00:34:57.315613566 +0000 UTC m=+58.555088859" observedRunningTime="2026-03-13 00:34:58.237714088 +0000 UTC m=+59.477189381" watchObservedRunningTime="2026-03-13 00:34:58.250296281 +0000 UTC m=+59.489771574" Mar 13 00:34:59.231967 kubelet[2792]: I0313 00:34:59.231912 2792 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:34:59.232461 kubelet[2792]: I0313 00:34:59.232433 2792 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:35:01.754931 kubelet[2792]: I0313 00:35:01.754674 2792 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:35:29.305619 kubelet[2792]: I0313 00:35:29.305287 2792 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:35:32.240857 systemd[1]: Started sshd@7-157.180.95.181:22-4.153.228.146:42106.service - OpenSSH per-connection server daemon (4.153.228.146:42106). Mar 13 00:35:32.890836 sshd[5598]: Accepted publickey for core from 4.153.228.146 port 42106 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:35:32.896329 sshd-session[5598]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:32.911610 systemd-logind[1591]: New session 8 of user core. Mar 13 00:35:32.919632 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 13 00:35:33.343219 sshd[5601]: Connection closed by 4.153.228.146 port 42106 Mar 13 00:35:33.346286 sshd-session[5598]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:33.350832 systemd[1]: sshd@7-157.180.95.181:22-4.153.228.146:42106.service: Deactivated successfully. Mar 13 00:35:33.354026 systemd[1]: session-8.scope: Deactivated successfully. Mar 13 00:35:33.358315 systemd-logind[1591]: Session 8 logged out. Waiting for processes to exit. Mar 13 00:35:33.360704 systemd-logind[1591]: Removed session 8. Mar 13 00:35:38.476486 systemd[1]: Started sshd@8-157.180.95.181:22-4.153.228.146:42110.service - OpenSSH per-connection server daemon (4.153.228.146:42110). Mar 13 00:35:39.134170 sshd[5616]: Accepted publickey for core from 4.153.228.146 port 42110 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:35:39.135826 sshd-session[5616]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:39.142567 systemd-logind[1591]: New session 9 of user core. Mar 13 00:35:39.147369 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 13 00:35:39.588237 sshd[5619]: Connection closed by 4.153.228.146 port 42110 Mar 13 00:35:39.589898 sshd-session[5616]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:39.598623 systemd[1]: sshd@8-157.180.95.181:22-4.153.228.146:42110.service: Deactivated successfully. Mar 13 00:35:39.603889 systemd[1]: session-9.scope: Deactivated successfully. Mar 13 00:35:39.611299 systemd-logind[1591]: Session 9 logged out. Waiting for processes to exit. Mar 13 00:35:39.615022 systemd-logind[1591]: Removed session 9. Mar 13 00:35:44.723793 systemd[1]: Started sshd@9-157.180.95.181:22-4.153.228.146:43932.service - OpenSSH per-connection server daemon (4.153.228.146:43932). Mar 13 00:35:45.398205 sshd[5676]: Accepted publickey for core from 4.153.228.146 port 43932 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:35:45.400671 sshd-session[5676]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:45.411472 systemd-logind[1591]: New session 10 of user core. Mar 13 00:35:45.418705 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 13 00:35:45.848274 sshd[5680]: Connection closed by 4.153.228.146 port 43932 Mar 13 00:35:45.849784 sshd-session[5676]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:45.855287 systemd[1]: sshd@9-157.180.95.181:22-4.153.228.146:43932.service: Deactivated successfully. Mar 13 00:35:45.859291 systemd[1]: session-10.scope: Deactivated successfully. Mar 13 00:35:45.863712 systemd-logind[1591]: Session 10 logged out. Waiting for processes to exit. Mar 13 00:35:45.866090 systemd-logind[1591]: Removed session 10. Mar 13 00:35:50.986624 systemd[1]: Started sshd@10-157.180.95.181:22-4.153.228.146:46744.service - OpenSSH per-connection server daemon (4.153.228.146:46744). Mar 13 00:35:51.639353 sshd[5711]: Accepted publickey for core from 4.153.228.146 port 46744 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:35:51.641403 sshd-session[5711]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:51.646493 systemd-logind[1591]: New session 11 of user core. Mar 13 00:35:51.652333 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 13 00:35:52.090339 sshd[5714]: Connection closed by 4.153.228.146 port 46744 Mar 13 00:35:52.091449 sshd-session[5711]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:52.097528 systemd[1]: sshd@10-157.180.95.181:22-4.153.228.146:46744.service: Deactivated successfully. Mar 13 00:35:52.101924 systemd[1]: session-11.scope: Deactivated successfully. Mar 13 00:35:52.105026 systemd-logind[1591]: Session 11 logged out. Waiting for processes to exit. Mar 13 00:35:52.108241 systemd-logind[1591]: Removed session 11. Mar 13 00:35:52.233145 systemd[1]: Started sshd@11-157.180.95.181:22-4.153.228.146:46758.service - OpenSSH per-connection server daemon (4.153.228.146:46758). Mar 13 00:35:52.909687 sshd[5727]: Accepted publickey for core from 4.153.228.146 port 46758 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:35:52.911972 sshd-session[5727]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:52.919589 systemd-logind[1591]: New session 12 of user core. Mar 13 00:35:52.927411 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 13 00:35:53.394839 sshd[5730]: Connection closed by 4.153.228.146 port 46758 Mar 13 00:35:53.396379 sshd-session[5727]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:53.401920 systemd-logind[1591]: Session 12 logged out. Waiting for processes to exit. Mar 13 00:35:53.402410 systemd[1]: sshd@11-157.180.95.181:22-4.153.228.146:46758.service: Deactivated successfully. Mar 13 00:35:53.406138 systemd[1]: session-12.scope: Deactivated successfully. Mar 13 00:35:53.408776 systemd-logind[1591]: Removed session 12. Mar 13 00:35:53.527353 systemd[1]: Started sshd@12-157.180.95.181:22-4.153.228.146:46766.service - OpenSSH per-connection server daemon (4.153.228.146:46766). Mar 13 00:35:54.176196 sshd[5740]: Accepted publickey for core from 4.153.228.146 port 46766 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:35:54.179406 sshd-session[5740]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:54.191260 systemd-logind[1591]: New session 13 of user core. Mar 13 00:35:54.198508 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 13 00:35:54.629138 sshd[5743]: Connection closed by 4.153.228.146 port 46766 Mar 13 00:35:54.628387 sshd-session[5740]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:54.633126 systemd[1]: sshd@12-157.180.95.181:22-4.153.228.146:46766.service: Deactivated successfully. Mar 13 00:35:54.635784 systemd[1]: session-13.scope: Deactivated successfully. Mar 13 00:35:54.637879 systemd-logind[1591]: Session 13 logged out. Waiting for processes to exit. Mar 13 00:35:54.639266 systemd-logind[1591]: Removed session 13. Mar 13 00:35:59.765328 systemd[1]: Started sshd@13-157.180.95.181:22-4.153.228.146:42730.service - OpenSSH per-connection server daemon (4.153.228.146:42730). Mar 13 00:36:00.422562 sshd[5803]: Accepted publickey for core from 4.153.228.146 port 42730 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:36:00.425348 sshd-session[5803]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:36:00.432430 systemd-logind[1591]: New session 14 of user core. Mar 13 00:36:00.443482 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 13 00:36:00.857961 sshd[5806]: Connection closed by 4.153.228.146 port 42730 Mar 13 00:36:00.860577 sshd-session[5803]: pam_unix(sshd:session): session closed for user core Mar 13 00:36:00.867897 systemd[1]: sshd@13-157.180.95.181:22-4.153.228.146:42730.service: Deactivated successfully. Mar 13 00:36:00.871966 systemd[1]: session-14.scope: Deactivated successfully. Mar 13 00:36:00.874588 systemd-logind[1591]: Session 14 logged out. Waiting for processes to exit. Mar 13 00:36:00.877497 systemd-logind[1591]: Removed session 14. Mar 13 00:36:00.993808 systemd[1]: Started sshd@14-157.180.95.181:22-4.153.228.146:42738.service - OpenSSH per-connection server daemon (4.153.228.146:42738). Mar 13 00:36:01.655176 sshd[5818]: Accepted publickey for core from 4.153.228.146 port 42738 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:36:01.658516 sshd-session[5818]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:36:01.667267 systemd-logind[1591]: New session 15 of user core. Mar 13 00:36:01.675415 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 13 00:36:02.319947 sshd[5821]: Connection closed by 4.153.228.146 port 42738 Mar 13 00:36:02.321501 sshd-session[5818]: pam_unix(sshd:session): session closed for user core Mar 13 00:36:02.327707 systemd-logind[1591]: Session 15 logged out. Waiting for processes to exit. Mar 13 00:36:02.327915 systemd[1]: sshd@14-157.180.95.181:22-4.153.228.146:42738.service: Deactivated successfully. Mar 13 00:36:02.330606 systemd[1]: session-15.scope: Deactivated successfully. Mar 13 00:36:02.333741 systemd-logind[1591]: Removed session 15. Mar 13 00:36:02.453628 systemd[1]: Started sshd@15-157.180.95.181:22-4.153.228.146:42746.service - OpenSSH per-connection server daemon (4.153.228.146:42746). Mar 13 00:36:03.119485 sshd[5831]: Accepted publickey for core from 4.153.228.146 port 42746 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:36:03.121722 sshd-session[5831]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:36:03.128883 systemd-logind[1591]: New session 16 of user core. Mar 13 00:36:03.135364 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 13 00:36:04.069690 sshd[5834]: Connection closed by 4.153.228.146 port 42746 Mar 13 00:36:04.071229 sshd-session[5831]: pam_unix(sshd:session): session closed for user core Mar 13 00:36:04.077252 systemd-logind[1591]: Session 16 logged out. Waiting for processes to exit. Mar 13 00:36:04.078066 systemd[1]: sshd@15-157.180.95.181:22-4.153.228.146:42746.service: Deactivated successfully. Mar 13 00:36:04.080699 systemd[1]: session-16.scope: Deactivated successfully. Mar 13 00:36:04.082912 systemd-logind[1591]: Removed session 16. Mar 13 00:36:04.200713 systemd[1]: Started sshd@16-157.180.95.181:22-4.153.228.146:42752.service - OpenSSH per-connection server daemon (4.153.228.146:42752). Mar 13 00:36:04.849377 sshd[5867]: Accepted publickey for core from 4.153.228.146 port 42752 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:36:04.851347 sshd-session[5867]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:36:04.858132 systemd-logind[1591]: New session 17 of user core. Mar 13 00:36:04.862432 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 13 00:36:05.406087 sshd[5870]: Connection closed by 4.153.228.146 port 42752 Mar 13 00:36:05.406897 sshd-session[5867]: pam_unix(sshd:session): session closed for user core Mar 13 00:36:05.413320 systemd[1]: sshd@16-157.180.95.181:22-4.153.228.146:42752.service: Deactivated successfully. Mar 13 00:36:05.416336 systemd[1]: session-17.scope: Deactivated successfully. Mar 13 00:36:05.420874 systemd-logind[1591]: Session 17 logged out. Waiting for processes to exit. Mar 13 00:36:05.422519 systemd-logind[1591]: Removed session 17. Mar 13 00:36:05.537874 systemd[1]: Started sshd@17-157.180.95.181:22-4.153.228.146:42768.service - OpenSSH per-connection server daemon (4.153.228.146:42768). Mar 13 00:36:06.189631 sshd[5882]: Accepted publickey for core from 4.153.228.146 port 42768 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:36:06.191821 sshd-session[5882]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:36:06.197428 systemd-logind[1591]: New session 18 of user core. Mar 13 00:36:06.207400 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 13 00:36:06.625973 sshd[5887]: Connection closed by 4.153.228.146 port 42768 Mar 13 00:36:06.626556 sshd-session[5882]: pam_unix(sshd:session): session closed for user core Mar 13 00:36:06.634003 systemd-logind[1591]: Session 18 logged out. Waiting for processes to exit. Mar 13 00:36:06.634857 systemd[1]: sshd@17-157.180.95.181:22-4.153.228.146:42768.service: Deactivated successfully. Mar 13 00:36:06.638397 systemd[1]: session-18.scope: Deactivated successfully. Mar 13 00:36:06.643053 systemd-logind[1591]: Removed session 18. Mar 13 00:36:11.770782 systemd[1]: Started sshd@18-157.180.95.181:22-4.153.228.146:59894.service - OpenSSH per-connection server daemon (4.153.228.146:59894). Mar 13 00:36:12.440452 sshd[5929]: Accepted publickey for core from 4.153.228.146 port 59894 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:36:12.442854 sshd-session[5929]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:36:12.450418 systemd-logind[1591]: New session 19 of user core. Mar 13 00:36:12.457371 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 13 00:36:12.890383 sshd[5932]: Connection closed by 4.153.228.146 port 59894 Mar 13 00:36:12.891421 sshd-session[5929]: pam_unix(sshd:session): session closed for user core Mar 13 00:36:12.899591 systemd[1]: sshd@18-157.180.95.181:22-4.153.228.146:59894.service: Deactivated successfully. Mar 13 00:36:12.902186 systemd[1]: session-19.scope: Deactivated successfully. Mar 13 00:36:12.903420 systemd-logind[1591]: Session 19 logged out. Waiting for processes to exit. Mar 13 00:36:12.905531 systemd-logind[1591]: Removed session 19. Mar 13 00:36:18.031804 systemd[1]: Started sshd@19-157.180.95.181:22-4.153.228.146:59898.service - OpenSSH per-connection server daemon (4.153.228.146:59898). Mar 13 00:36:18.704685 sshd[5984]: Accepted publickey for core from 4.153.228.146 port 59898 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:36:18.706301 sshd-session[5984]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:36:18.711260 systemd-logind[1591]: New session 20 of user core. Mar 13 00:36:18.717451 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 13 00:36:19.147960 sshd[5987]: Connection closed by 4.153.228.146 port 59898 Mar 13 00:36:19.149949 sshd-session[5984]: pam_unix(sshd:session): session closed for user core Mar 13 00:36:19.154951 systemd[1]: sshd@19-157.180.95.181:22-4.153.228.146:59898.service: Deactivated successfully. Mar 13 00:36:19.158330 systemd-logind[1591]: Session 20 logged out. Waiting for processes to exit. Mar 13 00:36:19.159516 systemd[1]: session-20.scope: Deactivated successfully. Mar 13 00:36:19.162430 systemd-logind[1591]: Removed session 20. Mar 13 00:36:35.544674 systemd[1]: cri-containerd-6cc418d0e56ff32a5ac1bdd3cfacd844ba924d92d3b3d53b618ec8d9c33259d8.scope: Deactivated successfully. Mar 13 00:36:35.545085 systemd[1]: cri-containerd-6cc418d0e56ff32a5ac1bdd3cfacd844ba924d92d3b3d53b618ec8d9c33259d8.scope: Consumed 12.327s CPU time, 128.6M memory peak, 748K read from disk. Mar 13 00:36:35.549626 containerd[1620]: time="2026-03-13T00:36:35.549556659Z" level=info msg="received container exit event container_id:\"6cc418d0e56ff32a5ac1bdd3cfacd844ba924d92d3b3d53b618ec8d9c33259d8\" id:\"6cc418d0e56ff32a5ac1bdd3cfacd844ba924d92d3b3d53b618ec8d9c33259d8\" pid:3118 exit_status:1 exited_at:{seconds:1773362195 nanos:547864996}" Mar 13 00:36:35.582437 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6cc418d0e56ff32a5ac1bdd3cfacd844ba924d92d3b3d53b618ec8d9c33259d8-rootfs.mount: Deactivated successfully. Mar 13 00:36:35.969923 kubelet[2792]: E0313 00:36:35.969875 2792 controller.go:251] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:42418->10.0.0.2:2379: read: connection timed out" Mar 13 00:36:36.097618 systemd[1]: cri-containerd-f659ef2d455179e956f36b02d5c1c7e67eb201aa9ca7f56cb90f78eab9d93ebd.scope: Deactivated successfully. Mar 13 00:36:36.098661 systemd[1]: cri-containerd-f659ef2d455179e956f36b02d5c1c7e67eb201aa9ca7f56cb90f78eab9d93ebd.scope: Consumed 2.893s CPU time, 61.5M memory peak, 64K read from disk. Mar 13 00:36:36.109356 containerd[1620]: time="2026-03-13T00:36:36.109275047Z" level=info msg="received container exit event container_id:\"f659ef2d455179e956f36b02d5c1c7e67eb201aa9ca7f56cb90f78eab9d93ebd\" id:\"f659ef2d455179e956f36b02d5c1c7e67eb201aa9ca7f56cb90f78eab9d93ebd\" pid:2620 exit_status:1 exited_at:{seconds:1773362196 nanos:108780760}" Mar 13 00:36:36.145933 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f659ef2d455179e956f36b02d5c1c7e67eb201aa9ca7f56cb90f78eab9d93ebd-rootfs.mount: Deactivated successfully. Mar 13 00:36:36.492569 kubelet[2792]: I0313 00:36:36.492498 2792 scope.go:122] "RemoveContainer" containerID="f659ef2d455179e956f36b02d5c1c7e67eb201aa9ca7f56cb90f78eab9d93ebd" Mar 13 00:36:36.495832 kubelet[2792]: I0313 00:36:36.495795 2792 scope.go:122] "RemoveContainer" containerID="6cc418d0e56ff32a5ac1bdd3cfacd844ba924d92d3b3d53b618ec8d9c33259d8" Mar 13 00:36:36.496754 containerd[1620]: time="2026-03-13T00:36:36.496640338Z" level=info msg="CreateContainer within sandbox \"0e9968d0f5e68aa830fb5350c73dc172e9df58ae3d2404995c075d519e2496dd\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 13 00:36:36.498137 containerd[1620]: time="2026-03-13T00:36:36.498064842Z" level=info msg="CreateContainer within sandbox \"cbc06db47363952d44679304961a729edd4ca41f68e91357ed4d827d071f470e\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 13 00:36:36.515847 containerd[1620]: time="2026-03-13T00:36:36.514282714Z" level=info msg="Container 45ac46eb94327804a527bde1ce3e36822296ff9cc26f1e35a2f5733cfdb3ee5a: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:36:36.518563 containerd[1620]: time="2026-03-13T00:36:36.516281745Z" level=info msg="Container ff3ca3edec50a2543ca9599d17d3cb02b428a0887eb51afead76dd24c694a28c: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:36:36.526141 containerd[1620]: time="2026-03-13T00:36:36.526093344Z" level=info msg="CreateContainer within sandbox \"0e9968d0f5e68aa830fb5350c73dc172e9df58ae3d2404995c075d519e2496dd\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"45ac46eb94327804a527bde1ce3e36822296ff9cc26f1e35a2f5733cfdb3ee5a\"" Mar 13 00:36:36.526915 containerd[1620]: time="2026-03-13T00:36:36.526847171Z" level=info msg="CreateContainer within sandbox \"cbc06db47363952d44679304961a729edd4ca41f68e91357ed4d827d071f470e\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"ff3ca3edec50a2543ca9599d17d3cb02b428a0887eb51afead76dd24c694a28c\"" Mar 13 00:36:36.528863 containerd[1620]: time="2026-03-13T00:36:36.528830124Z" level=info msg="StartContainer for \"ff3ca3edec50a2543ca9599d17d3cb02b428a0887eb51afead76dd24c694a28c\"" Mar 13 00:36:36.530447 containerd[1620]: time="2026-03-13T00:36:36.530398147Z" level=info msg="connecting to shim ff3ca3edec50a2543ca9599d17d3cb02b428a0887eb51afead76dd24c694a28c" address="unix:///run/containerd/s/afc51b8c363341cab57fc7c55c7b33b71b46e462a6da4f29d5d26a8a5830b7f1" protocol=ttrpc version=3 Mar 13 00:36:36.530853 containerd[1620]: time="2026-03-13T00:36:36.530757735Z" level=info msg="StartContainer for \"45ac46eb94327804a527bde1ce3e36822296ff9cc26f1e35a2f5733cfdb3ee5a\"" Mar 13 00:36:36.532818 containerd[1620]: time="2026-03-13T00:36:36.532775997Z" level=info msg="connecting to shim 45ac46eb94327804a527bde1ce3e36822296ff9cc26f1e35a2f5733cfdb3ee5a" address="unix:///run/containerd/s/7a18c8e172315a4fd36af06976cfc2be4081a82c6bea9e605b3180beede49f52" protocol=ttrpc version=3 Mar 13 00:36:36.563321 systemd[1]: Started cri-containerd-ff3ca3edec50a2543ca9599d17d3cb02b428a0887eb51afead76dd24c694a28c.scope - libcontainer container ff3ca3edec50a2543ca9599d17d3cb02b428a0887eb51afead76dd24c694a28c. Mar 13 00:36:36.566362 systemd[1]: Started cri-containerd-45ac46eb94327804a527bde1ce3e36822296ff9cc26f1e35a2f5733cfdb3ee5a.scope - libcontainer container 45ac46eb94327804a527bde1ce3e36822296ff9cc26f1e35a2f5733cfdb3ee5a. Mar 13 00:36:36.613569 containerd[1620]: time="2026-03-13T00:36:36.613501619Z" level=info msg="StartContainer for \"ff3ca3edec50a2543ca9599d17d3cb02b428a0887eb51afead76dd24c694a28c\" returns successfully" Mar 13 00:36:36.639213 containerd[1620]: time="2026-03-13T00:36:36.639140452Z" level=info msg="StartContainer for \"45ac46eb94327804a527bde1ce3e36822296ff9cc26f1e35a2f5733cfdb3ee5a\" returns successfully" Mar 13 00:36:40.065348 kubelet[2792]: E0313 00:36:40.063251 2792 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:42000->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-2-4-n-7393fd8643.189c3f840ac0c7ea kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-2-4-n-7393fd8643,UID:6e7dd5f523e82d58e4391e484d4a43f5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-n-7393fd8643,},FirstTimestamp:2026-03-13 00:36:29.62577201 +0000 UTC m=+150.865247303,LastTimestamp:2026-03-13 00:36:29.62577201 +0000 UTC m=+150.865247303,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-n-7393fd8643,}" Mar 13 00:36:41.322163 systemd[1]: cri-containerd-88ec2912720f177b0f30cde776237bd8a409f50716142900b067729edfeacd3e.scope: Deactivated successfully. Mar 13 00:36:41.323971 systemd[1]: cri-containerd-88ec2912720f177b0f30cde776237bd8a409f50716142900b067729edfeacd3e.scope: Consumed 1.304s CPU time, 21.1M memory peak, 64K read from disk. Mar 13 00:36:41.326078 containerd[1620]: time="2026-03-13T00:36:41.326015037Z" level=info msg="received container exit event container_id:\"88ec2912720f177b0f30cde776237bd8a409f50716142900b067729edfeacd3e\" id:\"88ec2912720f177b0f30cde776237bd8a409f50716142900b067729edfeacd3e\" pid:2651 exit_status:1 exited_at:{seconds:1773362201 nanos:325334119}" Mar 13 00:36:41.362576 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-88ec2912720f177b0f30cde776237bd8a409f50716142900b067729edfeacd3e-rootfs.mount: Deactivated successfully. Mar 13 00:36:41.519636 kubelet[2792]: I0313 00:36:41.519584 2792 scope.go:122] "RemoveContainer" containerID="88ec2912720f177b0f30cde776237bd8a409f50716142900b067729edfeacd3e" Mar 13 00:36:41.522957 containerd[1620]: time="2026-03-13T00:36:41.522839104Z" level=info msg="CreateContainer within sandbox \"6ff7590e5c496e9f7fad61bfbe1fae1001231010324d04c3928d5ad146a00405\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 13 00:36:41.534443 containerd[1620]: time="2026-03-13T00:36:41.533339764Z" level=info msg="Container 9fc5a5d53a14f205dc271b6f31e3050a50bb3fc17b7d40de2878f1e23135d5f8: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:36:41.544426 containerd[1620]: time="2026-03-13T00:36:41.544352171Z" level=info msg="CreateContainer within sandbox \"6ff7590e5c496e9f7fad61bfbe1fae1001231010324d04c3928d5ad146a00405\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"9fc5a5d53a14f205dc271b6f31e3050a50bb3fc17b7d40de2878f1e23135d5f8\"" Mar 13 00:36:41.546193 containerd[1620]: time="2026-03-13T00:36:41.545681176Z" level=info msg="StartContainer for \"9fc5a5d53a14f205dc271b6f31e3050a50bb3fc17b7d40de2878f1e23135d5f8\"" Mar 13 00:36:41.547918 containerd[1620]: time="2026-03-13T00:36:41.547858858Z" level=info msg="connecting to shim 9fc5a5d53a14f205dc271b6f31e3050a50bb3fc17b7d40de2878f1e23135d5f8" address="unix:///run/containerd/s/1eb4258441f488811d405c6f3ffac03d853ee99c4e915ef71e1ac2903a1f2685" protocol=ttrpc version=3 Mar 13 00:36:41.576339 systemd[1]: Started cri-containerd-9fc5a5d53a14f205dc271b6f31e3050a50bb3fc17b7d40de2878f1e23135d5f8.scope - libcontainer container 9fc5a5d53a14f205dc271b6f31e3050a50bb3fc17b7d40de2878f1e23135d5f8. Mar 13 00:36:41.642398 containerd[1620]: time="2026-03-13T00:36:41.642312132Z" level=info msg="StartContainer for \"9fc5a5d53a14f205dc271b6f31e3050a50bb3fc17b7d40de2878f1e23135d5f8\" returns successfully" Mar 13 00:36:45.611414 systemd[1]: cri-containerd-ff3ca3edec50a2543ca9599d17d3cb02b428a0887eb51afead76dd24c694a28c.scope: Deactivated successfully. Mar 13 00:36:45.616357 containerd[1620]: time="2026-03-13T00:36:45.616299603Z" level=info msg="received container exit event container_id:\"ff3ca3edec50a2543ca9599d17d3cb02b428a0887eb51afead76dd24c694a28c\" id:\"ff3ca3edec50a2543ca9599d17d3cb02b428a0887eb51afead76dd24c694a28c\" pid:6115 exit_status:1 exited_at:{seconds:1773362205 nanos:615825185}" Mar 13 00:36:45.673028 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ff3ca3edec50a2543ca9599d17d3cb02b428a0887eb51afead76dd24c694a28c-rootfs.mount: Deactivated successfully. Mar 13 00:36:45.970816 kubelet[2792]: E0313 00:36:45.970630 2792 controller.go:251] "Failed to update lease" err="Put \"https://157.180.95.181:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-7393fd8643?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 00:36:46.537159 kubelet[2792]: I0313 00:36:46.537041 2792 scope.go:122] "RemoveContainer" containerID="6cc418d0e56ff32a5ac1bdd3cfacd844ba924d92d3b3d53b618ec8d9c33259d8" Mar 13 00:36:46.537547 kubelet[2792]: I0313 00:36:46.537507 2792 scope.go:122] "RemoveContainer" containerID="ff3ca3edec50a2543ca9599d17d3cb02b428a0887eb51afead76dd24c694a28c" Mar 13 00:36:46.537915 kubelet[2792]: E0313 00:36:46.537778 2792 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-6cf4cccc57-679h9_tigera-operator(0938926d-a751-4f1e-9ecb-13b724848847)\"" pod="tigera-operator/tigera-operator-6cf4cccc57-679h9" podUID="0938926d-a751-4f1e-9ecb-13b724848847" Mar 13 00:36:46.540804 containerd[1620]: time="2026-03-13T00:36:46.540422245Z" level=info msg="RemoveContainer for \"6cc418d0e56ff32a5ac1bdd3cfacd844ba924d92d3b3d53b618ec8d9c33259d8\"" Mar 13 00:36:46.549800 containerd[1620]: time="2026-03-13T00:36:46.549715072Z" level=info msg="RemoveContainer for \"6cc418d0e56ff32a5ac1bdd3cfacd844ba924d92d3b3d53b618ec8d9c33259d8\" returns successfully" Mar 13 00:36:55.971643 kubelet[2792]: E0313 00:36:55.971276 2792 controller.go:251] "Failed to update lease" err="Put \"https://157.180.95.181:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-7393fd8643?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)"