Mar 7 01:06:25.010135 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Fri Mar 6 22:58:19 -00 2026 Mar 7 01:06:25.010153 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=531e046a631dbba7b4aae1b7955ffa961f5ce7d570e89a624d767cf739ab70b5 Mar 7 01:06:25.010162 kernel: BIOS-provided physical RAM map: Mar 7 01:06:25.010166 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 7 01:06:25.010171 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ed3efff] usable Mar 7 01:06:25.010175 kernel: BIOS-e820: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Mar 7 01:06:25.010180 kernel: BIOS-e820: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Mar 7 01:06:25.010185 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007f9ecfff] reserved Mar 7 01:06:25.010189 kernel: BIOS-e820: [mem 0x000000007f9ed000-0x000000007faecfff] type 20 Mar 7 01:06:25.010206 kernel: BIOS-e820: [mem 0x000000007faed000-0x000000007fb6cfff] reserved Mar 7 01:06:25.010211 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Mar 7 01:06:25.010218 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Mar 7 01:06:25.010222 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Mar 7 01:06:25.010226 kernel: BIOS-e820: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Mar 7 01:06:25.010232 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 7 01:06:25.010236 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 7 01:06:25.010243 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Mar 7 01:06:25.010248 kernel: BIOS-e820: [mem 0x0000000100000000-0x0000000179ffffff] usable Mar 7 01:06:25.010252 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 7 01:06:25.010257 kernel: NX (Execute Disable) protection: active Mar 7 01:06:25.010261 kernel: APIC: Static calls initialized Mar 7 01:06:25.010266 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Mar 7 01:06:25.010271 kernel: efi: SMBIOS=0x7f988000 SMBIOS 3.0=0x7f986000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7e84f198 Mar 7 01:06:25.010275 kernel: efi: Remove mem137: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Mar 7 01:06:25.010280 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Mar 7 01:06:25.010285 kernel: SMBIOS 3.0.0 present. Mar 7 01:06:25.010289 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Mar 7 01:06:25.010294 kernel: Hypervisor detected: KVM Mar 7 01:06:25.010301 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 7 01:06:25.010306 kernel: kvm-clock: using sched offset of 12963821632 cycles Mar 7 01:06:25.010310 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 7 01:06:25.010315 kernel: tsc: Detected 2396.400 MHz processor Mar 7 01:06:25.010320 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 7 01:06:25.010325 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 7 01:06:25.010330 kernel: last_pfn = 0x17a000 max_arch_pfn = 0x10000000000 Mar 7 01:06:25.010334 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Mar 7 01:06:25.010339 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 7 01:06:25.010346 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Mar 7 01:06:25.010351 kernel: Using GB pages for direct mapping Mar 7 01:06:25.010356 kernel: Secure boot disabled Mar 7 01:06:25.010363 kernel: ACPI: Early table checksum verification disabled Mar 7 01:06:25.010368 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Mar 7 01:06:25.010373 kernel: ACPI: XSDT 0x000000007FB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Mar 7 01:06:25.010378 kernel: ACPI: FACP 0x000000007FB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 01:06:25.010386 kernel: ACPI: DSDT 0x000000007FB7A000 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 01:06:25.010391 kernel: ACPI: FACS 0x000000007FBDD000 000040 Mar 7 01:06:25.010395 kernel: ACPI: APIC 0x000000007FB78000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 01:06:25.010403 kernel: ACPI: HPET 0x000000007FB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 01:06:25.010411 kernel: ACPI: MCFG 0x000000007FB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 01:06:25.010419 kernel: ACPI: WAET 0x000000007FB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 01:06:25.010426 kernel: ACPI: BGRT 0x000000007FB74000 000038 (v01 INTEL EDK2 00000002 01000013) Mar 7 01:06:25.010436 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb79000-0x7fb790f3] Mar 7 01:06:25.010444 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb7a000-0x7fb7c442] Mar 7 01:06:25.010450 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Mar 7 01:06:25.010455 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb78000-0x7fb7807f] Mar 7 01:06:25.010460 kernel: ACPI: Reserving HPET table memory at [mem 0x7fb77000-0x7fb77037] Mar 7 01:06:25.010465 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb76000-0x7fb7603b] Mar 7 01:06:25.010470 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb75000-0x7fb75027] Mar 7 01:06:25.010475 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb74000-0x7fb74037] Mar 7 01:06:25.010480 kernel: No NUMA configuration found Mar 7 01:06:25.010488 kernel: Faking a node at [mem 0x0000000000000000-0x0000000179ffffff] Mar 7 01:06:25.010493 kernel: NODE_DATA(0) allocated [mem 0x179ffa000-0x179ffffff] Mar 7 01:06:25.010498 kernel: Zone ranges: Mar 7 01:06:25.010503 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 7 01:06:25.010508 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 7 01:06:25.010513 kernel: Normal [mem 0x0000000100000000-0x0000000179ffffff] Mar 7 01:06:25.010518 kernel: Movable zone start for each node Mar 7 01:06:25.010523 kernel: Early memory node ranges Mar 7 01:06:25.010528 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Mar 7 01:06:25.010533 kernel: node 0: [mem 0x0000000000100000-0x000000007ed3efff] Mar 7 01:06:25.010540 kernel: node 0: [mem 0x000000007ee00000-0x000000007f8ecfff] Mar 7 01:06:25.010545 kernel: node 0: [mem 0x000000007fbff000-0x000000007ff7bfff] Mar 7 01:06:25.010550 kernel: node 0: [mem 0x0000000100000000-0x0000000179ffffff] Mar 7 01:06:25.010555 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x0000000179ffffff] Mar 7 01:06:25.010560 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 7 01:06:25.010565 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Mar 7 01:06:25.010570 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Mar 7 01:06:25.010575 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Mar 7 01:06:25.010582 kernel: On node 0, zone Normal: 132 pages in unavailable ranges Mar 7 01:06:25.010593 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Mar 7 01:06:25.010600 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 7 01:06:25.010607 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 7 01:06:25.010615 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 7 01:06:25.010623 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 7 01:06:25.010628 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 7 01:06:25.010633 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 7 01:06:25.010638 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 7 01:06:25.010643 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 7 01:06:25.010651 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 7 01:06:25.010655 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Mar 7 01:06:25.010660 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Mar 7 01:06:25.010665 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 7 01:06:25.010670 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Mar 7 01:06:25.010675 kernel: Booting paravirtualized kernel on KVM Mar 7 01:06:25.010680 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 7 01:06:25.010685 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 7 01:06:25.010700 kernel: percpu: Embedded 57 pages/cpu s196328 r8192 d28952 u1048576 Mar 7 01:06:25.010708 kernel: pcpu-alloc: s196328 r8192 d28952 u1048576 alloc=1*2097152 Mar 7 01:06:25.010713 kernel: pcpu-alloc: [0] 0 1 Mar 7 01:06:25.010718 kernel: kvm-guest: PV spinlocks disabled, no host support Mar 7 01:06:25.010724 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=531e046a631dbba7b4aae1b7955ffa961f5ce7d570e89a624d767cf739ab70b5 Mar 7 01:06:25.010729 kernel: random: crng init done Mar 7 01:06:25.010734 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 7 01:06:25.010739 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 7 01:06:25.010744 kernel: Fallback order for Node 0: 0 Mar 7 01:06:25.010752 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1004632 Mar 7 01:06:25.010757 kernel: Policy zone: Normal Mar 7 01:06:25.010762 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 7 01:06:25.010766 kernel: software IO TLB: area num 2. Mar 7 01:06:25.010772 kernel: Memory: 3819392K/4091168K available (12288K kernel code, 2288K rwdata, 22752K rodata, 42892K init, 2304K bss, 271572K reserved, 0K cma-reserved) Mar 7 01:06:25.010777 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 7 01:06:25.010782 kernel: ftrace: allocating 37996 entries in 149 pages Mar 7 01:06:25.010786 kernel: ftrace: allocated 149 pages with 4 groups Mar 7 01:06:25.010791 kernel: Dynamic Preempt: voluntary Mar 7 01:06:25.010799 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 7 01:06:25.010804 kernel: rcu: RCU event tracing is enabled. Mar 7 01:06:25.010810 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 7 01:06:25.010815 kernel: Trampoline variant of Tasks RCU enabled. Mar 7 01:06:25.010827 kernel: Rude variant of Tasks RCU enabled. Mar 7 01:06:25.010835 kernel: Tracing variant of Tasks RCU enabled. Mar 7 01:06:25.010840 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 7 01:06:25.010845 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 7 01:06:25.010851 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 7 01:06:25.010856 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 7 01:06:25.010861 kernel: Console: colour dummy device 80x25 Mar 7 01:06:25.010867 kernel: printk: console [tty0] enabled Mar 7 01:06:25.010875 kernel: printk: console [ttyS0] enabled Mar 7 01:06:25.010880 kernel: ACPI: Core revision 20230628 Mar 7 01:06:25.010885 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Mar 7 01:06:25.010891 kernel: APIC: Switch to symmetric I/O mode setup Mar 7 01:06:25.010896 kernel: x2apic enabled Mar 7 01:06:25.010901 kernel: APIC: Switched APIC routing to: physical x2apic Mar 7 01:06:25.010909 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 7 01:06:25.010914 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Mar 7 01:06:25.010919 kernel: Calibrating delay loop (skipped) preset value.. 4792.80 BogoMIPS (lpj=2396400) Mar 7 01:06:25.010924 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 7 01:06:25.010929 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Mar 7 01:06:25.010935 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Mar 7 01:06:25.010940 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 7 01:06:25.010945 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Mar 7 01:06:25.010953 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 7 01:06:25.010958 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 7 01:06:25.010963 kernel: active return thunk: srso_alias_return_thunk Mar 7 01:06:25.010968 kernel: Speculative Return Stack Overflow: Mitigation: Safe RET Mar 7 01:06:25.010973 kernel: Transient Scheduler Attacks: Forcing mitigation on in a VM Mar 7 01:06:25.010979 kernel: Transient Scheduler Attacks: Vulnerable: Clear CPU buffers attempted, no microcode Mar 7 01:06:25.010984 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 7 01:06:25.010989 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 7 01:06:25.010994 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 7 01:06:25.011003 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Mar 7 01:06:25.011008 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Mar 7 01:06:25.011013 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Mar 7 01:06:25.011018 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Mar 7 01:06:25.011026 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 7 01:06:25.011035 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Mar 7 01:06:25.011043 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Mar 7 01:06:25.011051 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Mar 7 01:06:25.011057 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Mar 7 01:06:25.011065 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Mar 7 01:06:25.011070 kernel: Freeing SMP alternatives memory: 32K Mar 7 01:06:25.011075 kernel: pid_max: default: 32768 minimum: 301 Mar 7 01:06:25.011080 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 7 01:06:25.011086 kernel: landlock: Up and running. Mar 7 01:06:25.011091 kernel: SELinux: Initializing. Mar 7 01:06:25.011096 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 7 01:06:25.011101 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 7 01:06:25.011106 kernel: smpboot: CPU0: AMD EPYC-Genoa Processor (family: 0x19, model: 0x11, stepping: 0x0) Mar 7 01:06:25.011114 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 01:06:25.011119 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 01:06:25.011125 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 01:06:25.011130 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Mar 7 01:06:25.011138 kernel: ... version: 0 Mar 7 01:06:25.011146 kernel: ... bit width: 48 Mar 7 01:06:25.011154 kernel: ... generic registers: 6 Mar 7 01:06:25.011162 kernel: ... value mask: 0000ffffffffffff Mar 7 01:06:25.011171 kernel: ... max period: 00007fffffffffff Mar 7 01:06:25.011182 kernel: ... fixed-purpose events: 0 Mar 7 01:06:25.011190 kernel: ... event mask: 000000000000003f Mar 7 01:06:25.014139 kernel: signal: max sigframe size: 3376 Mar 7 01:06:25.014155 kernel: rcu: Hierarchical SRCU implementation. Mar 7 01:06:25.014161 kernel: rcu: Max phase no-delay instances is 400. Mar 7 01:06:25.014167 kernel: smp: Bringing up secondary CPUs ... Mar 7 01:06:25.014172 kernel: smpboot: x86: Booting SMP configuration: Mar 7 01:06:25.014178 kernel: .... node #0, CPUs: #1 Mar 7 01:06:25.014183 kernel: smp: Brought up 1 node, 2 CPUs Mar 7 01:06:25.014207 kernel: smpboot: Max logical packages: 1 Mar 7 01:06:25.014215 kernel: smpboot: Total of 2 processors activated (9585.60 BogoMIPS) Mar 7 01:06:25.014224 kernel: devtmpfs: initialized Mar 7 01:06:25.014231 kernel: x86/mm: Memory block size: 128MB Mar 7 01:06:25.014237 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Mar 7 01:06:25.014242 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 7 01:06:25.014247 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 7 01:06:25.014253 kernel: pinctrl core: initialized pinctrl subsystem Mar 7 01:06:25.014262 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 7 01:06:25.014274 kernel: audit: initializing netlink subsys (disabled) Mar 7 01:06:25.014282 kernel: audit: type=2000 audit(1772845583.196:1): state=initialized audit_enabled=0 res=1 Mar 7 01:06:25.014289 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 7 01:06:25.014295 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 7 01:06:25.014300 kernel: cpuidle: using governor menu Mar 7 01:06:25.014305 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 7 01:06:25.014311 kernel: dca service started, version 1.12.1 Mar 7 01:06:25.014316 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xe0000000-0xefffffff] (base 0xe0000000) Mar 7 01:06:25.014321 kernel: PCI: Using configuration type 1 for base access Mar 7 01:06:25.014329 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 7 01:06:25.014335 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 7 01:06:25.014340 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 7 01:06:25.014345 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 7 01:06:25.014350 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 7 01:06:25.014355 kernel: ACPI: Added _OSI(Module Device) Mar 7 01:06:25.014360 kernel: ACPI: Added _OSI(Processor Device) Mar 7 01:06:25.014366 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 7 01:06:25.014371 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 7 01:06:25.014378 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 7 01:06:25.014384 kernel: ACPI: Interpreter enabled Mar 7 01:06:25.014389 kernel: ACPI: PM: (supports S0 S5) Mar 7 01:06:25.014394 kernel: ACPI: Using IOAPIC for interrupt routing Mar 7 01:06:25.014399 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 7 01:06:25.014404 kernel: PCI: Using E820 reservations for host bridge windows Mar 7 01:06:25.014409 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 7 01:06:25.014414 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 7 01:06:25.014570 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 7 01:06:25.014682 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Mar 7 01:06:25.014792 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Mar 7 01:06:25.014798 kernel: PCI host bridge to bus 0000:00 Mar 7 01:06:25.014998 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 7 01:06:25.015473 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 7 01:06:25.015611 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 7 01:06:25.015716 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Mar 7 01:06:25.015804 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Mar 7 01:06:25.015892 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc7ffffffff window] Mar 7 01:06:25.015980 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 7 01:06:25.016093 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Mar 7 01:06:25.018166 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 Mar 7 01:06:25.018343 kernel: pci 0000:00:01.0: reg 0x10: [mem 0x80000000-0x807fffff pref] Mar 7 01:06:25.018451 kernel: pci 0000:00:01.0: reg 0x18: [mem 0xc060500000-0xc060503fff 64bit pref] Mar 7 01:06:25.018561 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8138a000-0x8138afff] Mar 7 01:06:25.018659 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xffff0000-0xffffffff pref] Mar 7 01:06:25.018766 kernel: pci 0000:00:01.0: BAR 0: assigned to efifb Mar 7 01:06:25.018864 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 7 01:06:25.018968 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Mar 7 01:06:25.019068 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x81389000-0x81389fff] Mar 7 01:06:25.019173 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Mar 7 01:06:25.019293 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x81388000-0x81388fff] Mar 7 01:06:25.019401 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Mar 7 01:06:25.019500 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x81387000-0x81387fff] Mar 7 01:06:25.019603 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Mar 7 01:06:25.019712 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x81386000-0x81386fff] Mar 7 01:06:25.019816 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Mar 7 01:06:25.019913 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x81385000-0x81385fff] Mar 7 01:06:25.020015 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Mar 7 01:06:25.020111 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x81384000-0x81384fff] Mar 7 01:06:25.021130 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Mar 7 01:06:25.021258 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x81383000-0x81383fff] Mar 7 01:06:25.021370 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Mar 7 01:06:25.021466 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x81382000-0x81382fff] Mar 7 01:06:25.021568 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Mar 7 01:06:25.021676 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x81381000-0x81381fff] Mar 7 01:06:25.021806 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Mar 7 01:06:25.021905 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 7 01:06:25.022012 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Mar 7 01:06:25.022111 kernel: pci 0000:00:1f.2: reg 0x20: [io 0x6040-0x605f] Mar 7 01:06:25.024309 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0x81380000-0x81380fff] Mar 7 01:06:25.024432 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Mar 7 01:06:25.024531 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x6000-0x603f] Mar 7 01:06:25.024639 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Mar 7 01:06:25.024756 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x81200000-0x81200fff] Mar 7 01:06:25.024857 kernel: pci 0000:01:00.0: reg 0x20: [mem 0xc060000000-0xc060003fff 64bit pref] Mar 7 01:06:25.024959 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Mar 7 01:06:25.025058 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 7 01:06:25.025154 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Mar 7 01:06:25.025294 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 7 01:06:25.025405 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Mar 7 01:06:25.025529 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x81100000-0x81103fff 64bit] Mar 7 01:06:25.025632 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 7 01:06:25.025751 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Mar 7 01:06:25.025858 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Mar 7 01:06:25.025959 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x81000000-0x81000fff] Mar 7 01:06:25.026058 kernel: pci 0000:03:00.0: reg 0x20: [mem 0xc060100000-0xc060103fff 64bit pref] Mar 7 01:06:25.026154 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 7 01:06:25.029547 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Mar 7 01:06:25.029662 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 7 01:06:25.029783 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Mar 7 01:06:25.029885 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xc060200000-0xc060203fff 64bit pref] Mar 7 01:06:25.029981 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 7 01:06:25.030076 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 7 01:06:25.030186 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Mar 7 01:06:25.030304 kernel: pci 0000:05:00.0: reg 0x14: [mem 0x80f00000-0x80f00fff] Mar 7 01:06:25.030405 kernel: pci 0000:05:00.0: reg 0x20: [mem 0xc060300000-0xc060303fff 64bit pref] Mar 7 01:06:25.030501 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 7 01:06:25.030596 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Mar 7 01:06:25.030697 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 7 01:06:25.030802 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Mar 7 01:06:25.030901 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x80e00000-0x80e00fff] Mar 7 01:06:25.031004 kernel: pci 0000:06:00.0: reg 0x20: [mem 0xc060400000-0xc060403fff 64bit pref] Mar 7 01:06:25.031098 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 7 01:06:25.031218 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Mar 7 01:06:25.031316 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 7 01:06:25.031322 kernel: acpiphp: Slot [0] registered Mar 7 01:06:25.031428 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Mar 7 01:06:25.031527 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x80c00000-0x80c00fff] Mar 7 01:06:25.031626 kernel: pci 0000:07:00.0: reg 0x20: [mem 0xc000000000-0xc000003fff 64bit pref] Mar 7 01:06:25.031737 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Mar 7 01:06:25.031831 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 7 01:06:25.031925 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Mar 7 01:06:25.032019 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 7 01:06:25.032025 kernel: acpiphp: Slot [0-2] registered Mar 7 01:06:25.032119 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 7 01:06:25.032231 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Mar 7 01:06:25.032326 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 7 01:06:25.032335 kernel: acpiphp: Slot [0-3] registered Mar 7 01:06:25.032432 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 7 01:06:25.032527 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Mar 7 01:06:25.032621 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 7 01:06:25.032627 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 7 01:06:25.032633 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 7 01:06:25.032638 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 7 01:06:25.032643 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 7 01:06:25.032651 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 7 01:06:25.032657 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 7 01:06:25.032662 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 7 01:06:25.032667 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 7 01:06:25.032672 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 7 01:06:25.032678 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 7 01:06:25.032683 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 7 01:06:25.032688 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 7 01:06:25.032702 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 7 01:06:25.032710 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 7 01:06:25.032715 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 7 01:06:25.032720 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 7 01:06:25.032725 kernel: iommu: Default domain type: Translated Mar 7 01:06:25.032731 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 7 01:06:25.032736 kernel: efivars: Registered efivars operations Mar 7 01:06:25.032741 kernel: PCI: Using ACPI for IRQ routing Mar 7 01:06:25.032747 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 7 01:06:25.032752 kernel: e820: reserve RAM buffer [mem 0x7ed3f000-0x7fffffff] Mar 7 01:06:25.032759 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Mar 7 01:06:25.032765 kernel: e820: reserve RAM buffer [mem 0x7ff7c000-0x7fffffff] Mar 7 01:06:25.032770 kernel: e820: reserve RAM buffer [mem 0x17a000000-0x17bffffff] Mar 7 01:06:25.032866 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 7 01:06:25.032961 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 7 01:06:25.033054 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 7 01:06:25.033060 kernel: vgaarb: loaded Mar 7 01:06:25.033066 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Mar 7 01:06:25.033071 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Mar 7 01:06:25.033079 kernel: clocksource: Switched to clocksource kvm-clock Mar 7 01:06:25.033084 kernel: VFS: Disk quotas dquot_6.6.0 Mar 7 01:06:25.033090 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 7 01:06:25.033095 kernel: pnp: PnP ACPI init Mar 7 01:06:25.034817 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Mar 7 01:06:25.034831 kernel: pnp: PnP ACPI: found 5 devices Mar 7 01:06:25.034837 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 7 01:06:25.034843 kernel: NET: Registered PF_INET protocol family Mar 7 01:06:25.034865 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 7 01:06:25.034873 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 7 01:06:25.034879 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 7 01:06:25.034884 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 7 01:06:25.034889 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 7 01:06:25.034895 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 7 01:06:25.034900 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 7 01:06:25.034906 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 7 01:06:25.034911 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 7 01:06:25.034919 kernel: NET: Registered PF_XDP protocol family Mar 7 01:06:25.035037 kernel: pci 0000:01:00.0: can't claim BAR 6 [mem 0xfff80000-0xffffffff pref]: no compatible bridge window Mar 7 01:06:25.035143 kernel: pci 0000:07:00.0: can't claim BAR 6 [mem 0xfff80000-0xffffffff pref]: no compatible bridge window Mar 7 01:06:25.035255 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 7 01:06:25.035353 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 7 01:06:25.035449 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 7 01:06:25.035545 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x1000-0x1fff] Mar 7 01:06:25.035645 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x2000-0x2fff] Mar 7 01:06:25.035753 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x3000-0x3fff] Mar 7 01:06:25.035859 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x81280000-0x812fffff pref] Mar 7 01:06:25.035956 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 7 01:06:25.036056 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Mar 7 01:06:25.036152 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 7 01:06:25.036838 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 7 01:06:25.036942 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Mar 7 01:06:25.037039 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 7 01:06:25.037135 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Mar 7 01:06:25.037244 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 7 01:06:25.037339 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 7 01:06:25.037434 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 7 01:06:25.037534 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 7 01:06:25.037636 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Mar 7 01:06:25.037754 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 7 01:06:25.037879 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 7 01:06:25.037993 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Mar 7 01:06:25.038092 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 7 01:06:25.038235 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x80c80000-0x80cfffff pref] Mar 7 01:06:25.038336 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 7 01:06:25.038436 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Mar 7 01:06:25.038531 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Mar 7 01:06:25.038628 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 7 01:06:25.038731 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 7 01:06:25.038844 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Mar 7 01:06:25.038964 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Mar 7 01:06:25.039063 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 7 01:06:25.039163 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 7 01:06:25.039274 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Mar 7 01:06:25.039372 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Mar 7 01:06:25.039467 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 7 01:06:25.039557 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 7 01:06:25.039645 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 7 01:06:25.039745 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 7 01:06:25.039834 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Mar 7 01:06:25.039956 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Mar 7 01:06:25.040043 kernel: pci_bus 0000:00: resource 9 [mem 0xc000000000-0xc7ffffffff window] Mar 7 01:06:25.040146 kernel: pci_bus 0000:01: resource 1 [mem 0x81200000-0x812fffff] Mar 7 01:06:25.040252 kernel: pci_bus 0000:01: resource 2 [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 7 01:06:25.040351 kernel: pci_bus 0000:02: resource 1 [mem 0x81100000-0x811fffff] Mar 7 01:06:25.040453 kernel: pci_bus 0000:03: resource 1 [mem 0x81000000-0x810fffff] Mar 7 01:06:25.040545 kernel: pci_bus 0000:03: resource 2 [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 7 01:06:25.040646 kernel: pci_bus 0000:04: resource 2 [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 7 01:06:25.040773 kernel: pci_bus 0000:05: resource 1 [mem 0x80f00000-0x80ffffff] Mar 7 01:06:25.040867 kernel: pci_bus 0000:05: resource 2 [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 7 01:06:25.040966 kernel: pci_bus 0000:06: resource 1 [mem 0x80e00000-0x80efffff] Mar 7 01:06:25.041062 kernel: pci_bus 0000:06: resource 2 [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 7 01:06:25.041159 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Mar 7 01:06:25.041271 kernel: pci_bus 0000:07: resource 1 [mem 0x80c00000-0x80dfffff] Mar 7 01:06:25.041363 kernel: pci_bus 0000:07: resource 2 [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 7 01:06:25.041461 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Mar 7 01:06:25.041554 kernel: pci_bus 0000:08: resource 1 [mem 0x80a00000-0x80bfffff] Mar 7 01:06:25.041667 kernel: pci_bus 0000:08: resource 2 [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 7 01:06:25.041793 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Mar 7 01:06:25.041887 kernel: pci_bus 0000:09: resource 1 [mem 0x80800000-0x809fffff] Mar 7 01:06:25.041979 kernel: pci_bus 0000:09: resource 2 [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 7 01:06:25.041987 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 7 01:06:25.041993 kernel: PCI: CLS 0 bytes, default 64 Mar 7 01:06:25.041998 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 7 01:06:25.042004 kernel: software IO TLB: mapped [mem 0x0000000077ffd000-0x000000007bffd000] (64MB) Mar 7 01:06:25.042010 kernel: Initialise system trusted keyrings Mar 7 01:06:25.042018 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 7 01:06:25.042023 kernel: Key type asymmetric registered Mar 7 01:06:25.042029 kernel: Asymmetric key parser 'x509' registered Mar 7 01:06:25.042034 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 7 01:06:25.042040 kernel: io scheduler mq-deadline registered Mar 7 01:06:25.042046 kernel: io scheduler kyber registered Mar 7 01:06:25.042051 kernel: io scheduler bfq registered Mar 7 01:06:25.042151 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 7 01:06:25.045182 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 7 01:06:25.045307 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 7 01:06:25.045405 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 7 01:06:25.045500 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 7 01:06:25.045596 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 7 01:06:25.045718 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 7 01:06:25.045815 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 7 01:06:25.045910 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 7 01:06:25.046004 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 7 01:06:25.046102 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 7 01:06:25.046236 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 7 01:06:25.046334 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 7 01:06:25.046429 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 7 01:06:25.046524 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 7 01:06:25.046618 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 7 01:06:25.046625 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 7 01:06:25.046727 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Mar 7 01:06:25.046826 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Mar 7 01:06:25.046833 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 7 01:06:25.046839 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Mar 7 01:06:25.046844 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 7 01:06:25.046850 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 7 01:06:25.046856 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 7 01:06:25.046861 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 7 01:06:25.046867 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 7 01:06:25.046876 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 7 01:06:25.046981 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 7 01:06:25.047073 kernel: rtc_cmos 00:03: registered as rtc0 Mar 7 01:06:25.047163 kernel: rtc_cmos 00:03: setting system clock to 2026-03-07T01:06:24 UTC (1772845584) Mar 7 01:06:25.048141 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Mar 7 01:06:25.048153 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 7 01:06:25.048160 kernel: efifb: probing for efifb Mar 7 01:06:25.048165 kernel: efifb: framebuffer at 0x80000000, using 4032k, total 4032k Mar 7 01:06:25.048171 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Mar 7 01:06:25.048180 kernel: efifb: scrolling: redraw Mar 7 01:06:25.048186 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 7 01:06:25.048211 kernel: Console: switching to colour frame buffer device 160x50 Mar 7 01:06:25.048232 kernel: fb0: EFI VGA frame buffer device Mar 7 01:06:25.048249 kernel: pstore: Using crash dump compression: deflate Mar 7 01:06:25.048273 kernel: pstore: Registered efi_pstore as persistent store backend Mar 7 01:06:25.048294 kernel: NET: Registered PF_INET6 protocol family Mar 7 01:06:25.048316 kernel: Segment Routing with IPv6 Mar 7 01:06:25.048337 kernel: In-situ OAM (IOAM) with IPv6 Mar 7 01:06:25.048365 kernel: NET: Registered PF_PACKET protocol family Mar 7 01:06:25.048386 kernel: Key type dns_resolver registered Mar 7 01:06:25.048402 kernel: IPI shorthand broadcast: enabled Mar 7 01:06:25.048421 kernel: sched_clock: Marking stable (1362011498, 215343323)->(1634137014, -56782193) Mar 7 01:06:25.048441 kernel: registered taskstats version 1 Mar 7 01:06:25.048457 kernel: Loading compiled-in X.509 certificates Mar 7 01:06:25.048477 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: da286e6f6c247ee6f65a875c513de7da57782e90' Mar 7 01:06:25.048497 kernel: Key type .fscrypt registered Mar 7 01:06:25.048514 kernel: Key type fscrypt-provisioning registered Mar 7 01:06:25.048542 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 7 01:06:25.048558 kernel: ima: Allocated hash algorithm: sha1 Mar 7 01:06:25.048580 kernel: ima: No architecture policies found Mar 7 01:06:25.048595 kernel: clk: Disabling unused clocks Mar 7 01:06:25.048614 kernel: Freeing unused kernel image (initmem) memory: 42892K Mar 7 01:06:25.048634 kernel: Write protecting the kernel read-only data: 36864k Mar 7 01:06:25.048650 kernel: Freeing unused kernel image (rodata/data gap) memory: 1824K Mar 7 01:06:25.048670 kernel: Run /init as init process Mar 7 01:06:25.048685 kernel: with arguments: Mar 7 01:06:25.048721 kernel: /init Mar 7 01:06:25.048737 kernel: with environment: Mar 7 01:06:25.048757 kernel: HOME=/ Mar 7 01:06:25.048773 kernel: TERM=linux Mar 7 01:06:25.048784 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 7 01:06:25.048791 systemd[1]: Detected virtualization kvm. Mar 7 01:06:25.048797 systemd[1]: Detected architecture x86-64. Mar 7 01:06:25.048806 systemd[1]: Running in initrd. Mar 7 01:06:25.048811 systemd[1]: No hostname configured, using default hostname. Mar 7 01:06:25.048817 systemd[1]: Hostname set to . Mar 7 01:06:25.048823 systemd[1]: Initializing machine ID from VM UUID. Mar 7 01:06:25.048829 systemd[1]: Queued start job for default target initrd.target. Mar 7 01:06:25.048835 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 01:06:25.048840 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 01:06:25.048847 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 7 01:06:25.048855 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 7 01:06:25.048861 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 7 01:06:25.048867 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 7 01:06:25.048874 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 7 01:06:25.048880 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 7 01:06:25.048886 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 01:06:25.048892 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 7 01:06:25.048900 systemd[1]: Reached target paths.target - Path Units. Mar 7 01:06:25.048908 systemd[1]: Reached target slices.target - Slice Units. Mar 7 01:06:25.048913 systemd[1]: Reached target swap.target - Swaps. Mar 7 01:06:25.048919 systemd[1]: Reached target timers.target - Timer Units. Mar 7 01:06:25.048925 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 7 01:06:25.048931 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 7 01:06:25.048937 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 7 01:06:25.048942 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 7 01:06:25.048951 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 7 01:06:25.048957 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 7 01:06:25.048963 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 01:06:25.048968 systemd[1]: Reached target sockets.target - Socket Units. Mar 7 01:06:25.048974 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 7 01:06:25.048980 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 7 01:06:25.048986 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 7 01:06:25.048991 systemd[1]: Starting systemd-fsck-usr.service... Mar 7 01:06:25.048997 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 7 01:06:25.049005 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 7 01:06:25.049011 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:06:25.049017 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 7 01:06:25.049039 systemd-journald[189]: Collecting audit messages is disabled. Mar 7 01:06:25.049055 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 01:06:25.049061 systemd[1]: Finished systemd-fsck-usr.service. Mar 7 01:06:25.049067 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 7 01:06:25.049074 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:06:25.049083 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 7 01:06:25.049089 systemd-journald[189]: Journal started Mar 7 01:06:25.049102 systemd-journald[189]: Runtime Journal (/run/log/journal/3b9bc323bb6d4d8e8bdb8041b859ffd3) is 8.0M, max 76.3M, 68.3M free. Mar 7 01:06:25.015580 systemd-modules-load[190]: Inserted module 'overlay' Mar 7 01:06:25.056226 kernel: Bridge firewalling registered Mar 7 01:06:25.059133 systemd-modules-load[190]: Inserted module 'br_netfilter' Mar 7 01:06:25.061967 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 01:06:25.061983 systemd[1]: Started systemd-journald.service - Journal Service. Mar 7 01:06:25.062753 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 7 01:06:25.063861 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 7 01:06:25.069382 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 7 01:06:25.071323 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 7 01:06:25.077054 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 7 01:06:25.079445 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 01:06:25.082503 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 7 01:06:25.091179 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 7 01:06:25.092869 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 01:06:25.099346 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 7 01:06:25.100493 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 01:06:25.103745 dracut-cmdline[216]: dracut-dracut-053 Mar 7 01:06:25.108978 dracut-cmdline[216]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=531e046a631dbba7b4aae1b7955ffa961f5ce7d570e89a624d767cf739ab70b5 Mar 7 01:06:25.124967 systemd-resolved[226]: Positive Trust Anchors: Mar 7 01:06:25.124978 systemd-resolved[226]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 7 01:06:25.125000 systemd-resolved[226]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 7 01:06:25.129437 systemd-resolved[226]: Defaulting to hostname 'linux'. Mar 7 01:06:25.130363 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 7 01:06:25.130947 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 7 01:06:25.183233 kernel: SCSI subsystem initialized Mar 7 01:06:25.191226 kernel: Loading iSCSI transport class v2.0-870. Mar 7 01:06:25.200232 kernel: iscsi: registered transport (tcp) Mar 7 01:06:25.217853 kernel: iscsi: registered transport (qla4xxx) Mar 7 01:06:25.217894 kernel: QLogic iSCSI HBA Driver Mar 7 01:06:25.258447 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 7 01:06:25.263323 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 7 01:06:25.285478 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 7 01:06:25.285512 kernel: device-mapper: uevent: version 1.0.3 Mar 7 01:06:25.289360 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 7 01:06:25.327220 kernel: raid6: avx512x4 gen() 49168 MB/s Mar 7 01:06:25.345216 kernel: raid6: avx512x2 gen() 52409 MB/s Mar 7 01:06:25.363255 kernel: raid6: avx512x1 gen() 46945 MB/s Mar 7 01:06:25.381236 kernel: raid6: avx2x4 gen() 50551 MB/s Mar 7 01:06:25.399244 kernel: raid6: avx2x2 gen() 55751 MB/s Mar 7 01:06:25.418355 kernel: raid6: avx2x1 gen() 43116 MB/s Mar 7 01:06:25.418425 kernel: raid6: using algorithm avx2x2 gen() 55751 MB/s Mar 7 01:06:25.438341 kernel: raid6: .... xor() 35333 MB/s, rmw enabled Mar 7 01:06:25.438408 kernel: raid6: using avx512x2 recovery algorithm Mar 7 01:06:25.455248 kernel: xor: automatically using best checksumming function avx Mar 7 01:06:25.565829 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 7 01:06:25.581084 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 7 01:06:25.586478 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 01:06:25.597190 systemd-udevd[407]: Using default interface naming scheme 'v255'. Mar 7 01:06:25.601037 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 01:06:25.611316 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 7 01:06:25.629466 dracut-pre-trigger[415]: rd.md=0: removing MD RAID activation Mar 7 01:06:25.673875 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 7 01:06:25.680370 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 7 01:06:25.753486 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 01:06:25.763786 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 7 01:06:25.794235 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 7 01:06:25.795495 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 7 01:06:25.795985 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 01:06:25.796583 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 7 01:06:25.803326 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 7 01:06:25.817469 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 7 01:06:25.837224 kernel: scsi host0: Virtio SCSI HBA Mar 7 01:06:25.853224 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Mar 7 01:06:25.864216 kernel: cryptd: max_cpu_qlen set to 1000 Mar 7 01:06:25.867348 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 7 01:06:25.867446 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 01:06:25.867907 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 01:06:25.868220 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 01:06:25.869339 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:06:25.870879 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:06:25.874210 kernel: libata version 3.00 loaded. Mar 7 01:06:25.879418 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:06:25.883347 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 01:06:25.883469 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:06:25.885380 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:06:25.906607 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:06:25.909143 kernel: ahci 0000:00:1f.2: version 3.0 Mar 7 01:06:25.909334 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 7 01:06:25.912207 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Mar 7 01:06:25.913461 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 7 01:06:25.918398 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 01:06:25.919568 kernel: scsi host1: ahci Mar 7 01:06:25.921359 kernel: AVX2 version of gcm_enc/dec engaged. Mar 7 01:06:25.921379 kernel: scsi host2: ahci Mar 7 01:06:25.928233 kernel: scsi host3: ahci Mar 7 01:06:25.932043 kernel: AES CTR mode by8 optimization enabled Mar 7 01:06:25.935780 kernel: scsi host4: ahci Mar 7 01:06:25.935956 kernel: scsi host5: ahci Mar 7 01:06:25.938242 kernel: scsi host6: ahci Mar 7 01:06:25.938392 kernel: ata1: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380100 irq 48 Mar 7 01:06:25.942579 kernel: ata2: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380180 irq 48 Mar 7 01:06:25.946741 kernel: ata3: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380200 irq 48 Mar 7 01:06:25.946759 kernel: ata4: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380280 irq 48 Mar 7 01:06:25.950983 kernel: ata5: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380300 irq 48 Mar 7 01:06:25.951008 kernel: ata6: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380380 irq 48 Mar 7 01:06:25.954875 kernel: sd 0:0:0:0: Power-on or device reset occurred Mar 7 01:06:25.958215 kernel: sd 0:0:0:0: [sda] 160006144 512-byte logical blocks: (81.9 GB/76.3 GiB) Mar 7 01:06:25.960335 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 7 01:06:25.960490 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Mar 7 01:06:25.965353 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Mar 7 01:06:25.965535 kernel: ACPI: bus type USB registered Mar 7 01:06:25.968088 kernel: usbcore: registered new interface driver usbfs Mar 7 01:06:25.973225 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 7 01:06:25.973273 kernel: usbcore: registered new interface driver hub Mar 7 01:06:25.973283 kernel: GPT:17805311 != 160006143 Mar 7 01:06:25.973291 kernel: usbcore: registered new device driver usb Mar 7 01:06:25.973299 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 7 01:06:25.979461 kernel: GPT:17805311 != 160006143 Mar 7 01:06:25.979490 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 7 01:06:25.979500 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:06:25.983637 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 01:06:25.984477 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 7 01:06:26.271250 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 7 01:06:26.271343 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Mar 7 01:06:26.277327 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 7 01:06:26.277375 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Mar 7 01:06:26.282386 kernel: ata1.00: applying bridge limits Mar 7 01:06:26.290772 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 7 01:06:26.296071 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 7 01:06:26.296125 kernel: ata1.00: configured for UDMA/100 Mar 7 01:06:26.304803 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 7 01:06:26.310257 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 7 01:06:26.371304 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 7 01:06:26.371580 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Mar 7 01:06:26.371739 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 7 01:06:26.386279 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Mar 7 01:06:26.386467 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 7 01:06:26.386477 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 7 01:06:26.390645 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Mar 7 01:06:26.390842 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Mar 7 01:06:26.398220 kernel: hub 1-0:1.0: USB hub found Mar 7 01:06:26.398381 kernel: hub 1-0:1.0: 4 ports detected Mar 7 01:06:26.398500 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 7 01:06:26.402147 kernel: hub 2-0:1.0: USB hub found Mar 7 01:06:26.402318 kernel: hub 2-0:1.0: 4 ports detected Mar 7 01:06:26.409214 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/sda6 scanned by (udev-worker) (458) Mar 7 01:06:26.410900 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Mar 7 01:06:26.413286 kernel: BTRFS: device fsid 3bed8db9-42ad-4483-9cc8-1ad17a6cd948 devid 1 transid 34 /dev/sda3 scanned by (udev-worker) (455) Mar 7 01:06:26.413300 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Mar 7 01:06:26.419282 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Mar 7 01:06:26.426761 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Mar 7 01:06:26.427438 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Mar 7 01:06:26.431253 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 7 01:06:26.436301 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 7 01:06:26.442834 disk-uuid[588]: Primary Header is updated. Mar 7 01:06:26.442834 disk-uuid[588]: Secondary Entries is updated. Mar 7 01:06:26.442834 disk-uuid[588]: Secondary Header is updated. Mar 7 01:06:26.449242 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:06:26.455216 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:06:26.460217 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:06:26.636231 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 7 01:06:26.784257 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 7 01:06:26.794783 kernel: usbcore: registered new interface driver usbhid Mar 7 01:06:26.794840 kernel: usbhid: USB HID core driver Mar 7 01:06:26.813855 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Mar 7 01:06:26.813907 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Mar 7 01:06:27.468286 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:06:27.469868 disk-uuid[589]: The operation has completed successfully. Mar 7 01:06:27.542529 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 7 01:06:27.542619 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 7 01:06:27.551316 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 7 01:06:27.553916 sh[610]: Success Mar 7 01:06:27.565222 kernel: device-mapper: verity: sha256 using implementation "sha256-ni" Mar 7 01:06:27.597409 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 7 01:06:27.613148 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 7 01:06:27.614053 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 7 01:06:27.628459 kernel: BTRFS info (device dm-0): first mount of filesystem 3bed8db9-42ad-4483-9cc8-1ad17a6cd948 Mar 7 01:06:27.628483 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 7 01:06:27.632989 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 7 01:06:27.632999 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 7 01:06:27.635266 kernel: BTRFS info (device dm-0): using free space tree Mar 7 01:06:27.645210 kernel: BTRFS info (device dm-0): enabling ssd optimizations Mar 7 01:06:27.646914 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 7 01:06:27.647730 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 7 01:06:27.652296 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 7 01:06:27.653944 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 7 01:06:27.669536 kernel: BTRFS info (device sda6): first mount of filesystem 872bf425-12c9-4ef2-aaf0-71379b3513d9 Mar 7 01:06:27.669562 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 7 01:06:27.669571 kernel: BTRFS info (device sda6): using free space tree Mar 7 01:06:27.677809 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 7 01:06:27.677832 kernel: BTRFS info (device sda6): auto enabling async discard Mar 7 01:06:27.688478 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 7 01:06:27.690313 kernel: BTRFS info (device sda6): last unmount of filesystem 872bf425-12c9-4ef2-aaf0-71379b3513d9 Mar 7 01:06:27.702390 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 7 01:06:27.705971 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 7 01:06:27.745061 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 7 01:06:27.755126 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 7 01:06:27.773651 ignition[739]: Ignition 2.19.0 Mar 7 01:06:27.773664 ignition[739]: Stage: fetch-offline Mar 7 01:06:27.775379 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 7 01:06:27.773708 ignition[739]: no configs at "/usr/lib/ignition/base.d" Mar 7 01:06:27.776127 systemd-networkd[791]: lo: Link UP Mar 7 01:06:27.773717 ignition[739]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 01:06:27.776131 systemd-networkd[791]: lo: Gained carrier Mar 7 01:06:27.773797 ignition[739]: parsed url from cmdline: "" Mar 7 01:06:27.773801 ignition[739]: no config URL provided Mar 7 01:06:27.773806 ignition[739]: reading system config file "/usr/lib/ignition/user.ign" Mar 7 01:06:27.778406 systemd-networkd[791]: Enumeration completed Mar 7 01:06:27.773813 ignition[739]: no config at "/usr/lib/ignition/user.ign" Mar 7 01:06:27.778467 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 7 01:06:27.773818 ignition[739]: failed to fetch config: resource requires networking Mar 7 01:06:27.778871 systemd-networkd[791]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:06:27.773975 ignition[739]: Ignition finished successfully Mar 7 01:06:27.778876 systemd-networkd[791]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 01:06:27.779649 systemd-networkd[791]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:06:27.779653 systemd-networkd[791]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 01:06:27.780246 systemd-networkd[791]: eth0: Link UP Mar 7 01:06:27.780250 systemd-networkd[791]: eth0: Gained carrier Mar 7 01:06:27.780257 systemd-networkd[791]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:06:27.780436 systemd[1]: Reached target network.target - Network. Mar 7 01:06:27.784380 systemd-networkd[791]: eth1: Link UP Mar 7 01:06:27.784384 systemd-networkd[791]: eth1: Gained carrier Mar 7 01:06:27.784390 systemd-networkd[791]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:06:27.788307 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 7 01:06:27.798235 ignition[798]: Ignition 2.19.0 Mar 7 01:06:27.798243 ignition[798]: Stage: fetch Mar 7 01:06:27.798350 ignition[798]: no configs at "/usr/lib/ignition/base.d" Mar 7 01:06:27.798359 ignition[798]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 01:06:27.798413 ignition[798]: parsed url from cmdline: "" Mar 7 01:06:27.798417 ignition[798]: no config URL provided Mar 7 01:06:27.798421 ignition[798]: reading system config file "/usr/lib/ignition/user.ign" Mar 7 01:06:27.798429 ignition[798]: no config at "/usr/lib/ignition/user.ign" Mar 7 01:06:27.798444 ignition[798]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Mar 7 01:06:27.798589 ignition[798]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Mar 7 01:06:27.827230 systemd-networkd[791]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 7 01:06:27.850227 systemd-networkd[791]: eth0: DHCPv4 address 204.168.152.184/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 7 01:06:27.999713 ignition[798]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Mar 7 01:06:28.005167 ignition[798]: GET result: OK Mar 7 01:06:28.005298 ignition[798]: parsing config with SHA512: 52cbe08d55b85303cc3e1fe93998cc3f1242d8a1ee7503f361499e760f64330cedc309c904e87990d89989b04a26dfd47b28107c6d45d8ba86d71ee041af91fa Mar 7 01:06:28.010931 unknown[798]: fetched base config from "system" Mar 7 01:06:28.010949 unknown[798]: fetched base config from "system" Mar 7 01:06:28.011404 ignition[798]: fetch: fetch complete Mar 7 01:06:28.010961 unknown[798]: fetched user config from "hetzner" Mar 7 01:06:28.011415 ignition[798]: fetch: fetch passed Mar 7 01:06:28.011487 ignition[798]: Ignition finished successfully Mar 7 01:06:28.015503 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 7 01:06:28.024436 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 7 01:06:28.058406 ignition[806]: Ignition 2.19.0 Mar 7 01:06:28.058429 ignition[806]: Stage: kargs Mar 7 01:06:28.058781 ignition[806]: no configs at "/usr/lib/ignition/base.d" Mar 7 01:06:28.058803 ignition[806]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 01:06:28.061411 ignition[806]: kargs: kargs passed Mar 7 01:06:28.065660 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 7 01:06:28.061497 ignition[806]: Ignition finished successfully Mar 7 01:06:28.073433 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 7 01:06:28.104229 ignition[812]: Ignition 2.19.0 Mar 7 01:06:28.104249 ignition[812]: Stage: disks Mar 7 01:06:28.104465 ignition[812]: no configs at "/usr/lib/ignition/base.d" Mar 7 01:06:28.104482 ignition[812]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 01:06:28.105477 ignition[812]: disks: disks passed Mar 7 01:06:28.108495 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 7 01:06:28.105537 ignition[812]: Ignition finished successfully Mar 7 01:06:28.110738 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 7 01:06:28.112258 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 7 01:06:28.112903 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 7 01:06:28.114294 systemd[1]: Reached target sysinit.target - System Initialization. Mar 7 01:06:28.115593 systemd[1]: Reached target basic.target - Basic System. Mar 7 01:06:28.121373 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 7 01:06:28.155718 systemd-fsck[820]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 7 01:06:28.160813 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 7 01:06:28.167527 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 7 01:06:28.255213 kernel: EXT4-fs (sda9): mounted filesystem aab0506b-de72-4dd2-9393-24d7958f49a5 r/w with ordered data mode. Quota mode: none. Mar 7 01:06:28.256064 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 7 01:06:28.257937 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 7 01:06:28.264334 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 7 01:06:28.267318 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 7 01:06:28.271209 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 7 01:06:28.271910 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 7 01:06:28.272570 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 7 01:06:28.279221 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by mount (828) Mar 7 01:06:28.286751 kernel: BTRFS info (device sda6): first mount of filesystem 872bf425-12c9-4ef2-aaf0-71379b3513d9 Mar 7 01:06:28.286777 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 7 01:06:28.286787 kernel: BTRFS info (device sda6): using free space tree Mar 7 01:06:28.290056 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 7 01:06:28.294313 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 7 01:06:28.302257 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 7 01:06:28.302301 kernel: BTRFS info (device sda6): auto enabling async discard Mar 7 01:06:28.310271 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 7 01:06:28.339219 coreos-metadata[830]: Mar 07 01:06:28.339 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Mar 7 01:06:28.340416 coreos-metadata[830]: Mar 07 01:06:28.340 INFO Fetch successful Mar 7 01:06:28.341624 coreos-metadata[830]: Mar 07 01:06:28.341 INFO wrote hostname ci-4081-3-6-n-593f1c83d2 to /sysroot/etc/hostname Mar 7 01:06:28.342913 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 7 01:06:28.355615 initrd-setup-root[856]: cut: /sysroot/etc/passwd: No such file or directory Mar 7 01:06:28.361464 initrd-setup-root[863]: cut: /sysroot/etc/group: No such file or directory Mar 7 01:06:28.368102 initrd-setup-root[870]: cut: /sysroot/etc/shadow: No such file or directory Mar 7 01:06:28.372380 initrd-setup-root[877]: cut: /sysroot/etc/gshadow: No such file or directory Mar 7 01:06:28.444329 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 7 01:06:28.449279 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 7 01:06:28.451342 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 7 01:06:28.460217 kernel: BTRFS info (device sda6): last unmount of filesystem 872bf425-12c9-4ef2-aaf0-71379b3513d9 Mar 7 01:06:28.475180 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 7 01:06:28.477571 ignition[944]: INFO : Ignition 2.19.0 Mar 7 01:06:28.478473 ignition[944]: INFO : Stage: mount Mar 7 01:06:28.479490 ignition[944]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 01:06:28.479490 ignition[944]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 01:06:28.479490 ignition[944]: INFO : mount: mount passed Mar 7 01:06:28.480528 ignition[944]: INFO : Ignition finished successfully Mar 7 01:06:28.480440 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 7 01:06:28.485279 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 7 01:06:28.627910 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 7 01:06:28.635432 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 7 01:06:28.662243 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (957) Mar 7 01:06:28.671450 kernel: BTRFS info (device sda6): first mount of filesystem 872bf425-12c9-4ef2-aaf0-71379b3513d9 Mar 7 01:06:28.671505 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 7 01:06:28.677310 kernel: BTRFS info (device sda6): using free space tree Mar 7 01:06:28.694474 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 7 01:06:28.694521 kernel: BTRFS info (device sda6): auto enabling async discard Mar 7 01:06:28.699952 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 7 01:06:28.731440 ignition[973]: INFO : Ignition 2.19.0 Mar 7 01:06:28.732413 ignition[973]: INFO : Stage: files Mar 7 01:06:28.733329 ignition[973]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 01:06:28.735211 ignition[973]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 01:06:28.735211 ignition[973]: DEBUG : files: compiled without relabeling support, skipping Mar 7 01:06:28.737054 ignition[973]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 7 01:06:28.737054 ignition[973]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 7 01:06:28.740894 ignition[973]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 7 01:06:28.741617 ignition[973]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 7 01:06:28.742446 ignition[973]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 7 01:06:28.742301 unknown[973]: wrote ssh authorized keys file for user: core Mar 7 01:06:28.745069 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 7 01:06:28.746253 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Mar 7 01:06:29.018018 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 7 01:06:29.263514 systemd-networkd[791]: eth1: Gained IPv6LL Mar 7 01:06:29.328616 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 7 01:06:29.330656 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 7 01:06:29.330656 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 7 01:06:29.330656 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 7 01:06:29.330656 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 7 01:06:29.330656 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 7 01:06:29.330656 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 7 01:06:29.330656 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 7 01:06:29.330656 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 7 01:06:29.330656 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 7 01:06:29.338313 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 7 01:06:29.338313 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Mar 7 01:06:29.338313 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Mar 7 01:06:29.338313 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Mar 7 01:06:29.338313 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.35.1-x86-64.raw: attempt #1 Mar 7 01:06:29.712015 systemd-networkd[791]: eth0: Gained IPv6LL Mar 7 01:06:29.766331 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 7 01:06:30.100291 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Mar 7 01:06:30.100291 ignition[973]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 7 01:06:30.104477 ignition[973]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 7 01:06:30.104477 ignition[973]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 7 01:06:30.104477 ignition[973]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 7 01:06:30.104477 ignition[973]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 7 01:06:30.104477 ignition[973]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 7 01:06:30.104477 ignition[973]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 7 01:06:30.104477 ignition[973]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 7 01:06:30.104477 ignition[973]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Mar 7 01:06:30.104477 ignition[973]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Mar 7 01:06:30.104477 ignition[973]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 7 01:06:30.104477 ignition[973]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 7 01:06:30.104477 ignition[973]: INFO : files: files passed Mar 7 01:06:30.104477 ignition[973]: INFO : Ignition finished successfully Mar 7 01:06:30.104932 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 7 01:06:30.113768 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 7 01:06:30.119321 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 7 01:06:30.122735 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 7 01:06:30.122887 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 7 01:06:30.132690 initrd-setup-root-after-ignition[1003]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 7 01:06:30.133794 initrd-setup-root-after-ignition[1007]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 7 01:06:30.134765 initrd-setup-root-after-ignition[1003]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 7 01:06:30.136414 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 7 01:06:30.137468 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 7 01:06:30.142334 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 7 01:06:30.163467 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 7 01:06:30.163686 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 7 01:06:30.165500 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 7 01:06:30.166751 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 7 01:06:30.167973 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 7 01:06:30.172373 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 7 01:06:30.185501 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 7 01:06:30.193379 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 7 01:06:30.202237 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 7 01:06:30.203097 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 01:06:30.203536 systemd[1]: Stopped target timers.target - Timer Units. Mar 7 01:06:30.203946 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 7 01:06:30.204022 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 7 01:06:30.205216 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 7 01:06:30.206056 systemd[1]: Stopped target basic.target - Basic System. Mar 7 01:06:30.207088 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 7 01:06:30.208106 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 7 01:06:30.209125 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 7 01:06:30.210149 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 7 01:06:30.211157 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 7 01:06:30.212221 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 7 01:06:30.213244 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 7 01:06:30.214233 systemd[1]: Stopped target swap.target - Swaps. Mar 7 01:06:30.215222 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 7 01:06:30.215369 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 7 01:06:30.216677 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 7 01:06:30.217708 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 01:06:30.218684 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 7 01:06:30.218998 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 01:06:30.220098 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 7 01:06:30.220255 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 7 01:06:30.221547 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 7 01:06:30.221713 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 7 01:06:30.222626 systemd[1]: ignition-files.service: Deactivated successfully. Mar 7 01:06:30.222770 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 7 01:06:30.223775 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 7 01:06:30.223956 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 7 01:06:30.231320 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 7 01:06:30.231831 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 7 01:06:30.231926 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 01:06:30.237344 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 7 01:06:30.237710 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 7 01:06:30.237786 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 01:06:30.238184 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 7 01:06:30.238263 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 7 01:06:30.244325 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 7 01:06:30.244421 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 7 01:06:30.245900 ignition[1027]: INFO : Ignition 2.19.0 Mar 7 01:06:30.246933 ignition[1027]: INFO : Stage: umount Mar 7 01:06:30.246933 ignition[1027]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 01:06:30.246933 ignition[1027]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 01:06:30.249071 ignition[1027]: INFO : umount: umount passed Mar 7 01:06:30.249071 ignition[1027]: INFO : Ignition finished successfully Mar 7 01:06:30.250457 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 7 01:06:30.251241 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 7 01:06:30.252282 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 7 01:06:30.252360 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 7 01:06:30.252746 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 7 01:06:30.252781 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 7 01:06:30.253125 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 7 01:06:30.253157 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 7 01:06:30.255280 systemd[1]: Stopped target network.target - Network. Mar 7 01:06:30.255617 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 7 01:06:30.255669 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 7 01:06:30.256001 systemd[1]: Stopped target paths.target - Path Units. Mar 7 01:06:30.256300 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 7 01:06:30.261241 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 01:06:30.261565 systemd[1]: Stopped target slices.target - Slice Units. Mar 7 01:06:30.261880 systemd[1]: Stopped target sockets.target - Socket Units. Mar 7 01:06:30.262210 systemd[1]: iscsid.socket: Deactivated successfully. Mar 7 01:06:30.262247 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 7 01:06:30.262561 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 7 01:06:30.262591 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 7 01:06:30.262897 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 7 01:06:30.262934 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 7 01:06:30.263271 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 7 01:06:30.263305 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 7 01:06:30.264427 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 7 01:06:30.265179 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 7 01:06:30.266663 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 7 01:06:30.267101 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 7 01:06:30.267179 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 7 01:06:30.267850 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 7 01:06:30.267916 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 7 01:06:30.268237 systemd-networkd[791]: eth0: DHCPv6 lease lost Mar 7 01:06:30.271977 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 7 01:06:30.272077 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 7 01:06:30.274081 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 7 01:06:30.274143 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 01:06:30.274238 systemd-networkd[791]: eth1: DHCPv6 lease lost Mar 7 01:06:30.275485 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 7 01:06:30.275591 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 7 01:06:30.276708 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 7 01:06:30.276764 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 7 01:06:30.282282 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 7 01:06:30.282586 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 7 01:06:30.282628 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 7 01:06:30.282977 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 7 01:06:30.283009 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 7 01:06:30.283347 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 7 01:06:30.283379 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 7 01:06:30.283772 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 01:06:30.295437 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 7 01:06:30.295929 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 01:06:30.296917 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 7 01:06:30.297010 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 7 01:06:30.297866 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 7 01:06:30.297923 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 7 01:06:30.298472 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 7 01:06:30.298504 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 01:06:30.299129 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 7 01:06:30.299167 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 7 01:06:30.300090 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 7 01:06:30.300126 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 7 01:06:30.301064 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 7 01:06:30.301101 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 01:06:30.311325 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 7 01:06:30.311910 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 7 01:06:30.311954 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 01:06:30.312321 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 01:06:30.312355 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:06:30.318356 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 7 01:06:30.318444 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 7 01:06:30.318970 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 7 01:06:30.325309 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 7 01:06:30.331177 systemd[1]: Switching root. Mar 7 01:06:30.381026 systemd-journald[189]: Journal stopped Mar 7 01:06:31.385618 systemd-journald[189]: Received SIGTERM from PID 1 (systemd). Mar 7 01:06:31.385689 kernel: SELinux: policy capability network_peer_controls=1 Mar 7 01:06:31.385705 kernel: SELinux: policy capability open_perms=1 Mar 7 01:06:31.385714 kernel: SELinux: policy capability extended_socket_class=1 Mar 7 01:06:31.385725 kernel: SELinux: policy capability always_check_network=0 Mar 7 01:06:31.385734 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 7 01:06:31.385742 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 7 01:06:31.385750 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 7 01:06:31.385762 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 7 01:06:31.385770 kernel: audit: type=1403 audit(1772845590.541:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 7 01:06:31.385780 systemd[1]: Successfully loaded SELinux policy in 61.976ms. Mar 7 01:06:31.385795 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 9.348ms. Mar 7 01:06:31.385809 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 7 01:06:31.385820 systemd[1]: Detected virtualization kvm. Mar 7 01:06:31.385829 systemd[1]: Detected architecture x86-64. Mar 7 01:06:31.385838 systemd[1]: Detected first boot. Mar 7 01:06:31.385850 systemd[1]: Hostname set to . Mar 7 01:06:31.385861 systemd[1]: Initializing machine ID from VM UUID. Mar 7 01:06:31.385871 zram_generator::config[1070]: No configuration found. Mar 7 01:06:31.385881 systemd[1]: Populated /etc with preset unit settings. Mar 7 01:06:31.385890 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 7 01:06:31.385899 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 7 01:06:31.385907 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 7 01:06:31.385917 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 7 01:06:31.385926 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 7 01:06:31.385934 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 7 01:06:31.385945 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 7 01:06:31.385954 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 7 01:06:31.385963 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 7 01:06:31.385972 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 7 01:06:31.385981 systemd[1]: Created slice user.slice - User and Session Slice. Mar 7 01:06:31.385990 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 01:06:31.385999 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 01:06:31.386008 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 7 01:06:31.386019 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 7 01:06:31.386027 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 7 01:06:31.386036 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 7 01:06:31.386045 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 7 01:06:31.386053 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 01:06:31.386063 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 7 01:06:31.386071 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 7 01:06:31.386083 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 7 01:06:31.386091 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 7 01:06:31.386100 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 01:06:31.386109 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 7 01:06:31.386118 systemd[1]: Reached target slices.target - Slice Units. Mar 7 01:06:31.386127 systemd[1]: Reached target swap.target - Swaps. Mar 7 01:06:31.386136 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 7 01:06:31.386145 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 7 01:06:31.386157 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 7 01:06:31.386168 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 7 01:06:31.386177 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 01:06:31.386186 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 7 01:06:31.387215 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 7 01:06:31.387228 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 7 01:06:31.387237 systemd[1]: Mounting media.mount - External Media Directory... Mar 7 01:06:31.387246 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:06:31.387255 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 7 01:06:31.387263 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 7 01:06:31.387277 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 7 01:06:31.387286 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 7 01:06:31.387295 systemd[1]: Reached target machines.target - Containers. Mar 7 01:06:31.387304 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 7 01:06:31.387313 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 01:06:31.387322 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 7 01:06:31.387331 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 7 01:06:31.387339 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 01:06:31.387351 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 7 01:06:31.387359 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 01:06:31.387368 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 7 01:06:31.387377 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 01:06:31.387386 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 7 01:06:31.387394 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 7 01:06:31.387404 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 7 01:06:31.387413 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 7 01:06:31.387424 systemd[1]: Stopped systemd-fsck-usr.service. Mar 7 01:06:31.387433 kernel: fuse: init (API version 7.39) Mar 7 01:06:31.387442 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 7 01:06:31.387451 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 7 01:06:31.387459 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 7 01:06:31.387468 kernel: loop: module loaded Mar 7 01:06:31.387476 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 7 01:06:31.387502 systemd-journald[1150]: Collecting audit messages is disabled. Mar 7 01:06:31.387524 systemd-journald[1150]: Journal started Mar 7 01:06:31.387544 systemd-journald[1150]: Runtime Journal (/run/log/journal/3b9bc323bb6d4d8e8bdb8041b859ffd3) is 8.0M, max 76.3M, 68.3M free. Mar 7 01:06:31.390228 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 7 01:06:31.081141 systemd[1]: Queued start job for default target multi-user.target. Mar 7 01:06:31.099558 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 7 01:06:31.100468 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 7 01:06:31.397481 kernel: ACPI: bus type drm_connector registered Mar 7 01:06:31.397511 systemd[1]: verity-setup.service: Deactivated successfully. Mar 7 01:06:31.399206 systemd[1]: Stopped verity-setup.service. Mar 7 01:06:31.403251 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:06:31.406212 systemd[1]: Started systemd-journald.service - Journal Service. Mar 7 01:06:31.407257 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 7 01:06:31.407777 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 7 01:06:31.408292 systemd[1]: Mounted media.mount - External Media Directory. Mar 7 01:06:31.408785 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 7 01:06:31.409310 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 7 01:06:31.409802 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 7 01:06:31.410443 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 7 01:06:31.411083 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 01:06:31.411768 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 7 01:06:31.411951 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 7 01:06:31.412630 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 01:06:31.412817 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 01:06:31.413454 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 7 01:06:31.413631 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 7 01:06:31.414413 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 01:06:31.414587 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 01:06:31.415337 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 7 01:06:31.415518 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 7 01:06:31.416136 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 01:06:31.416417 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 01:06:31.417032 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 7 01:06:31.417714 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 7 01:06:31.418549 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 7 01:06:31.428811 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 7 01:06:31.434454 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 7 01:06:31.440256 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 7 01:06:31.441255 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 7 01:06:31.441278 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 7 01:06:31.442961 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Mar 7 01:06:31.447333 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 7 01:06:31.449298 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 7 01:06:31.450667 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 01:06:31.457463 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 7 01:06:31.460169 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 7 01:06:31.461260 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 01:06:31.468472 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 7 01:06:31.469227 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 7 01:06:31.478217 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 7 01:06:31.488971 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 7 01:06:31.492318 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 7 01:06:31.495176 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 7 01:06:31.496255 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 7 01:06:31.497355 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 7 01:06:31.500320 systemd-journald[1150]: Time spent on flushing to /var/log/journal/3b9bc323bb6d4d8e8bdb8041b859ffd3 is 20.727ms for 1179 entries. Mar 7 01:06:31.500320 systemd-journald[1150]: System Journal (/var/log/journal/3b9bc323bb6d4d8e8bdb8041b859ffd3) is 8.0M, max 584.8M, 576.8M free. Mar 7 01:06:31.552389 systemd-journald[1150]: Received client request to flush runtime journal. Mar 7 01:06:31.552420 kernel: loop0: detected capacity change from 0 to 140768 Mar 7 01:06:31.498990 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 7 01:06:31.505320 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 7 01:06:31.514596 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Mar 7 01:06:31.535167 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 01:06:31.538637 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 7 01:06:31.555253 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 7 01:06:31.566176 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 7 01:06:31.567543 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 7 01:06:31.571620 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Mar 7 01:06:31.583417 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 7 01:06:31.600938 udevadm[1199]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 7 01:06:31.610614 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 7 01:06:31.611895 kernel: loop1: detected capacity change from 0 to 217752 Mar 7 01:06:31.629363 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 7 01:06:31.650318 kernel: loop2: detected capacity change from 0 to 8 Mar 7 01:06:31.657218 systemd-tmpfiles[1209]: ACLs are not supported, ignoring. Mar 7 01:06:31.657233 systemd-tmpfiles[1209]: ACLs are not supported, ignoring. Mar 7 01:06:31.662661 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 01:06:31.672236 kernel: loop3: detected capacity change from 0 to 142488 Mar 7 01:06:31.705211 kernel: loop4: detected capacity change from 0 to 140768 Mar 7 01:06:31.724222 kernel: loop5: detected capacity change from 0 to 217752 Mar 7 01:06:31.742221 kernel: loop6: detected capacity change from 0 to 8 Mar 7 01:06:31.745207 kernel: loop7: detected capacity change from 0 to 142488 Mar 7 01:06:31.761670 (sd-merge)[1215]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Mar 7 01:06:31.762238 (sd-merge)[1215]: Merged extensions into '/usr'. Mar 7 01:06:31.767398 systemd[1]: Reloading requested from client PID 1190 ('systemd-sysext') (unit systemd-sysext.service)... Mar 7 01:06:31.767409 systemd[1]: Reloading... Mar 7 01:06:31.844222 zram_generator::config[1241]: No configuration found. Mar 7 01:06:31.870225 ldconfig[1185]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 7 01:06:31.950139 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 01:06:31.986770 systemd[1]: Reloading finished in 218 ms. Mar 7 01:06:32.017050 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 7 01:06:32.017919 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 7 01:06:32.019402 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 7 01:06:32.028315 systemd[1]: Starting ensure-sysext.service... Mar 7 01:06:32.031410 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 7 01:06:32.037743 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 01:06:32.047266 systemd[1]: Reloading requested from client PID 1285 ('systemctl') (unit ensure-sysext.service)... Mar 7 01:06:32.047276 systemd[1]: Reloading... Mar 7 01:06:32.052711 systemd-tmpfiles[1287]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 7 01:06:32.052994 systemd-tmpfiles[1287]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 7 01:06:32.053795 systemd-tmpfiles[1287]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 7 01:06:32.054003 systemd-tmpfiles[1287]: ACLs are not supported, ignoring. Mar 7 01:06:32.054061 systemd-tmpfiles[1287]: ACLs are not supported, ignoring. Mar 7 01:06:32.058017 systemd-tmpfiles[1287]: Detected autofs mount point /boot during canonicalization of boot. Mar 7 01:06:32.058030 systemd-tmpfiles[1287]: Skipping /boot Mar 7 01:06:32.069187 systemd-tmpfiles[1287]: Detected autofs mount point /boot during canonicalization of boot. Mar 7 01:06:32.069212 systemd-tmpfiles[1287]: Skipping /boot Mar 7 01:06:32.079895 systemd-udevd[1288]: Using default interface naming scheme 'v255'. Mar 7 01:06:32.133940 zram_generator::config[1324]: No configuration found. Mar 7 01:06:32.243282 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Mar 7 01:06:32.270505 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 01:06:32.276229 kernel: ACPI: button: Power Button [PWRF] Mar 7 01:06:32.307745 kernel: mousedev: PS/2 mouse device common for all mice Mar 7 01:06:32.311675 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 7 01:06:32.311997 systemd[1]: Reloading finished in 264 ms. Mar 7 01:06:32.314208 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 34 scanned by (udev-worker) (1325) Mar 7 01:06:32.324410 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 01:06:32.325144 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 01:06:32.342270 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Mar 7 01:06:32.347547 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Mar 7 01:06:32.348142 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:06:32.353351 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 7 01:06:32.356775 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 7 01:06:32.357367 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 01:06:32.360316 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 01:06:32.362303 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 01:06:32.365997 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Mar 7 01:06:32.366226 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 7 01:06:32.366365 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Mar 7 01:06:32.366509 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 7 01:06:32.365300 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 01:06:32.369778 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 01:06:32.372309 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 7 01:06:32.380306 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 7 01:06:32.382322 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 7 01:06:32.385336 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 7 01:06:32.385666 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:06:32.401342 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 7 01:06:32.414248 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:06:32.414411 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 01:06:32.414560 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 01:06:32.414649 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:06:32.426830 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:06:32.427574 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 01:06:32.435579 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 7 01:06:32.436225 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 01:06:32.438305 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:06:32.438829 systemd[1]: Finished ensure-sysext.service. Mar 7 01:06:32.456382 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 7 01:06:32.457584 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 01:06:32.458253 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 01:06:32.458942 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 01:06:32.460226 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 01:06:32.469544 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 01:06:32.469698 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 01:06:32.477000 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 7 01:06:32.477578 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 7 01:06:32.480175 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 01:06:32.481411 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 7 01:06:32.488774 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Mar 7 01:06:32.488821 kernel: Console: switching to colour dummy device 80x25 Mar 7 01:06:32.488586 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:06:32.492240 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Mar 7 01:06:32.490836 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 7 01:06:32.495838 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 7 01:06:32.495864 kernel: [drm] features: -context_init Mar 7 01:06:32.495147 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 7 01:06:32.503262 kernel: [drm] number of scanouts: 1 Mar 7 01:06:32.503304 kernel: [drm] number of cap sets: 0 Mar 7 01:06:32.505216 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Mar 7 01:06:32.505324 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 7 01:06:32.525890 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 7 01:06:32.530736 augenrules[1436]: No rules Mar 7 01:06:32.532527 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 7 01:06:32.534943 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 01:06:32.535148 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:06:32.537735 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:06:32.541757 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 7 01:06:32.547456 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 7 01:06:32.548938 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 7 01:06:32.549605 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 7 01:06:32.555209 kernel: EDAC MC: Ver: 3.0.0 Mar 7 01:06:32.568824 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 7 01:06:32.569171 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 7 01:06:32.584778 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Mar 7 01:06:32.584835 kernel: Console: switching to colour frame buffer device 160x50 Mar 7 01:06:32.592214 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 7 01:06:32.599390 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 01:06:32.599566 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:06:32.604310 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:06:32.622513 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 7 01:06:32.634399 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 7 01:06:32.656993 systemd-resolved[1410]: Positive Trust Anchors: Mar 7 01:06:32.657003 systemd-resolved[1410]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 7 01:06:32.657025 systemd-resolved[1410]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 7 01:06:32.658233 lvm[1464]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 7 01:06:32.660139 systemd-networkd[1407]: lo: Link UP Mar 7 01:06:32.660683 systemd-networkd[1407]: lo: Gained carrier Mar 7 01:06:32.662963 systemd-resolved[1410]: Using system hostname 'ci-4081-3-6-n-593f1c83d2'. Mar 7 01:06:32.664175 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 7 01:06:32.665462 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 7 01:06:32.668820 systemd-networkd[1407]: Enumeration completed Mar 7 01:06:32.669294 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 7 01:06:32.670557 systemd[1]: Reached target network.target - Network. Mar 7 01:06:32.672602 systemd-networkd[1407]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:06:32.673141 systemd-networkd[1407]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 01:06:32.675338 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 7 01:06:32.675858 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 7 01:06:32.675944 systemd[1]: Reached target time-set.target - System Time Set. Mar 7 01:06:32.678542 systemd-networkd[1407]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:06:32.678550 systemd-networkd[1407]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 01:06:32.679170 systemd-networkd[1407]: eth0: Link UP Mar 7 01:06:32.679174 systemd-networkd[1407]: eth0: Gained carrier Mar 7 01:06:32.679183 systemd-networkd[1407]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:06:32.684916 systemd-networkd[1407]: eth1: Link UP Mar 7 01:06:32.684922 systemd-networkd[1407]: eth1: Gained carrier Mar 7 01:06:32.684933 systemd-networkd[1407]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:06:32.695263 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 7 01:06:32.695762 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 7 01:06:32.700319 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 7 01:06:32.701173 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:06:32.701787 systemd[1]: Reached target sysinit.target - System Initialization. Mar 7 01:06:32.702013 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 7 01:06:32.702116 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 7 01:06:32.702838 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 7 01:06:32.704031 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 7 01:06:32.705340 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 7 01:06:32.707538 lvm[1471]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 7 01:06:32.705409 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 7 01:06:32.705430 systemd[1]: Reached target paths.target - Path Units. Mar 7 01:06:32.705474 systemd[1]: Reached target timers.target - Timer Units. Mar 7 01:06:32.706460 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 7 01:06:32.708759 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 7 01:06:32.716266 systemd-networkd[1407]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 7 01:06:32.716817 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 7 01:06:32.716969 systemd-timesyncd[1419]: Network configuration changed, trying to establish connection. Mar 7 01:06:32.717950 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 7 01:06:32.718388 systemd[1]: Reached target sockets.target - Socket Units. Mar 7 01:06:32.718719 systemd[1]: Reached target basic.target - Basic System. Mar 7 01:06:32.719056 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 7 01:06:32.719077 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 7 01:06:32.731580 systemd[1]: Starting containerd.service - containerd container runtime... Mar 7 01:06:32.733359 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 7 01:06:32.737324 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 7 01:06:32.741266 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 7 01:06:32.744260 systemd-networkd[1407]: eth0: DHCPv4 address 204.168.152.184/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 7 01:06:32.744333 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 7 01:06:32.745819 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 7 01:06:32.749302 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 7 01:06:32.751095 jq[1479]: false Mar 7 01:06:32.751126 systemd-timesyncd[1419]: Network configuration changed, trying to establish connection. Mar 7 01:06:32.751621 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 7 01:06:32.754061 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Mar 7 01:06:32.759055 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 7 01:06:32.764361 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 7 01:06:32.779363 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 7 01:06:32.780984 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 7 01:06:32.781401 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 7 01:06:32.781795 coreos-metadata[1477]: Mar 07 01:06:32.781 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Mar 7 01:06:32.783323 systemd[1]: Starting update-engine.service - Update Engine... Mar 7 01:06:32.787302 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 7 01:06:32.788855 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 7 01:06:32.791317 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 7 01:06:32.791562 coreos-metadata[1477]: Mar 07 01:06:32.791 INFO Fetch successful Mar 7 01:06:32.791562 coreos-metadata[1477]: Mar 07 01:06:32.791 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Mar 7 01:06:32.791472 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 7 01:06:32.791763 systemd[1]: motdgen.service: Deactivated successfully. Mar 7 01:06:32.793559 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 7 01:06:32.805853 coreos-metadata[1477]: Mar 07 01:06:32.804 INFO Fetch successful Mar 7 01:06:32.815425 jq[1498]: true Mar 7 01:06:32.816973 extend-filesystems[1482]: Found loop4 Mar 7 01:06:32.816973 extend-filesystems[1482]: Found loop5 Mar 7 01:06:32.816973 extend-filesystems[1482]: Found loop6 Mar 7 01:06:32.816973 extend-filesystems[1482]: Found loop7 Mar 7 01:06:32.816973 extend-filesystems[1482]: Found sda Mar 7 01:06:32.816973 extend-filesystems[1482]: Found sda1 Mar 7 01:06:32.816973 extend-filesystems[1482]: Found sda2 Mar 7 01:06:32.816973 extend-filesystems[1482]: Found sda3 Mar 7 01:06:32.816973 extend-filesystems[1482]: Found usr Mar 7 01:06:32.851669 extend-filesystems[1482]: Found sda4 Mar 7 01:06:32.851669 extend-filesystems[1482]: Found sda6 Mar 7 01:06:32.851669 extend-filesystems[1482]: Found sda7 Mar 7 01:06:32.851669 extend-filesystems[1482]: Found sda9 Mar 7 01:06:32.851669 extend-filesystems[1482]: Checking size of /dev/sda9 Mar 7 01:06:32.835064 dbus-daemon[1478]: [system] SELinux support is enabled Mar 7 01:06:32.828501 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 7 01:06:32.828702 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 7 01:06:32.838109 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 7 01:06:32.865381 tar[1500]: linux-amd64/LICENSE Mar 7 01:06:32.865381 tar[1500]: linux-amd64/helm Mar 7 01:06:32.842158 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 7 01:06:32.842207 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 7 01:06:32.849017 (ntainerd)[1511]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 7 01:06:32.879257 jq[1509]: true Mar 7 01:06:32.852337 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 7 01:06:32.852356 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 7 01:06:32.891839 update_engine[1496]: I20260307 01:06:32.885984 1496 main.cc:92] Flatcar Update Engine starting Mar 7 01:06:32.892047 extend-filesystems[1482]: Resized partition /dev/sda9 Mar 7 01:06:32.898216 extend-filesystems[1527]: resize2fs 1.47.1 (20-May-2024) Mar 7 01:06:32.910221 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 19393531 blocks Mar 7 01:06:32.915503 systemd[1]: Started update-engine.service - Update Engine. Mar 7 01:06:32.917783 update_engine[1496]: I20260307 01:06:32.915721 1496 update_check_scheduler.cc:74] Next update check in 9m44s Mar 7 01:06:32.922755 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 7 01:06:32.936377 systemd-logind[1489]: New seat seat0. Mar 7 01:06:32.943089 systemd-logind[1489]: Watching system buttons on /dev/input/event2 (Power Button) Mar 7 01:06:32.943319 systemd-logind[1489]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 7 01:06:32.943579 systemd[1]: Started systemd-logind.service - User Login Management. Mar 7 01:06:32.960979 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 7 01:06:32.964047 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 7 01:06:33.012245 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 34 scanned by (udev-worker) (1332) Mar 7 01:06:33.042102 bash[1546]: Updated "/home/core/.ssh/authorized_keys" Mar 7 01:06:33.051436 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 7 01:06:33.073369 systemd[1]: Starting sshkeys.service... Mar 7 01:06:33.102127 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 7 01:06:33.112183 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 7 01:06:33.145559 coreos-metadata[1555]: Mar 07 01:06:33.145 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Mar 7 01:06:33.147071 containerd[1511]: time="2026-03-07T01:06:33.147006214Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Mar 7 01:06:33.147543 coreos-metadata[1555]: Mar 07 01:06:33.147 INFO Fetch successful Mar 7 01:06:33.153326 unknown[1555]: wrote ssh authorized keys file for user: core Mar 7 01:06:33.174744 locksmithd[1533]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 7 01:06:33.184967 containerd[1511]: time="2026-03-07T01:06:33.184935613Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 7 01:06:33.187020 containerd[1511]: time="2026-03-07T01:06:33.186998426Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 7 01:06:33.187070 containerd[1511]: time="2026-03-07T01:06:33.187061150Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 7 01:06:33.187103 containerd[1511]: time="2026-03-07T01:06:33.187095261Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 7 01:06:33.189824 containerd[1511]: time="2026-03-07T01:06:33.189809832Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 7 01:06:33.189882 containerd[1511]: time="2026-03-07T01:06:33.189872566Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 7 01:06:33.189967 containerd[1511]: time="2026-03-07T01:06:33.189955811Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 01:06:33.190002 containerd[1511]: time="2026-03-07T01:06:33.189994229Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 7 01:06:33.190256 containerd[1511]: time="2026-03-07T01:06:33.190242291Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 01:06:33.190305 containerd[1511]: time="2026-03-07T01:06:33.190296482Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 7 01:06:33.190336 containerd[1511]: time="2026-03-07T01:06:33.190328600Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 01:06:33.190620 containerd[1511]: time="2026-03-07T01:06:33.190358685Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 7 01:06:33.190620 containerd[1511]: time="2026-03-07T01:06:33.190433848Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 7 01:06:33.193865 containerd[1511]: time="2026-03-07T01:06:33.193844653Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 7 01:06:33.193979 containerd[1511]: time="2026-03-07T01:06:33.193962950Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 01:06:33.193997 containerd[1511]: time="2026-03-07T01:06:33.193977352Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 7 01:06:33.194070 containerd[1511]: time="2026-03-07T01:06:33.194056620Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 7 01:06:33.194111 containerd[1511]: time="2026-03-07T01:06:33.194099465Z" level=info msg="metadata content store policy set" policy=shared Mar 7 01:06:33.209613 update-ssh-keys[1565]: Updated "/home/core/.ssh/authorized_keys" Mar 7 01:06:33.210288 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 7 01:06:33.213971 systemd[1]: Finished sshkeys.service. Mar 7 01:06:33.217653 containerd[1511]: time="2026-03-07T01:06:33.217623761Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 7 01:06:33.217755 containerd[1511]: time="2026-03-07T01:06:33.217744442Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 7 01:06:33.217844 containerd[1511]: time="2026-03-07T01:06:33.217835429Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 7 01:06:33.218044 containerd[1511]: time="2026-03-07T01:06:33.217891042Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 7 01:06:33.218044 containerd[1511]: time="2026-03-07T01:06:33.217905444Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 7 01:06:33.218231 containerd[1511]: time="2026-03-07T01:06:33.218220366Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 7 01:06:33.218931 containerd[1511]: time="2026-03-07T01:06:33.218505454Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 7 01:06:33.218931 containerd[1511]: time="2026-03-07T01:06:33.218600606Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 7 01:06:33.218931 containerd[1511]: time="2026-03-07T01:06:33.218612484Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 7 01:06:33.218931 containerd[1511]: time="2026-03-07T01:06:33.218622870Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 7 01:06:33.218931 containerd[1511]: time="2026-03-07T01:06:33.218644462Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 7 01:06:33.218931 containerd[1511]: time="2026-03-07T01:06:33.218654146Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 7 01:06:33.218931 containerd[1511]: time="2026-03-07T01:06:33.218662639Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 7 01:06:33.218931 containerd[1511]: time="2026-03-07T01:06:33.218671933Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 7 01:06:33.219660 containerd[1511]: time="2026-03-07T01:06:33.219646044Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 7 01:06:33.219751 containerd[1511]: time="2026-03-07T01:06:33.219741447Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 7 01:06:33.220249 containerd[1511]: time="2026-03-07T01:06:33.219934186Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 7 01:06:33.220249 containerd[1511]: time="2026-03-07T01:06:33.219948407Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 7 01:06:33.220249 containerd[1511]: time="2026-03-07T01:06:33.219967786Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.220249 containerd[1511]: time="2026-03-07T01:06:33.219980185Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.220249 containerd[1511]: time="2026-03-07T01:06:33.219990140Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.220249 containerd[1511]: time="2026-03-07T01:06:33.220003260Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.220249 containerd[1511]: time="2026-03-07T01:06:33.220015197Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.220249 containerd[1511]: time="2026-03-07T01:06:33.220027155Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.220249 containerd[1511]: time="2026-03-07T01:06:33.220038032Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.220249 containerd[1511]: time="2026-03-07T01:06:33.220065944Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.220249 containerd[1511]: time="2026-03-07T01:06:33.220077110Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.220249 containerd[1511]: time="2026-03-07T01:06:33.220120425Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.220249 containerd[1511]: time="2026-03-07T01:06:33.220133765Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.220249 containerd[1511]: time="2026-03-07T01:06:33.220144652Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.220437 containerd[1511]: time="2026-03-07T01:06:33.220154947Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.220437 containerd[1511]: time="2026-03-07T01:06:33.220169379Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 7 01:06:33.220437 containerd[1511]: time="2026-03-07T01:06:33.220187326Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.221760 containerd[1511]: time="2026-03-07T01:06:33.220860345Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.221760 containerd[1511]: time="2026-03-07T01:06:33.220875658Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 7 01:06:33.221760 containerd[1511]: time="2026-03-07T01:06:33.220914576Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 7 01:06:33.221760 containerd[1511]: time="2026-03-07T01:06:33.220930310Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 7 01:06:33.221760 containerd[1511]: time="2026-03-07T01:06:33.220940455Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 7 01:06:33.221760 containerd[1511]: time="2026-03-07T01:06:33.220950290Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 7 01:06:33.221760 containerd[1511]: time="2026-03-07T01:06:33.220959504Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.221760 containerd[1511]: time="2026-03-07T01:06:33.220968757Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 7 01:06:33.221760 containerd[1511]: time="2026-03-07T01:06:33.220982809Z" level=info msg="NRI interface is disabled by configuration." Mar 7 01:06:33.221760 containerd[1511]: time="2026-03-07T01:06:33.220993034Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 7 01:06:33.236352 kernel: EXT4-fs (sda9): resized filesystem to 19393531 Mar 7 01:06:33.236387 containerd[1511]: time="2026-03-07T01:06:33.222103098Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 7 01:06:33.236387 containerd[1511]: time="2026-03-07T01:06:33.222153774Z" level=info msg="Connect containerd service" Mar 7 01:06:33.236387 containerd[1511]: time="2026-03-07T01:06:33.222189137Z" level=info msg="using legacy CRI server" Mar 7 01:06:33.236387 containerd[1511]: time="2026-03-07T01:06:33.222219693Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 7 01:06:33.236387 containerd[1511]: time="2026-03-07T01:06:33.222291221Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 7 01:06:33.236887 containerd[1511]: time="2026-03-07T01:06:33.236838475Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 7 01:06:33.237937 containerd[1511]: time="2026-03-07T01:06:33.237651565Z" level=info msg="Start subscribing containerd event" Mar 7 01:06:33.238000 containerd[1511]: time="2026-03-07T01:06:33.237980738Z" level=info msg="Start recovering state" Mar 7 01:06:33.238275 containerd[1511]: time="2026-03-07T01:06:33.238264284Z" level=info msg="Start event monitor" Mar 7 01:06:33.241301 extend-filesystems[1527]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Mar 7 01:06:33.241301 extend-filesystems[1527]: old_desc_blocks = 1, new_desc_blocks = 10 Mar 7 01:06:33.241301 extend-filesystems[1527]: The filesystem on /dev/sda9 is now 19393531 (4k) blocks long. Mar 7 01:06:33.240418 systemd[1]: Started containerd.service - containerd container runtime. Mar 7 01:06:33.246053 containerd[1511]: time="2026-03-07T01:06:33.238709411Z" level=info msg="Start snapshots syncer" Mar 7 01:06:33.246053 containerd[1511]: time="2026-03-07T01:06:33.238721199Z" level=info msg="Start cni network conf syncer for default" Mar 7 01:06:33.246053 containerd[1511]: time="2026-03-07T01:06:33.238727769Z" level=info msg="Start streaming server" Mar 7 01:06:33.246053 containerd[1511]: time="2026-03-07T01:06:33.238468910Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 7 01:06:33.246053 containerd[1511]: time="2026-03-07T01:06:33.239792736Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 7 01:06:33.246053 containerd[1511]: time="2026-03-07T01:06:33.239831914Z" level=info msg="containerd successfully booted in 0.095496s" Mar 7 01:06:33.246141 extend-filesystems[1482]: Resized filesystem in /dev/sda9 Mar 7 01:06:33.246141 extend-filesystems[1482]: Found sr0 Mar 7 01:06:33.242893 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 7 01:06:33.243728 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 7 01:06:33.273109 sshd_keygen[1517]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 7 01:06:33.293452 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 7 01:06:33.303884 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 7 01:06:33.312589 systemd[1]: issuegen.service: Deactivated successfully. Mar 7 01:06:33.312887 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 7 01:06:33.321242 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 7 01:06:33.332565 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 7 01:06:33.340106 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 7 01:06:33.343441 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 7 01:06:33.345623 systemd[1]: Reached target getty.target - Login Prompts. Mar 7 01:06:33.509415 tar[1500]: linux-amd64/README.md Mar 7 01:06:33.519898 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 7 01:06:34.191571 systemd-networkd[1407]: eth1: Gained IPv6LL Mar 7 01:06:34.192583 systemd-timesyncd[1419]: Network configuration changed, trying to establish connection. Mar 7 01:06:34.198330 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 7 01:06:34.200070 systemd[1]: Reached target network-online.target - Network is Online. Mar 7 01:06:34.210467 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:06:34.224608 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 7 01:06:34.263026 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 7 01:06:34.703382 systemd-networkd[1407]: eth0: Gained IPv6LL Mar 7 01:06:34.703850 systemd-timesyncd[1419]: Network configuration changed, trying to establish connection. Mar 7 01:06:34.923828 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 7 01:06:34.935307 systemd[1]: Started sshd@0-204.168.152.184:22-4.153.228.146:52826.service - OpenSSH per-connection server daemon (4.153.228.146:52826). Mar 7 01:06:35.158446 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:06:35.159373 (kubelet)[1612]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 01:06:35.161090 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 7 01:06:35.164164 systemd[1]: Startup finished in 1.518s (kernel) + 5.778s (initrd) + 4.684s (userspace) = 11.981s. Mar 7 01:06:35.602584 kubelet[1612]: E0307 01:06:35.602408 1612 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 01:06:35.605462 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 01:06:35.605644 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 01:06:35.678606 sshd[1605]: Accepted publickey for core from 4.153.228.146 port 52826 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:06:35.680893 sshd[1605]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:06:35.696740 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 7 01:06:35.704924 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 7 01:06:35.709302 systemd-logind[1489]: New session 1 of user core. Mar 7 01:06:35.742631 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 7 01:06:35.749459 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 7 01:06:35.763691 (systemd)[1625]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 7 01:06:35.872312 systemd[1625]: Queued start job for default target default.target. Mar 7 01:06:35.883222 systemd[1625]: Created slice app.slice - User Application Slice. Mar 7 01:06:35.883242 systemd[1625]: Reached target paths.target - Paths. Mar 7 01:06:35.883253 systemd[1625]: Reached target timers.target - Timers. Mar 7 01:06:35.884628 systemd[1625]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 7 01:06:35.905129 systemd[1625]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 7 01:06:35.905246 systemd[1625]: Reached target sockets.target - Sockets. Mar 7 01:06:35.905258 systemd[1625]: Reached target basic.target - Basic System. Mar 7 01:06:35.905292 systemd[1625]: Reached target default.target - Main User Target. Mar 7 01:06:35.905323 systemd[1625]: Startup finished in 130ms. Mar 7 01:06:35.905603 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 7 01:06:35.918309 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 7 01:06:36.462632 systemd[1]: Started sshd@1-204.168.152.184:22-4.153.228.146:52842.service - OpenSSH per-connection server daemon (4.153.228.146:52842). Mar 7 01:06:37.211775 sshd[1636]: Accepted publickey for core from 4.153.228.146 port 52842 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:06:37.214707 sshd[1636]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:06:37.223447 systemd-logind[1489]: New session 2 of user core. Mar 7 01:06:37.233422 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 7 01:06:37.742917 sshd[1636]: pam_unix(sshd:session): session closed for user core Mar 7 01:06:37.747282 systemd[1]: sshd@1-204.168.152.184:22-4.153.228.146:52842.service: Deactivated successfully. Mar 7 01:06:37.750096 systemd[1]: session-2.scope: Deactivated successfully. Mar 7 01:06:37.752013 systemd-logind[1489]: Session 2 logged out. Waiting for processes to exit. Mar 7 01:06:37.753685 systemd-logind[1489]: Removed session 2. Mar 7 01:06:37.879630 systemd[1]: Started sshd@2-204.168.152.184:22-4.153.228.146:52846.service - OpenSSH per-connection server daemon (4.153.228.146:52846). Mar 7 01:06:38.637257 sshd[1643]: Accepted publickey for core from 4.153.228.146 port 52846 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:06:38.638452 sshd[1643]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:06:38.646476 systemd-logind[1489]: New session 3 of user core. Mar 7 01:06:38.654439 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 7 01:06:39.156850 sshd[1643]: pam_unix(sshd:session): session closed for user core Mar 7 01:06:39.163301 systemd[1]: sshd@2-204.168.152.184:22-4.153.228.146:52846.service: Deactivated successfully. Mar 7 01:06:39.166800 systemd[1]: session-3.scope: Deactivated successfully. Mar 7 01:06:39.167824 systemd-logind[1489]: Session 3 logged out. Waiting for processes to exit. Mar 7 01:06:39.169868 systemd-logind[1489]: Removed session 3. Mar 7 01:06:39.298679 systemd[1]: Started sshd@3-204.168.152.184:22-4.153.228.146:45258.service - OpenSSH per-connection server daemon (4.153.228.146:45258). Mar 7 01:06:40.058658 sshd[1650]: Accepted publickey for core from 4.153.228.146 port 45258 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:06:40.061446 sshd[1650]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:06:40.070416 systemd-logind[1489]: New session 4 of user core. Mar 7 01:06:40.081480 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 7 01:06:40.589924 sshd[1650]: pam_unix(sshd:session): session closed for user core Mar 7 01:06:40.596050 systemd-logind[1489]: Session 4 logged out. Waiting for processes to exit. Mar 7 01:06:40.597676 systemd[1]: sshd@3-204.168.152.184:22-4.153.228.146:45258.service: Deactivated successfully. Mar 7 01:06:40.601316 systemd[1]: session-4.scope: Deactivated successfully. Mar 7 01:06:40.602742 systemd-logind[1489]: Removed session 4. Mar 7 01:06:40.727578 systemd[1]: Started sshd@4-204.168.152.184:22-4.153.228.146:45270.service - OpenSSH per-connection server daemon (4.153.228.146:45270). Mar 7 01:06:41.464602 sshd[1657]: Accepted publickey for core from 4.153.228.146 port 45270 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:06:41.467334 sshd[1657]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:06:41.475710 systemd-logind[1489]: New session 5 of user core. Mar 7 01:06:41.485448 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 7 01:06:41.882830 sudo[1660]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 7 01:06:41.883548 sudo[1660]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 01:06:41.902252 sudo[1660]: pam_unix(sudo:session): session closed for user root Mar 7 01:06:42.022512 sshd[1657]: pam_unix(sshd:session): session closed for user core Mar 7 01:06:42.028094 systemd[1]: sshd@4-204.168.152.184:22-4.153.228.146:45270.service: Deactivated successfully. Mar 7 01:06:42.032104 systemd[1]: session-5.scope: Deactivated successfully. Mar 7 01:06:42.034730 systemd-logind[1489]: Session 5 logged out. Waiting for processes to exit. Mar 7 01:06:42.036623 systemd-logind[1489]: Removed session 5. Mar 7 01:06:42.160565 systemd[1]: Started sshd@5-204.168.152.184:22-4.153.228.146:45278.service - OpenSSH per-connection server daemon (4.153.228.146:45278). Mar 7 01:06:42.924443 sshd[1665]: Accepted publickey for core from 4.153.228.146 port 45278 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:06:42.927285 sshd[1665]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:06:42.935031 systemd-logind[1489]: New session 6 of user core. Mar 7 01:06:42.942422 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 7 01:06:43.337187 sudo[1669]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 7 01:06:43.337969 sudo[1669]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 01:06:43.344714 sudo[1669]: pam_unix(sudo:session): session closed for user root Mar 7 01:06:43.356100 sudo[1668]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Mar 7 01:06:43.356806 sudo[1668]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 01:06:43.386536 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Mar 7 01:06:43.390442 auditctl[1672]: No rules Mar 7 01:06:43.391285 systemd[1]: audit-rules.service: Deactivated successfully. Mar 7 01:06:43.391664 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Mar 7 01:06:43.400710 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 7 01:06:43.456392 augenrules[1690]: No rules Mar 7 01:06:43.459102 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 7 01:06:43.462296 sudo[1668]: pam_unix(sudo:session): session closed for user root Mar 7 01:06:43.583925 sshd[1665]: pam_unix(sshd:session): session closed for user core Mar 7 01:06:43.590323 systemd[1]: sshd@5-204.168.152.184:22-4.153.228.146:45278.service: Deactivated successfully. Mar 7 01:06:43.593832 systemd[1]: session-6.scope: Deactivated successfully. Mar 7 01:06:43.596279 systemd-logind[1489]: Session 6 logged out. Waiting for processes to exit. Mar 7 01:06:43.597928 systemd-logind[1489]: Removed session 6. Mar 7 01:06:43.723130 systemd[1]: Started sshd@6-204.168.152.184:22-4.153.228.146:45294.service - OpenSSH per-connection server daemon (4.153.228.146:45294). Mar 7 01:06:44.465781 sshd[1698]: Accepted publickey for core from 4.153.228.146 port 45294 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:06:44.468661 sshd[1698]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:06:44.477278 systemd-logind[1489]: New session 7 of user core. Mar 7 01:06:44.487420 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 7 01:06:44.873942 sudo[1701]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 7 01:06:44.874736 sudo[1701]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 01:06:45.199605 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 7 01:06:45.199624 (dockerd)[1717]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 7 01:06:45.546658 dockerd[1717]: time="2026-03-07T01:06:45.546384252Z" level=info msg="Starting up" Mar 7 01:06:45.638001 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 7 01:06:45.644583 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:06:45.645706 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport332051867-merged.mount: Deactivated successfully. Mar 7 01:06:45.673571 dockerd[1717]: time="2026-03-07T01:06:45.673075947Z" level=info msg="Loading containers: start." Mar 7 01:06:45.793220 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:06:45.798169 (kubelet)[1787]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 01:06:45.802683 kernel: Initializing XFRM netlink socket Mar 7 01:06:45.822363 systemd-timesyncd[1419]: Network configuration changed, trying to establish connection. Mar 7 01:06:45.848781 kubelet[1787]: E0307 01:06:45.848753 1787 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 01:06:45.853399 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 01:06:45.853567 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 01:06:45.877905 systemd-networkd[1407]: docker0: Link UP Mar 7 01:06:45.897524 dockerd[1717]: time="2026-03-07T01:06:45.897490826Z" level=info msg="Loading containers: done." Mar 7 01:06:45.913746 dockerd[1717]: time="2026-03-07T01:06:45.913714324Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 7 01:06:45.913849 dockerd[1717]: time="2026-03-07T01:06:45.913777589Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Mar 7 01:06:45.913884 dockerd[1717]: time="2026-03-07T01:06:45.913865771Z" level=info msg="Daemon has completed initialization" Mar 7 01:06:45.916805 systemd-timesyncd[1419]: Contacted time server 79.133.44.137:123 (2.flatcar.pool.ntp.org). Mar 7 01:06:45.916860 systemd-timesyncd[1419]: Initial clock synchronization to Sat 2026-03-07 01:06:45.977750 UTC. Mar 7 01:06:45.940726 dockerd[1717]: time="2026-03-07T01:06:45.940686231Z" level=info msg="API listen on /run/docker.sock" Mar 7 01:06:45.940936 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 7 01:06:46.417748 containerd[1511]: time="2026-03-07T01:06:46.417655871Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.2\"" Mar 7 01:06:47.018448 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount63536791.mount: Deactivated successfully. Mar 7 01:06:48.101792 containerd[1511]: time="2026-03-07T01:06:48.101738701Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:48.103243 containerd[1511]: time="2026-03-07T01:06:48.103188321Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.35.2: active requests=0, bytes read=27696567" Mar 7 01:06:48.105221 containerd[1511]: time="2026-03-07T01:06:48.104938648Z" level=info msg="ImageCreate event name:\"sha256:66108468ce51257077e642f2f509cd61d470029036a7954a1a47ca15b2706dda\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:48.107983 containerd[1511]: time="2026-03-07T01:06:48.107965551Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:68cdc586f13b13edb7aa30a18155be530136a39cfd5ef8672aad8ccc98f0a7f7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:48.108725 containerd[1511]: time="2026-03-07T01:06:48.108696430Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.35.2\" with image id \"sha256:66108468ce51257077e642f2f509cd61d470029036a7954a1a47ca15b2706dda\", repo tag \"registry.k8s.io/kube-apiserver:v1.35.2\", repo digest \"registry.k8s.io/kube-apiserver@sha256:68cdc586f13b13edb7aa30a18155be530136a39cfd5ef8672aad8ccc98f0a7f7\", size \"27693066\" in 1.691008127s" Mar 7 01:06:48.108760 containerd[1511]: time="2026-03-07T01:06:48.108728424Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.2\" returns image reference \"sha256:66108468ce51257077e642f2f509cd61d470029036a7954a1a47ca15b2706dda\"" Mar 7 01:06:48.109139 containerd[1511]: time="2026-03-07T01:06:48.109118534Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.2\"" Mar 7 01:06:49.216865 containerd[1511]: time="2026-03-07T01:06:49.216810019Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:49.218067 containerd[1511]: time="2026-03-07T01:06:49.217859327Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.35.2: active requests=0, bytes read=21450722" Mar 7 01:06:49.219113 containerd[1511]: time="2026-03-07T01:06:49.219069615Z" level=info msg="ImageCreate event name:\"sha256:0f2dd35011c05b55a97c9304ae1d36cfd58499cc1fd3dd8ccdf6efef1144e36a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:49.221125 containerd[1511]: time="2026-03-07T01:06:49.221107259Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d9784320a41dd1b155c0ad8fdb5823d60c475870f3dd23865edde36b585748f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:49.221889 containerd[1511]: time="2026-03-07T01:06:49.221777542Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.35.2\" with image id \"sha256:0f2dd35011c05b55a97c9304ae1d36cfd58499cc1fd3dd8ccdf6efef1144e36a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.35.2\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d9784320a41dd1b155c0ad8fdb5823d60c475870f3dd23865edde36b585748f2\", size \"23142311\" in 1.112635965s" Mar 7 01:06:49.221889 containerd[1511]: time="2026-03-07T01:06:49.221805003Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.2\" returns image reference \"sha256:0f2dd35011c05b55a97c9304ae1d36cfd58499cc1fd3dd8ccdf6efef1144e36a\"" Mar 7 01:06:49.222570 containerd[1511]: time="2026-03-07T01:06:49.222542700Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.2\"" Mar 7 01:06:50.415295 containerd[1511]: time="2026-03-07T01:06:50.415231041Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:50.416451 containerd[1511]: time="2026-03-07T01:06:50.416301118Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.35.2: active requests=0, bytes read=15548451" Mar 7 01:06:50.417640 containerd[1511]: time="2026-03-07T01:06:50.417354888Z" level=info msg="ImageCreate event name:\"sha256:ee83c410d7938aa1752b4e79a8d51f03710b4becc23b2e095fba471049fb2914\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:50.419544 containerd[1511]: time="2026-03-07T01:06:50.419513977Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:5833e2c4b779215efe7a48126c067de199e86aa5a86518693adeef16db0ff943\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:50.420276 containerd[1511]: time="2026-03-07T01:06:50.420244152Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.35.2\" with image id \"sha256:ee83c410d7938aa1752b4e79a8d51f03710b4becc23b2e095fba471049fb2914\", repo tag \"registry.k8s.io/kube-scheduler:v1.35.2\", repo digest \"registry.k8s.io/kube-scheduler@sha256:5833e2c4b779215efe7a48126c067de199e86aa5a86518693adeef16db0ff943\", size \"17240058\" in 1.197679028s" Mar 7 01:06:50.420342 containerd[1511]: time="2026-03-07T01:06:50.420330375Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.2\" returns image reference \"sha256:ee83c410d7938aa1752b4e79a8d51f03710b4becc23b2e095fba471049fb2914\"" Mar 7 01:06:50.420867 containerd[1511]: time="2026-03-07T01:06:50.420842462Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.2\"" Mar 7 01:06:51.538179 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2293722607.mount: Deactivated successfully. Mar 7 01:06:51.749333 containerd[1511]: time="2026-03-07T01:06:51.749270275Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:51.750578 containerd[1511]: time="2026-03-07T01:06:51.750430989Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.35.2: active requests=0, bytes read=25685340" Mar 7 01:06:51.752244 containerd[1511]: time="2026-03-07T01:06:51.751419001Z" level=info msg="ImageCreate event name:\"sha256:3c471cf273e44f68c91b48985c27627d581915b9ee5e72f7227bbf2146008b5e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:51.753311 containerd[1511]: time="2026-03-07T01:06:51.753181060Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:015265214cc874b593a7adccdcfe4ac15d2b8e9ae89881bdcd5bcb99d42e1862\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:51.754013 containerd[1511]: time="2026-03-07T01:06:51.753608033Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.35.2\" with image id \"sha256:3c471cf273e44f68c91b48985c27627d581915b9ee5e72f7227bbf2146008b5e\", repo tag \"registry.k8s.io/kube-proxy:v1.35.2\", repo digest \"registry.k8s.io/kube-proxy@sha256:015265214cc874b593a7adccdcfe4ac15d2b8e9ae89881bdcd5bcb99d42e1862\", size \"25684331\" in 1.332738936s" Mar 7 01:06:51.754013 containerd[1511]: time="2026-03-07T01:06:51.753644983Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.2\" returns image reference \"sha256:3c471cf273e44f68c91b48985c27627d581915b9ee5e72f7227bbf2146008b5e\"" Mar 7 01:06:51.754254 containerd[1511]: time="2026-03-07T01:06:51.754234564Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\"" Mar 7 01:06:52.311441 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3131978520.mount: Deactivated successfully. Mar 7 01:06:53.711642 containerd[1511]: time="2026-03-07T01:06:53.710895874Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:53.712521 containerd[1511]: time="2026-03-07T01:06:53.712497067Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.13.1: active requests=0, bytes read=23556642" Mar 7 01:06:53.713628 containerd[1511]: time="2026-03-07T01:06:53.713613384Z" level=info msg="ImageCreate event name:\"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:53.715744 containerd[1511]: time="2026-03-07T01:06:53.715719442Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:53.716512 containerd[1511]: time="2026-03-07T01:06:53.716489557Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.13.1\" with image id \"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\", repo tag \"registry.k8s.io/coredns/coredns:v1.13.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\", size \"23553139\" in 1.962191636s" Mar 7 01:06:53.716512 containerd[1511]: time="2026-03-07T01:06:53.716511777Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\" returns image reference \"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\"" Mar 7 01:06:53.716939 containerd[1511]: time="2026-03-07T01:06:53.716924929Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 7 01:06:54.185231 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2007861185.mount: Deactivated successfully. Mar 7 01:06:54.195225 containerd[1511]: time="2026-03-07T01:06:54.195124337Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:54.196666 containerd[1511]: time="2026-03-07T01:06:54.196589355Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321240" Mar 7 01:06:54.197751 containerd[1511]: time="2026-03-07T01:06:54.197667178Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:54.202248 containerd[1511]: time="2026-03-07T01:06:54.201987069Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:54.204010 containerd[1511]: time="2026-03-07T01:06:54.203183209Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 486.191132ms" Mar 7 01:06:54.204010 containerd[1511]: time="2026-03-07T01:06:54.203271753Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Mar 7 01:06:54.204312 containerd[1511]: time="2026-03-07T01:06:54.204047815Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\"" Mar 7 01:06:54.754838 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2902293597.mount: Deactivated successfully. Mar 7 01:06:55.435256 containerd[1511]: time="2026-03-07T01:06:55.435188001Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.6-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:55.436460 containerd[1511]: time="2026-03-07T01:06:55.436292404Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.6-0: active requests=0, bytes read=23630398" Mar 7 01:06:55.438232 containerd[1511]: time="2026-03-07T01:06:55.437759531Z" level=info msg="ImageCreate event name:\"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:55.440554 containerd[1511]: time="2026-03-07T01:06:55.440534401Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:06:55.441279 containerd[1511]: time="2026-03-07T01:06:55.441255522Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.6-0\" with image id \"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\", repo tag \"registry.k8s.io/etcd:3.6.6-0\", repo digest \"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\", size \"23641797\" in 1.237179352s" Mar 7 01:06:55.441341 containerd[1511]: time="2026-03-07T01:06:55.441330416Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\" returns image reference \"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\"" Mar 7 01:06:56.104356 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 7 01:06:56.114549 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:06:56.241329 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:06:56.244591 (kubelet)[2091]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 01:06:56.270583 kubelet[2091]: E0307 01:06:56.270542 2091 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 01:06:56.273064 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 01:06:56.273238 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 01:06:56.554084 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:06:56.567627 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:06:56.603632 systemd[1]: Reloading requested from client PID 2105 ('systemctl') (unit session-7.scope)... Mar 7 01:06:56.603748 systemd[1]: Reloading... Mar 7 01:06:56.709746 zram_generator::config[2145]: No configuration found. Mar 7 01:06:56.812678 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 01:06:56.877719 systemd[1]: Reloading finished in 273 ms. Mar 7 01:06:56.925166 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 7 01:06:56.925497 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 7 01:06:56.925727 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:06:56.938030 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:06:57.084084 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:06:57.093801 (kubelet)[2198]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 7 01:06:57.124695 kubelet[2198]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 01:06:57.356623 kubelet[2198]: I0307 01:06:57.356002 2198 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Mar 7 01:06:57.356623 kubelet[2198]: I0307 01:06:57.356059 2198 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 7 01:06:57.356623 kubelet[2198]: I0307 01:06:57.356090 2198 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 7 01:06:57.356623 kubelet[2198]: I0307 01:06:57.356110 2198 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 7 01:06:57.356623 kubelet[2198]: I0307 01:06:57.356551 2198 server.go:951] "Client rotation is on, will bootstrap in background" Mar 7 01:06:57.365350 kubelet[2198]: E0307 01:06:57.365310 2198 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://204.168.152.184:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 204.168.152.184:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 7 01:06:57.366072 kubelet[2198]: I0307 01:06:57.365717 2198 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 7 01:06:57.373889 kubelet[2198]: E0307 01:06:57.373837 2198 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 7 01:06:57.373937 kubelet[2198]: I0307 01:06:57.373922 2198 server.go:1395] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Mar 7 01:06:57.381017 kubelet[2198]: I0307 01:06:57.380993 2198 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 7 01:06:57.383258 kubelet[2198]: I0307 01:06:57.383176 2198 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 7 01:06:57.383473 kubelet[2198]: I0307 01:06:57.383258 2198 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-n-593f1c83d2","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 7 01:06:57.383539 kubelet[2198]: I0307 01:06:57.383482 2198 topology_manager.go:143] "Creating topology manager with none policy" Mar 7 01:06:57.383539 kubelet[2198]: I0307 01:06:57.383497 2198 container_manager_linux.go:308] "Creating device plugin manager" Mar 7 01:06:57.383654 kubelet[2198]: I0307 01:06:57.383632 2198 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Mar 7 01:06:57.386090 kubelet[2198]: I0307 01:06:57.386066 2198 state_mem.go:41] "Initialized" logger="CPUManager state memory" Mar 7 01:06:57.386422 kubelet[2198]: I0307 01:06:57.386404 2198 kubelet.go:482] "Attempting to sync node with API server" Mar 7 01:06:57.386447 kubelet[2198]: I0307 01:06:57.386427 2198 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 7 01:06:57.386476 kubelet[2198]: I0307 01:06:57.386467 2198 kubelet.go:394] "Adding apiserver pod source" Mar 7 01:06:57.386493 kubelet[2198]: I0307 01:06:57.386481 2198 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 7 01:06:57.390980 kubelet[2198]: I0307 01:06:57.390578 2198 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 7 01:06:57.394253 kubelet[2198]: I0307 01:06:57.394226 2198 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 7 01:06:57.394294 kubelet[2198]: I0307 01:06:57.394276 2198 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 7 01:06:57.394415 kubelet[2198]: W0307 01:06:57.394395 2198 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 7 01:06:57.399324 kubelet[2198]: I0307 01:06:57.399257 2198 server.go:1257] "Started kubelet" Mar 7 01:06:57.402266 kubelet[2198]: I0307 01:06:57.401877 2198 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Mar 7 01:06:57.408391 kubelet[2198]: E0307 01:06:57.404657 2198 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://204.168.152.184:6443/api/v1/namespaces/default/events\": dial tcp 204.168.152.184:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081-3-6-n-593f1c83d2.189a69b232af587d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-6-n-593f1c83d2,UID:,APIVersion:v1,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-n-593f1c83d2,},FirstTimestamp:2026-03-07 01:06:57.399158909 +0000 UTC m=+0.301828472,LastTimestamp:2026-03-07 01:06:57.399158909 +0000 UTC m=+0.301828472,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-n-593f1c83d2,}" Mar 7 01:06:57.408713 kubelet[2198]: I0307 01:06:57.408681 2198 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Mar 7 01:06:57.411454 kubelet[2198]: I0307 01:06:57.411442 2198 server.go:317] "Adding debug handlers to kubelet server" Mar 7 01:06:57.414458 kubelet[2198]: I0307 01:06:57.414425 2198 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 7 01:06:57.414812 kubelet[2198]: I0307 01:06:57.414799 2198 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 7 01:06:57.414988 kubelet[2198]: I0307 01:06:57.414979 2198 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 7 01:06:57.415183 kubelet[2198]: I0307 01:06:57.415172 2198 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 7 01:06:57.418331 kubelet[2198]: I0307 01:06:57.418210 2198 volume_manager.go:311] "Starting Kubelet Volume Manager" Mar 7 01:06:57.418451 kubelet[2198]: I0307 01:06:57.418443 2198 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 7 01:06:57.418511 kubelet[2198]: I0307 01:06:57.418505 2198 reconciler.go:29] "Reconciler: start to sync state" Mar 7 01:06:57.418974 kubelet[2198]: I0307 01:06:57.418964 2198 factory.go:223] Registration of the systemd container factory successfully Mar 7 01:06:57.419085 kubelet[2198]: I0307 01:06:57.419074 2198 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 7 01:06:57.419642 kubelet[2198]: E0307 01:06:57.419407 2198 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-593f1c83d2\" not found" Mar 7 01:06:57.420103 kubelet[2198]: E0307 01:06:57.420050 2198 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://204.168.152.184:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-593f1c83d2?timeout=10s\": dial tcp 204.168.152.184:6443: connect: connection refused" interval="200ms" Mar 7 01:06:57.421486 kubelet[2198]: E0307 01:06:57.421473 2198 kubelet.go:1656] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 7 01:06:57.421708 kubelet[2198]: I0307 01:06:57.421622 2198 factory.go:223] Registration of the containerd container factory successfully Mar 7 01:06:57.438504 kubelet[2198]: I0307 01:06:57.438321 2198 cpu_manager.go:225] "Starting" policy="none" Mar 7 01:06:57.438504 kubelet[2198]: I0307 01:06:57.438331 2198 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 7 01:06:57.438504 kubelet[2198]: I0307 01:06:57.438344 2198 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Mar 7 01:06:57.440272 kubelet[2198]: I0307 01:06:57.440080 2198 policy_none.go:50] "Start" Mar 7 01:06:57.440272 kubelet[2198]: I0307 01:06:57.440094 2198 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 7 01:06:57.440272 kubelet[2198]: I0307 01:06:57.440104 2198 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 7 01:06:57.441835 kubelet[2198]: I0307 01:06:57.441294 2198 policy_none.go:44] "Start" Mar 7 01:06:57.451645 kubelet[2198]: I0307 01:06:57.450729 2198 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 7 01:06:57.454322 kubelet[2198]: I0307 01:06:57.453509 2198 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 7 01:06:57.454322 kubelet[2198]: I0307 01:06:57.453530 2198 status_manager.go:249] "Starting to sync pod status with apiserver" Mar 7 01:06:57.454322 kubelet[2198]: I0307 01:06:57.453569 2198 kubelet.go:2501] "Starting kubelet main sync loop" Mar 7 01:06:57.454322 kubelet[2198]: E0307 01:06:57.453618 2198 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 7 01:06:57.453873 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 7 01:06:57.469495 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 7 01:06:57.472572 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 7 01:06:57.482922 kubelet[2198]: E0307 01:06:57.482892 2198 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 7 01:06:57.483074 kubelet[2198]: I0307 01:06:57.483058 2198 eviction_manager.go:194] "Eviction manager: starting control loop" Mar 7 01:06:57.483095 kubelet[2198]: I0307 01:06:57.483071 2198 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 7 01:06:57.484225 kubelet[2198]: I0307 01:06:57.484183 2198 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Mar 7 01:06:57.484463 kubelet[2198]: E0307 01:06:57.484427 2198 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 7 01:06:57.484463 kubelet[2198]: E0307 01:06:57.484449 2198 eviction_manager.go:297] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4081-3-6-n-593f1c83d2\" not found" Mar 7 01:06:57.574475 systemd[1]: Created slice kubepods-burstable-pod881450a00fe9f34600feb3266356da22.slice - libcontainer container kubepods-burstable-pod881450a00fe9f34600feb3266356da22.slice. Mar 7 01:06:57.585929 kubelet[2198]: I0307 01:06:57.585838 2198 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:57.586479 kubelet[2198]: E0307 01:06:57.586412 2198 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://204.168.152.184:6443/api/v1/nodes\": dial tcp 204.168.152.184:6443: connect: connection refused" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:57.591228 kubelet[2198]: E0307 01:06:57.590815 2198 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-593f1c83d2\" not found" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:57.596015 systemd[1]: Created slice kubepods-burstable-podbd5e42152fdb7f4ac5a6623b709603de.slice - libcontainer container kubepods-burstable-podbd5e42152fdb7f4ac5a6623b709603de.slice. Mar 7 01:06:57.606566 kubelet[2198]: E0307 01:06:57.606523 2198 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-593f1c83d2\" not found" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:57.610325 systemd[1]: Created slice kubepods-burstable-podbb552851c6c70b587d12c53cece59c48.slice - libcontainer container kubepods-burstable-podbb552851c6c70b587d12c53cece59c48.slice. Mar 7 01:06:57.616379 kubelet[2198]: E0307 01:06:57.616314 2198 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-593f1c83d2\" not found" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:57.620664 kubelet[2198]: I0307 01:06:57.620149 2198 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/bd5e42152fdb7f4ac5a6623b709603de-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-n-593f1c83d2\" (UID: \"bd5e42152fdb7f4ac5a6623b709603de\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:57.620664 kubelet[2198]: I0307 01:06:57.620211 2198 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/bd5e42152fdb7f4ac5a6623b709603de-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-n-593f1c83d2\" (UID: \"bd5e42152fdb7f4ac5a6623b709603de\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:57.620664 kubelet[2198]: I0307 01:06:57.620241 2198 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/bb552851c6c70b587d12c53cece59c48-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-n-593f1c83d2\" (UID: \"bb552851c6c70b587d12c53cece59c48\") " pod="kube-system/kube-scheduler-ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:57.620664 kubelet[2198]: I0307 01:06:57.620264 2198 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/881450a00fe9f34600feb3266356da22-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-n-593f1c83d2\" (UID: \"881450a00fe9f34600feb3266356da22\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:57.620664 kubelet[2198]: I0307 01:06:57.620307 2198 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/881450a00fe9f34600feb3266356da22-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-n-593f1c83d2\" (UID: \"881450a00fe9f34600feb3266356da22\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:57.620937 kubelet[2198]: I0307 01:06:57.620331 2198 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/bd5e42152fdb7f4ac5a6623b709603de-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-n-593f1c83d2\" (UID: \"bd5e42152fdb7f4ac5a6623b709603de\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:57.620937 kubelet[2198]: I0307 01:06:57.620352 2198 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/881450a00fe9f34600feb3266356da22-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-n-593f1c83d2\" (UID: \"881450a00fe9f34600feb3266356da22\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:57.620937 kubelet[2198]: I0307 01:06:57.620384 2198 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/bd5e42152fdb7f4ac5a6623b709603de-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-593f1c83d2\" (UID: \"bd5e42152fdb7f4ac5a6623b709603de\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:57.620937 kubelet[2198]: I0307 01:06:57.620408 2198 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/bd5e42152fdb7f4ac5a6623b709603de-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-593f1c83d2\" (UID: \"bd5e42152fdb7f4ac5a6623b709603de\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:57.621629 kubelet[2198]: E0307 01:06:57.621530 2198 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://204.168.152.184:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-593f1c83d2?timeout=10s\": dial tcp 204.168.152.184:6443: connect: connection refused" interval="400ms" Mar 7 01:06:57.789366 kubelet[2198]: I0307 01:06:57.789324 2198 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:57.789791 kubelet[2198]: E0307 01:06:57.789739 2198 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://204.168.152.184:6443/api/v1/nodes\": dial tcp 204.168.152.184:6443: connect: connection refused" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:57.896315 containerd[1511]: time="2026-03-07T01:06:57.896101945Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-n-593f1c83d2,Uid:881450a00fe9f34600feb3266356da22,Namespace:kube-system,Attempt:0,}" Mar 7 01:06:57.914389 containerd[1511]: time="2026-03-07T01:06:57.913980331Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-n-593f1c83d2,Uid:bd5e42152fdb7f4ac5a6623b709603de,Namespace:kube-system,Attempt:0,}" Mar 7 01:06:57.920049 containerd[1511]: time="2026-03-07T01:06:57.919968286Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-n-593f1c83d2,Uid:bb552851c6c70b587d12c53cece59c48,Namespace:kube-system,Attempt:0,}" Mar 7 01:06:58.022722 kubelet[2198]: E0307 01:06:58.022650 2198 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://204.168.152.184:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-593f1c83d2?timeout=10s\": dial tcp 204.168.152.184:6443: connect: connection refused" interval="800ms" Mar 7 01:06:58.193449 kubelet[2198]: I0307 01:06:58.193175 2198 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:58.194364 kubelet[2198]: E0307 01:06:58.194280 2198 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://204.168.152.184:6443/api/v1/nodes\": dial tcp 204.168.152.184:6443: connect: connection refused" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:58.382217 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1119175036.mount: Deactivated successfully. Mar 7 01:06:58.390981 containerd[1511]: time="2026-03-07T01:06:58.390893410Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 01:06:58.393300 containerd[1511]: time="2026-03-07T01:06:58.393099372Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 7 01:06:58.394179 containerd[1511]: time="2026-03-07T01:06:58.394120711Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 01:06:58.395447 containerd[1511]: time="2026-03-07T01:06:58.395389039Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 01:06:58.397602 containerd[1511]: time="2026-03-07T01:06:58.397544497Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 01:06:58.398554 containerd[1511]: time="2026-03-07T01:06:58.398358416Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312078" Mar 7 01:06:58.400253 containerd[1511]: time="2026-03-07T01:06:58.399767972Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 7 01:06:58.402326 containerd[1511]: time="2026-03-07T01:06:58.402151442Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 01:06:58.405861 containerd[1511]: time="2026-03-07T01:06:58.405544315Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 491.458905ms" Mar 7 01:06:58.408094 containerd[1511]: time="2026-03-07T01:06:58.408033245Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 511.731332ms" Mar 7 01:06:58.409144 containerd[1511]: time="2026-03-07T01:06:58.409070963Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 489.004582ms" Mar 7 01:06:58.520995 containerd[1511]: time="2026-03-07T01:06:58.520700665Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:06:58.520995 containerd[1511]: time="2026-03-07T01:06:58.520806797Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:06:58.520995 containerd[1511]: time="2026-03-07T01:06:58.520819185Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:06:58.522001 containerd[1511]: time="2026-03-07T01:06:58.521337947Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:06:58.527261 containerd[1511]: time="2026-03-07T01:06:58.527171033Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:06:58.527261 containerd[1511]: time="2026-03-07T01:06:58.527223151Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:06:58.527261 containerd[1511]: time="2026-03-07T01:06:58.527241105Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:06:58.527452 containerd[1511]: time="2026-03-07T01:06:58.527293384Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:06:58.530568 containerd[1511]: time="2026-03-07T01:06:58.530352684Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:06:58.530568 containerd[1511]: time="2026-03-07T01:06:58.530392164Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:06:58.530568 containerd[1511]: time="2026-03-07T01:06:58.530402907Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:06:58.533106 containerd[1511]: time="2026-03-07T01:06:58.533007237Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:06:58.541333 systemd[1]: Started cri-containerd-ca1a1adb9a0c3708b8bb93c46c621c1466b71d1950b12a694bc194b3e003f29d.scope - libcontainer container ca1a1adb9a0c3708b8bb93c46c621c1466b71d1950b12a694bc194b3e003f29d. Mar 7 01:06:58.546481 systemd[1]: Started cri-containerd-4676abf7b4c631d2ba0e1fa8c0b42279e1d68479ca0be94f85e3ce7db2bbd9a1.scope - libcontainer container 4676abf7b4c631d2ba0e1fa8c0b42279e1d68479ca0be94f85e3ce7db2bbd9a1. Mar 7 01:06:58.567310 systemd[1]: Started cri-containerd-4846f1466b6c3d751b9c00067bebf32d9ec4b53e5a103f0adffdc801f97c0d15.scope - libcontainer container 4846f1466b6c3d751b9c00067bebf32d9ec4b53e5a103f0adffdc801f97c0d15. Mar 7 01:06:58.578142 containerd[1511]: time="2026-03-07T01:06:58.578105474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-n-593f1c83d2,Uid:881450a00fe9f34600feb3266356da22,Namespace:kube-system,Attempt:0,} returns sandbox id \"ca1a1adb9a0c3708b8bb93c46c621c1466b71d1950b12a694bc194b3e003f29d\"" Mar 7 01:06:58.582822 containerd[1511]: time="2026-03-07T01:06:58.582796214Z" level=info msg="CreateContainer within sandbox \"ca1a1adb9a0c3708b8bb93c46c621c1466b71d1950b12a694bc194b3e003f29d\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 7 01:06:58.596307 containerd[1511]: time="2026-03-07T01:06:58.596277559Z" level=info msg="CreateContainer within sandbox \"ca1a1adb9a0c3708b8bb93c46c621c1466b71d1950b12a694bc194b3e003f29d\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"4a05c8870c2319dc9fc8bba71938daeaeb82d5e395972a2d876e9b481ba47dee\"" Mar 7 01:06:58.596688 containerd[1511]: time="2026-03-07T01:06:58.596666068Z" level=info msg="StartContainer for \"4a05c8870c2319dc9fc8bba71938daeaeb82d5e395972a2d876e9b481ba47dee\"" Mar 7 01:06:58.612529 containerd[1511]: time="2026-03-07T01:06:58.612499008Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-n-593f1c83d2,Uid:bb552851c6c70b587d12c53cece59c48,Namespace:kube-system,Attempt:0,} returns sandbox id \"4676abf7b4c631d2ba0e1fa8c0b42279e1d68479ca0be94f85e3ce7db2bbd9a1\"" Mar 7 01:06:58.613300 containerd[1511]: time="2026-03-07T01:06:58.613262433Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-n-593f1c83d2,Uid:bd5e42152fdb7f4ac5a6623b709603de,Namespace:kube-system,Attempt:0,} returns sandbox id \"4846f1466b6c3d751b9c00067bebf32d9ec4b53e5a103f0adffdc801f97c0d15\"" Mar 7 01:06:58.616761 containerd[1511]: time="2026-03-07T01:06:58.616658646Z" level=info msg="CreateContainer within sandbox \"4676abf7b4c631d2ba0e1fa8c0b42279e1d68479ca0be94f85e3ce7db2bbd9a1\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 7 01:06:58.618240 containerd[1511]: time="2026-03-07T01:06:58.617814171Z" level=info msg="CreateContainer within sandbox \"4846f1466b6c3d751b9c00067bebf32d9ec4b53e5a103f0adffdc801f97c0d15\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 7 01:06:58.636377 systemd[1]: Started cri-containerd-4a05c8870c2319dc9fc8bba71938daeaeb82d5e395972a2d876e9b481ba47dee.scope - libcontainer container 4a05c8870c2319dc9fc8bba71938daeaeb82d5e395972a2d876e9b481ba47dee. Mar 7 01:06:58.647581 containerd[1511]: time="2026-03-07T01:06:58.647549163Z" level=info msg="CreateContainer within sandbox \"4676abf7b4c631d2ba0e1fa8c0b42279e1d68479ca0be94f85e3ce7db2bbd9a1\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"3f547db23e5f58392e323f5d30c1bcf8fbdf64d75858d49c895405e1af795fb7\"" Mar 7 01:06:58.648207 containerd[1511]: time="2026-03-07T01:06:58.648109142Z" level=info msg="StartContainer for \"3f547db23e5f58392e323f5d30c1bcf8fbdf64d75858d49c895405e1af795fb7\"" Mar 7 01:06:58.658176 containerd[1511]: time="2026-03-07T01:06:58.658135276Z" level=info msg="CreateContainer within sandbox \"4846f1466b6c3d751b9c00067bebf32d9ec4b53e5a103f0adffdc801f97c0d15\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"5a52644178bd6b7280bf0af6e209ff785d77a3d7cdafea2f9d0c3431bdb9e800\"" Mar 7 01:06:58.658891 containerd[1511]: time="2026-03-07T01:06:58.658868320Z" level=info msg="StartContainer for \"5a52644178bd6b7280bf0af6e209ff785d77a3d7cdafea2f9d0c3431bdb9e800\"" Mar 7 01:06:58.684567 systemd[1]: Started cri-containerd-3f547db23e5f58392e323f5d30c1bcf8fbdf64d75858d49c895405e1af795fb7.scope - libcontainer container 3f547db23e5f58392e323f5d30c1bcf8fbdf64d75858d49c895405e1af795fb7. Mar 7 01:06:58.693207 containerd[1511]: time="2026-03-07T01:06:58.689756801Z" level=info msg="StartContainer for \"4a05c8870c2319dc9fc8bba71938daeaeb82d5e395972a2d876e9b481ba47dee\" returns successfully" Mar 7 01:06:58.702308 systemd[1]: Started cri-containerd-5a52644178bd6b7280bf0af6e209ff785d77a3d7cdafea2f9d0c3431bdb9e800.scope - libcontainer container 5a52644178bd6b7280bf0af6e209ff785d77a3d7cdafea2f9d0c3431bdb9e800. Mar 7 01:06:58.749135 containerd[1511]: time="2026-03-07T01:06:58.749085386Z" level=info msg="StartContainer for \"3f547db23e5f58392e323f5d30c1bcf8fbdf64d75858d49c895405e1af795fb7\" returns successfully" Mar 7 01:06:58.772409 containerd[1511]: time="2026-03-07T01:06:58.772326940Z" level=info msg="StartContainer for \"5a52644178bd6b7280bf0af6e209ff785d77a3d7cdafea2f9d0c3431bdb9e800\" returns successfully" Mar 7 01:06:58.996848 kubelet[2198]: I0307 01:06:58.996705 2198 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:59.386876 kubelet[2198]: E0307 01:06:59.386838 2198 nodelease.go:50] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4081-3-6-n-593f1c83d2\" not found" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:59.474676 kubelet[2198]: E0307 01:06:59.474525 2198 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-593f1c83d2\" not found" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:59.475769 kubelet[2198]: E0307 01:06:59.475651 2198 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-593f1c83d2\" not found" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:59.475769 kubelet[2198]: E0307 01:06:59.475690 2198 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-593f1c83d2\" not found" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:59.484302 kubelet[2198]: I0307 01:06:59.484280 2198 kubelet_node_status.go:77] "Successfully registered node" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:06:59.484302 kubelet[2198]: E0307 01:06:59.484301 2198 kubelet_node_status.go:474] "Error updating node status, will retry" err="error getting node \"ci-4081-3-6-n-593f1c83d2\": node \"ci-4081-3-6-n-593f1c83d2\" not found" Mar 7 01:06:59.492240 kubelet[2198]: E0307 01:06:59.492209 2198 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-593f1c83d2\" not found" Mar 7 01:06:59.594247 kubelet[2198]: E0307 01:06:59.592515 2198 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-593f1c83d2\" not found" Mar 7 01:06:59.693588 kubelet[2198]: E0307 01:06:59.693274 2198 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-593f1c83d2\" not found" Mar 7 01:06:59.793555 kubelet[2198]: E0307 01:06:59.793464 2198 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-593f1c83d2\" not found" Mar 7 01:06:59.893996 kubelet[2198]: E0307 01:06:59.893950 2198 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-593f1c83d2\" not found" Mar 7 01:06:59.994307 kubelet[2198]: E0307 01:06:59.994106 2198 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-593f1c83d2\" not found" Mar 7 01:07:00.095072 kubelet[2198]: E0307 01:07:00.094993 2198 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-593f1c83d2\" not found" Mar 7 01:07:00.196028 kubelet[2198]: E0307 01:07:00.195945 2198 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-593f1c83d2\" not found" Mar 7 01:07:00.320483 kubelet[2198]: I0307 01:07:00.320303 2198 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:00.328250 kubelet[2198]: E0307 01:07:00.327444 2198 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-6-n-593f1c83d2\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:00.328250 kubelet[2198]: I0307 01:07:00.327478 2198 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:00.330111 kubelet[2198]: E0307 01:07:00.330052 2198 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-593f1c83d2\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:00.330111 kubelet[2198]: I0307 01:07:00.330086 2198 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:00.332369 kubelet[2198]: E0307 01:07:00.332288 2198 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-593f1c83d2\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:00.390655 kubelet[2198]: I0307 01:07:00.390562 2198 apiserver.go:52] "Watching apiserver" Mar 7 01:07:00.419272 kubelet[2198]: I0307 01:07:00.419150 2198 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 7 01:07:00.477689 kubelet[2198]: I0307 01:07:00.477313 2198 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:00.477689 kubelet[2198]: I0307 01:07:00.477604 2198 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:01.564760 systemd[1]: Reloading requested from client PID 2481 ('systemctl') (unit session-7.scope)... Mar 7 01:07:01.564788 systemd[1]: Reloading... Mar 7 01:07:01.657222 zram_generator::config[2521]: No configuration found. Mar 7 01:07:01.747266 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 01:07:01.816832 systemd[1]: Reloading finished in 251 ms. Mar 7 01:07:01.864493 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:07:01.879642 systemd[1]: kubelet.service: Deactivated successfully. Mar 7 01:07:01.880101 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:07:01.884423 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:07:02.035445 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:07:02.036178 (kubelet)[2572]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 7 01:07:02.076850 kubelet[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 01:07:02.088069 kubelet[2572]: I0307 01:07:02.086659 2572 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Mar 7 01:07:02.088069 kubelet[2572]: I0307 01:07:02.086684 2572 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 7 01:07:02.088069 kubelet[2572]: I0307 01:07:02.086700 2572 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 7 01:07:02.088069 kubelet[2572]: I0307 01:07:02.086704 2572 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 7 01:07:02.088069 kubelet[2572]: I0307 01:07:02.086965 2572 server.go:951] "Client rotation is on, will bootstrap in background" Mar 7 01:07:02.088778 kubelet[2572]: I0307 01:07:02.088758 2572 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 7 01:07:02.091574 kubelet[2572]: I0307 01:07:02.091174 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 7 01:07:02.093809 kubelet[2572]: E0307 01:07:02.093780 2572 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 7 01:07:02.093852 kubelet[2572]: I0307 01:07:02.093819 2572 server.go:1395] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Mar 7 01:07:02.097853 kubelet[2572]: I0307 01:07:02.097839 2572 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 7 01:07:02.098058 kubelet[2572]: I0307 01:07:02.098036 2572 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 7 01:07:02.098158 kubelet[2572]: I0307 01:07:02.098054 2572 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-n-593f1c83d2","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 7 01:07:02.098245 kubelet[2572]: I0307 01:07:02.098159 2572 topology_manager.go:143] "Creating topology manager with none policy" Mar 7 01:07:02.098245 kubelet[2572]: I0307 01:07:02.098168 2572 container_manager_linux.go:308] "Creating device plugin manager" Mar 7 01:07:02.098663 kubelet[2572]: I0307 01:07:02.098192 2572 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Mar 7 01:07:02.098809 kubelet[2572]: I0307 01:07:02.098796 2572 state_mem.go:41] "Initialized" logger="CPUManager state memory" Mar 7 01:07:02.098992 kubelet[2572]: I0307 01:07:02.098945 2572 kubelet.go:482] "Attempting to sync node with API server" Mar 7 01:07:02.098992 kubelet[2572]: I0307 01:07:02.098960 2572 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 7 01:07:02.098992 kubelet[2572]: I0307 01:07:02.098973 2572 kubelet.go:394] "Adding apiserver pod source" Mar 7 01:07:02.098992 kubelet[2572]: I0307 01:07:02.098985 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 7 01:07:02.100575 kubelet[2572]: I0307 01:07:02.100564 2572 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 7 01:07:02.102256 kubelet[2572]: I0307 01:07:02.101704 2572 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 7 01:07:02.102256 kubelet[2572]: I0307 01:07:02.101727 2572 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 7 01:07:02.107638 kubelet[2572]: I0307 01:07:02.107586 2572 server.go:1257] "Started kubelet" Mar 7 01:07:02.111709 kubelet[2572]: I0307 01:07:02.111293 2572 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Mar 7 01:07:02.116772 kubelet[2572]: I0307 01:07:02.116404 2572 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Mar 7 01:07:02.118710 kubelet[2572]: I0307 01:07:02.118693 2572 server.go:317] "Adding debug handlers to kubelet server" Mar 7 01:07:02.124276 kubelet[2572]: I0307 01:07:02.124182 2572 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 7 01:07:02.125243 kubelet[2572]: I0307 01:07:02.124341 2572 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 7 01:07:02.125243 kubelet[2572]: I0307 01:07:02.124467 2572 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 7 01:07:02.125243 kubelet[2572]: I0307 01:07:02.124775 2572 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 7 01:07:02.131710 kubelet[2572]: I0307 01:07:02.131697 2572 volume_manager.go:311] "Starting Kubelet Volume Manager" Mar 7 01:07:02.133148 kubelet[2572]: I0307 01:07:02.132637 2572 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 7 01:07:02.133337 kubelet[2572]: I0307 01:07:02.133328 2572 reconciler.go:29] "Reconciler: start to sync state" Mar 7 01:07:02.135815 kubelet[2572]: I0307 01:07:02.135562 2572 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 7 01:07:02.135880 kubelet[2572]: I0307 01:07:02.135864 2572 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 7 01:07:02.136999 kubelet[2572]: I0307 01:07:02.136773 2572 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 7 01:07:02.136999 kubelet[2572]: I0307 01:07:02.136786 2572 status_manager.go:249] "Starting to sync pod status with apiserver" Mar 7 01:07:02.136999 kubelet[2572]: I0307 01:07:02.136800 2572 kubelet.go:2501] "Starting kubelet main sync loop" Mar 7 01:07:02.136999 kubelet[2572]: E0307 01:07:02.136835 2572 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 7 01:07:02.141046 kubelet[2572]: I0307 01:07:02.140895 2572 factory.go:223] Registration of the containerd container factory successfully Mar 7 01:07:02.141119 kubelet[2572]: I0307 01:07:02.141112 2572 factory.go:223] Registration of the systemd container factory successfully Mar 7 01:07:02.147176 kubelet[2572]: E0307 01:07:02.146878 2572 kubelet.go:1656] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 7 01:07:02.191100 kubelet[2572]: I0307 01:07:02.191079 2572 cpu_manager.go:225] "Starting" policy="none" Mar 7 01:07:02.191100 kubelet[2572]: I0307 01:07:02.191092 2572 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 7 01:07:02.191100 kubelet[2572]: I0307 01:07:02.191107 2572 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Mar 7 01:07:02.191290 kubelet[2572]: I0307 01:07:02.191261 2572 state_mem.go:94] "Updated default CPUSet" logger="CPUManager state checkpoint.CPUManager state memory" cpuSet="" Mar 7 01:07:02.191330 kubelet[2572]: I0307 01:07:02.191284 2572 state_mem.go:102] "Updated CPUSet assignments" logger="CPUManager state checkpoint.CPUManager state memory" assignments={} Mar 7 01:07:02.191330 kubelet[2572]: I0307 01:07:02.191309 2572 policy_none.go:50] "Start" Mar 7 01:07:02.191330 kubelet[2572]: I0307 01:07:02.191316 2572 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 7 01:07:02.191330 kubelet[2572]: I0307 01:07:02.191325 2572 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 7 01:07:02.191425 kubelet[2572]: I0307 01:07:02.191413 2572 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 7 01:07:02.191425 kubelet[2572]: I0307 01:07:02.191424 2572 policy_none.go:44] "Start" Mar 7 01:07:02.195303 kubelet[2572]: E0307 01:07:02.195282 2572 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 7 01:07:02.195433 kubelet[2572]: I0307 01:07:02.195414 2572 eviction_manager.go:194] "Eviction manager: starting control loop" Mar 7 01:07:02.195456 kubelet[2572]: I0307 01:07:02.195425 2572 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 7 01:07:02.195699 kubelet[2572]: I0307 01:07:02.195621 2572 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Mar 7 01:07:02.197902 kubelet[2572]: E0307 01:07:02.197879 2572 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 7 01:07:02.238128 kubelet[2572]: I0307 01:07:02.238106 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:02.238418 kubelet[2572]: I0307 01:07:02.238387 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:02.238657 kubelet[2572]: I0307 01:07:02.238546 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:02.245096 kubelet[2572]: E0307 01:07:02.244973 2572 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-593f1c83d2\" already exists" pod="kube-system/kube-apiserver-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:02.245724 kubelet[2572]: E0307 01:07:02.245691 2572 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-593f1c83d2\" already exists" pod="kube-system/kube-scheduler-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:02.303727 kubelet[2572]: I0307 01:07:02.303670 2572 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:02.312540 kubelet[2572]: I0307 01:07:02.312506 2572 kubelet_node_status.go:123] "Node was previously registered" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:02.312684 kubelet[2572]: I0307 01:07:02.312650 2572 kubelet_node_status.go:77] "Successfully registered node" node="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:02.434664 kubelet[2572]: I0307 01:07:02.434332 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/881450a00fe9f34600feb3266356da22-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-n-593f1c83d2\" (UID: \"881450a00fe9f34600feb3266356da22\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:02.434664 kubelet[2572]: I0307 01:07:02.434375 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/881450a00fe9f34600feb3266356da22-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-n-593f1c83d2\" (UID: \"881450a00fe9f34600feb3266356da22\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:02.434664 kubelet[2572]: I0307 01:07:02.434401 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/bd5e42152fdb7f4ac5a6623b709603de-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-593f1c83d2\" (UID: \"bd5e42152fdb7f4ac5a6623b709603de\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:02.434664 kubelet[2572]: I0307 01:07:02.434428 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/bd5e42152fdb7f4ac5a6623b709603de-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-n-593f1c83d2\" (UID: \"bd5e42152fdb7f4ac5a6623b709603de\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:02.434664 kubelet[2572]: I0307 01:07:02.434455 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/881450a00fe9f34600feb3266356da22-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-n-593f1c83d2\" (UID: \"881450a00fe9f34600feb3266356da22\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:02.434984 kubelet[2572]: I0307 01:07:02.434479 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/bd5e42152fdb7f4ac5a6623b709603de-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-n-593f1c83d2\" (UID: \"bd5e42152fdb7f4ac5a6623b709603de\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:02.434984 kubelet[2572]: I0307 01:07:02.434513 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/bd5e42152fdb7f4ac5a6623b709603de-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-593f1c83d2\" (UID: \"bd5e42152fdb7f4ac5a6623b709603de\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:02.434984 kubelet[2572]: I0307 01:07:02.434561 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/bd5e42152fdb7f4ac5a6623b709603de-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-n-593f1c83d2\" (UID: \"bd5e42152fdb7f4ac5a6623b709603de\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:02.434984 kubelet[2572]: I0307 01:07:02.434605 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/bb552851c6c70b587d12c53cece59c48-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-n-593f1c83d2\" (UID: \"bb552851c6c70b587d12c53cece59c48\") " pod="kube-system/kube-scheduler-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:03.100252 kubelet[2572]: I0307 01:07:03.100217 2572 apiserver.go:52] "Watching apiserver" Mar 7 01:07:03.133726 kubelet[2572]: I0307 01:07:03.133663 2572 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 7 01:07:03.171983 kubelet[2572]: I0307 01:07:03.171538 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:03.171983 kubelet[2572]: I0307 01:07:03.171763 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:03.179023 kubelet[2572]: E0307 01:07:03.178981 2572 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-593f1c83d2\" already exists" pod="kube-system/kube-apiserver-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:03.180219 kubelet[2572]: E0307 01:07:03.180155 2572 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-593f1c83d2\" already exists" pod="kube-system/kube-scheduler-ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:03.197704 kubelet[2572]: I0307 01:07:03.197619 2572 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4081-3-6-n-593f1c83d2" podStartSLOduration=3.197588689 podStartE2EDuration="3.197588689s" podCreationTimestamp="2026-03-07 01:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:07:03.197415713 +0000 UTC m=+1.157142483" watchObservedRunningTime="2026-03-07 01:07:03.197588689 +0000 UTC m=+1.157315459" Mar 7 01:07:03.207513 kubelet[2572]: I0307 01:07:03.207250 2572 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-593f1c83d2" podStartSLOduration=1.207237235 podStartE2EDuration="1.207237235s" podCreationTimestamp="2026-03-07 01:07:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:07:03.206971326 +0000 UTC m=+1.166698106" watchObservedRunningTime="2026-03-07 01:07:03.207237235 +0000 UTC m=+1.166964025" Mar 7 01:07:03.217239 kubelet[2572]: I0307 01:07:03.216896 2572 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4081-3-6-n-593f1c83d2" podStartSLOduration=3.216876759 podStartE2EDuration="3.216876759s" podCreationTimestamp="2026-03-07 01:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:07:03.21678079 +0000 UTC m=+1.176507560" watchObservedRunningTime="2026-03-07 01:07:03.216876759 +0000 UTC m=+1.176603550" Mar 7 01:07:08.364397 kubelet[2572]: I0307 01:07:08.364317 2572 kuberuntime_manager.go:2062] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 7 01:07:08.365847 kubelet[2572]: I0307 01:07:08.365264 2572 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 7 01:07:08.365905 containerd[1511]: time="2026-03-07T01:07:08.364935854Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 7 01:07:09.059135 systemd[1]: Created slice kubepods-besteffort-pod2befbc58_fc9d_43fa_9aa2_dce8a400b33d.slice - libcontainer container kubepods-besteffort-pod2befbc58_fc9d_43fa_9aa2_dce8a400b33d.slice. Mar 7 01:07:09.073861 kubelet[2572]: I0307 01:07:09.073810 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/2befbc58-fc9d-43fa-9aa2-dce8a400b33d-kube-proxy\") pod \"kube-proxy-p7wmz\" (UID: \"2befbc58-fc9d-43fa-9aa2-dce8a400b33d\") " pod="kube-system/kube-proxy-p7wmz" Mar 7 01:07:09.073861 kubelet[2572]: I0307 01:07:09.073840 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2befbc58-fc9d-43fa-9aa2-dce8a400b33d-lib-modules\") pod \"kube-proxy-p7wmz\" (UID: \"2befbc58-fc9d-43fa-9aa2-dce8a400b33d\") " pod="kube-system/kube-proxy-p7wmz" Mar 7 01:07:09.074035 kubelet[2572]: I0307 01:07:09.073897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/2befbc58-fc9d-43fa-9aa2-dce8a400b33d-xtables-lock\") pod \"kube-proxy-p7wmz\" (UID: \"2befbc58-fc9d-43fa-9aa2-dce8a400b33d\") " pod="kube-system/kube-proxy-p7wmz" Mar 7 01:07:09.074035 kubelet[2572]: I0307 01:07:09.073911 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52rst\" (UniqueName: \"kubernetes.io/projected/2befbc58-fc9d-43fa-9aa2-dce8a400b33d-kube-api-access-52rst\") pod \"kube-proxy-p7wmz\" (UID: \"2befbc58-fc9d-43fa-9aa2-dce8a400b33d\") " pod="kube-system/kube-proxy-p7wmz" Mar 7 01:07:09.370935 containerd[1511]: time="2026-03-07T01:07:09.370637893Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-p7wmz,Uid:2befbc58-fc9d-43fa-9aa2-dce8a400b33d,Namespace:kube-system,Attempt:0,}" Mar 7 01:07:09.416073 containerd[1511]: time="2026-03-07T01:07:09.415436589Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:07:09.416073 containerd[1511]: time="2026-03-07T01:07:09.415526526Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:07:09.416073 containerd[1511]: time="2026-03-07T01:07:09.415542776Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:09.416073 containerd[1511]: time="2026-03-07T01:07:09.415995034Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:09.445333 systemd[1]: Started cri-containerd-bb8f45e3fe63567df76460bdd9d0183a54e7ebce3a0044c89a24310b9feba5f1.scope - libcontainer container bb8f45e3fe63567df76460bdd9d0183a54e7ebce3a0044c89a24310b9feba5f1. Mar 7 01:07:09.464256 containerd[1511]: time="2026-03-07T01:07:09.464122990Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-p7wmz,Uid:2befbc58-fc9d-43fa-9aa2-dce8a400b33d,Namespace:kube-system,Attempt:0,} returns sandbox id \"bb8f45e3fe63567df76460bdd9d0183a54e7ebce3a0044c89a24310b9feba5f1\"" Mar 7 01:07:09.468619 containerd[1511]: time="2026-03-07T01:07:09.468599986Z" level=info msg="CreateContainer within sandbox \"bb8f45e3fe63567df76460bdd9d0183a54e7ebce3a0044c89a24310b9feba5f1\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 7 01:07:09.482988 containerd[1511]: time="2026-03-07T01:07:09.482922277Z" level=info msg="CreateContainer within sandbox \"bb8f45e3fe63567df76460bdd9d0183a54e7ebce3a0044c89a24310b9feba5f1\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"65dde70a605b84b8c5c7ad626fa9f284881f720e8eb5718b8378c73746fa29f2\"" Mar 7 01:07:09.484151 containerd[1511]: time="2026-03-07T01:07:09.483305868Z" level=info msg="StartContainer for \"65dde70a605b84b8c5c7ad626fa9f284881f720e8eb5718b8378c73746fa29f2\"" Mar 7 01:07:09.507307 systemd[1]: Started cri-containerd-65dde70a605b84b8c5c7ad626fa9f284881f720e8eb5718b8378c73746fa29f2.scope - libcontainer container 65dde70a605b84b8c5c7ad626fa9f284881f720e8eb5718b8378c73746fa29f2. Mar 7 01:07:09.534667 containerd[1511]: time="2026-03-07T01:07:09.534554587Z" level=info msg="StartContainer for \"65dde70a605b84b8c5c7ad626fa9f284881f720e8eb5718b8378c73746fa29f2\" returns successfully" Mar 7 01:07:09.578954 systemd[1]: Created slice kubepods-besteffort-pod8c6c21c7_f042_4f30_b2bc_57048b6b4a4c.slice - libcontainer container kubepods-besteffort-pod8c6c21c7_f042_4f30_b2bc_57048b6b4a4c.slice. Mar 7 01:07:09.678763 kubelet[2572]: I0307 01:07:09.678552 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8c6c21c7-f042-4f30-b2bc-57048b6b4a4c-var-lib-calico\") pod \"tigera-operator-6cf4cccc57-kpvhj\" (UID: \"8c6c21c7-f042-4f30-b2bc-57048b6b4a4c\") " pod="tigera-operator/tigera-operator-6cf4cccc57-kpvhj" Mar 7 01:07:09.678763 kubelet[2572]: I0307 01:07:09.678608 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5t4c\" (UniqueName: \"kubernetes.io/projected/8c6c21c7-f042-4f30-b2bc-57048b6b4a4c-kube-api-access-x5t4c\") pod \"tigera-operator-6cf4cccc57-kpvhj\" (UID: \"8c6c21c7-f042-4f30-b2bc-57048b6b4a4c\") " pod="tigera-operator/tigera-operator-6cf4cccc57-kpvhj" Mar 7 01:07:09.884921 containerd[1511]: time="2026-03-07T01:07:09.884485174Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-kpvhj,Uid:8c6c21c7-f042-4f30-b2bc-57048b6b4a4c,Namespace:tigera-operator,Attempt:0,}" Mar 7 01:07:09.921530 containerd[1511]: time="2026-03-07T01:07:09.921326665Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:07:09.922264 containerd[1511]: time="2026-03-07T01:07:09.921722288Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:07:09.923011 containerd[1511]: time="2026-03-07T01:07:09.922915760Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:09.923947 containerd[1511]: time="2026-03-07T01:07:09.923149252Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:09.950329 systemd[1]: Started cri-containerd-0676fc6ff0ebbb61e89a03b796948feeff8ca5b9e4fdb374bf65556033487581.scope - libcontainer container 0676fc6ff0ebbb61e89a03b796948feeff8ca5b9e4fdb374bf65556033487581. Mar 7 01:07:09.989589 containerd[1511]: time="2026-03-07T01:07:09.989543828Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-kpvhj,Uid:8c6c21c7-f042-4f30-b2bc-57048b6b4a4c,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"0676fc6ff0ebbb61e89a03b796948feeff8ca5b9e4fdb374bf65556033487581\"" Mar 7 01:07:09.991712 containerd[1511]: time="2026-03-07T01:07:09.991567348Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 7 01:07:10.204255 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1516096681.mount: Deactivated successfully. Mar 7 01:07:10.217918 kubelet[2572]: I0307 01:07:10.217801 2572 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-proxy-p7wmz" podStartSLOduration=1.217781898 podStartE2EDuration="1.217781898s" podCreationTimestamp="2026-03-07 01:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:07:10.217566098 +0000 UTC m=+8.177292878" watchObservedRunningTime="2026-03-07 01:07:10.217781898 +0000 UTC m=+8.177508678" Mar 7 01:07:11.690158 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4047394411.mount: Deactivated successfully. Mar 7 01:07:12.413630 containerd[1511]: time="2026-03-07T01:07:12.413580132Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:12.414788 containerd[1511]: time="2026-03-07T01:07:12.414651232Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Mar 7 01:07:12.415746 containerd[1511]: time="2026-03-07T01:07:12.415564197Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:12.417980 containerd[1511]: time="2026-03-07T01:07:12.417485703Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:12.417980 containerd[1511]: time="2026-03-07T01:07:12.417889385Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 2.426187047s" Mar 7 01:07:12.417980 containerd[1511]: time="2026-03-07T01:07:12.417911373Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Mar 7 01:07:12.422067 containerd[1511]: time="2026-03-07T01:07:12.422040854Z" level=info msg="CreateContainer within sandbox \"0676fc6ff0ebbb61e89a03b796948feeff8ca5b9e4fdb374bf65556033487581\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 7 01:07:12.434304 containerd[1511]: time="2026-03-07T01:07:12.434274957Z" level=info msg="CreateContainer within sandbox \"0676fc6ff0ebbb61e89a03b796948feeff8ca5b9e4fdb374bf65556033487581\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"bebcf63f8b2070de6306897606c5ba9d599db7890dc710512037db3c5e171565\"" Mar 7 01:07:12.435119 containerd[1511]: time="2026-03-07T01:07:12.434716123Z" level=info msg="StartContainer for \"bebcf63f8b2070de6306897606c5ba9d599db7890dc710512037db3c5e171565\"" Mar 7 01:07:12.464316 systemd[1]: Started cri-containerd-bebcf63f8b2070de6306897606c5ba9d599db7890dc710512037db3c5e171565.scope - libcontainer container bebcf63f8b2070de6306897606c5ba9d599db7890dc710512037db3c5e171565. Mar 7 01:07:12.487515 containerd[1511]: time="2026-03-07T01:07:12.487488088Z" level=info msg="StartContainer for \"bebcf63f8b2070de6306897606c5ba9d599db7890dc710512037db3c5e171565\" returns successfully" Mar 7 01:07:13.227791 kubelet[2572]: I0307 01:07:13.227174 2572 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6cf4cccc57-kpvhj" podStartSLOduration=1.799232368 podStartE2EDuration="4.227157723s" podCreationTimestamp="2026-03-07 01:07:09 +0000 UTC" firstStartedPulling="2026-03-07 01:07:09.990833669 +0000 UTC m=+7.950560419" lastFinishedPulling="2026-03-07 01:07:12.418759024 +0000 UTC m=+10.378485774" observedRunningTime="2026-03-07 01:07:13.22694513 +0000 UTC m=+11.186671910" watchObservedRunningTime="2026-03-07 01:07:13.227157723 +0000 UTC m=+11.186884503" Mar 7 01:07:17.605463 sudo[1701]: pam_unix(sudo:session): session closed for user root Mar 7 01:07:17.725461 sshd[1698]: pam_unix(sshd:session): session closed for user core Mar 7 01:07:17.728422 systemd-logind[1489]: Session 7 logged out. Waiting for processes to exit. Mar 7 01:07:17.730740 systemd[1]: sshd@6-204.168.152.184:22-4.153.228.146:45294.service: Deactivated successfully. Mar 7 01:07:17.732859 systemd[1]: session-7.scope: Deactivated successfully. Mar 7 01:07:17.733679 systemd[1]: session-7.scope: Consumed 3.214s CPU time, 158.0M memory peak, 0B memory swap peak. Mar 7 01:07:17.735000 systemd-logind[1489]: Removed session 7. Mar 7 01:07:18.184215 update_engine[1496]: I20260307 01:07:18.182862 1496 update_attempter.cc:509] Updating boot flags... Mar 7 01:07:18.246287 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 34 scanned by (udev-worker) (2971) Mar 7 01:07:18.336852 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 34 scanned by (udev-worker) (2972) Mar 7 01:07:18.429238 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 34 scanned by (udev-worker) (2972) Mar 7 01:07:19.595428 systemd[1]: Created slice kubepods-besteffort-podbbafbfe7_dfa0_4745_9fad_7bc6c73ccbeb.slice - libcontainer container kubepods-besteffort-podbbafbfe7_dfa0_4745_9fad_7bc6c73ccbeb.slice. Mar 7 01:07:19.646583 systemd[1]: Created slice kubepods-besteffort-podd413ab48_0299_4af8_8d5d_3d5cf51f80e2.slice - libcontainer container kubepods-besteffort-podd413ab48_0299_4af8_8d5d_3d5cf51f80e2.slice. Mar 7 01:07:19.648666 kubelet[2572]: I0307 01:07:19.648627 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbafbfe7-dfa0-4745-9fad-7bc6c73ccbeb-tigera-ca-bundle\") pod \"calico-typha-6db997cf49-csqbw\" (UID: \"bbafbfe7-dfa0-4745-9fad-7bc6c73ccbeb\") " pod="calico-system/calico-typha-6db997cf49-csqbw" Mar 7 01:07:19.649212 kubelet[2572]: I0307 01:07:19.649121 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/bbafbfe7-dfa0-4745-9fad-7bc6c73ccbeb-typha-certs\") pod \"calico-typha-6db997cf49-csqbw\" (UID: \"bbafbfe7-dfa0-4745-9fad-7bc6c73ccbeb\") " pod="calico-system/calico-typha-6db997cf49-csqbw" Mar 7 01:07:19.649379 kubelet[2572]: I0307 01:07:19.649313 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-449bb\" (UniqueName: \"kubernetes.io/projected/bbafbfe7-dfa0-4745-9fad-7bc6c73ccbeb-kube-api-access-449bb\") pod \"calico-typha-6db997cf49-csqbw\" (UID: \"bbafbfe7-dfa0-4745-9fad-7bc6c73ccbeb\") " pod="calico-system/calico-typha-6db997cf49-csqbw" Mar 7 01:07:19.745575 kubelet[2572]: E0307 01:07:19.744864 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2x9rm" podUID="0869cd07-08c1-477a-a901-3ac76e743a03" Mar 7 01:07:19.749916 kubelet[2572]: I0307 01:07:19.749894 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d413ab48-0299-4af8-8d5d-3d5cf51f80e2-lib-modules\") pod \"calico-node-42826\" (UID: \"d413ab48-0299-4af8-8d5d-3d5cf51f80e2\") " pod="calico-system/calico-node-42826" Mar 7 01:07:19.750579 kubelet[2572]: I0307 01:07:19.750466 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d413ab48-0299-4af8-8d5d-3d5cf51f80e2-xtables-lock\") pod \"calico-node-42826\" (UID: \"d413ab48-0299-4af8-8d5d-3d5cf51f80e2\") " pod="calico-system/calico-node-42826" Mar 7 01:07:19.750579 kubelet[2572]: I0307 01:07:19.750484 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/d413ab48-0299-4af8-8d5d-3d5cf51f80e2-bpffs\") pod \"calico-node-42826\" (UID: \"d413ab48-0299-4af8-8d5d-3d5cf51f80e2\") " pod="calico-system/calico-node-42826" Mar 7 01:07:19.750579 kubelet[2572]: I0307 01:07:19.750495 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d413ab48-0299-4af8-8d5d-3d5cf51f80e2-cni-bin-dir\") pod \"calico-node-42826\" (UID: \"d413ab48-0299-4af8-8d5d-3d5cf51f80e2\") " pod="calico-system/calico-node-42826" Mar 7 01:07:19.750579 kubelet[2572]: I0307 01:07:19.750509 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d413ab48-0299-4af8-8d5d-3d5cf51f80e2-flexvol-driver-host\") pod \"calico-node-42826\" (UID: \"d413ab48-0299-4af8-8d5d-3d5cf51f80e2\") " pod="calico-system/calico-node-42826" Mar 7 01:07:19.750579 kubelet[2572]: I0307 01:07:19.750519 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6899\" (UniqueName: \"kubernetes.io/projected/d413ab48-0299-4af8-8d5d-3d5cf51f80e2-kube-api-access-j6899\") pod \"calico-node-42826\" (UID: \"d413ab48-0299-4af8-8d5d-3d5cf51f80e2\") " pod="calico-system/calico-node-42826" Mar 7 01:07:19.751325 kubelet[2572]: I0307 01:07:19.750544 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d413ab48-0299-4af8-8d5d-3d5cf51f80e2-cni-log-dir\") pod \"calico-node-42826\" (UID: \"d413ab48-0299-4af8-8d5d-3d5cf51f80e2\") " pod="calico-system/calico-node-42826" Mar 7 01:07:19.751325 kubelet[2572]: I0307 01:07:19.750556 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/d413ab48-0299-4af8-8d5d-3d5cf51f80e2-nodeproc\") pod \"calico-node-42826\" (UID: \"d413ab48-0299-4af8-8d5d-3d5cf51f80e2\") " pod="calico-system/calico-node-42826" Mar 7 01:07:19.751325 kubelet[2572]: I0307 01:07:19.750802 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d413ab48-0299-4af8-8d5d-3d5cf51f80e2-var-run-calico\") pod \"calico-node-42826\" (UID: \"d413ab48-0299-4af8-8d5d-3d5cf51f80e2\") " pod="calico-system/calico-node-42826" Mar 7 01:07:19.751325 kubelet[2572]: I0307 01:07:19.750827 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d413ab48-0299-4af8-8d5d-3d5cf51f80e2-tigera-ca-bundle\") pod \"calico-node-42826\" (UID: \"d413ab48-0299-4af8-8d5d-3d5cf51f80e2\") " pod="calico-system/calico-node-42826" Mar 7 01:07:19.751325 kubelet[2572]: I0307 01:07:19.750837 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d413ab48-0299-4af8-8d5d-3d5cf51f80e2-var-lib-calico\") pod \"calico-node-42826\" (UID: \"d413ab48-0299-4af8-8d5d-3d5cf51f80e2\") " pod="calico-system/calico-node-42826" Mar 7 01:07:19.751519 kubelet[2572]: I0307 01:07:19.750851 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d413ab48-0299-4af8-8d5d-3d5cf51f80e2-policysync\") pod \"calico-node-42826\" (UID: \"d413ab48-0299-4af8-8d5d-3d5cf51f80e2\") " pod="calico-system/calico-node-42826" Mar 7 01:07:19.751519 kubelet[2572]: I0307 01:07:19.750861 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d413ab48-0299-4af8-8d5d-3d5cf51f80e2-sys-fs\") pod \"calico-node-42826\" (UID: \"d413ab48-0299-4af8-8d5d-3d5cf51f80e2\") " pod="calico-system/calico-node-42826" Mar 7 01:07:19.751519 kubelet[2572]: I0307 01:07:19.750876 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d413ab48-0299-4af8-8d5d-3d5cf51f80e2-cni-net-dir\") pod \"calico-node-42826\" (UID: \"d413ab48-0299-4af8-8d5d-3d5cf51f80e2\") " pod="calico-system/calico-node-42826" Mar 7 01:07:19.751519 kubelet[2572]: I0307 01:07:19.750888 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d413ab48-0299-4af8-8d5d-3d5cf51f80e2-node-certs\") pod \"calico-node-42826\" (UID: \"d413ab48-0299-4af8-8d5d-3d5cf51f80e2\") " pod="calico-system/calico-node-42826" Mar 7 01:07:19.852321 kubelet[2572]: I0307 01:07:19.851223 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqzjf\" (UniqueName: \"kubernetes.io/projected/0869cd07-08c1-477a-a901-3ac76e743a03-kube-api-access-dqzjf\") pod \"csi-node-driver-2x9rm\" (UID: \"0869cd07-08c1-477a-a901-3ac76e743a03\") " pod="calico-system/csi-node-driver-2x9rm" Mar 7 01:07:19.852321 kubelet[2572]: I0307 01:07:19.851271 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/0869cd07-08c1-477a-a901-3ac76e743a03-varrun\") pod \"csi-node-driver-2x9rm\" (UID: \"0869cd07-08c1-477a-a901-3ac76e743a03\") " pod="calico-system/csi-node-driver-2x9rm" Mar 7 01:07:19.852321 kubelet[2572]: I0307 01:07:19.851341 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0869cd07-08c1-477a-a901-3ac76e743a03-kubelet-dir\") pod \"csi-node-driver-2x9rm\" (UID: \"0869cd07-08c1-477a-a901-3ac76e743a03\") " pod="calico-system/csi-node-driver-2x9rm" Mar 7 01:07:19.852321 kubelet[2572]: I0307 01:07:19.851384 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0869cd07-08c1-477a-a901-3ac76e743a03-socket-dir\") pod \"csi-node-driver-2x9rm\" (UID: \"0869cd07-08c1-477a-a901-3ac76e743a03\") " pod="calico-system/csi-node-driver-2x9rm" Mar 7 01:07:19.852321 kubelet[2572]: I0307 01:07:19.851442 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0869cd07-08c1-477a-a901-3ac76e743a03-registration-dir\") pod \"csi-node-driver-2x9rm\" (UID: \"0869cd07-08c1-477a-a901-3ac76e743a03\") " pod="calico-system/csi-node-driver-2x9rm" Mar 7 01:07:19.856607 kubelet[2572]: E0307 01:07:19.856552 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.856607 kubelet[2572]: W0307 01:07:19.856580 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.856607 kubelet[2572]: E0307 01:07:19.856601 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.858236 kubelet[2572]: E0307 01:07:19.858161 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.858236 kubelet[2572]: W0307 01:07:19.858185 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.858236 kubelet[2572]: E0307 01:07:19.858233 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.864445 kubelet[2572]: E0307 01:07:19.861063 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.864445 kubelet[2572]: W0307 01:07:19.861084 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.864445 kubelet[2572]: E0307 01:07:19.861100 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.864950 kubelet[2572]: E0307 01:07:19.864902 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.865725 kubelet[2572]: W0307 01:07:19.865665 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.865725 kubelet[2572]: E0307 01:07:19.865704 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.877221 kubelet[2572]: E0307 01:07:19.875172 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.877221 kubelet[2572]: W0307 01:07:19.875214 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.877221 kubelet[2572]: E0307 01:07:19.875232 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.901402 containerd[1511]: time="2026-03-07T01:07:19.901352466Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6db997cf49-csqbw,Uid:bbafbfe7-dfa0-4745-9fad-7bc6c73ccbeb,Namespace:calico-system,Attempt:0,}" Mar 7 01:07:19.923410 containerd[1511]: time="2026-03-07T01:07:19.923141162Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:07:19.923410 containerd[1511]: time="2026-03-07T01:07:19.923254737Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:07:19.923410 containerd[1511]: time="2026-03-07T01:07:19.923346333Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:19.923735 containerd[1511]: time="2026-03-07T01:07:19.923675935Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:19.942357 systemd[1]: Started cri-containerd-5b1075221a4bdc6083164f8cd8ba26e4d2eb5d126eae8def1d089a0ed4445c1a.scope - libcontainer container 5b1075221a4bdc6083164f8cd8ba26e4d2eb5d126eae8def1d089a0ed4445c1a. Mar 7 01:07:19.951862 containerd[1511]: time="2026-03-07T01:07:19.951500611Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-42826,Uid:d413ab48-0299-4af8-8d5d-3d5cf51f80e2,Namespace:calico-system,Attempt:0,}" Mar 7 01:07:19.952168 kubelet[2572]: E0307 01:07:19.952121 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.952168 kubelet[2572]: W0307 01:07:19.952160 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.952247 kubelet[2572]: E0307 01:07:19.952181 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.954157 kubelet[2572]: E0307 01:07:19.954124 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.954157 kubelet[2572]: W0307 01:07:19.954142 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.954157 kubelet[2572]: E0307 01:07:19.954154 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.954805 kubelet[2572]: E0307 01:07:19.954710 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.954805 kubelet[2572]: W0307 01:07:19.954724 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.954805 kubelet[2572]: E0307 01:07:19.954734 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.955256 kubelet[2572]: E0307 01:07:19.955188 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.955256 kubelet[2572]: W0307 01:07:19.955253 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.955308 kubelet[2572]: E0307 01:07:19.955262 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.955749 kubelet[2572]: E0307 01:07:19.955726 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.955799 kubelet[2572]: W0307 01:07:19.955743 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.955819 kubelet[2572]: E0307 01:07:19.955807 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.957551 kubelet[2572]: E0307 01:07:19.957408 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.957551 kubelet[2572]: W0307 01:07:19.957430 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.957551 kubelet[2572]: E0307 01:07:19.957455 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.958051 kubelet[2572]: E0307 01:07:19.958041 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.958093 kubelet[2572]: W0307 01:07:19.958085 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.958250 kubelet[2572]: E0307 01:07:19.958126 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.959565 kubelet[2572]: E0307 01:07:19.959554 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.959621 kubelet[2572]: W0307 01:07:19.959613 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.959656 kubelet[2572]: E0307 01:07:19.959649 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.960273 kubelet[2572]: E0307 01:07:19.960246 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.960273 kubelet[2572]: W0307 01:07:19.960256 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.960273 kubelet[2572]: E0307 01:07:19.960263 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.960756 kubelet[2572]: E0307 01:07:19.960746 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.960867 kubelet[2572]: W0307 01:07:19.960794 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.960867 kubelet[2572]: E0307 01:07:19.960804 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.962443 kubelet[2572]: E0307 01:07:19.962383 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.962443 kubelet[2572]: W0307 01:07:19.962392 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.962443 kubelet[2572]: E0307 01:07:19.962401 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.962991 kubelet[2572]: E0307 01:07:19.962928 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.962991 kubelet[2572]: W0307 01:07:19.962937 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.962991 kubelet[2572]: E0307 01:07:19.962945 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.963690 kubelet[2572]: E0307 01:07:19.963588 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.963690 kubelet[2572]: W0307 01:07:19.963597 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.963690 kubelet[2572]: E0307 01:07:19.963606 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.964063 kubelet[2572]: E0307 01:07:19.964044 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.964217 kubelet[2572]: W0307 01:07:19.964101 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.964217 kubelet[2572]: E0307 01:07:19.964110 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.965280 kubelet[2572]: E0307 01:07:19.965270 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.965401 kubelet[2572]: W0307 01:07:19.965315 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.965401 kubelet[2572]: E0307 01:07:19.965324 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.965608 kubelet[2572]: E0307 01:07:19.965599 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.965780 kubelet[2572]: W0307 01:07:19.965641 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.965780 kubelet[2572]: E0307 01:07:19.965650 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.965967 kubelet[2572]: E0307 01:07:19.965959 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.966007 kubelet[2572]: W0307 01:07:19.966000 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.966039 kubelet[2572]: E0307 01:07:19.966032 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.966310 kubelet[2572]: E0307 01:07:19.966242 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.966310 kubelet[2572]: W0307 01:07:19.966249 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.966310 kubelet[2572]: E0307 01:07:19.966256 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.967365 kubelet[2572]: E0307 01:07:19.967263 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.967365 kubelet[2572]: W0307 01:07:19.967272 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.967365 kubelet[2572]: E0307 01:07:19.967280 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.967559 kubelet[2572]: E0307 01:07:19.967551 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.967674 kubelet[2572]: W0307 01:07:19.967589 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.967674 kubelet[2572]: E0307 01:07:19.967598 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.967839 kubelet[2572]: E0307 01:07:19.967831 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.967933 kubelet[2572]: W0307 01:07:19.967868 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.967933 kubelet[2572]: E0307 01:07:19.967876 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.968406 kubelet[2572]: E0307 01:07:19.968256 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.968406 kubelet[2572]: W0307 01:07:19.968265 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.968406 kubelet[2572]: E0307 01:07:19.968273 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.969554 kubelet[2572]: E0307 01:07:19.969543 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.969606 kubelet[2572]: W0307 01:07:19.969598 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.969725 kubelet[2572]: E0307 01:07:19.969645 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.970531 kubelet[2572]: E0307 01:07:19.970514 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.970718 kubelet[2572]: W0307 01:07:19.970610 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.970718 kubelet[2572]: E0307 01:07:19.970620 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.971272 kubelet[2572]: E0307 01:07:19.971086 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.971272 kubelet[2572]: W0307 01:07:19.971096 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.971272 kubelet[2572]: E0307 01:07:19.971104 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.971640 kubelet[2572]: E0307 01:07:19.971610 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:19.971640 kubelet[2572]: W0307 01:07:19.971618 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:19.971640 kubelet[2572]: E0307 01:07:19.971626 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:19.988117 containerd[1511]: time="2026-03-07T01:07:19.986652780Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:07:19.988117 containerd[1511]: time="2026-03-07T01:07:19.986694921Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:07:19.988117 containerd[1511]: time="2026-03-07T01:07:19.986702464Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:19.988117 containerd[1511]: time="2026-03-07T01:07:19.986751848Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:19.989572 containerd[1511]: time="2026-03-07T01:07:19.989544613Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6db997cf49-csqbw,Uid:bbafbfe7-dfa0-4745-9fad-7bc6c73ccbeb,Namespace:calico-system,Attempt:0,} returns sandbox id \"5b1075221a4bdc6083164f8cd8ba26e4d2eb5d126eae8def1d089a0ed4445c1a\"" Mar 7 01:07:19.993628 containerd[1511]: time="2026-03-07T01:07:19.993520527Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 7 01:07:20.004308 systemd[1]: Started cri-containerd-5f984f9e6ab16ebde163d1462d928b9d4e6c203cf5359101a4f869ade855623d.scope - libcontainer container 5f984f9e6ab16ebde163d1462d928b9d4e6c203cf5359101a4f869ade855623d. Mar 7 01:07:20.026341 containerd[1511]: time="2026-03-07T01:07:20.026302985Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-42826,Uid:d413ab48-0299-4af8-8d5d-3d5cf51f80e2,Namespace:calico-system,Attempt:0,} returns sandbox id \"5f984f9e6ab16ebde163d1462d928b9d4e6c203cf5359101a4f869ade855623d\"" Mar 7 01:07:21.542466 kubelet[2572]: E0307 01:07:21.542404 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:21.542466 kubelet[2572]: W0307 01:07:21.542436 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:21.542466 kubelet[2572]: E0307 01:07:21.542469 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:21.543359 kubelet[2572]: E0307 01:07:21.543018 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:21.543359 kubelet[2572]: W0307 01:07:21.543035 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:21.543359 kubelet[2572]: E0307 01:07:21.543053 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:21.543616 kubelet[2572]: E0307 01:07:21.543501 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:21.543616 kubelet[2572]: W0307 01:07:21.543517 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:21.543616 kubelet[2572]: E0307 01:07:21.543532 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:21.544096 kubelet[2572]: E0307 01:07:21.544043 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:21.544096 kubelet[2572]: W0307 01:07:21.544078 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:21.544096 kubelet[2572]: E0307 01:07:21.544094 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:21.544827 kubelet[2572]: E0307 01:07:21.544622 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:21.544827 kubelet[2572]: W0307 01:07:21.544644 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:21.544827 kubelet[2572]: E0307 01:07:21.544665 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:21.545271 kubelet[2572]: E0307 01:07:21.545187 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:21.545271 kubelet[2572]: W0307 01:07:21.545241 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:21.545271 kubelet[2572]: E0307 01:07:21.545258 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:21.545764 kubelet[2572]: E0307 01:07:21.545714 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:21.545764 kubelet[2572]: W0307 01:07:21.545741 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:21.545764 kubelet[2572]: E0307 01:07:21.545760 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:21.546350 kubelet[2572]: E0307 01:07:21.546312 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:21.546350 kubelet[2572]: W0307 01:07:21.546336 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:21.546350 kubelet[2572]: E0307 01:07:21.546355 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:21.546857 kubelet[2572]: E0307 01:07:21.546799 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:21.546857 kubelet[2572]: W0307 01:07:21.546824 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:21.546857 kubelet[2572]: E0307 01:07:21.546842 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:21.547387 kubelet[2572]: E0307 01:07:21.547331 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:21.547387 kubelet[2572]: W0307 01:07:21.547352 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:21.547387 kubelet[2572]: E0307 01:07:21.547369 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:21.547895 kubelet[2572]: E0307 01:07:21.547856 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:21.547895 kubelet[2572]: W0307 01:07:21.547882 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:21.548030 kubelet[2572]: E0307 01:07:21.547900 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:21.548442 kubelet[2572]: E0307 01:07:21.548405 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:21.548442 kubelet[2572]: W0307 01:07:21.548428 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:21.548442 kubelet[2572]: E0307 01:07:21.548444 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:21.548938 kubelet[2572]: E0307 01:07:21.548898 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:21.548938 kubelet[2572]: W0307 01:07:21.548926 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:21.549075 kubelet[2572]: E0307 01:07:21.548942 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:21.549483 kubelet[2572]: E0307 01:07:21.549440 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:21.549483 kubelet[2572]: W0307 01:07:21.549464 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:21.549483 kubelet[2572]: E0307 01:07:21.549481 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:21.550078 kubelet[2572]: E0307 01:07:21.549930 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:21.550078 kubelet[2572]: W0307 01:07:21.549981 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:21.550078 kubelet[2572]: E0307 01:07:21.549998 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:21.951757 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1282013087.mount: Deactivated successfully. Mar 7 01:07:22.139108 kubelet[2572]: E0307 01:07:22.138257 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2x9rm" podUID="0869cd07-08c1-477a-a901-3ac76e743a03" Mar 7 01:07:23.094530 containerd[1511]: time="2026-03-07T01:07:23.094491830Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:23.095572 containerd[1511]: time="2026-03-07T01:07:23.095490572Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Mar 7 01:07:23.096726 containerd[1511]: time="2026-03-07T01:07:23.096364758Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:23.098123 containerd[1511]: time="2026-03-07T01:07:23.098094857Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:23.098634 containerd[1511]: time="2026-03-07T01:07:23.098537504Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 3.104920411s" Mar 7 01:07:23.098634 containerd[1511]: time="2026-03-07T01:07:23.098558740Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Mar 7 01:07:23.099422 containerd[1511]: time="2026-03-07T01:07:23.099291249Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 7 01:07:23.111418 containerd[1511]: time="2026-03-07T01:07:23.111390858Z" level=info msg="CreateContainer within sandbox \"5b1075221a4bdc6083164f8cd8ba26e4d2eb5d126eae8def1d089a0ed4445c1a\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 7 01:07:23.123962 containerd[1511]: time="2026-03-07T01:07:23.123844116Z" level=info msg="CreateContainer within sandbox \"5b1075221a4bdc6083164f8cd8ba26e4d2eb5d126eae8def1d089a0ed4445c1a\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"f4a89432a1cc9749189e59e1295723babeff519177f6efa13f3a84ed5a156bed\"" Mar 7 01:07:23.125323 containerd[1511]: time="2026-03-07T01:07:23.124407804Z" level=info msg="StartContainer for \"f4a89432a1cc9749189e59e1295723babeff519177f6efa13f3a84ed5a156bed\"" Mar 7 01:07:23.152327 systemd[1]: Started cri-containerd-f4a89432a1cc9749189e59e1295723babeff519177f6efa13f3a84ed5a156bed.scope - libcontainer container f4a89432a1cc9749189e59e1295723babeff519177f6efa13f3a84ed5a156bed. Mar 7 01:07:23.186641 containerd[1511]: time="2026-03-07T01:07:23.186606461Z" level=info msg="StartContainer for \"f4a89432a1cc9749189e59e1295723babeff519177f6efa13f3a84ed5a156bed\" returns successfully" Mar 7 01:07:23.258942 kubelet[2572]: E0307 01:07:23.258914 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.258942 kubelet[2572]: W0307 01:07:23.258933 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.258942 kubelet[2572]: E0307 01:07:23.258952 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.260061 kubelet[2572]: E0307 01:07:23.259447 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.260061 kubelet[2572]: W0307 01:07:23.259455 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.260061 kubelet[2572]: E0307 01:07:23.259465 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.260061 kubelet[2572]: E0307 01:07:23.259651 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.260061 kubelet[2572]: W0307 01:07:23.259657 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.260061 kubelet[2572]: E0307 01:07:23.259663 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.260061 kubelet[2572]: E0307 01:07:23.260039 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.260061 kubelet[2572]: W0307 01:07:23.260047 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.260061 kubelet[2572]: E0307 01:07:23.260055 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.260549 kubelet[2572]: E0307 01:07:23.260261 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.260549 kubelet[2572]: W0307 01:07:23.260267 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.260549 kubelet[2572]: E0307 01:07:23.260274 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.260549 kubelet[2572]: E0307 01:07:23.260432 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.260549 kubelet[2572]: W0307 01:07:23.260439 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.260549 kubelet[2572]: E0307 01:07:23.260446 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.260742 kubelet[2572]: E0307 01:07:23.260623 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.260742 kubelet[2572]: W0307 01:07:23.260629 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.260742 kubelet[2572]: E0307 01:07:23.260636 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.262279 kubelet[2572]: E0307 01:07:23.262262 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.262279 kubelet[2572]: W0307 01:07:23.262274 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.262349 kubelet[2572]: E0307 01:07:23.262283 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.262500 kubelet[2572]: E0307 01:07:23.262480 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.262500 kubelet[2572]: W0307 01:07:23.262490 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.262500 kubelet[2572]: E0307 01:07:23.262497 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.262888 kubelet[2572]: E0307 01:07:23.262685 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.262888 kubelet[2572]: W0307 01:07:23.262692 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.262888 kubelet[2572]: E0307 01:07:23.262699 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.262888 kubelet[2572]: E0307 01:07:23.262878 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.262888 kubelet[2572]: W0307 01:07:23.262884 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.262972 kubelet[2572]: E0307 01:07:23.262890 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.263073 kubelet[2572]: E0307 01:07:23.263059 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.263073 kubelet[2572]: W0307 01:07:23.263068 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.263115 kubelet[2572]: E0307 01:07:23.263074 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.263305 kubelet[2572]: E0307 01:07:23.263288 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.263305 kubelet[2572]: W0307 01:07:23.263298 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.263357 kubelet[2572]: E0307 01:07:23.263305 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.263524 kubelet[2572]: E0307 01:07:23.263512 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.263524 kubelet[2572]: W0307 01:07:23.263522 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.263563 kubelet[2572]: E0307 01:07:23.263529 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.263706 kubelet[2572]: E0307 01:07:23.263695 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.263706 kubelet[2572]: W0307 01:07:23.263704 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.263747 kubelet[2572]: E0307 01:07:23.263710 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.288077 kubelet[2572]: E0307 01:07:23.288050 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.288077 kubelet[2572]: W0307 01:07:23.288070 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.288211 kubelet[2572]: E0307 01:07:23.288087 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.288370 kubelet[2572]: E0307 01:07:23.288355 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.288370 kubelet[2572]: W0307 01:07:23.288369 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.288437 kubelet[2572]: E0307 01:07:23.288378 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.288637 kubelet[2572]: E0307 01:07:23.288625 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.288637 kubelet[2572]: W0307 01:07:23.288636 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.288687 kubelet[2572]: E0307 01:07:23.288644 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.288892 kubelet[2572]: E0307 01:07:23.288877 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.288892 kubelet[2572]: W0307 01:07:23.288889 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.289068 kubelet[2572]: E0307 01:07:23.288897 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.289131 kubelet[2572]: E0307 01:07:23.289113 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.289131 kubelet[2572]: W0307 01:07:23.289121 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.289232 kubelet[2572]: E0307 01:07:23.289129 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.289578 kubelet[2572]: E0307 01:07:23.289564 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.289578 kubelet[2572]: W0307 01:07:23.289576 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.289644 kubelet[2572]: E0307 01:07:23.289583 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.291402 kubelet[2572]: E0307 01:07:23.291382 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.291402 kubelet[2572]: W0307 01:07:23.291396 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.291494 kubelet[2572]: E0307 01:07:23.291406 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.291637 kubelet[2572]: E0307 01:07:23.291623 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.291637 kubelet[2572]: W0307 01:07:23.291634 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.291683 kubelet[2572]: E0307 01:07:23.291641 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.291861 kubelet[2572]: E0307 01:07:23.291846 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.291861 kubelet[2572]: W0307 01:07:23.291857 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.291899 kubelet[2572]: E0307 01:07:23.291863 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.292443 kubelet[2572]: E0307 01:07:23.292414 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.292443 kubelet[2572]: W0307 01:07:23.292440 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.292493 kubelet[2572]: E0307 01:07:23.292447 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.293339 kubelet[2572]: E0307 01:07:23.293323 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.293339 kubelet[2572]: W0307 01:07:23.293336 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.293386 kubelet[2572]: E0307 01:07:23.293344 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.293589 kubelet[2572]: E0307 01:07:23.293574 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.293614 kubelet[2572]: W0307 01:07:23.293588 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.293614 kubelet[2572]: E0307 01:07:23.293597 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.294473 kubelet[2572]: E0307 01:07:23.294458 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.294473 kubelet[2572]: W0307 01:07:23.294470 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.294515 kubelet[2572]: E0307 01:07:23.294477 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.294689 kubelet[2572]: E0307 01:07:23.294675 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.294689 kubelet[2572]: W0307 01:07:23.294686 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.294725 kubelet[2572]: E0307 01:07:23.294692 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.296685 kubelet[2572]: E0307 01:07:23.296668 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.296685 kubelet[2572]: W0307 01:07:23.296680 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.296732 kubelet[2572]: E0307 01:07:23.296688 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.296878 kubelet[2572]: E0307 01:07:23.296865 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.296878 kubelet[2572]: W0307 01:07:23.296875 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.296919 kubelet[2572]: E0307 01:07:23.296882 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.297204 kubelet[2572]: E0307 01:07:23.297176 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.297204 kubelet[2572]: W0307 01:07:23.297187 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.297259 kubelet[2572]: E0307 01:07:23.297220 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:23.297421 kubelet[2572]: E0307 01:07:23.297408 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:23.297421 kubelet[2572]: W0307 01:07:23.297419 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:23.297452 kubelet[2572]: E0307 01:07:23.297425 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.139010 kubelet[2572]: E0307 01:07:24.137544 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2x9rm" podUID="0869cd07-08c1-477a-a901-3ac76e743a03" Mar 7 01:07:24.240502 kubelet[2572]: I0307 01:07:24.240439 2572 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 7 01:07:24.269841 kubelet[2572]: E0307 01:07:24.269798 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.269841 kubelet[2572]: W0307 01:07:24.269828 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.270548 kubelet[2572]: E0307 01:07:24.269855 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.270548 kubelet[2572]: E0307 01:07:24.270348 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.270548 kubelet[2572]: W0307 01:07:24.270363 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.270548 kubelet[2572]: E0307 01:07:24.270379 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.270895 kubelet[2572]: E0307 01:07:24.270852 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.270895 kubelet[2572]: W0307 01:07:24.270879 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.270895 kubelet[2572]: E0307 01:07:24.270900 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.271446 kubelet[2572]: E0307 01:07:24.271421 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.271551 kubelet[2572]: W0307 01:07:24.271513 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.271551 kubelet[2572]: E0307 01:07:24.271543 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.271997 kubelet[2572]: E0307 01:07:24.271969 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.271997 kubelet[2572]: W0307 01:07:24.271993 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.272122 kubelet[2572]: E0307 01:07:24.272013 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.272690 kubelet[2572]: E0307 01:07:24.272508 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.272690 kubelet[2572]: W0307 01:07:24.272529 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.272690 kubelet[2572]: E0307 01:07:24.272546 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.273052 kubelet[2572]: E0307 01:07:24.273015 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.273052 kubelet[2572]: W0307 01:07:24.273038 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.273052 kubelet[2572]: E0307 01:07:24.273053 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.273608 kubelet[2572]: E0307 01:07:24.273570 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.273608 kubelet[2572]: W0307 01:07:24.273595 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.273733 kubelet[2572]: E0307 01:07:24.273613 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.274173 kubelet[2572]: E0307 01:07:24.274119 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.274173 kubelet[2572]: W0307 01:07:24.274144 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.274173 kubelet[2572]: E0307 01:07:24.274159 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.274721 kubelet[2572]: E0307 01:07:24.274679 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.274721 kubelet[2572]: W0307 01:07:24.274709 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.274827 kubelet[2572]: E0307 01:07:24.274726 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.275275 kubelet[2572]: E0307 01:07:24.275190 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.275275 kubelet[2572]: W0307 01:07:24.275250 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.275275 kubelet[2572]: E0307 01:07:24.275264 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.275750 kubelet[2572]: E0307 01:07:24.275713 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.275750 kubelet[2572]: W0307 01:07:24.275738 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.275861 kubelet[2572]: E0307 01:07:24.275758 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.276389 kubelet[2572]: E0307 01:07:24.276341 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.276389 kubelet[2572]: W0307 01:07:24.276367 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.276389 kubelet[2572]: E0307 01:07:24.276384 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.276851 kubelet[2572]: E0307 01:07:24.276825 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.276851 kubelet[2572]: W0307 01:07:24.276844 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.276995 kubelet[2572]: E0307 01:07:24.276859 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.277340 kubelet[2572]: E0307 01:07:24.277302 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.277340 kubelet[2572]: W0307 01:07:24.277328 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.277494 kubelet[2572]: E0307 01:07:24.277347 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.294849 kubelet[2572]: E0307 01:07:24.294808 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.294849 kubelet[2572]: W0307 01:07:24.294834 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.294849 kubelet[2572]: E0307 01:07:24.294855 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.295461 kubelet[2572]: E0307 01:07:24.295420 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.295461 kubelet[2572]: W0307 01:07:24.295447 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.295607 kubelet[2572]: E0307 01:07:24.295467 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.296029 kubelet[2572]: E0307 01:07:24.295989 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.296029 kubelet[2572]: W0307 01:07:24.296017 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.296132 kubelet[2572]: E0307 01:07:24.296035 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.296612 kubelet[2572]: E0307 01:07:24.296579 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.296612 kubelet[2572]: W0307 01:07:24.296601 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.296737 kubelet[2572]: E0307 01:07:24.296618 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.297121 kubelet[2572]: E0307 01:07:24.297081 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.297121 kubelet[2572]: W0307 01:07:24.297106 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.297121 kubelet[2572]: E0307 01:07:24.297125 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.297674 kubelet[2572]: E0307 01:07:24.297642 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.297674 kubelet[2572]: W0307 01:07:24.297664 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.297791 kubelet[2572]: E0307 01:07:24.297681 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.298229 kubelet[2572]: E0307 01:07:24.298160 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.298229 kubelet[2572]: W0307 01:07:24.298184 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.298331 kubelet[2572]: E0307 01:07:24.298248 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.299041 kubelet[2572]: E0307 01:07:24.298840 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.299041 kubelet[2572]: W0307 01:07:24.298862 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.299041 kubelet[2572]: E0307 01:07:24.298881 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.299565 kubelet[2572]: E0307 01:07:24.299526 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.299565 kubelet[2572]: W0307 01:07:24.299554 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.299669 kubelet[2572]: E0307 01:07:24.299572 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.300048 kubelet[2572]: E0307 01:07:24.300006 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.300048 kubelet[2572]: W0307 01:07:24.300034 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.300048 kubelet[2572]: E0307 01:07:24.300051 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.300684 kubelet[2572]: E0307 01:07:24.300625 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.300684 kubelet[2572]: W0307 01:07:24.300650 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.300684 kubelet[2572]: E0307 01:07:24.300668 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.301800 kubelet[2572]: E0307 01:07:24.301745 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.301800 kubelet[2572]: W0307 01:07:24.301767 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.301800 kubelet[2572]: E0307 01:07:24.301785 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.302348 kubelet[2572]: E0307 01:07:24.302304 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.302348 kubelet[2572]: W0307 01:07:24.302338 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.302509 kubelet[2572]: E0307 01:07:24.302363 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.303011 kubelet[2572]: E0307 01:07:24.302977 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.303011 kubelet[2572]: W0307 01:07:24.302998 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.303011 kubelet[2572]: E0307 01:07:24.303013 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.303699 kubelet[2572]: E0307 01:07:24.303492 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.303699 kubelet[2572]: W0307 01:07:24.303514 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.303699 kubelet[2572]: E0307 01:07:24.303531 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.303998 kubelet[2572]: E0307 01:07:24.303963 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.303998 kubelet[2572]: W0307 01:07:24.303985 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.303998 kubelet[2572]: E0307 01:07:24.304002 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.304619 kubelet[2572]: E0307 01:07:24.304578 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.304619 kubelet[2572]: W0307 01:07:24.304610 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.304745 kubelet[2572]: E0307 01:07:24.304630 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.305462 kubelet[2572]: E0307 01:07:24.305428 2572 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:07:24.305462 kubelet[2572]: W0307 01:07:24.305451 2572 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:07:24.305574 kubelet[2572]: E0307 01:07:24.305467 2572 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:07:24.892204 containerd[1511]: time="2026-03-07T01:07:24.892150266Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:24.893300 containerd[1511]: time="2026-03-07T01:07:24.893170064Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Mar 7 01:07:24.894250 containerd[1511]: time="2026-03-07T01:07:24.894222928Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:24.896105 containerd[1511]: time="2026-03-07T01:07:24.896076766Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:24.896833 containerd[1511]: time="2026-03-07T01:07:24.896464318Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 1.797154779s" Mar 7 01:07:24.896833 containerd[1511]: time="2026-03-07T01:07:24.896487446Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Mar 7 01:07:24.900228 containerd[1511]: time="2026-03-07T01:07:24.900187793Z" level=info msg="CreateContainer within sandbox \"5f984f9e6ab16ebde163d1462d928b9d4e6c203cf5359101a4f869ade855623d\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 7 01:07:24.913905 containerd[1511]: time="2026-03-07T01:07:24.913878519Z" level=info msg="CreateContainer within sandbox \"5f984f9e6ab16ebde163d1462d928b9d4e6c203cf5359101a4f869ade855623d\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"589a7b1a9afe0efc383060127b599272d3b05036a835a35a86da4529ee8d4ea4\"" Mar 7 01:07:24.914345 containerd[1511]: time="2026-03-07T01:07:24.914323395Z" level=info msg="StartContainer for \"589a7b1a9afe0efc383060127b599272d3b05036a835a35a86da4529ee8d4ea4\"" Mar 7 01:07:24.941304 systemd[1]: Started cri-containerd-589a7b1a9afe0efc383060127b599272d3b05036a835a35a86da4529ee8d4ea4.scope - libcontainer container 589a7b1a9afe0efc383060127b599272d3b05036a835a35a86da4529ee8d4ea4. Mar 7 01:07:24.964159 containerd[1511]: time="2026-03-07T01:07:24.964130325Z" level=info msg="StartContainer for \"589a7b1a9afe0efc383060127b599272d3b05036a835a35a86da4529ee8d4ea4\" returns successfully" Mar 7 01:07:24.974363 systemd[1]: cri-containerd-589a7b1a9afe0efc383060127b599272d3b05036a835a35a86da4529ee8d4ea4.scope: Deactivated successfully. Mar 7 01:07:24.994739 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-589a7b1a9afe0efc383060127b599272d3b05036a835a35a86da4529ee8d4ea4-rootfs.mount: Deactivated successfully. Mar 7 01:07:25.087755 containerd[1511]: time="2026-03-07T01:07:25.087696261Z" level=info msg="shim disconnected" id=589a7b1a9afe0efc383060127b599272d3b05036a835a35a86da4529ee8d4ea4 namespace=k8s.io Mar 7 01:07:25.087755 containerd[1511]: time="2026-03-07T01:07:25.087745421Z" level=warning msg="cleaning up after shim disconnected" id=589a7b1a9afe0efc383060127b599272d3b05036a835a35a86da4529ee8d4ea4 namespace=k8s.io Mar 7 01:07:25.087755 containerd[1511]: time="2026-03-07T01:07:25.087752122Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 01:07:25.249287 containerd[1511]: time="2026-03-07T01:07:25.249050104Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 7 01:07:25.274558 kubelet[2572]: I0307 01:07:25.274111 2572 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-typha-6db997cf49-csqbw" podStartSLOduration=3.167229036 podStartE2EDuration="6.274092616s" podCreationTimestamp="2026-03-07 01:07:19 +0000 UTC" firstStartedPulling="2026-03-07 01:07:19.992310313 +0000 UTC m=+17.952037063" lastFinishedPulling="2026-03-07 01:07:23.099173893 +0000 UTC m=+21.058900643" observedRunningTime="2026-03-07 01:07:23.278567877 +0000 UTC m=+21.238294627" watchObservedRunningTime="2026-03-07 01:07:25.274092616 +0000 UTC m=+23.233819396" Mar 7 01:07:26.138239 kubelet[2572]: E0307 01:07:26.137768 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2x9rm" podUID="0869cd07-08c1-477a-a901-3ac76e743a03" Mar 7 01:07:28.138566 kubelet[2572]: E0307 01:07:28.137978 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2x9rm" podUID="0869cd07-08c1-477a-a901-3ac76e743a03" Mar 7 01:07:30.143627 kubelet[2572]: E0307 01:07:30.143483 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2x9rm" podUID="0869cd07-08c1-477a-a901-3ac76e743a03" Mar 7 01:07:32.139941 kubelet[2572]: E0307 01:07:32.139896 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2x9rm" podUID="0869cd07-08c1-477a-a901-3ac76e743a03" Mar 7 01:07:32.317892 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount323224317.mount: Deactivated successfully. Mar 7 01:07:32.345776 containerd[1511]: time="2026-03-07T01:07:32.345729646Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:32.346845 containerd[1511]: time="2026-03-07T01:07:32.346752485Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Mar 7 01:07:32.347908 containerd[1511]: time="2026-03-07T01:07:32.347893092Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:32.350810 containerd[1511]: time="2026-03-07T01:07:32.350365744Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:32.350810 containerd[1511]: time="2026-03-07T01:07:32.350722334Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 7.101468836s" Mar 7 01:07:32.350810 containerd[1511]: time="2026-03-07T01:07:32.350740824Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Mar 7 01:07:32.354605 containerd[1511]: time="2026-03-07T01:07:32.354579003Z" level=info msg="CreateContainer within sandbox \"5f984f9e6ab16ebde163d1462d928b9d4e6c203cf5359101a4f869ade855623d\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 7 01:07:32.374459 containerd[1511]: time="2026-03-07T01:07:32.374417653Z" level=info msg="CreateContainer within sandbox \"5f984f9e6ab16ebde163d1462d928b9d4e6c203cf5359101a4f869ade855623d\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"ff2faebe0d7763f7171698ef755c15c6c55e1e33e29666127420061a36528f00\"" Mar 7 01:07:32.375335 containerd[1511]: time="2026-03-07T01:07:32.375311275Z" level=info msg="StartContainer for \"ff2faebe0d7763f7171698ef755c15c6c55e1e33e29666127420061a36528f00\"" Mar 7 01:07:32.405313 systemd[1]: Started cri-containerd-ff2faebe0d7763f7171698ef755c15c6c55e1e33e29666127420061a36528f00.scope - libcontainer container ff2faebe0d7763f7171698ef755c15c6c55e1e33e29666127420061a36528f00. Mar 7 01:07:32.429607 containerd[1511]: time="2026-03-07T01:07:32.429569243Z" level=info msg="StartContainer for \"ff2faebe0d7763f7171698ef755c15c6c55e1e33e29666127420061a36528f00\" returns successfully" Mar 7 01:07:32.460748 systemd[1]: cri-containerd-ff2faebe0d7763f7171698ef755c15c6c55e1e33e29666127420061a36528f00.scope: Deactivated successfully. Mar 7 01:07:32.572348 containerd[1511]: time="2026-03-07T01:07:32.572300290Z" level=info msg="shim disconnected" id=ff2faebe0d7763f7171698ef755c15c6c55e1e33e29666127420061a36528f00 namespace=k8s.io Mar 7 01:07:32.572348 containerd[1511]: time="2026-03-07T01:07:32.572340835Z" level=warning msg="cleaning up after shim disconnected" id=ff2faebe0d7763f7171698ef755c15c6c55e1e33e29666127420061a36528f00 namespace=k8s.io Mar 7 01:07:32.572348 containerd[1511]: time="2026-03-07T01:07:32.572348097Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 01:07:33.266409 containerd[1511]: time="2026-03-07T01:07:33.266243306Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 7 01:07:33.320671 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ff2faebe0d7763f7171698ef755c15c6c55e1e33e29666127420061a36528f00-rootfs.mount: Deactivated successfully. Mar 7 01:07:34.141525 kubelet[2572]: E0307 01:07:34.141438 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2x9rm" podUID="0869cd07-08c1-477a-a901-3ac76e743a03" Mar 7 01:07:36.138994 kubelet[2572]: E0307 01:07:36.138091 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2x9rm" podUID="0869cd07-08c1-477a-a901-3ac76e743a03" Mar 7 01:07:37.287882 containerd[1511]: time="2026-03-07T01:07:37.287818717Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:37.289024 containerd[1511]: time="2026-03-07T01:07:37.288886745Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Mar 7 01:07:37.290034 containerd[1511]: time="2026-03-07T01:07:37.289855726Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:37.291778 containerd[1511]: time="2026-03-07T01:07:37.291743849Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:37.292717 containerd[1511]: time="2026-03-07T01:07:37.292233012Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 4.025940507s" Mar 7 01:07:37.292717 containerd[1511]: time="2026-03-07T01:07:37.292262809Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Mar 7 01:07:37.296203 containerd[1511]: time="2026-03-07T01:07:37.296166868Z" level=info msg="CreateContainer within sandbox \"5f984f9e6ab16ebde163d1462d928b9d4e6c203cf5359101a4f869ade855623d\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 7 01:07:37.308612 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2753377632.mount: Deactivated successfully. Mar 7 01:07:37.311740 containerd[1511]: time="2026-03-07T01:07:37.311702395Z" level=info msg="CreateContainer within sandbox \"5f984f9e6ab16ebde163d1462d928b9d4e6c203cf5359101a4f869ade855623d\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"7894ee9f4be24b9923852c4a8296fc2ce55062421193aff241c781e9d8ff4e05\"" Mar 7 01:07:37.312369 containerd[1511]: time="2026-03-07T01:07:37.312305758Z" level=info msg="StartContainer for \"7894ee9f4be24b9923852c4a8296fc2ce55062421193aff241c781e9d8ff4e05\"" Mar 7 01:07:37.341320 systemd[1]: Started cri-containerd-7894ee9f4be24b9923852c4a8296fc2ce55062421193aff241c781e9d8ff4e05.scope - libcontainer container 7894ee9f4be24b9923852c4a8296fc2ce55062421193aff241c781e9d8ff4e05. Mar 7 01:07:37.373645 containerd[1511]: time="2026-03-07T01:07:37.373600947Z" level=info msg="StartContainer for \"7894ee9f4be24b9923852c4a8296fc2ce55062421193aff241c781e9d8ff4e05\" returns successfully" Mar 7 01:07:37.817341 containerd[1511]: time="2026-03-07T01:07:37.816367909Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 7 01:07:37.820648 systemd[1]: cri-containerd-7894ee9f4be24b9923852c4a8296fc2ce55062421193aff241c781e9d8ff4e05.scope: Deactivated successfully. Mar 7 01:07:37.843240 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7894ee9f4be24b9923852c4a8296fc2ce55062421193aff241c781e9d8ff4e05-rootfs.mount: Deactivated successfully. Mar 7 01:07:37.844980 containerd[1511]: time="2026-03-07T01:07:37.844907613Z" level=info msg="shim disconnected" id=7894ee9f4be24b9923852c4a8296fc2ce55062421193aff241c781e9d8ff4e05 namespace=k8s.io Mar 7 01:07:37.844980 containerd[1511]: time="2026-03-07T01:07:37.844953536Z" level=warning msg="cleaning up after shim disconnected" id=7894ee9f4be24b9923852c4a8296fc2ce55062421193aff241c781e9d8ff4e05 namespace=k8s.io Mar 7 01:07:37.844980 containerd[1511]: time="2026-03-07T01:07:37.844961339Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 01:07:37.897748 kubelet[2572]: I0307 01:07:37.897708 2572 kubelet_node_status.go:427] "Fast updating node status as it just became ready" Mar 7 01:07:37.936723 systemd[1]: Created slice kubepods-burstable-pod6afa7af7_0921_4201_99cd_3abd2378e890.slice - libcontainer container kubepods-burstable-pod6afa7af7_0921_4201_99cd_3abd2378e890.slice. Mar 7 01:07:37.942901 systemd[1]: Created slice kubepods-besteffort-pod06a6379b_7db4_4dbd_a0fc_bb463be89e8c.slice - libcontainer container kubepods-besteffort-pod06a6379b_7db4_4dbd_a0fc_bb463be89e8c.slice. Mar 7 01:07:37.952641 systemd[1]: Created slice kubepods-burstable-podd25ec7da_6038_4c24_89dc_467334f42af3.slice - libcontainer container kubepods-burstable-podd25ec7da_6038_4c24_89dc_467334f42af3.slice. Mar 7 01:07:37.957169 systemd[1]: Created slice kubepods-besteffort-pod57352416_50b9_4228_9b61_6bb3108b4b1a.slice - libcontainer container kubepods-besteffort-pod57352416_50b9_4228_9b61_6bb3108b4b1a.slice. Mar 7 01:07:37.964417 systemd[1]: Created slice kubepods-besteffort-podffcd3157_37de_4e8c_903c_bcf1e17ebdb3.slice - libcontainer container kubepods-besteffort-podffcd3157_37de_4e8c_903c_bcf1e17ebdb3.slice. Mar 7 01:07:37.975172 systemd[1]: Created slice kubepods-besteffort-podacf6180d_a049_431a_b0ff_bb13b9279e69.slice - libcontainer container kubepods-besteffort-podacf6180d_a049_431a_b0ff_bb13b9279e69.slice. Mar 7 01:07:37.978258 systemd[1]: Created slice kubepods-besteffort-pod408b461e_6520_4c86_a37a_f5d60a0218ca.slice - libcontainer container kubepods-besteffort-pod408b461e_6520_4c86_a37a_f5d60a0218ca.slice. Mar 7 01:07:37.992080 kubelet[2572]: I0307 01:07:37.992048 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-whisker-backend-key-pair\") pod \"whisker-764d867d9c-fngzt\" (UID: \"06a6379b-7db4-4dbd-a0fc-bb463be89e8c\") " pod="calico-system/whisker-764d867d9c-fngzt" Mar 7 01:07:37.992080 kubelet[2572]: I0307 01:07:37.992077 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5gx2\" (UniqueName: \"kubernetes.io/projected/57352416-50b9-4228-9b61-6bb3108b4b1a-kube-api-access-g5gx2\") pod \"calico-kube-controllers-c8d76dd9d-hqr58\" (UID: \"57352416-50b9-4228-9b61-6bb3108b4b1a\") " pod="calico-system/calico-kube-controllers-c8d76dd9d-hqr58" Mar 7 01:07:37.992251 kubelet[2572]: I0307 01:07:37.992093 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acf6180d-a049-431a-b0ff-bb13b9279e69-config\") pod \"goldmane-9f7667bb8-85bqd\" (UID: \"acf6180d-a049-431a-b0ff-bb13b9279e69\") " pod="calico-system/goldmane-9f7667bb8-85bqd" Mar 7 01:07:37.992251 kubelet[2572]: I0307 01:07:37.992104 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acf6180d-a049-431a-b0ff-bb13b9279e69-goldmane-ca-bundle\") pod \"goldmane-9f7667bb8-85bqd\" (UID: \"acf6180d-a049-431a-b0ff-bb13b9279e69\") " pod="calico-system/goldmane-9f7667bb8-85bqd" Mar 7 01:07:37.992251 kubelet[2572]: I0307 01:07:37.992115 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/acf6180d-a049-431a-b0ff-bb13b9279e69-goldmane-key-pair\") pod \"goldmane-9f7667bb8-85bqd\" (UID: \"acf6180d-a049-431a-b0ff-bb13b9279e69\") " pod="calico-system/goldmane-9f7667bb8-85bqd" Mar 7 01:07:37.992251 kubelet[2572]: I0307 01:07:37.992126 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv4gb\" (UniqueName: \"kubernetes.io/projected/acf6180d-a049-431a-b0ff-bb13b9279e69-kube-api-access-bv4gb\") pod \"goldmane-9f7667bb8-85bqd\" (UID: \"acf6180d-a049-431a-b0ff-bb13b9279e69\") " pod="calico-system/goldmane-9f7667bb8-85bqd" Mar 7 01:07:37.992251 kubelet[2572]: I0307 01:07:37.992139 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57352416-50b9-4228-9b61-6bb3108b4b1a-tigera-ca-bundle\") pod \"calico-kube-controllers-c8d76dd9d-hqr58\" (UID: \"57352416-50b9-4228-9b61-6bb3108b4b1a\") " pod="calico-system/calico-kube-controllers-c8d76dd9d-hqr58" Mar 7 01:07:37.992343 kubelet[2572]: I0307 01:07:37.992149 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6afa7af7-0921-4201-99cd-3abd2378e890-config-volume\") pod \"coredns-7d764666f9-f9xbn\" (UID: \"6afa7af7-0921-4201-99cd-3abd2378e890\") " pod="kube-system/coredns-7d764666f9-f9xbn" Mar 7 01:07:37.992343 kubelet[2572]: I0307 01:07:37.992160 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5vtj\" (UniqueName: \"kubernetes.io/projected/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-kube-api-access-q5vtj\") pod \"whisker-764d867d9c-fngzt\" (UID: \"06a6379b-7db4-4dbd-a0fc-bb463be89e8c\") " pod="calico-system/whisker-764d867d9c-fngzt" Mar 7 01:07:37.992343 kubelet[2572]: I0307 01:07:37.992173 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ffcd3157-37de-4e8c-903c-bcf1e17ebdb3-calico-apiserver-certs\") pod \"calico-apiserver-6f7869694b-75lrk\" (UID: \"ffcd3157-37de-4e8c-903c-bcf1e17ebdb3\") " pod="calico-system/calico-apiserver-6f7869694b-75lrk" Mar 7 01:07:37.992343 kubelet[2572]: I0307 01:07:37.992185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spq67\" (UniqueName: \"kubernetes.io/projected/ffcd3157-37de-4e8c-903c-bcf1e17ebdb3-kube-api-access-spq67\") pod \"calico-apiserver-6f7869694b-75lrk\" (UID: \"ffcd3157-37de-4e8c-903c-bcf1e17ebdb3\") " pod="calico-system/calico-apiserver-6f7869694b-75lrk" Mar 7 01:07:37.992343 kubelet[2572]: I0307 01:07:37.992210 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdsb5\" (UniqueName: \"kubernetes.io/projected/6afa7af7-0921-4201-99cd-3abd2378e890-kube-api-access-tdsb5\") pod \"coredns-7d764666f9-f9xbn\" (UID: \"6afa7af7-0921-4201-99cd-3abd2378e890\") " pod="kube-system/coredns-7d764666f9-f9xbn" Mar 7 01:07:37.992430 kubelet[2572]: I0307 01:07:37.992220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d25ec7da-6038-4c24-89dc-467334f42af3-config-volume\") pod \"coredns-7d764666f9-qsb6f\" (UID: \"d25ec7da-6038-4c24-89dc-467334f42af3\") " pod="kube-system/coredns-7d764666f9-qsb6f" Mar 7 01:07:37.992430 kubelet[2572]: I0307 01:07:37.992231 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjh5q\" (UniqueName: \"kubernetes.io/projected/d25ec7da-6038-4c24-89dc-467334f42af3-kube-api-access-kjh5q\") pod \"coredns-7d764666f9-qsb6f\" (UID: \"d25ec7da-6038-4c24-89dc-467334f42af3\") " pod="kube-system/coredns-7d764666f9-qsb6f" Mar 7 01:07:37.992430 kubelet[2572]: I0307 01:07:37.992248 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-nginx-config\") pod \"whisker-764d867d9c-fngzt\" (UID: \"06a6379b-7db4-4dbd-a0fc-bb463be89e8c\") " pod="calico-system/whisker-764d867d9c-fngzt" Mar 7 01:07:37.992430 kubelet[2572]: I0307 01:07:37.992259 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-whisker-ca-bundle\") pod \"whisker-764d867d9c-fngzt\" (UID: \"06a6379b-7db4-4dbd-a0fc-bb463be89e8c\") " pod="calico-system/whisker-764d867d9c-fngzt" Mar 7 01:07:37.992430 kubelet[2572]: I0307 01:07:37.992270 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/408b461e-6520-4c86-a37a-f5d60a0218ca-calico-apiserver-certs\") pod \"calico-apiserver-6f7869694b-rjchv\" (UID: \"408b461e-6520-4c86-a37a-f5d60a0218ca\") " pod="calico-system/calico-apiserver-6f7869694b-rjchv" Mar 7 01:07:37.992512 kubelet[2572]: I0307 01:07:37.992281 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnxlz\" (UniqueName: \"kubernetes.io/projected/408b461e-6520-4c86-a37a-f5d60a0218ca-kube-api-access-vnxlz\") pod \"calico-apiserver-6f7869694b-rjchv\" (UID: \"408b461e-6520-4c86-a37a-f5d60a0218ca\") " pod="calico-system/calico-apiserver-6f7869694b-rjchv" Mar 7 01:07:38.143871 systemd[1]: Created slice kubepods-besteffort-pod0869cd07_08c1_477a_a901_3ac76e743a03.slice - libcontainer container kubepods-besteffort-pod0869cd07_08c1_477a_a901_3ac76e743a03.slice. Mar 7 01:07:38.149120 containerd[1511]: time="2026-03-07T01:07:38.149086048Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2x9rm,Uid:0869cd07-08c1-477a-a901-3ac76e743a03,Namespace:calico-system,Attempt:0,}" Mar 7 01:07:38.250046 containerd[1511]: time="2026-03-07T01:07:38.249970897Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-f9xbn,Uid:6afa7af7-0921-4201-99cd-3abd2378e890,Namespace:kube-system,Attempt:0,}" Mar 7 01:07:38.256143 containerd[1511]: time="2026-03-07T01:07:38.256098417Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-764d867d9c-fngzt,Uid:06a6379b-7db4-4dbd-a0fc-bb463be89e8c,Namespace:calico-system,Attempt:0,}" Mar 7 01:07:38.260003 containerd[1511]: time="2026-03-07T01:07:38.259723122Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-qsb6f,Uid:d25ec7da-6038-4c24-89dc-467334f42af3,Namespace:kube-system,Attempt:0,}" Mar 7 01:07:38.264122 containerd[1511]: time="2026-03-07T01:07:38.264089365Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-c8d76dd9d-hqr58,Uid:57352416-50b9-4228-9b61-6bb3108b4b1a,Namespace:calico-system,Attempt:0,}" Mar 7 01:07:38.266186 containerd[1511]: time="2026-03-07T01:07:38.266149563Z" level=error msg="Failed to destroy network for sandbox \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.266495 containerd[1511]: time="2026-03-07T01:07:38.266475296Z" level=error msg="encountered an error cleaning up failed sandbox \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.266548 containerd[1511]: time="2026-03-07T01:07:38.266505353Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2x9rm,Uid:0869cd07-08c1-477a-a901-3ac76e743a03,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.266797 kubelet[2572]: E0307 01:07:38.266732 2572 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.266797 kubelet[2572]: E0307 01:07:38.266769 2572 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2x9rm" Mar 7 01:07:38.266797 kubelet[2572]: E0307 01:07:38.266782 2572 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2x9rm" Mar 7 01:07:38.267055 kubelet[2572]: E0307 01:07:38.266828 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-2x9rm_calico-system(0869cd07-08c1-477a-a901-3ac76e743a03)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-2x9rm_calico-system(0869cd07-08c1-477a-a901-3ac76e743a03)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-2x9rm" podUID="0869cd07-08c1-477a-a901-3ac76e743a03" Mar 7 01:07:38.270305 containerd[1511]: time="2026-03-07T01:07:38.270269948Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f7869694b-75lrk,Uid:ffcd3157-37de-4e8c-903c-bcf1e17ebdb3,Namespace:calico-system,Attempt:0,}" Mar 7 01:07:38.280306 kubelet[2572]: I0307 01:07:38.279564 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Mar 7 01:07:38.280372 containerd[1511]: time="2026-03-07T01:07:38.280305159Z" level=info msg="StopPodSandbox for \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\"" Mar 7 01:07:38.280424 containerd[1511]: time="2026-03-07T01:07:38.280406799Z" level=info msg="Ensure that sandbox a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741 in task-service has been cleanup successfully" Mar 7 01:07:38.286038 containerd[1511]: time="2026-03-07T01:07:38.285560212Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f7869694b-rjchv,Uid:408b461e-6520-4c86-a37a-f5d60a0218ca,Namespace:calico-system,Attempt:0,}" Mar 7 01:07:38.295009 containerd[1511]: time="2026-03-07T01:07:38.294247588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-85bqd,Uid:acf6180d-a049-431a-b0ff-bb13b9279e69,Namespace:calico-system,Attempt:0,}" Mar 7 01:07:38.295009 containerd[1511]: time="2026-03-07T01:07:38.294323998Z" level=info msg="CreateContainer within sandbox \"5f984f9e6ab16ebde163d1462d928b9d4e6c203cf5359101a4f869ade855623d\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 7 01:07:38.362510 containerd[1511]: time="2026-03-07T01:07:38.362260988Z" level=error msg="StopPodSandbox for \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\" failed" error="failed to destroy network for sandbox \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.363517 kubelet[2572]: E0307 01:07:38.362667 2572 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Mar 7 01:07:38.363517 kubelet[2572]: E0307 01:07:38.362710 2572 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741"} Mar 7 01:07:38.363517 kubelet[2572]: E0307 01:07:38.362749 2572 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0869cd07-08c1-477a-a901-3ac76e743a03\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 7 01:07:38.363517 kubelet[2572]: E0307 01:07:38.362770 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0869cd07-08c1-477a-a901-3ac76e743a03\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-2x9rm" podUID="0869cd07-08c1-477a-a901-3ac76e743a03" Mar 7 01:07:38.412968 containerd[1511]: time="2026-03-07T01:07:38.411756546Z" level=info msg="CreateContainer within sandbox \"5f984f9e6ab16ebde163d1462d928b9d4e6c203cf5359101a4f869ade855623d\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ec75b89a203d741bc3c5ff05429f82a3c692393a003b7f0d45dda3d01549365f\"" Mar 7 01:07:38.415050 containerd[1511]: time="2026-03-07T01:07:38.414335531Z" level=info msg="StartContainer for \"ec75b89a203d741bc3c5ff05429f82a3c692393a003b7f0d45dda3d01549365f\"" Mar 7 01:07:38.449724 containerd[1511]: time="2026-03-07T01:07:38.449684486Z" level=error msg="Failed to destroy network for sandbox \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.450504 containerd[1511]: time="2026-03-07T01:07:38.450476355Z" level=error msg="encountered an error cleaning up failed sandbox \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.450580 containerd[1511]: time="2026-03-07T01:07:38.450520255Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-f9xbn,Uid:6afa7af7-0921-4201-99cd-3abd2378e890,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.450933 kubelet[2572]: E0307 01:07:38.450820 2572 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.450933 kubelet[2572]: E0307 01:07:38.450867 2572 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-f9xbn" Mar 7 01:07:38.451019 kubelet[2572]: E0307 01:07:38.450894 2572 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-f9xbn" Mar 7 01:07:38.452100 kubelet[2572]: E0307 01:07:38.451957 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-f9xbn_kube-system(6afa7af7-0921-4201-99cd-3abd2378e890)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-f9xbn_kube-system(6afa7af7-0921-4201-99cd-3abd2378e890)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-f9xbn" podUID="6afa7af7-0921-4201-99cd-3abd2378e890" Mar 7 01:07:38.493308 systemd[1]: Started cri-containerd-ec75b89a203d741bc3c5ff05429f82a3c692393a003b7f0d45dda3d01549365f.scope - libcontainer container ec75b89a203d741bc3c5ff05429f82a3c692393a003b7f0d45dda3d01549365f. Mar 7 01:07:38.525103 containerd[1511]: time="2026-03-07T01:07:38.525070449Z" level=error msg="Failed to destroy network for sandbox \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.525563 containerd[1511]: time="2026-03-07T01:07:38.525544356Z" level=error msg="encountered an error cleaning up failed sandbox \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.525645 containerd[1511]: time="2026-03-07T01:07:38.525630973Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-764d867d9c-fngzt,Uid:06a6379b-7db4-4dbd-a0fc-bb463be89e8c,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.525896 kubelet[2572]: E0307 01:07:38.525858 2572 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.526000 kubelet[2572]: E0307 01:07:38.525986 2572 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-764d867d9c-fngzt" Mar 7 01:07:38.526048 kubelet[2572]: E0307 01:07:38.526039 2572 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-764d867d9c-fngzt" Mar 7 01:07:38.526140 kubelet[2572]: E0307 01:07:38.526121 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-764d867d9c-fngzt_calico-system(06a6379b-7db4-4dbd-a0fc-bb463be89e8c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-764d867d9c-fngzt_calico-system(06a6379b-7db4-4dbd-a0fc-bb463be89e8c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-764d867d9c-fngzt" podUID="06a6379b-7db4-4dbd-a0fc-bb463be89e8c" Mar 7 01:07:38.553621 containerd[1511]: time="2026-03-07T01:07:38.553586713Z" level=error msg="Failed to destroy network for sandbox \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.554039 containerd[1511]: time="2026-03-07T01:07:38.554018956Z" level=error msg="encountered an error cleaning up failed sandbox \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.554240 containerd[1511]: time="2026-03-07T01:07:38.554224509Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f7869694b-75lrk,Uid:ffcd3157-37de-4e8c-903c-bcf1e17ebdb3,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.554525 kubelet[2572]: E0307 01:07:38.554500 2572 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.554610 kubelet[2572]: E0307 01:07:38.554600 2572 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6f7869694b-75lrk" Mar 7 01:07:38.554652 kubelet[2572]: E0307 01:07:38.554643 2572 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6f7869694b-75lrk" Mar 7 01:07:38.554738 kubelet[2572]: E0307 01:07:38.554721 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6f7869694b-75lrk_calico-system(ffcd3157-37de-4e8c-903c-bcf1e17ebdb3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6f7869694b-75lrk_calico-system(ffcd3157-37de-4e8c-903c-bcf1e17ebdb3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-6f7869694b-75lrk" podUID="ffcd3157-37de-4e8c-903c-bcf1e17ebdb3" Mar 7 01:07:38.565794 containerd[1511]: time="2026-03-07T01:07:38.565754285Z" level=error msg="Failed to destroy network for sandbox \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.566067 containerd[1511]: time="2026-03-07T01:07:38.566044783Z" level=error msg="encountered an error cleaning up failed sandbox \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.566102 containerd[1511]: time="2026-03-07T01:07:38.566079438Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-qsb6f,Uid:d25ec7da-6038-4c24-89dc-467334f42af3,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.566412 kubelet[2572]: E0307 01:07:38.566387 2572 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.566465 kubelet[2572]: E0307 01:07:38.566426 2572 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-qsb6f" Mar 7 01:07:38.566465 kubelet[2572]: E0307 01:07:38.566456 2572 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-qsb6f" Mar 7 01:07:38.566535 kubelet[2572]: E0307 01:07:38.566507 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-qsb6f_kube-system(d25ec7da-6038-4c24-89dc-467334f42af3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-qsb6f_kube-system(d25ec7da-6038-4c24-89dc-467334f42af3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-qsb6f" podUID="d25ec7da-6038-4c24-89dc-467334f42af3" Mar 7 01:07:38.568623 containerd[1511]: time="2026-03-07T01:07:38.568601574Z" level=info msg="StartContainer for \"ec75b89a203d741bc3c5ff05429f82a3c692393a003b7f0d45dda3d01549365f\" returns successfully" Mar 7 01:07:38.584338 containerd[1511]: time="2026-03-07T01:07:38.584307233Z" level=error msg="Failed to destroy network for sandbox \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.584683 containerd[1511]: time="2026-03-07T01:07:38.584581887Z" level=error msg="encountered an error cleaning up failed sandbox \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.584683 containerd[1511]: time="2026-03-07T01:07:38.584621529Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-c8d76dd9d-hqr58,Uid:57352416-50b9-4228-9b61-6bb3108b4b1a,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.584762 kubelet[2572]: E0307 01:07:38.584741 2572 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.584794 kubelet[2572]: E0307 01:07:38.584773 2572 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-c8d76dd9d-hqr58" Mar 7 01:07:38.584794 kubelet[2572]: E0307 01:07:38.584786 2572 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-c8d76dd9d-hqr58" Mar 7 01:07:38.584834 kubelet[2572]: E0307 01:07:38.584818 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-c8d76dd9d-hqr58_calico-system(57352416-50b9-4228-9b61-6bb3108b4b1a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-c8d76dd9d-hqr58_calico-system(57352416-50b9-4228-9b61-6bb3108b4b1a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-c8d76dd9d-hqr58" podUID="57352416-50b9-4228-9b61-6bb3108b4b1a" Mar 7 01:07:38.586142 containerd[1511]: time="2026-03-07T01:07:38.586120412Z" level=error msg="Failed to destroy network for sandbox \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.586508 containerd[1511]: time="2026-03-07T01:07:38.586491436Z" level=error msg="encountered an error cleaning up failed sandbox \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.586606 containerd[1511]: time="2026-03-07T01:07:38.586579405Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-85bqd,Uid:acf6180d-a049-431a-b0ff-bb13b9279e69,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.586870 kubelet[2572]: E0307 01:07:38.586770 2572 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.586870 kubelet[2572]: E0307 01:07:38.586835 2572 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-85bqd" Mar 7 01:07:38.586870 kubelet[2572]: E0307 01:07:38.586848 2572 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-85bqd" Mar 7 01:07:38.587038 kubelet[2572]: E0307 01:07:38.587002 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-9f7667bb8-85bqd_calico-system(acf6180d-a049-431a-b0ff-bb13b9279e69)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-9f7667bb8-85bqd_calico-system(acf6180d-a049-431a-b0ff-bb13b9279e69)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9f7667bb8-85bqd" podUID="acf6180d-a049-431a-b0ff-bb13b9279e69" Mar 7 01:07:38.595597 containerd[1511]: time="2026-03-07T01:07:38.595570038Z" level=error msg="Failed to destroy network for sandbox \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.595862 containerd[1511]: time="2026-03-07T01:07:38.595844712Z" level=error msg="encountered an error cleaning up failed sandbox \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.595902 containerd[1511]: time="2026-03-07T01:07:38.595877032Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f7869694b-rjchv,Uid:408b461e-6520-4c86-a37a-f5d60a0218ca,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.596030 kubelet[2572]: E0307 01:07:38.596007 2572 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:07:38.596079 kubelet[2572]: E0307 01:07:38.596062 2572 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6f7869694b-rjchv" Mar 7 01:07:38.596100 kubelet[2572]: E0307 01:07:38.596075 2572 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6f7869694b-rjchv" Mar 7 01:07:38.596141 kubelet[2572]: E0307 01:07:38.596121 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6f7869694b-rjchv_calico-system(408b461e-6520-4c86-a37a-f5d60a0218ca)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6f7869694b-rjchv_calico-system(408b461e-6520-4c86-a37a-f5d60a0218ca)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-6f7869694b-rjchv" podUID="408b461e-6520-4c86-a37a-f5d60a0218ca" Mar 7 01:07:39.284504 kubelet[2572]: I0307 01:07:39.284445 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Mar 7 01:07:39.286831 containerd[1511]: time="2026-03-07T01:07:39.286753116Z" level=info msg="StopPodSandbox for \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\"" Mar 7 01:07:39.287163 containerd[1511]: time="2026-03-07T01:07:39.287110538Z" level=info msg="Ensure that sandbox 4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e in task-service has been cleanup successfully" Mar 7 01:07:39.287677 kubelet[2572]: I0307 01:07:39.287496 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Mar 7 01:07:39.290765 containerd[1511]: time="2026-03-07T01:07:39.290152203Z" level=info msg="StopPodSandbox for \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\"" Mar 7 01:07:39.290765 containerd[1511]: time="2026-03-07T01:07:39.290472858Z" level=info msg="Ensure that sandbox e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe in task-service has been cleanup successfully" Mar 7 01:07:39.295301 kubelet[2572]: I0307 01:07:39.295274 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Mar 7 01:07:39.300386 containerd[1511]: time="2026-03-07T01:07:39.299779768Z" level=info msg="StopPodSandbox for \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\"" Mar 7 01:07:39.301583 containerd[1511]: time="2026-03-07T01:07:39.301000078Z" level=info msg="Ensure that sandbox 035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f in task-service has been cleanup successfully" Mar 7 01:07:39.303714 kubelet[2572]: I0307 01:07:39.303040 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Mar 7 01:07:39.308961 containerd[1511]: time="2026-03-07T01:07:39.308639656Z" level=info msg="StopPodSandbox for \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\"" Mar 7 01:07:39.315015 containerd[1511]: time="2026-03-07T01:07:39.314978497Z" level=info msg="Ensure that sandbox 431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07 in task-service has been cleanup successfully" Mar 7 01:07:39.324248 kubelet[2572]: I0307 01:07:39.323541 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Mar 7 01:07:39.323873 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f-shm.mount: Deactivated successfully. Mar 7 01:07:39.324358 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e-shm.mount: Deactivated successfully. Mar 7 01:07:39.324481 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe-shm.mount: Deactivated successfully. Mar 7 01:07:39.325526 containerd[1511]: time="2026-03-07T01:07:39.324927978Z" level=info msg="StopPodSandbox for \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\"" Mar 7 01:07:39.325526 containerd[1511]: time="2026-03-07T01:07:39.325121653Z" level=info msg="Ensure that sandbox eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735 in task-service has been cleanup successfully" Mar 7 01:07:39.336537 kubelet[2572]: I0307 01:07:39.336520 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Mar 7 01:07:39.338163 containerd[1511]: time="2026-03-07T01:07:39.338145861Z" level=info msg="StopPodSandbox for \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\"" Mar 7 01:07:39.338480 containerd[1511]: time="2026-03-07T01:07:39.338467437Z" level=info msg="Ensure that sandbox a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7 in task-service has been cleanup successfully" Mar 7 01:07:39.343881 kubelet[2572]: I0307 01:07:39.343856 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Mar 7 01:07:39.344565 containerd[1511]: time="2026-03-07T01:07:39.344542584Z" level=info msg="StopPodSandbox for \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\"" Mar 7 01:07:39.344757 containerd[1511]: time="2026-03-07T01:07:39.344741296Z" level=info msg="Ensure that sandbox 58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05 in task-service has been cleanup successfully" Mar 7 01:07:39.372583 kubelet[2572]: I0307 01:07:39.372286 2572 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-node-42826" podStartSLOduration=2.120543996 podStartE2EDuration="20.372274598s" podCreationTimestamp="2026-03-07 01:07:19 +0000 UTC" firstStartedPulling="2026-03-07 01:07:20.027241204 +0000 UTC m=+17.986967944" lastFinishedPulling="2026-03-07 01:07:38.278971806 +0000 UTC m=+36.238698546" observedRunningTime="2026-03-07 01:07:39.371561086 +0000 UTC m=+37.331287837" watchObservedRunningTime="2026-03-07 01:07:39.372274598 +0000 UTC m=+37.332001338" Mar 7 01:07:39.590517 containerd[1511]: 2026-03-07 01:07:39.521 [INFO][3838] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Mar 7 01:07:39.590517 containerd[1511]: 2026-03-07 01:07:39.525 [INFO][3838] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" iface="eth0" netns="/var/run/netns/cni-a9d1d525-2d3d-8194-a666-dfb6f8a31434" Mar 7 01:07:39.590517 containerd[1511]: 2026-03-07 01:07:39.525 [INFO][3838] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" iface="eth0" netns="/var/run/netns/cni-a9d1d525-2d3d-8194-a666-dfb6f8a31434" Mar 7 01:07:39.590517 containerd[1511]: 2026-03-07 01:07:39.527 [INFO][3838] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" iface="eth0" netns="/var/run/netns/cni-a9d1d525-2d3d-8194-a666-dfb6f8a31434" Mar 7 01:07:39.590517 containerd[1511]: 2026-03-07 01:07:39.527 [INFO][3838] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Mar 7 01:07:39.590517 containerd[1511]: 2026-03-07 01:07:39.527 [INFO][3838] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Mar 7 01:07:39.590517 containerd[1511]: 2026-03-07 01:07:39.569 [INFO][3909] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" HandleID="k8s-pod-network.58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" Mar 7 01:07:39.590517 containerd[1511]: 2026-03-07 01:07:39.570 [INFO][3909] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:07:39.590517 containerd[1511]: 2026-03-07 01:07:39.570 [INFO][3909] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:07:39.590517 containerd[1511]: 2026-03-07 01:07:39.576 [WARNING][3909] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" HandleID="k8s-pod-network.58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" Mar 7 01:07:39.590517 containerd[1511]: 2026-03-07 01:07:39.576 [INFO][3909] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" HandleID="k8s-pod-network.58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" Mar 7 01:07:39.590517 containerd[1511]: 2026-03-07 01:07:39.577 [INFO][3909] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:07:39.590517 containerd[1511]: 2026-03-07 01:07:39.586 [INFO][3838] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Mar 7 01:07:39.593474 containerd[1511]: time="2026-03-07T01:07:39.593438818Z" level=info msg="TearDown network for sandbox \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\" successfully" Mar 7 01:07:39.593574 containerd[1511]: time="2026-03-07T01:07:39.593563775Z" level=info msg="StopPodSandbox for \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\" returns successfully" Mar 7 01:07:39.593971 systemd[1]: run-netns-cni\x2da9d1d525\x2d2d3d\x2d8194\x2da666\x2ddfb6f8a31434.mount: Deactivated successfully. Mar 7 01:07:39.596280 containerd[1511]: time="2026-03-07T01:07:39.596110591Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-c8d76dd9d-hqr58,Uid:57352416-50b9-4228-9b61-6bb3108b4b1a,Namespace:calico-system,Attempt:1,}" Mar 7 01:07:39.624235 containerd[1511]: 2026-03-07 01:07:39.485 [INFO][3775] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Mar 7 01:07:39.624235 containerd[1511]: 2026-03-07 01:07:39.485 [INFO][3775] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" iface="eth0" netns="/var/run/netns/cni-29802cf6-4244-1ba6-f2cb-d2c0ccba4fee" Mar 7 01:07:39.624235 containerd[1511]: 2026-03-07 01:07:39.485 [INFO][3775] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" iface="eth0" netns="/var/run/netns/cni-29802cf6-4244-1ba6-f2cb-d2c0ccba4fee" Mar 7 01:07:39.624235 containerd[1511]: 2026-03-07 01:07:39.486 [INFO][3775] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" iface="eth0" netns="/var/run/netns/cni-29802cf6-4244-1ba6-f2cb-d2c0ccba4fee" Mar 7 01:07:39.624235 containerd[1511]: 2026-03-07 01:07:39.486 [INFO][3775] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Mar 7 01:07:39.624235 containerd[1511]: 2026-03-07 01:07:39.486 [INFO][3775] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Mar 7 01:07:39.624235 containerd[1511]: 2026-03-07 01:07:39.596 [INFO][3887] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" HandleID="k8s-pod-network.4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" Mar 7 01:07:39.624235 containerd[1511]: 2026-03-07 01:07:39.597 [INFO][3887] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:07:39.624235 containerd[1511]: 2026-03-07 01:07:39.597 [INFO][3887] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:07:39.624235 containerd[1511]: 2026-03-07 01:07:39.604 [WARNING][3887] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" HandleID="k8s-pod-network.4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" Mar 7 01:07:39.624235 containerd[1511]: 2026-03-07 01:07:39.604 [INFO][3887] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" HandleID="k8s-pod-network.4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" Mar 7 01:07:39.624235 containerd[1511]: 2026-03-07 01:07:39.606 [INFO][3887] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:07:39.624235 containerd[1511]: 2026-03-07 01:07:39.612 [INFO][3775] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Mar 7 01:07:39.624710 containerd[1511]: time="2026-03-07T01:07:39.624689929Z" level=info msg="TearDown network for sandbox \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\" successfully" Mar 7 01:07:39.624834 containerd[1511]: time="2026-03-07T01:07:39.624775323Z" level=info msg="StopPodSandbox for \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\" returns successfully" Mar 7 01:07:39.628434 containerd[1511]: time="2026-03-07T01:07:39.628416061Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-qsb6f,Uid:d25ec7da-6038-4c24-89dc-467334f42af3,Namespace:kube-system,Attempt:1,}" Mar 7 01:07:39.641001 containerd[1511]: 2026-03-07 01:07:39.474 [INFO][3793] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Mar 7 01:07:39.641001 containerd[1511]: 2026-03-07 01:07:39.474 [INFO][3793] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" iface="eth0" netns="/var/run/netns/cni-de89a7f5-f6ee-a903-3320-9fc4151a7530" Mar 7 01:07:39.641001 containerd[1511]: 2026-03-07 01:07:39.474 [INFO][3793] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" iface="eth0" netns="/var/run/netns/cni-de89a7f5-f6ee-a903-3320-9fc4151a7530" Mar 7 01:07:39.641001 containerd[1511]: 2026-03-07 01:07:39.476 [INFO][3793] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" iface="eth0" netns="/var/run/netns/cni-de89a7f5-f6ee-a903-3320-9fc4151a7530" Mar 7 01:07:39.641001 containerd[1511]: 2026-03-07 01:07:39.476 [INFO][3793] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Mar 7 01:07:39.641001 containerd[1511]: 2026-03-07 01:07:39.476 [INFO][3793] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Mar 7 01:07:39.641001 containerd[1511]: 2026-03-07 01:07:39.596 [INFO][3881] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" HandleID="k8s-pod-network.e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" Mar 7 01:07:39.641001 containerd[1511]: 2026-03-07 01:07:39.597 [INFO][3881] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:07:39.641001 containerd[1511]: 2026-03-07 01:07:39.606 [INFO][3881] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:07:39.641001 containerd[1511]: 2026-03-07 01:07:39.620 [WARNING][3881] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" HandleID="k8s-pod-network.e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" Mar 7 01:07:39.641001 containerd[1511]: 2026-03-07 01:07:39.620 [INFO][3881] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" HandleID="k8s-pod-network.e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" Mar 7 01:07:39.641001 containerd[1511]: 2026-03-07 01:07:39.623 [INFO][3881] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:07:39.641001 containerd[1511]: 2026-03-07 01:07:39.630 [INFO][3793] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Mar 7 01:07:39.641859 containerd[1511]: time="2026-03-07T01:07:39.641380775Z" level=info msg="TearDown network for sandbox \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\" successfully" Mar 7 01:07:39.641859 containerd[1511]: time="2026-03-07T01:07:39.641398513Z" level=info msg="StopPodSandbox for \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\" returns successfully" Mar 7 01:07:39.642884 containerd[1511]: time="2026-03-07T01:07:39.642849556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-f9xbn,Uid:6afa7af7-0921-4201-99cd-3abd2378e890,Namespace:kube-system,Attempt:1,}" Mar 7 01:07:39.647877 containerd[1511]: 2026-03-07 01:07:39.485 [INFO][3806] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Mar 7 01:07:39.647877 containerd[1511]: 2026-03-07 01:07:39.485 [INFO][3806] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" iface="eth0" netns="/var/run/netns/cni-6be8196e-409f-ea21-b98b-8d01773999ff" Mar 7 01:07:39.647877 containerd[1511]: 2026-03-07 01:07:39.486 [INFO][3806] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" iface="eth0" netns="/var/run/netns/cni-6be8196e-409f-ea21-b98b-8d01773999ff" Mar 7 01:07:39.647877 containerd[1511]: 2026-03-07 01:07:39.491 [INFO][3806] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" iface="eth0" netns="/var/run/netns/cni-6be8196e-409f-ea21-b98b-8d01773999ff" Mar 7 01:07:39.647877 containerd[1511]: 2026-03-07 01:07:39.491 [INFO][3806] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Mar 7 01:07:39.647877 containerd[1511]: 2026-03-07 01:07:39.491 [INFO][3806] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Mar 7 01:07:39.647877 containerd[1511]: 2026-03-07 01:07:39.603 [INFO][3889] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" HandleID="k8s-pod-network.431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Workload="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" Mar 7 01:07:39.647877 containerd[1511]: 2026-03-07 01:07:39.603 [INFO][3889] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:07:39.647877 containerd[1511]: 2026-03-07 01:07:39.623 [INFO][3889] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:07:39.647877 containerd[1511]: 2026-03-07 01:07:39.629 [WARNING][3889] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" HandleID="k8s-pod-network.431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Workload="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" Mar 7 01:07:39.647877 containerd[1511]: 2026-03-07 01:07:39.629 [INFO][3889] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" HandleID="k8s-pod-network.431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Workload="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" Mar 7 01:07:39.647877 containerd[1511]: 2026-03-07 01:07:39.631 [INFO][3889] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:07:39.647877 containerd[1511]: 2026-03-07 01:07:39.639 [INFO][3806] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Mar 7 01:07:39.648387 containerd[1511]: time="2026-03-07T01:07:39.647944581Z" level=info msg="TearDown network for sandbox \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\" successfully" Mar 7 01:07:39.648387 containerd[1511]: time="2026-03-07T01:07:39.647975970Z" level=info msg="StopPodSandbox for \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\" returns successfully" Mar 7 01:07:39.652217 containerd[1511]: time="2026-03-07T01:07:39.652177089Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-85bqd,Uid:acf6180d-a049-431a-b0ff-bb13b9279e69,Namespace:calico-system,Attempt:1,}" Mar 7 01:07:39.661535 containerd[1511]: 2026-03-07 01:07:39.503 [INFO][3839] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Mar 7 01:07:39.661535 containerd[1511]: 2026-03-07 01:07:39.504 [INFO][3839] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" iface="eth0" netns="/var/run/netns/cni-14e5f6a5-d26b-17be-b070-658ee4028e5f" Mar 7 01:07:39.661535 containerd[1511]: 2026-03-07 01:07:39.504 [INFO][3839] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" iface="eth0" netns="/var/run/netns/cni-14e5f6a5-d26b-17be-b070-658ee4028e5f" Mar 7 01:07:39.661535 containerd[1511]: 2026-03-07 01:07:39.505 [INFO][3839] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" iface="eth0" netns="/var/run/netns/cni-14e5f6a5-d26b-17be-b070-658ee4028e5f" Mar 7 01:07:39.661535 containerd[1511]: 2026-03-07 01:07:39.505 [INFO][3839] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Mar 7 01:07:39.661535 containerd[1511]: 2026-03-07 01:07:39.505 [INFO][3839] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Mar 7 01:07:39.661535 containerd[1511]: 2026-03-07 01:07:39.605 [INFO][3898] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" HandleID="k8s-pod-network.a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" Mar 7 01:07:39.661535 containerd[1511]: 2026-03-07 01:07:39.605 [INFO][3898] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:07:39.661535 containerd[1511]: 2026-03-07 01:07:39.631 [INFO][3898] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:07:39.661535 containerd[1511]: 2026-03-07 01:07:39.639 [WARNING][3898] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" HandleID="k8s-pod-network.a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" Mar 7 01:07:39.661535 containerd[1511]: 2026-03-07 01:07:39.639 [INFO][3898] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" HandleID="k8s-pod-network.a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" Mar 7 01:07:39.661535 containerd[1511]: 2026-03-07 01:07:39.640 [INFO][3898] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:07:39.661535 containerd[1511]: 2026-03-07 01:07:39.643 [INFO][3839] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Mar 7 01:07:39.661970 containerd[1511]: time="2026-03-07T01:07:39.661645253Z" level=info msg="TearDown network for sandbox \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\" successfully" Mar 7 01:07:39.661970 containerd[1511]: time="2026-03-07T01:07:39.661659946Z" level=info msg="StopPodSandbox for \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\" returns successfully" Mar 7 01:07:39.667006 containerd[1511]: time="2026-03-07T01:07:39.666847546Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f7869694b-75lrk,Uid:ffcd3157-37de-4e8c-903c-bcf1e17ebdb3,Namespace:calico-system,Attempt:1,}" Mar 7 01:07:39.668004 containerd[1511]: 2026-03-07 01:07:39.513 [INFO][3815] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Mar 7 01:07:39.668004 containerd[1511]: 2026-03-07 01:07:39.513 [INFO][3815] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" iface="eth0" netns="/var/run/netns/cni-16d99109-2cfe-8074-7792-851676b05102" Mar 7 01:07:39.668004 containerd[1511]: 2026-03-07 01:07:39.515 [INFO][3815] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" iface="eth0" netns="/var/run/netns/cni-16d99109-2cfe-8074-7792-851676b05102" Mar 7 01:07:39.668004 containerd[1511]: 2026-03-07 01:07:39.515 [INFO][3815] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" iface="eth0" netns="/var/run/netns/cni-16d99109-2cfe-8074-7792-851676b05102" Mar 7 01:07:39.668004 containerd[1511]: 2026-03-07 01:07:39.515 [INFO][3815] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Mar 7 01:07:39.668004 containerd[1511]: 2026-03-07 01:07:39.515 [INFO][3815] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Mar 7 01:07:39.668004 containerd[1511]: 2026-03-07 01:07:39.608 [INFO][3901] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" HandleID="k8s-pod-network.eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" Mar 7 01:07:39.668004 containerd[1511]: 2026-03-07 01:07:39.609 [INFO][3901] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:07:39.668004 containerd[1511]: 2026-03-07 01:07:39.642 [INFO][3901] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:07:39.668004 containerd[1511]: 2026-03-07 01:07:39.649 [WARNING][3901] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" HandleID="k8s-pod-network.eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" Mar 7 01:07:39.668004 containerd[1511]: 2026-03-07 01:07:39.649 [INFO][3901] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" HandleID="k8s-pod-network.eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" Mar 7 01:07:39.668004 containerd[1511]: 2026-03-07 01:07:39.650 [INFO][3901] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:07:39.668004 containerd[1511]: 2026-03-07 01:07:39.664 [INFO][3815] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Mar 7 01:07:39.670776 containerd[1511]: time="2026-03-07T01:07:39.670322153Z" level=info msg="TearDown network for sandbox \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\" successfully" Mar 7 01:07:39.670776 containerd[1511]: time="2026-03-07T01:07:39.670338809Z" level=info msg="StopPodSandbox for \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\" returns successfully" Mar 7 01:07:39.672359 containerd[1511]: time="2026-03-07T01:07:39.672189359Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f7869694b-rjchv,Uid:408b461e-6520-4c86-a37a-f5d60a0218ca,Namespace:calico-system,Attempt:1,}" Mar 7 01:07:39.698870 containerd[1511]: 2026-03-07 01:07:39.525 [INFO][3809] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Mar 7 01:07:39.698870 containerd[1511]: 2026-03-07 01:07:39.526 [INFO][3809] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" iface="eth0" netns="/var/run/netns/cni-ec74d7ff-aae2-91d5-ef83-405857fd97ec" Mar 7 01:07:39.698870 containerd[1511]: 2026-03-07 01:07:39.527 [INFO][3809] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" iface="eth0" netns="/var/run/netns/cni-ec74d7ff-aae2-91d5-ef83-405857fd97ec" Mar 7 01:07:39.698870 containerd[1511]: 2026-03-07 01:07:39.527 [INFO][3809] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" iface="eth0" netns="/var/run/netns/cni-ec74d7ff-aae2-91d5-ef83-405857fd97ec" Mar 7 01:07:39.698870 containerd[1511]: 2026-03-07 01:07:39.527 [INFO][3809] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Mar 7 01:07:39.698870 containerd[1511]: 2026-03-07 01:07:39.527 [INFO][3809] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Mar 7 01:07:39.698870 containerd[1511]: 2026-03-07 01:07:39.617 [INFO][3910] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" HandleID="k8s-pod-network.035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Workload="ci--4081--3--6--n--593f1c83d2-k8s-whisker--764d867d9c--fngzt-eth0" Mar 7 01:07:39.698870 containerd[1511]: 2026-03-07 01:07:39.617 [INFO][3910] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:07:39.698870 containerd[1511]: 2026-03-07 01:07:39.651 [INFO][3910] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:07:39.698870 containerd[1511]: 2026-03-07 01:07:39.666 [WARNING][3910] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" HandleID="k8s-pod-network.035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Workload="ci--4081--3--6--n--593f1c83d2-k8s-whisker--764d867d9c--fngzt-eth0" Mar 7 01:07:39.698870 containerd[1511]: 2026-03-07 01:07:39.666 [INFO][3910] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" HandleID="k8s-pod-network.035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Workload="ci--4081--3--6--n--593f1c83d2-k8s-whisker--764d867d9c--fngzt-eth0" Mar 7 01:07:39.698870 containerd[1511]: 2026-03-07 01:07:39.673 [INFO][3910] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:07:39.698870 containerd[1511]: 2026-03-07 01:07:39.688 [INFO][3809] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Mar 7 01:07:39.699536 containerd[1511]: time="2026-03-07T01:07:39.699515686Z" level=info msg="TearDown network for sandbox \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\" successfully" Mar 7 01:07:39.699674 containerd[1511]: time="2026-03-07T01:07:39.699575200Z" level=info msg="StopPodSandbox for \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\" returns successfully" Mar 7 01:07:39.811097 kubelet[2572]: I0307 01:07:39.811023 2572 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/secret/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-whisker-backend-key-pair\") pod \"06a6379b-7db4-4dbd-a0fc-bb463be89e8c\" (UID: \"06a6379b-7db4-4dbd-a0fc-bb463be89e8c\") " Mar 7 01:07:39.811097 kubelet[2572]: I0307 01:07:39.811062 2572 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-nginx-config\" (UniqueName: \"kubernetes.io/configmap/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-nginx-config\") pod \"06a6379b-7db4-4dbd-a0fc-bb463be89e8c\" (UID: \"06a6379b-7db4-4dbd-a0fc-bb463be89e8c\") " Mar 7 01:07:39.811097 kubelet[2572]: I0307 01:07:39.811085 2572 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/projected/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-kube-api-access-q5vtj\" (UniqueName: \"kubernetes.io/projected/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-kube-api-access-q5vtj\") pod \"06a6379b-7db4-4dbd-a0fc-bb463be89e8c\" (UID: \"06a6379b-7db4-4dbd-a0fc-bb463be89e8c\") " Mar 7 01:07:39.811097 kubelet[2572]: I0307 01:07:39.811097 2572 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-whisker-ca-bundle\") pod \"06a6379b-7db4-4dbd-a0fc-bb463be89e8c\" (UID: \"06a6379b-7db4-4dbd-a0fc-bb463be89e8c\") " Mar 7 01:07:39.812719 kubelet[2572]: I0307 01:07:39.812385 2572 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-nginx-config" pod "06a6379b-7db4-4dbd-a0fc-bb463be89e8c" (UID: "06a6379b-7db4-4dbd-a0fc-bb463be89e8c"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 7 01:07:39.813316 kubelet[2572]: I0307 01:07:39.813296 2572 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-whisker-ca-bundle" pod "06a6379b-7db4-4dbd-a0fc-bb463be89e8c" (UID: "06a6379b-7db4-4dbd-a0fc-bb463be89e8c"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 7 01:07:39.823501 kubelet[2572]: I0307 01:07:39.822566 2572 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-whisker-backend-key-pair" pod "06a6379b-7db4-4dbd-a0fc-bb463be89e8c" (UID: "06a6379b-7db4-4dbd-a0fc-bb463be89e8c"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 7 01:07:39.824157 kubelet[2572]: I0307 01:07:39.824049 2572 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-kube-api-access-q5vtj" pod "06a6379b-7db4-4dbd-a0fc-bb463be89e8c" (UID: "06a6379b-7db4-4dbd-a0fc-bb463be89e8c"). InnerVolumeSpecName "kube-api-access-q5vtj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 7 01:07:39.860743 systemd-networkd[1407]: cali340665de010: Link UP Mar 7 01:07:39.865418 systemd-networkd[1407]: cali340665de010: Gained carrier Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.676 [ERROR][3943] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.690 [INFO][3943] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0 coredns-7d764666f9- kube-system d25ec7da-6038-4c24-89dc-467334f42af3 877 0 2026-03-07 01:07:09 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-n-593f1c83d2 coredns-7d764666f9-qsb6f eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali340665de010 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" Namespace="kube-system" Pod="coredns-7d764666f9-qsb6f" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-" Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.690 [INFO][3943] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" Namespace="kube-system" Pod="coredns-7d764666f9-qsb6f" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.774 [INFO][3975] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" HandleID="k8s-pod-network.45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.786 [INFO][3975] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" HandleID="k8s-pod-network.45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fde90), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-n-593f1c83d2", "pod":"coredns-7d764666f9-qsb6f", "timestamp":"2026-03-07 01:07:39.774902856 +0000 UTC"}, Hostname:"ci-4081-3-6-n-593f1c83d2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000372c60)} Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.786 [INFO][3975] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.786 [INFO][3975] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.786 [INFO][3975] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-593f1c83d2' Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.791 [INFO][3975] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.797 [INFO][3975] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.804 [INFO][3975] ipam/ipam.go 526: Trying affinity for 192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.807 [INFO][3975] ipam/ipam.go 160: Attempting to load block cidr=192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.809 [INFO][3975] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.809 [INFO][3975] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.59.0/26 handle="k8s-pod-network.45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.812 [INFO][3975] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.824 [INFO][3975] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.59.0/26 handle="k8s-pod-network.45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.833 [INFO][3975] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.59.1/26] block=192.168.59.0/26 handle="k8s-pod-network.45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.834 [INFO][3975] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.59.1/26] handle="k8s-pod-network.45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.834 [INFO][3975] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:07:39.910182 containerd[1511]: 2026-03-07 01:07:39.834 [INFO][3975] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.59.1/26] IPv6=[] ContainerID="45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" HandleID="k8s-pod-network.45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" Mar 7 01:07:39.910667 containerd[1511]: 2026-03-07 01:07:39.845 [INFO][3943] cni-plugin/k8s.go 418: Populated endpoint ContainerID="45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" Namespace="kube-system" Pod="coredns-7d764666f9-qsb6f" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"d25ec7da-6038-4c24-89dc-467334f42af3", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"", Pod:"coredns-7d764666f9-qsb6f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.59.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali340665de010", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:07:39.910667 containerd[1511]: 2026-03-07 01:07:39.847 [INFO][3943] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.59.1/32] ContainerID="45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" Namespace="kube-system" Pod="coredns-7d764666f9-qsb6f" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" Mar 7 01:07:39.910667 containerd[1511]: 2026-03-07 01:07:39.847 [INFO][3943] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali340665de010 ContainerID="45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" Namespace="kube-system" Pod="coredns-7d764666f9-qsb6f" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" Mar 7 01:07:39.910667 containerd[1511]: 2026-03-07 01:07:39.874 [INFO][3943] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" Namespace="kube-system" Pod="coredns-7d764666f9-qsb6f" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" Mar 7 01:07:39.910667 containerd[1511]: 2026-03-07 01:07:39.874 [INFO][3943] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" Namespace="kube-system" Pod="coredns-7d764666f9-qsb6f" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"d25ec7da-6038-4c24-89dc-467334f42af3", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f", Pod:"coredns-7d764666f9-qsb6f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.59.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali340665de010", MAC:"fe:b3:24:2c:f0:74", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:07:39.910814 containerd[1511]: 2026-03-07 01:07:39.900 [INFO][3943] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f" Namespace="kube-system" Pod="coredns-7d764666f9-qsb6f" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" Mar 7 01:07:39.912143 kubelet[2572]: I0307 01:07:39.912039 2572 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-whisker-backend-key-pair\") on node \"ci-4081-3-6-n-593f1c83d2\" DevicePath \"\"" Mar 7 01:07:39.912143 kubelet[2572]: I0307 01:07:39.912074 2572 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-nginx-config\") on node \"ci-4081-3-6-n-593f1c83d2\" DevicePath \"\"" Mar 7 01:07:39.912143 kubelet[2572]: I0307 01:07:39.912082 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q5vtj\" (UniqueName: \"kubernetes.io/projected/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-kube-api-access-q5vtj\") on node \"ci-4081-3-6-n-593f1c83d2\" DevicePath \"\"" Mar 7 01:07:39.912143 kubelet[2572]: I0307 01:07:39.912091 2572 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06a6379b-7db4-4dbd-a0fc-bb463be89e8c-whisker-ca-bundle\") on node \"ci-4081-3-6-n-593f1c83d2\" DevicePath \"\"" Mar 7 01:07:39.937261 containerd[1511]: time="2026-03-07T01:07:39.936742504Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:07:39.937261 containerd[1511]: time="2026-03-07T01:07:39.936777369Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:07:39.937261 containerd[1511]: time="2026-03-07T01:07:39.936785070Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:39.937261 containerd[1511]: time="2026-03-07T01:07:39.936837423Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:39.957774 systemd-networkd[1407]: cali92268a984ea: Link UP Mar 7 01:07:39.958574 systemd-networkd[1407]: cali92268a984ea: Gained carrier Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.673 [ERROR][3930] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.690 [INFO][3930] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0 calico-kube-controllers-c8d76dd9d- calico-system 57352416-50b9-4228-9b61-6bb3108b4b1a 881 0 2026-03-07 01:07:19 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:c8d76dd9d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4081-3-6-n-593f1c83d2 calico-kube-controllers-c8d76dd9d-hqr58 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali92268a984ea [] [] }} ContainerID="fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" Namespace="calico-system" Pod="calico-kube-controllers-c8d76dd9d-hqr58" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-" Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.690 [INFO][3930] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" Namespace="calico-system" Pod="calico-kube-controllers-c8d76dd9d-hqr58" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.778 [INFO][3971] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" HandleID="k8s-pod-network.fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.787 [INFO][3971] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" HandleID="k8s-pod-network.fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003925d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-593f1c83d2", "pod":"calico-kube-controllers-c8d76dd9d-hqr58", "timestamp":"2026-03-07 01:07:39.778787698 +0000 UTC"}, Hostname:"ci-4081-3-6-n-593f1c83d2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0006182c0)} Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.787 [INFO][3971] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.834 [INFO][3971] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.834 [INFO][3971] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-593f1c83d2' Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.905 [INFO][3971] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.914 [INFO][3971] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.921 [INFO][3971] ipam/ipam.go 526: Trying affinity for 192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.923 [INFO][3971] ipam/ipam.go 160: Attempting to load block cidr=192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.925 [INFO][3971] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.926 [INFO][3971] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.59.0/26 handle="k8s-pod-network.fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.927 [INFO][3971] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2 Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.931 [INFO][3971] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.59.0/26 handle="k8s-pod-network.fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.936 [INFO][3971] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.59.2/26] block=192.168.59.0/26 handle="k8s-pod-network.fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.936 [INFO][3971] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.59.2/26] handle="k8s-pod-network.fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.936 [INFO][3971] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:07:39.969265 containerd[1511]: 2026-03-07 01:07:39.936 [INFO][3971] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.59.2/26] IPv6=[] ContainerID="fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" HandleID="k8s-pod-network.fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" Mar 7 01:07:39.969700 containerd[1511]: 2026-03-07 01:07:39.940 [INFO][3930] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" Namespace="calico-system" Pod="calico-kube-controllers-c8d76dd9d-hqr58" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0", GenerateName:"calico-kube-controllers-c8d76dd9d-", Namespace:"calico-system", SelfLink:"", UID:"57352416-50b9-4228-9b61-6bb3108b4b1a", ResourceVersion:"881", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"c8d76dd9d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"", Pod:"calico-kube-controllers-c8d76dd9d-hqr58", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.59.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali92268a984ea", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:07:39.969700 containerd[1511]: 2026-03-07 01:07:39.940 [INFO][3930] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.59.2/32] ContainerID="fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" Namespace="calico-system" Pod="calico-kube-controllers-c8d76dd9d-hqr58" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" Mar 7 01:07:39.969700 containerd[1511]: 2026-03-07 01:07:39.940 [INFO][3930] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali92268a984ea ContainerID="fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" Namespace="calico-system" Pod="calico-kube-controllers-c8d76dd9d-hqr58" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" Mar 7 01:07:39.969700 containerd[1511]: 2026-03-07 01:07:39.958 [INFO][3930] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" Namespace="calico-system" Pod="calico-kube-controllers-c8d76dd9d-hqr58" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" Mar 7 01:07:39.969700 containerd[1511]: 2026-03-07 01:07:39.958 [INFO][3930] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" Namespace="calico-system" Pod="calico-kube-controllers-c8d76dd9d-hqr58" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0", GenerateName:"calico-kube-controllers-c8d76dd9d-", Namespace:"calico-system", SelfLink:"", UID:"57352416-50b9-4228-9b61-6bb3108b4b1a", ResourceVersion:"881", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"c8d76dd9d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2", Pod:"calico-kube-controllers-c8d76dd9d-hqr58", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.59.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali92268a984ea", MAC:"e2:20:85:e3:90:e7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:07:39.969700 containerd[1511]: 2026-03-07 01:07:39.967 [INFO][3930] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2" Namespace="calico-system" Pod="calico-kube-controllers-c8d76dd9d-hqr58" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" Mar 7 01:07:39.979462 systemd[1]: Started cri-containerd-45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f.scope - libcontainer container 45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f. Mar 7 01:07:40.022144 containerd[1511]: time="2026-03-07T01:07:40.022060481Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:07:40.022144 containerd[1511]: time="2026-03-07T01:07:40.022114355Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:07:40.022383 containerd[1511]: time="2026-03-07T01:07:40.022124931Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:40.023239 containerd[1511]: time="2026-03-07T01:07:40.023170383Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:40.062545 containerd[1511]: time="2026-03-07T01:07:40.062250736Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-qsb6f,Uid:d25ec7da-6038-4c24-89dc-467334f42af3,Namespace:kube-system,Attempt:1,} returns sandbox id \"45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f\"" Mar 7 01:07:40.066699 systemd-networkd[1407]: cali68bb3f856ae: Link UP Mar 7 01:07:40.071881 systemd-networkd[1407]: cali68bb3f856ae: Gained carrier Mar 7 01:07:40.074148 containerd[1511]: time="2026-03-07T01:07:40.072955411Z" level=info msg="CreateContainer within sandbox \"45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 7 01:07:40.073119 systemd[1]: Started cri-containerd-fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2.scope - libcontainer container fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2. Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:39.746 [ERROR][3964] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:39.760 [INFO][3964] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0 goldmane-9f7667bb8- calico-system acf6180d-a049-431a-b0ff-bb13b9279e69 876 0 2026-03-07 01:07:19 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:9f7667bb8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4081-3-6-n-593f1c83d2 goldmane-9f7667bb8-85bqd eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali68bb3f856ae [] [] }} ContainerID="af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" Namespace="calico-system" Pod="goldmane-9f7667bb8-85bqd" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-" Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:39.761 [INFO][3964] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" Namespace="calico-system" Pod="goldmane-9f7667bb8-85bqd" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:39.818 [INFO][4017] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" HandleID="k8s-pod-network.af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" Workload="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:39.833 [INFO][4017] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" HandleID="k8s-pod-network.af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" Workload="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277af0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-593f1c83d2", "pod":"goldmane-9f7667bb8-85bqd", "timestamp":"2026-03-07 01:07:39.818635622 +0000 UTC"}, Hostname:"ci-4081-3-6-n-593f1c83d2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00022b600)} Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:39.833 [INFO][4017] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:39.936 [INFO][4017] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:39.936 [INFO][4017] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-593f1c83d2' Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:39.993 [INFO][4017] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:40.016 [INFO][4017] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:40.024 [INFO][4017] ipam/ipam.go 526: Trying affinity for 192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:40.026 [INFO][4017] ipam/ipam.go 160: Attempting to load block cidr=192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:40.028 [INFO][4017] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:40.029 [INFO][4017] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.59.0/26 handle="k8s-pod-network.af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:40.032 [INFO][4017] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588 Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:40.038 [INFO][4017] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.59.0/26 handle="k8s-pod-network.af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:40.044 [INFO][4017] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.59.3/26] block=192.168.59.0/26 handle="k8s-pod-network.af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:40.044 [INFO][4017] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.59.3/26] handle="k8s-pod-network.af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:40.046 [INFO][4017] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:07:40.094596 containerd[1511]: 2026-03-07 01:07:40.046 [INFO][4017] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.59.3/26] IPv6=[] ContainerID="af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" HandleID="k8s-pod-network.af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" Workload="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" Mar 7 01:07:40.095410 containerd[1511]: 2026-03-07 01:07:40.055 [INFO][3964] cni-plugin/k8s.go 418: Populated endpoint ContainerID="af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" Namespace="calico-system" Pod="goldmane-9f7667bb8-85bqd" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"acf6180d-a049-431a-b0ff-bb13b9279e69", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"", Pod:"goldmane-9f7667bb8-85bqd", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.59.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali68bb3f856ae", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:07:40.095410 containerd[1511]: 2026-03-07 01:07:40.056 [INFO][3964] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.59.3/32] ContainerID="af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" Namespace="calico-system" Pod="goldmane-9f7667bb8-85bqd" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" Mar 7 01:07:40.095410 containerd[1511]: 2026-03-07 01:07:40.056 [INFO][3964] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali68bb3f856ae ContainerID="af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" Namespace="calico-system" Pod="goldmane-9f7667bb8-85bqd" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" Mar 7 01:07:40.095410 containerd[1511]: 2026-03-07 01:07:40.077 [INFO][3964] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" Namespace="calico-system" Pod="goldmane-9f7667bb8-85bqd" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" Mar 7 01:07:40.095410 containerd[1511]: 2026-03-07 01:07:40.078 [INFO][3964] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" Namespace="calico-system" Pod="goldmane-9f7667bb8-85bqd" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"acf6180d-a049-431a-b0ff-bb13b9279e69", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588", Pod:"goldmane-9f7667bb8-85bqd", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.59.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali68bb3f856ae", MAC:"0a:62:3d:90:f7:06", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:07:40.095410 containerd[1511]: 2026-03-07 01:07:40.087 [INFO][3964] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588" Namespace="calico-system" Pod="goldmane-9f7667bb8-85bqd" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" Mar 7 01:07:40.103449 containerd[1511]: time="2026-03-07T01:07:40.103021964Z" level=info msg="CreateContainer within sandbox \"45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9bbae00cad325e63e88ba4b89c6a5b719a70cf459a1a8df8449587b978cf6c9a\"" Mar 7 01:07:40.104214 containerd[1511]: time="2026-03-07T01:07:40.104106126Z" level=info msg="StartContainer for \"9bbae00cad325e63e88ba4b89c6a5b719a70cf459a1a8df8449587b978cf6c9a\"" Mar 7 01:07:40.141593 containerd[1511]: time="2026-03-07T01:07:40.140490767Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:07:40.141593 containerd[1511]: time="2026-03-07T01:07:40.140550210Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:07:40.141593 containerd[1511]: time="2026-03-07T01:07:40.140560827Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:40.141593 containerd[1511]: time="2026-03-07T01:07:40.140624156Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:40.162354 systemd[1]: Started cri-containerd-9bbae00cad325e63e88ba4b89c6a5b719a70cf459a1a8df8449587b978cf6c9a.scope - libcontainer container 9bbae00cad325e63e88ba4b89c6a5b719a70cf459a1a8df8449587b978cf6c9a. Mar 7 01:07:40.162831 systemd[1]: Removed slice kubepods-besteffort-pod06a6379b_7db4_4dbd_a0fc_bb463be89e8c.slice - libcontainer container kubepods-besteffort-pod06a6379b_7db4_4dbd_a0fc_bb463be89e8c.slice. Mar 7 01:07:40.178298 systemd-networkd[1407]: cali239e11d89ac: Link UP Mar 7 01:07:40.178511 systemd-networkd[1407]: cali239e11d89ac: Gained carrier Mar 7 01:07:40.210539 systemd[1]: Started cri-containerd-af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588.scope - libcontainer container af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588. Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:39.768 [ERROR][3994] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:39.792 [INFO][3994] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0 calico-apiserver-6f7869694b- calico-system ffcd3157-37de-4e8c-903c-bcf1e17ebdb3 879 0 2026-03-07 01:07:19 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6f7869694b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-n-593f1c83d2 calico-apiserver-6f7869694b-75lrk eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali239e11d89ac [] [] }} ContainerID="d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" Namespace="calico-system" Pod="calico-apiserver-6f7869694b-75lrk" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-" Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:39.792 [INFO][3994] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" Namespace="calico-system" Pod="calico-apiserver-6f7869694b-75lrk" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:39.850 [INFO][4026] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" HandleID="k8s-pod-network.d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:39.868 [INFO][4026] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" HandleID="k8s-pod-network.d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000276170), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-593f1c83d2", "pod":"calico-apiserver-6f7869694b-75lrk", "timestamp":"2026-03-07 01:07:39.850150127 +0000 UTC"}, Hostname:"ci-4081-3-6-n-593f1c83d2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000112580)} Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:39.868 [INFO][4026] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:40.045 [INFO][4026] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:40.045 [INFO][4026] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-593f1c83d2' Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:40.092 [INFO][4026] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:40.116 [INFO][4026] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:40.125 [INFO][4026] ipam/ipam.go 526: Trying affinity for 192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:40.129 [INFO][4026] ipam/ipam.go 160: Attempting to load block cidr=192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:40.132 [INFO][4026] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:40.132 [INFO][4026] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.59.0/26 handle="k8s-pod-network.d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:40.137 [INFO][4026] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714 Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:40.145 [INFO][4026] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.59.0/26 handle="k8s-pod-network.d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:40.155 [INFO][4026] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.59.4/26] block=192.168.59.0/26 handle="k8s-pod-network.d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:40.155 [INFO][4026] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.59.4/26] handle="k8s-pod-network.d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:40.155 [INFO][4026] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:07:40.219094 containerd[1511]: 2026-03-07 01:07:40.155 [INFO][4026] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.59.4/26] IPv6=[] ContainerID="d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" HandleID="k8s-pod-network.d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" Mar 7 01:07:40.219635 containerd[1511]: 2026-03-07 01:07:40.169 [INFO][3994] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" Namespace="calico-system" Pod="calico-apiserver-6f7869694b-75lrk" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0", GenerateName:"calico-apiserver-6f7869694b-", Namespace:"calico-system", SelfLink:"", UID:"ffcd3157-37de-4e8c-903c-bcf1e17ebdb3", ResourceVersion:"879", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f7869694b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"", Pod:"calico-apiserver-6f7869694b-75lrk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.59.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali239e11d89ac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:07:40.219635 containerd[1511]: 2026-03-07 01:07:40.169 [INFO][3994] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.59.4/32] ContainerID="d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" Namespace="calico-system" Pod="calico-apiserver-6f7869694b-75lrk" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" Mar 7 01:07:40.219635 containerd[1511]: 2026-03-07 01:07:40.169 [INFO][3994] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali239e11d89ac ContainerID="d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" Namespace="calico-system" Pod="calico-apiserver-6f7869694b-75lrk" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" Mar 7 01:07:40.219635 containerd[1511]: 2026-03-07 01:07:40.180 [INFO][3994] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" Namespace="calico-system" Pod="calico-apiserver-6f7869694b-75lrk" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" Mar 7 01:07:40.219635 containerd[1511]: 2026-03-07 01:07:40.191 [INFO][3994] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" Namespace="calico-system" Pod="calico-apiserver-6f7869694b-75lrk" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0", GenerateName:"calico-apiserver-6f7869694b-", Namespace:"calico-system", SelfLink:"", UID:"ffcd3157-37de-4e8c-903c-bcf1e17ebdb3", ResourceVersion:"879", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f7869694b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714", Pod:"calico-apiserver-6f7869694b-75lrk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.59.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali239e11d89ac", MAC:"be:93:48:18:3c:cf", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:07:40.219635 containerd[1511]: 2026-03-07 01:07:40.204 [INFO][3994] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714" Namespace="calico-system" Pod="calico-apiserver-6f7869694b-75lrk" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" Mar 7 01:07:40.239425 containerd[1511]: time="2026-03-07T01:07:40.238510991Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-c8d76dd9d-hqr58,Uid:57352416-50b9-4228-9b61-6bb3108b4b1a,Namespace:calico-system,Attempt:1,} returns sandbox id \"fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2\"" Mar 7 01:07:40.242119 containerd[1511]: time="2026-03-07T01:07:40.241735473Z" level=info msg="StartContainer for \"9bbae00cad325e63e88ba4b89c6a5b719a70cf459a1a8df8449587b978cf6c9a\" returns successfully" Mar 7 01:07:40.243625 containerd[1511]: time="2026-03-07T01:07:40.243499754Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 7 01:07:40.287747 containerd[1511]: time="2026-03-07T01:07:40.287439616Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:07:40.289297 containerd[1511]: time="2026-03-07T01:07:40.289241014Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:07:40.289297 containerd[1511]: time="2026-03-07T01:07:40.289289410Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:40.289469 containerd[1511]: time="2026-03-07T01:07:40.289379331Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:40.319837 systemd-networkd[1407]: cali76534bbf384: Link UP Mar 7 01:07:40.320050 systemd-networkd[1407]: cali76534bbf384: Gained carrier Mar 7 01:07:40.332346 systemd[1]: run-netns-cni\x2d6be8196e\x2d409f\x2dea21\x2db98b\x2d8d01773999ff.mount: Deactivated successfully. Mar 7 01:07:40.332430 systemd[1]: run-netns-cni\x2d16d99109\x2d2cfe\x2d8074\x2d7792\x2d851676b05102.mount: Deactivated successfully. Mar 7 01:07:40.332484 systemd[1]: run-netns-cni\x2d14e5f6a5\x2dd26b\x2d17be\x2db070\x2d658ee4028e5f.mount: Deactivated successfully. Mar 7 01:07:40.332535 systemd[1]: run-netns-cni\x2dec74d7ff\x2daae2\x2d91d5\x2def83\x2d405857fd97ec.mount: Deactivated successfully. Mar 7 01:07:40.332588 systemd[1]: run-netns-cni\x2d29802cf6\x2d4244\x2d1ba6\x2df2cb\x2dd2c0ccba4fee.mount: Deactivated successfully. Mar 7 01:07:40.332639 systemd[1]: run-netns-cni\x2dde89a7f5\x2df6ee\x2da903\x2d3320\x2d9fc4151a7530.mount: Deactivated successfully. Mar 7 01:07:40.332694 systemd[1]: var-lib-kubelet-pods-06a6379b\x2d7db4\x2d4dbd\x2da0fc\x2dbb463be89e8c-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dq5vtj.mount: Deactivated successfully. Mar 7 01:07:40.332754 systemd[1]: var-lib-kubelet-pods-06a6379b\x2d7db4\x2d4dbd\x2da0fc\x2dbb463be89e8c-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 7 01:07:40.342316 systemd[1]: Started cri-containerd-d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714.scope - libcontainer container d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714. Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:39.788 [ERROR][3983] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:39.815 [INFO][3983] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0 calico-apiserver-6f7869694b- calico-system 408b461e-6520-4c86-a37a-f5d60a0218ca 880 0 2026-03-07 01:07:19 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6f7869694b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-n-593f1c83d2 calico-apiserver-6f7869694b-rjchv eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali76534bbf384 [] [] }} ContainerID="b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" Namespace="calico-system" Pod="calico-apiserver-6f7869694b-rjchv" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-" Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:39.816 [INFO][3983] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" Namespace="calico-system" Pod="calico-apiserver-6f7869694b-rjchv" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:39.886 [INFO][4048] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" HandleID="k8s-pod-network.b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:39.904 [INFO][4048] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" HandleID="k8s-pod-network.b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004c3060), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-593f1c83d2", "pod":"calico-apiserver-6f7869694b-rjchv", "timestamp":"2026-03-07 01:07:39.886777214 +0000 UTC"}, Hostname:"ci-4081-3-6-n-593f1c83d2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000188580)} Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:39.904 [INFO][4048] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:40.158 [INFO][4048] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:40.158 [INFO][4048] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-593f1c83d2' Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:40.194 [INFO][4048] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:40.217 [INFO][4048] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:40.233 [INFO][4048] ipam/ipam.go 526: Trying affinity for 192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:40.237 [INFO][4048] ipam/ipam.go 160: Attempting to load block cidr=192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:40.240 [INFO][4048] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:40.241 [INFO][4048] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.59.0/26 handle="k8s-pod-network.b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:40.245 [INFO][4048] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:40.256 [INFO][4048] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.59.0/26 handle="k8s-pod-network.b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:40.293 [INFO][4048] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.59.5/26] block=192.168.59.0/26 handle="k8s-pod-network.b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:40.293 [INFO][4048] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.59.5/26] handle="k8s-pod-network.b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:40.295 [INFO][4048] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:07:40.378045 containerd[1511]: 2026-03-07 01:07:40.295 [INFO][4048] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.59.5/26] IPv6=[] ContainerID="b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" HandleID="k8s-pod-network.b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" Mar 7 01:07:40.378705 containerd[1511]: 2026-03-07 01:07:40.305 [INFO][3983] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" Namespace="calico-system" Pod="calico-apiserver-6f7869694b-rjchv" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0", GenerateName:"calico-apiserver-6f7869694b-", Namespace:"calico-system", SelfLink:"", UID:"408b461e-6520-4c86-a37a-f5d60a0218ca", ResourceVersion:"880", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f7869694b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"", Pod:"calico-apiserver-6f7869694b-rjchv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.59.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali76534bbf384", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:07:40.378705 containerd[1511]: 2026-03-07 01:07:40.306 [INFO][3983] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.59.5/32] ContainerID="b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" Namespace="calico-system" Pod="calico-apiserver-6f7869694b-rjchv" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" Mar 7 01:07:40.378705 containerd[1511]: 2026-03-07 01:07:40.306 [INFO][3983] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali76534bbf384 ContainerID="b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" Namespace="calico-system" Pod="calico-apiserver-6f7869694b-rjchv" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" Mar 7 01:07:40.378705 containerd[1511]: 2026-03-07 01:07:40.319 [INFO][3983] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" Namespace="calico-system" Pod="calico-apiserver-6f7869694b-rjchv" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" Mar 7 01:07:40.378705 containerd[1511]: 2026-03-07 01:07:40.319 [INFO][3983] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" Namespace="calico-system" Pod="calico-apiserver-6f7869694b-rjchv" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0", GenerateName:"calico-apiserver-6f7869694b-", Namespace:"calico-system", SelfLink:"", UID:"408b461e-6520-4c86-a37a-f5d60a0218ca", ResourceVersion:"880", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f7869694b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb", Pod:"calico-apiserver-6f7869694b-rjchv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.59.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali76534bbf384", MAC:"0a:95:a9:fc:d8:93", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:07:40.378705 containerd[1511]: 2026-03-07 01:07:40.372 [INFO][3983] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb" Namespace="calico-system" Pod="calico-apiserver-6f7869694b-rjchv" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" Mar 7 01:07:40.410055 systemd[1]: run-containerd-runc-k8s.io-ec75b89a203d741bc3c5ff05429f82a3c692393a003b7f0d45dda3d01549365f-runc.N9Aexz.mount: Deactivated successfully. Mar 7 01:07:40.413246 kubelet[2572]: I0307 01:07:40.411057 2572 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-qsb6f" podStartSLOduration=31.411045678 podStartE2EDuration="31.411045678s" podCreationTimestamp="2026-03-07 01:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:07:40.410487743 +0000 UTC m=+38.370214493" watchObservedRunningTime="2026-03-07 01:07:40.411045678 +0000 UTC m=+38.370772418" Mar 7 01:07:40.425786 containerd[1511]: time="2026-03-07T01:07:40.425618607Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:07:40.426021 containerd[1511]: time="2026-03-07T01:07:40.425919910Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:07:40.426130 containerd[1511]: time="2026-03-07T01:07:40.425967614Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:40.426555 containerd[1511]: time="2026-03-07T01:07:40.426510287Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:40.457420 systemd[1]: Started cri-containerd-b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb.scope - libcontainer container b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb. Mar 7 01:07:40.466976 systemd-networkd[1407]: calif2882f848af: Link UP Mar 7 01:07:40.468188 systemd-networkd[1407]: calif2882f848af: Gained carrier Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:39.731 [ERROR][3955] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:39.756 [INFO][3955] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0 coredns-7d764666f9- kube-system 6afa7af7-0921-4201-99cd-3abd2378e890 875 0 2026-03-07 01:07:09 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-n-593f1c83d2 coredns-7d764666f9-f9xbn eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif2882f848af [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" Namespace="kube-system" Pod="coredns-7d764666f9-f9xbn" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-" Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:39.757 [INFO][3955] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" Namespace="kube-system" Pod="coredns-7d764666f9-f9xbn" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:39.884 [INFO][4011] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" HandleID="k8s-pod-network.a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:39.914 [INFO][4011] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" HandleID="k8s-pod-network.a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000122810), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-n-593f1c83d2", "pod":"coredns-7d764666f9-f9xbn", "timestamp":"2026-03-07 01:07:39.884001658 +0000 UTC"}, Hostname:"ci-4081-3-6-n-593f1c83d2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000188f20)} Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:39.914 [INFO][4011] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:40.293 [INFO][4011] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:40.293 [INFO][4011] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-593f1c83d2' Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:40.302 [INFO][4011] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:40.329 [INFO][4011] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:40.380 [INFO][4011] ipam/ipam.go 526: Trying affinity for 192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:40.387 [INFO][4011] ipam/ipam.go 160: Attempting to load block cidr=192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:40.418 [INFO][4011] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:40.418 [INFO][4011] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.59.0/26 handle="k8s-pod-network.a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:40.428 [INFO][4011] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91 Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:40.439 [INFO][4011] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.59.0/26 handle="k8s-pod-network.a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:40.455 [INFO][4011] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.59.6/26] block=192.168.59.0/26 handle="k8s-pod-network.a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:40.455 [INFO][4011] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.59.6/26] handle="k8s-pod-network.a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:40.455 [INFO][4011] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:07:40.514449 containerd[1511]: 2026-03-07 01:07:40.455 [INFO][4011] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.59.6/26] IPv6=[] ContainerID="a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" HandleID="k8s-pod-network.a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" Mar 7 01:07:40.514918 containerd[1511]: 2026-03-07 01:07:40.463 [INFO][3955] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" Namespace="kube-system" Pod="coredns-7d764666f9-f9xbn" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"6afa7af7-0921-4201-99cd-3abd2378e890", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"", Pod:"coredns-7d764666f9-f9xbn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.59.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif2882f848af", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:07:40.514918 containerd[1511]: 2026-03-07 01:07:40.463 [INFO][3955] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.59.6/32] ContainerID="a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" Namespace="kube-system" Pod="coredns-7d764666f9-f9xbn" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" Mar 7 01:07:40.514918 containerd[1511]: 2026-03-07 01:07:40.463 [INFO][3955] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif2882f848af ContainerID="a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" Namespace="kube-system" Pod="coredns-7d764666f9-f9xbn" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" Mar 7 01:07:40.514918 containerd[1511]: 2026-03-07 01:07:40.475 [INFO][3955] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" Namespace="kube-system" Pod="coredns-7d764666f9-f9xbn" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" Mar 7 01:07:40.514918 containerd[1511]: 2026-03-07 01:07:40.477 [INFO][3955] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" Namespace="kube-system" Pod="coredns-7d764666f9-f9xbn" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"6afa7af7-0921-4201-99cd-3abd2378e890", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91", Pod:"coredns-7d764666f9-f9xbn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.59.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif2882f848af", MAC:"a2:aa:c6:b8:44:c4", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:07:40.515074 containerd[1511]: 2026-03-07 01:07:40.503 [INFO][3955] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91" Namespace="kube-system" Pod="coredns-7d764666f9-f9xbn" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" Mar 7 01:07:40.527561 containerd[1511]: time="2026-03-07T01:07:40.527532393Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f7869694b-75lrk,Uid:ffcd3157-37de-4e8c-903c-bcf1e17ebdb3,Namespace:calico-system,Attempt:1,} returns sandbox id \"d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714\"" Mar 7 01:07:40.551129 containerd[1511]: time="2026-03-07T01:07:40.551036413Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:07:40.551129 containerd[1511]: time="2026-03-07T01:07:40.551096017Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:07:40.552328 containerd[1511]: time="2026-03-07T01:07:40.551115548Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:40.552328 containerd[1511]: time="2026-03-07T01:07:40.551184036Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:40.577450 systemd[1]: Started cri-containerd-a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91.scope - libcontainer container a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91. Mar 7 01:07:40.583999 systemd[1]: Created slice kubepods-besteffort-pod7222a0ac_cccc_4b8d_b239_7750451b9926.slice - libcontainer container kubepods-besteffort-pod7222a0ac_cccc_4b8d_b239_7750451b9926.slice. Mar 7 01:07:40.617635 kubelet[2572]: I0307 01:07:40.617611 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm8nm\" (UniqueName: \"kubernetes.io/projected/7222a0ac-cccc-4b8d-b239-7750451b9926-kube-api-access-rm8nm\") pod \"whisker-6467698fbb-z99b9\" (UID: \"7222a0ac-cccc-4b8d-b239-7750451b9926\") " pod="calico-system/whisker-6467698fbb-z99b9" Mar 7 01:07:40.617847 kubelet[2572]: I0307 01:07:40.617836 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/7222a0ac-cccc-4b8d-b239-7750451b9926-nginx-config\") pod \"whisker-6467698fbb-z99b9\" (UID: \"7222a0ac-cccc-4b8d-b239-7750451b9926\") " pod="calico-system/whisker-6467698fbb-z99b9" Mar 7 01:07:40.618016 kubelet[2572]: I0307 01:07:40.618006 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7222a0ac-cccc-4b8d-b239-7750451b9926-whisker-backend-key-pair\") pod \"whisker-6467698fbb-z99b9\" (UID: \"7222a0ac-cccc-4b8d-b239-7750451b9926\") " pod="calico-system/whisker-6467698fbb-z99b9" Mar 7 01:07:40.618087 kubelet[2572]: I0307 01:07:40.618079 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7222a0ac-cccc-4b8d-b239-7750451b9926-whisker-ca-bundle\") pod \"whisker-6467698fbb-z99b9\" (UID: \"7222a0ac-cccc-4b8d-b239-7750451b9926\") " pod="calico-system/whisker-6467698fbb-z99b9" Mar 7 01:07:40.664453 containerd[1511]: time="2026-03-07T01:07:40.664144715Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-f9xbn,Uid:6afa7af7-0921-4201-99cd-3abd2378e890,Namespace:kube-system,Attempt:1,} returns sandbox id \"a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91\"" Mar 7 01:07:40.672073 containerd[1511]: time="2026-03-07T01:07:40.671814817Z" level=info msg="CreateContainer within sandbox \"a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 7 01:07:40.685953 containerd[1511]: time="2026-03-07T01:07:40.685930096Z" level=info msg="CreateContainer within sandbox \"a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9dbf4d26729e65f29886d9d750d129cf592fcbb01314dc645f5bf64f29d691de\"" Mar 7 01:07:40.686539 containerd[1511]: time="2026-03-07T01:07:40.686504529Z" level=info msg="StartContainer for \"9dbf4d26729e65f29886d9d750d129cf592fcbb01314dc645f5bf64f29d691de\"" Mar 7 01:07:40.729985 containerd[1511]: time="2026-03-07T01:07:40.729374661Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-85bqd,Uid:acf6180d-a049-431a-b0ff-bb13b9279e69,Namespace:calico-system,Attempt:1,} returns sandbox id \"af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588\"" Mar 7 01:07:40.741538 systemd[1]: Started cri-containerd-9dbf4d26729e65f29886d9d750d129cf592fcbb01314dc645f5bf64f29d691de.scope - libcontainer container 9dbf4d26729e65f29886d9d750d129cf592fcbb01314dc645f5bf64f29d691de. Mar 7 01:07:40.788485 containerd[1511]: time="2026-03-07T01:07:40.788088578Z" level=info msg="StartContainer for \"9dbf4d26729e65f29886d9d750d129cf592fcbb01314dc645f5bf64f29d691de\" returns successfully" Mar 7 01:07:40.806759 containerd[1511]: time="2026-03-07T01:07:40.806545043Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f7869694b-rjchv,Uid:408b461e-6520-4c86-a37a-f5d60a0218ca,Namespace:calico-system,Attempt:1,} returns sandbox id \"b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb\"" Mar 7 01:07:40.889878 containerd[1511]: time="2026-03-07T01:07:40.889844457Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6467698fbb-z99b9,Uid:7222a0ac-cccc-4b8d-b239-7750451b9926,Namespace:calico-system,Attempt:0,}" Mar 7 01:07:41.007508 systemd-networkd[1407]: cali340665de010: Gained IPv6LL Mar 7 01:07:41.048862 systemd-networkd[1407]: cali61c6ec07438: Link UP Mar 7 01:07:41.049571 systemd-networkd[1407]: cali61c6ec07438: Gained carrier Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:40.922 [ERROR][4536] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:40.934 [INFO][4536] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--593f1c83d2-k8s-whisker--6467698fbb--z99b9-eth0 whisker-6467698fbb- calico-system 7222a0ac-cccc-4b8d-b239-7750451b9926 928 0 2026-03-07 01:07:40 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6467698fbb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4081-3-6-n-593f1c83d2 whisker-6467698fbb-z99b9 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali61c6ec07438 [] [] }} ContainerID="c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" Namespace="calico-system" Pod="whisker-6467698fbb-z99b9" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-whisker--6467698fbb--z99b9-" Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:40.934 [INFO][4536] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" Namespace="calico-system" Pod="whisker-6467698fbb-z99b9" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-whisker--6467698fbb--z99b9-eth0" Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:40.958 [INFO][4548] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" HandleID="k8s-pod-network.c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" Workload="ci--4081--3--6--n--593f1c83d2-k8s-whisker--6467698fbb--z99b9-eth0" Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:40.963 [INFO][4548] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" HandleID="k8s-pod-network.c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" Workload="ci--4081--3--6--n--593f1c83d2-k8s-whisker--6467698fbb--z99b9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ef4b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-593f1c83d2", "pod":"whisker-6467698fbb-z99b9", "timestamp":"2026-03-07 01:07:40.958070076 +0000 UTC"}, Hostname:"ci-4081-3-6-n-593f1c83d2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000201080)} Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:40.963 [INFO][4548] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:40.963 [INFO][4548] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:40.963 [INFO][4548] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-593f1c83d2' Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:40.965 [INFO][4548] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:40.973 [INFO][4548] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:40.977 [INFO][4548] ipam/ipam.go 526: Trying affinity for 192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:40.978 [INFO][4548] ipam/ipam.go 160: Attempting to load block cidr=192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:40.980 [INFO][4548] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:40.980 [INFO][4548] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.59.0/26 handle="k8s-pod-network.c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:40.981 [INFO][4548] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:40.997 [INFO][4548] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.59.0/26 handle="k8s-pod-network.c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:41.042 [INFO][4548] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.59.7/26] block=192.168.59.0/26 handle="k8s-pod-network.c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:41.042 [INFO][4548] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.59.7/26] handle="k8s-pod-network.c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:41.042 [INFO][4548] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:07:41.067747 containerd[1511]: 2026-03-07 01:07:41.042 [INFO][4548] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.59.7/26] IPv6=[] ContainerID="c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" HandleID="k8s-pod-network.c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" Workload="ci--4081--3--6--n--593f1c83d2-k8s-whisker--6467698fbb--z99b9-eth0" Mar 7 01:07:41.068180 containerd[1511]: 2026-03-07 01:07:41.045 [INFO][4536] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" Namespace="calico-system" Pod="whisker-6467698fbb-z99b9" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-whisker--6467698fbb--z99b9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-whisker--6467698fbb--z99b9-eth0", GenerateName:"whisker-6467698fbb-", Namespace:"calico-system", SelfLink:"", UID:"7222a0ac-cccc-4b8d-b239-7750451b9926", ResourceVersion:"928", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6467698fbb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"", Pod:"whisker-6467698fbb-z99b9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.59.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali61c6ec07438", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:07:41.068180 containerd[1511]: 2026-03-07 01:07:41.045 [INFO][4536] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.59.7/32] ContainerID="c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" Namespace="calico-system" Pod="whisker-6467698fbb-z99b9" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-whisker--6467698fbb--z99b9-eth0" Mar 7 01:07:41.068180 containerd[1511]: 2026-03-07 01:07:41.045 [INFO][4536] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali61c6ec07438 ContainerID="c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" Namespace="calico-system" Pod="whisker-6467698fbb-z99b9" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-whisker--6467698fbb--z99b9-eth0" Mar 7 01:07:41.068180 containerd[1511]: 2026-03-07 01:07:41.049 [INFO][4536] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" Namespace="calico-system" Pod="whisker-6467698fbb-z99b9" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-whisker--6467698fbb--z99b9-eth0" Mar 7 01:07:41.068180 containerd[1511]: 2026-03-07 01:07:41.050 [INFO][4536] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" Namespace="calico-system" Pod="whisker-6467698fbb-z99b9" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-whisker--6467698fbb--z99b9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-whisker--6467698fbb--z99b9-eth0", GenerateName:"whisker-6467698fbb-", Namespace:"calico-system", SelfLink:"", UID:"7222a0ac-cccc-4b8d-b239-7750451b9926", ResourceVersion:"928", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6467698fbb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d", Pod:"whisker-6467698fbb-z99b9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.59.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali61c6ec07438", MAC:"6e:a7:dd:42:bb:5d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:07:41.068180 containerd[1511]: 2026-03-07 01:07:41.057 [INFO][4536] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d" Namespace="calico-system" Pod="whisker-6467698fbb-z99b9" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-whisker--6467698fbb--z99b9-eth0" Mar 7 01:07:41.083387 containerd[1511]: time="2026-03-07T01:07:41.083281356Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:07:41.083906 containerd[1511]: time="2026-03-07T01:07:41.083859270Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:07:41.083956 containerd[1511]: time="2026-03-07T01:07:41.083903680Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:41.084046 containerd[1511]: time="2026-03-07T01:07:41.084018220Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:41.101318 systemd[1]: Started cri-containerd-c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d.scope - libcontainer container c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d. Mar 7 01:07:41.134733 containerd[1511]: time="2026-03-07T01:07:41.134639351Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6467698fbb-z99b9,Uid:7222a0ac-cccc-4b8d-b239-7750451b9926,Namespace:calico-system,Attempt:0,} returns sandbox id \"c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d\"" Mar 7 01:07:41.400931 kubelet[2572]: I0307 01:07:41.399938 2572 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-f9xbn" podStartSLOduration=32.399918254 podStartE2EDuration="32.399918254s" podCreationTimestamp="2026-03-07 01:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:07:41.393992028 +0000 UTC m=+39.353718808" watchObservedRunningTime="2026-03-07 01:07:41.399918254 +0000 UTC m=+39.359645034" Mar 7 01:07:41.519475 systemd-networkd[1407]: cali92268a984ea: Gained IPv6LL Mar 7 01:07:41.583457 systemd-networkd[1407]: cali68bb3f856ae: Gained IPv6LL Mar 7 01:07:41.649715 systemd-networkd[1407]: cali239e11d89ac: Gained IPv6LL Mar 7 01:07:42.143110 kubelet[2572]: I0307 01:07:42.143035 2572 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="06a6379b-7db4-4dbd-a0fc-bb463be89e8c" path="/var/lib/kubelet/pods/06a6379b-7db4-4dbd-a0fc-bb463be89e8c/volumes" Mar 7 01:07:42.224247 systemd-networkd[1407]: calif2882f848af: Gained IPv6LL Mar 7 01:07:42.287406 systemd-networkd[1407]: cali76534bbf384: Gained IPv6LL Mar 7 01:07:42.287844 systemd-networkd[1407]: cali61c6ec07438: Gained IPv6LL Mar 7 01:07:44.219527 containerd[1511]: time="2026-03-07T01:07:44.218891991Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:44.220047 containerd[1511]: time="2026-03-07T01:07:44.219990154Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Mar 7 01:07:44.220506 containerd[1511]: time="2026-03-07T01:07:44.220460257Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:44.222327 containerd[1511]: time="2026-03-07T01:07:44.222182601Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:44.223012 containerd[1511]: time="2026-03-07T01:07:44.222636128Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 3.979115032s" Mar 7 01:07:44.223012 containerd[1511]: time="2026-03-07T01:07:44.222661609Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Mar 7 01:07:44.224243 containerd[1511]: time="2026-03-07T01:07:44.224066740Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 7 01:07:44.238751 containerd[1511]: time="2026-03-07T01:07:44.238717072Z" level=info msg="CreateContainer within sandbox \"fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 7 01:07:44.251044 containerd[1511]: time="2026-03-07T01:07:44.250854005Z" level=info msg="CreateContainer within sandbox \"fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"cfd8e81fce8e6503b2fbfcc561a6aa0ad9f0f00dac806ca7278eefc371d1f7ff\"" Mar 7 01:07:44.252203 containerd[1511]: time="2026-03-07T01:07:44.251349667Z" level=info msg="StartContainer for \"cfd8e81fce8e6503b2fbfcc561a6aa0ad9f0f00dac806ca7278eefc371d1f7ff\"" Mar 7 01:07:44.294329 systemd[1]: Started cri-containerd-cfd8e81fce8e6503b2fbfcc561a6aa0ad9f0f00dac806ca7278eefc371d1f7ff.scope - libcontainer container cfd8e81fce8e6503b2fbfcc561a6aa0ad9f0f00dac806ca7278eefc371d1f7ff. Mar 7 01:07:44.328153 containerd[1511]: time="2026-03-07T01:07:44.328066701Z" level=info msg="StartContainer for \"cfd8e81fce8e6503b2fbfcc561a6aa0ad9f0f00dac806ca7278eefc371d1f7ff\" returns successfully" Mar 7 01:07:44.402861 kubelet[2572]: I0307 01:07:44.402809 2572 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-c8d76dd9d-hqr58" podStartSLOduration=21.42233985 podStartE2EDuration="25.40279907s" podCreationTimestamp="2026-03-07 01:07:19 +0000 UTC" firstStartedPulling="2026-03-07 01:07:40.243294952 +0000 UTC m=+38.203021692" lastFinishedPulling="2026-03-07 01:07:44.223754172 +0000 UTC m=+42.183480912" observedRunningTime="2026-03-07 01:07:44.402032975 +0000 UTC m=+42.361759725" watchObservedRunningTime="2026-03-07 01:07:44.40279907 +0000 UTC m=+42.362525810" Mar 7 01:07:46.083094 kubelet[2572]: I0307 01:07:46.083042 2572 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 7 01:07:47.039468 kernel: calico-node[4828]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 7 01:07:47.479439 systemd-networkd[1407]: vxlan.calico: Link UP Mar 7 01:07:47.479451 systemd-networkd[1407]: vxlan.calico: Gained carrier Mar 7 01:07:48.811941 containerd[1511]: time="2026-03-07T01:07:48.811886805Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:48.813185 containerd[1511]: time="2026-03-07T01:07:48.813125315Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Mar 7 01:07:48.814582 containerd[1511]: time="2026-03-07T01:07:48.814535381Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:48.816660 containerd[1511]: time="2026-03-07T01:07:48.816621798Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:48.817698 containerd[1511]: time="2026-03-07T01:07:48.816989707Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 4.592900833s" Mar 7 01:07:48.817698 containerd[1511]: time="2026-03-07T01:07:48.817013856Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 7 01:07:48.820510 containerd[1511]: time="2026-03-07T01:07:48.820493211Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 7 01:07:48.823847 containerd[1511]: time="2026-03-07T01:07:48.823621072Z" level=info msg="CreateContainer within sandbox \"d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 7 01:07:48.839449 containerd[1511]: time="2026-03-07T01:07:48.839412441Z" level=info msg="CreateContainer within sandbox \"d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"50374f89fb58620d0412a140f2358a969b608d627ef5888c3cc3e9a6f713c377\"" Mar 7 01:07:48.840947 containerd[1511]: time="2026-03-07T01:07:48.840753481Z" level=info msg="StartContainer for \"50374f89fb58620d0412a140f2358a969b608d627ef5888c3cc3e9a6f713c377\"" Mar 7 01:07:48.869316 systemd[1]: Started cri-containerd-50374f89fb58620d0412a140f2358a969b608d627ef5888c3cc3e9a6f713c377.scope - libcontainer container 50374f89fb58620d0412a140f2358a969b608d627ef5888c3cc3e9a6f713c377. Mar 7 01:07:48.904110 containerd[1511]: time="2026-03-07T01:07:48.904073148Z" level=info msg="StartContainer for \"50374f89fb58620d0412a140f2358a969b608d627ef5888c3cc3e9a6f713c377\" returns successfully" Mar 7 01:07:49.071946 systemd-networkd[1407]: vxlan.calico: Gained IPv6LL Mar 7 01:07:49.411992 kubelet[2572]: I0307 01:07:49.411804 2572 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-6f7869694b-75lrk" podStartSLOduration=22.121675594 podStartE2EDuration="30.411792224s" podCreationTimestamp="2026-03-07 01:07:19 +0000 UTC" firstStartedPulling="2026-03-07 01:07:40.529894112 +0000 UTC m=+38.489620852" lastFinishedPulling="2026-03-07 01:07:48.820010742 +0000 UTC m=+46.779737482" observedRunningTime="2026-03-07 01:07:49.411775839 +0000 UTC m=+47.371502589" watchObservedRunningTime="2026-03-07 01:07:49.411792224 +0000 UTC m=+47.371518975" Mar 7 01:07:50.405767 kubelet[2572]: I0307 01:07:50.405705 2572 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 7 01:07:51.137960 containerd[1511]: time="2026-03-07T01:07:51.137923387Z" level=info msg="StopPodSandbox for \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\"" Mar 7 01:07:51.220236 containerd[1511]: 2026-03-07 01:07:51.181 [INFO][5005] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Mar 7 01:07:51.220236 containerd[1511]: 2026-03-07 01:07:51.182 [INFO][5005] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" iface="eth0" netns="/var/run/netns/cni-458f0186-99af-fc97-e680-cf1d24e71341" Mar 7 01:07:51.220236 containerd[1511]: 2026-03-07 01:07:51.182 [INFO][5005] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" iface="eth0" netns="/var/run/netns/cni-458f0186-99af-fc97-e680-cf1d24e71341" Mar 7 01:07:51.220236 containerd[1511]: 2026-03-07 01:07:51.183 [INFO][5005] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" iface="eth0" netns="/var/run/netns/cni-458f0186-99af-fc97-e680-cf1d24e71341" Mar 7 01:07:51.220236 containerd[1511]: 2026-03-07 01:07:51.183 [INFO][5005] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Mar 7 01:07:51.220236 containerd[1511]: 2026-03-07 01:07:51.183 [INFO][5005] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Mar 7 01:07:51.220236 containerd[1511]: 2026-03-07 01:07:51.208 [INFO][5013] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" HandleID="k8s-pod-network.a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Workload="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" Mar 7 01:07:51.220236 containerd[1511]: 2026-03-07 01:07:51.208 [INFO][5013] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:07:51.220236 containerd[1511]: 2026-03-07 01:07:51.208 [INFO][5013] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:07:51.220236 containerd[1511]: 2026-03-07 01:07:51.214 [WARNING][5013] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" HandleID="k8s-pod-network.a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Workload="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" Mar 7 01:07:51.220236 containerd[1511]: 2026-03-07 01:07:51.214 [INFO][5013] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" HandleID="k8s-pod-network.a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Workload="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" Mar 7 01:07:51.220236 containerd[1511]: 2026-03-07 01:07:51.215 [INFO][5013] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:07:51.220236 containerd[1511]: 2026-03-07 01:07:51.217 [INFO][5005] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Mar 7 01:07:51.220543 containerd[1511]: time="2026-03-07T01:07:51.220400865Z" level=info msg="TearDown network for sandbox \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\" successfully" Mar 7 01:07:51.220543 containerd[1511]: time="2026-03-07T01:07:51.220422658Z" level=info msg="StopPodSandbox for \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\" returns successfully" Mar 7 01:07:51.223882 systemd[1]: run-netns-cni\x2d458f0186\x2d99af\x2dfc97\x2de680\x2dcf1d24e71341.mount: Deactivated successfully. Mar 7 01:07:51.225998 containerd[1511]: time="2026-03-07T01:07:51.225978968Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2x9rm,Uid:0869cd07-08c1-477a-a901-3ac76e743a03,Namespace:calico-system,Attempt:1,}" Mar 7 01:07:51.335862 systemd-networkd[1407]: cali7bdc783d202: Link UP Mar 7 01:07:51.337471 systemd-networkd[1407]: cali7bdc783d202: Gained carrier Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.270 [INFO][5019] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0 csi-node-driver- calico-system 0869cd07-08c1-477a-a901-3ac76e743a03 996 0 2026-03-07 01:07:19 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:589b8b8d94 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4081-3-6-n-593f1c83d2 csi-node-driver-2x9rm eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali7bdc783d202 [] [] }} ContainerID="c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" Namespace="calico-system" Pod="csi-node-driver-2x9rm" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-" Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.270 [INFO][5019] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" Namespace="calico-system" Pod="csi-node-driver-2x9rm" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.295 [INFO][5032] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" HandleID="k8s-pod-network.c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" Workload="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.303 [INFO][5032] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" HandleID="k8s-pod-network.c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" Workload="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fde90), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-593f1c83d2", "pod":"csi-node-driver-2x9rm", "timestamp":"2026-03-07 01:07:51.295520097 +0000 UTC"}, Hostname:"ci-4081-3-6-n-593f1c83d2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001886e0)} Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.303 [INFO][5032] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.303 [INFO][5032] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.303 [INFO][5032] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-593f1c83d2' Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.305 [INFO][5032] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.309 [INFO][5032] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.313 [INFO][5032] ipam/ipam.go 526: Trying affinity for 192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.315 [INFO][5032] ipam/ipam.go 160: Attempting to load block cidr=192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.317 [INFO][5032] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.59.0/26 host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.317 [INFO][5032] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.59.0/26 handle="k8s-pod-network.c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.318 [INFO][5032] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.322 [INFO][5032] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.59.0/26 handle="k8s-pod-network.c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.328 [INFO][5032] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.59.8/26] block=192.168.59.0/26 handle="k8s-pod-network.c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.328 [INFO][5032] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.59.8/26] handle="k8s-pod-network.c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" host="ci-4081-3-6-n-593f1c83d2" Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.328 [INFO][5032] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:07:51.362065 containerd[1511]: 2026-03-07 01:07:51.328 [INFO][5032] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.59.8/26] IPv6=[] ContainerID="c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" HandleID="k8s-pod-network.c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" Workload="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" Mar 7 01:07:51.362543 containerd[1511]: 2026-03-07 01:07:51.332 [INFO][5019] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" Namespace="calico-system" Pod="csi-node-driver-2x9rm" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0869cd07-08c1-477a-a901-3ac76e743a03", ResourceVersion:"996", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"", Pod:"csi-node-driver-2x9rm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.59.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7bdc783d202", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:07:51.362543 containerd[1511]: 2026-03-07 01:07:51.332 [INFO][5019] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.59.8/32] ContainerID="c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" Namespace="calico-system" Pod="csi-node-driver-2x9rm" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" Mar 7 01:07:51.362543 containerd[1511]: 2026-03-07 01:07:51.332 [INFO][5019] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7bdc783d202 ContainerID="c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" Namespace="calico-system" Pod="csi-node-driver-2x9rm" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" Mar 7 01:07:51.362543 containerd[1511]: 2026-03-07 01:07:51.342 [INFO][5019] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" Namespace="calico-system" Pod="csi-node-driver-2x9rm" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" Mar 7 01:07:51.362543 containerd[1511]: 2026-03-07 01:07:51.343 [INFO][5019] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" Namespace="calico-system" Pod="csi-node-driver-2x9rm" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0869cd07-08c1-477a-a901-3ac76e743a03", ResourceVersion:"996", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d", Pod:"csi-node-driver-2x9rm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.59.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7bdc783d202", MAC:"1a:6c:6e:16:5e:0c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:07:51.362543 containerd[1511]: 2026-03-07 01:07:51.354 [INFO][5019] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d" Namespace="calico-system" Pod="csi-node-driver-2x9rm" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" Mar 7 01:07:51.401218 containerd[1511]: time="2026-03-07T01:07:51.400304447Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:07:51.401218 containerd[1511]: time="2026-03-07T01:07:51.400343007Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:07:51.401218 containerd[1511]: time="2026-03-07T01:07:51.400352932Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:51.401218 containerd[1511]: time="2026-03-07T01:07:51.400416230Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:07:51.426366 systemd[1]: Started cri-containerd-c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d.scope - libcontainer container c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d. Mar 7 01:07:51.466278 containerd[1511]: time="2026-03-07T01:07:51.466201995Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2x9rm,Uid:0869cd07-08c1-477a-a901-3ac76e743a03,Namespace:calico-system,Attempt:1,} returns sandbox id \"c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d\"" Mar 7 01:07:51.892825 containerd[1511]: time="2026-03-07T01:07:51.892781591Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:51.893731 containerd[1511]: time="2026-03-07T01:07:51.893596050Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Mar 7 01:07:51.894950 containerd[1511]: time="2026-03-07T01:07:51.894910311Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:51.896926 containerd[1511]: time="2026-03-07T01:07:51.896888709Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:51.897535 containerd[1511]: time="2026-03-07T01:07:51.897342621Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 3.075943145s" Mar 7 01:07:51.897535 containerd[1511]: time="2026-03-07T01:07:51.897365466Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Mar 7 01:07:51.898817 containerd[1511]: time="2026-03-07T01:07:51.898168477Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 7 01:07:51.900617 containerd[1511]: time="2026-03-07T01:07:51.900580877Z" level=info msg="CreateContainer within sandbox \"af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 7 01:07:51.922483 containerd[1511]: time="2026-03-07T01:07:51.922453028Z" level=info msg="CreateContainer within sandbox \"af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"a23747d137a54a84377ba8ca17929de0dc6fa7a70f40cfd18ae0da4ce255e7b7\"" Mar 7 01:07:51.923182 containerd[1511]: time="2026-03-07T01:07:51.923161052Z" level=info msg="StartContainer for \"a23747d137a54a84377ba8ca17929de0dc6fa7a70f40cfd18ae0da4ce255e7b7\"" Mar 7 01:07:51.944374 systemd[1]: Started cri-containerd-a23747d137a54a84377ba8ca17929de0dc6fa7a70f40cfd18ae0da4ce255e7b7.scope - libcontainer container a23747d137a54a84377ba8ca17929de0dc6fa7a70f40cfd18ae0da4ce255e7b7. Mar 7 01:07:51.987575 containerd[1511]: time="2026-03-07T01:07:51.987514783Z" level=info msg="StartContainer for \"a23747d137a54a84377ba8ca17929de0dc6fa7a70f40cfd18ae0da4ce255e7b7\" returns successfully" Mar 7 01:07:52.149360 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2915139318.mount: Deactivated successfully. Mar 7 01:07:52.371631 containerd[1511]: time="2026-03-07T01:07:52.371526477Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:52.374377 containerd[1511]: time="2026-03-07T01:07:52.374323346Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 7 01:07:52.376981 containerd[1511]: time="2026-03-07T01:07:52.376848076Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 478.656081ms" Mar 7 01:07:52.376981 containerd[1511]: time="2026-03-07T01:07:52.376886715Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 7 01:07:52.380885 containerd[1511]: time="2026-03-07T01:07:52.379365584Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 7 01:07:52.383940 containerd[1511]: time="2026-03-07T01:07:52.383886616Z" level=info msg="CreateContainer within sandbox \"b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 7 01:07:52.410130 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3253674478.mount: Deactivated successfully. Mar 7 01:07:52.412243 containerd[1511]: time="2026-03-07T01:07:52.411321386Z" level=info msg="CreateContainer within sandbox \"b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"3e3ac6d59d3745abbfde57d6bb34b5fe33452228823d7e1a0750a09d9ea2a738\"" Mar 7 01:07:52.414699 containerd[1511]: time="2026-03-07T01:07:52.414117374Z" level=info msg="StartContainer for \"3e3ac6d59d3745abbfde57d6bb34b5fe33452228823d7e1a0750a09d9ea2a738\"" Mar 7 01:07:52.438663 kubelet[2572]: I0307 01:07:52.438435 2572 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/goldmane-9f7667bb8-85bqd" podStartSLOduration=22.282821885 podStartE2EDuration="33.438422671s" podCreationTimestamp="2026-03-07 01:07:19 +0000 UTC" firstStartedPulling="2026-03-07 01:07:40.742478974 +0000 UTC m=+38.702205724" lastFinishedPulling="2026-03-07 01:07:51.89807976 +0000 UTC m=+49.857806510" observedRunningTime="2026-03-07 01:07:52.436009674 +0000 UTC m=+50.395736424" watchObservedRunningTime="2026-03-07 01:07:52.438422671 +0000 UTC m=+50.398149411" Mar 7 01:07:52.468300 systemd[1]: Started cri-containerd-3e3ac6d59d3745abbfde57d6bb34b5fe33452228823d7e1a0750a09d9ea2a738.scope - libcontainer container 3e3ac6d59d3745abbfde57d6bb34b5fe33452228823d7e1a0750a09d9ea2a738. Mar 7 01:07:52.506398 containerd[1511]: time="2026-03-07T01:07:52.505476598Z" level=info msg="StartContainer for \"3e3ac6d59d3745abbfde57d6bb34b5fe33452228823d7e1a0750a09d9ea2a738\" returns successfully" Mar 7 01:07:53.167441 systemd-networkd[1407]: cali7bdc783d202: Gained IPv6LL Mar 7 01:07:53.457842 kubelet[2572]: I0307 01:07:53.457593 2572 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-6f7869694b-rjchv" podStartSLOduration=22.888162776 podStartE2EDuration="34.457574074s" podCreationTimestamp="2026-03-07 01:07:19 +0000 UTC" firstStartedPulling="2026-03-07 01:07:40.80881741 +0000 UTC m=+38.768544150" lastFinishedPulling="2026-03-07 01:07:52.378228678 +0000 UTC m=+50.337955448" observedRunningTime="2026-03-07 01:07:53.457304128 +0000 UTC m=+51.417030908" watchObservedRunningTime="2026-03-07 01:07:53.457574074 +0000 UTC m=+51.417300855" Mar 7 01:07:53.490896 systemd[1]: run-containerd-runc-k8s.io-a23747d137a54a84377ba8ca17929de0dc6fa7a70f40cfd18ae0da4ce255e7b7-runc.yi3G0O.mount: Deactivated successfully. Mar 7 01:07:54.425003 containerd[1511]: time="2026-03-07T01:07:54.424933791Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:54.425750 containerd[1511]: time="2026-03-07T01:07:54.425694905Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Mar 7 01:07:54.426813 containerd[1511]: time="2026-03-07T01:07:54.426377327Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:54.428828 containerd[1511]: time="2026-03-07T01:07:54.428792060Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:54.429234 containerd[1511]: time="2026-03-07T01:07:54.429186839Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 2.049794142s" Mar 7 01:07:54.429269 containerd[1511]: time="2026-03-07T01:07:54.429234312Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Mar 7 01:07:54.431357 containerd[1511]: time="2026-03-07T01:07:54.431117195Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 7 01:07:54.436478 containerd[1511]: time="2026-03-07T01:07:54.436127089Z" level=info msg="CreateContainer within sandbox \"c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 7 01:07:54.455957 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3871358695.mount: Deactivated successfully. Mar 7 01:07:54.457216 containerd[1511]: time="2026-03-07T01:07:54.456785999Z" level=info msg="CreateContainer within sandbox \"c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"39da7e51cf790bb5b6ae3e0523bf97a39a20a309216e4705b64545b1ea84f667\"" Mar 7 01:07:54.457650 containerd[1511]: time="2026-03-07T01:07:54.457630281Z" level=info msg="StartContainer for \"39da7e51cf790bb5b6ae3e0523bf97a39a20a309216e4705b64545b1ea84f667\"" Mar 7 01:07:54.525537 systemd[1]: Started cri-containerd-39da7e51cf790bb5b6ae3e0523bf97a39a20a309216e4705b64545b1ea84f667.scope - libcontainer container 39da7e51cf790bb5b6ae3e0523bf97a39a20a309216e4705b64545b1ea84f667. Mar 7 01:07:54.581930 containerd[1511]: time="2026-03-07T01:07:54.581637729Z" level=info msg="StartContainer for \"39da7e51cf790bb5b6ae3e0523bf97a39a20a309216e4705b64545b1ea84f667\" returns successfully" Mar 7 01:07:56.138246 containerd[1511]: time="2026-03-07T01:07:56.138208595Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:56.139397 containerd[1511]: time="2026-03-07T01:07:56.139366338Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Mar 7 01:07:56.139513 containerd[1511]: time="2026-03-07T01:07:56.139483388Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:56.141552 containerd[1511]: time="2026-03-07T01:07:56.141380318Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:56.141856 containerd[1511]: time="2026-03-07T01:07:56.141836821Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 1.710698764s" Mar 7 01:07:56.141887 containerd[1511]: time="2026-03-07T01:07:56.141859215Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Mar 7 01:07:56.143977 containerd[1511]: time="2026-03-07T01:07:56.143691085Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 7 01:07:56.147041 containerd[1511]: time="2026-03-07T01:07:56.147017456Z" level=info msg="CreateContainer within sandbox \"c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 7 01:07:56.167853 containerd[1511]: time="2026-03-07T01:07:56.167825959Z" level=info msg="CreateContainer within sandbox \"c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"b0cc293aa1c09769f6539ae3d4f5ea5195efb6766dbe7d8800a8522421630c33\"" Mar 7 01:07:56.169126 containerd[1511]: time="2026-03-07T01:07:56.168248740Z" level=info msg="StartContainer for \"b0cc293aa1c09769f6539ae3d4f5ea5195efb6766dbe7d8800a8522421630c33\"" Mar 7 01:07:56.198300 systemd[1]: Started cri-containerd-b0cc293aa1c09769f6539ae3d4f5ea5195efb6766dbe7d8800a8522421630c33.scope - libcontainer container b0cc293aa1c09769f6539ae3d4f5ea5195efb6766dbe7d8800a8522421630c33. Mar 7 01:07:56.222444 containerd[1511]: time="2026-03-07T01:07:56.222409296Z" level=info msg="StartContainer for \"b0cc293aa1c09769f6539ae3d4f5ea5195efb6766dbe7d8800a8522421630c33\" returns successfully" Mar 7 01:07:59.080410 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2206178608.mount: Deactivated successfully. Mar 7 01:07:59.100419 containerd[1511]: time="2026-03-07T01:07:59.100377164Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:59.101605 containerd[1511]: time="2026-03-07T01:07:59.101565901Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Mar 7 01:07:59.102662 containerd[1511]: time="2026-03-07T01:07:59.102617376Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:59.105726 containerd[1511]: time="2026-03-07T01:07:59.105160593Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:07:59.105726 containerd[1511]: time="2026-03-07T01:07:59.105627461Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 2.961917497s" Mar 7 01:07:59.105726 containerd[1511]: time="2026-03-07T01:07:59.105651107Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Mar 7 01:07:59.106588 containerd[1511]: time="2026-03-07T01:07:59.106565582Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 7 01:07:59.109125 containerd[1511]: time="2026-03-07T01:07:59.109097162Z" level=info msg="CreateContainer within sandbox \"c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 7 01:07:59.127355 containerd[1511]: time="2026-03-07T01:07:59.127319580Z" level=info msg="CreateContainer within sandbox \"c9f6144265eb03529a0d622af63c215754e1e63e0c21437b4d9b77f49c8b6e3d\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"93083d8037c5c555bceca9aff77e1728a4033c79f06cfed993ede1e842958591\"" Mar 7 01:07:59.128477 containerd[1511]: time="2026-03-07T01:07:59.127651339Z" level=info msg="StartContainer for \"93083d8037c5c555bceca9aff77e1728a4033c79f06cfed993ede1e842958591\"" Mar 7 01:07:59.152296 systemd[1]: Started cri-containerd-93083d8037c5c555bceca9aff77e1728a4033c79f06cfed993ede1e842958591.scope - libcontainer container 93083d8037c5c555bceca9aff77e1728a4033c79f06cfed993ede1e842958591. Mar 7 01:07:59.186661 containerd[1511]: time="2026-03-07T01:07:59.186628003Z" level=info msg="StartContainer for \"93083d8037c5c555bceca9aff77e1728a4033c79f06cfed993ede1e842958591\" returns successfully" Mar 7 01:08:00.809811 containerd[1511]: time="2026-03-07T01:08:00.809766214Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:08:00.810919 containerd[1511]: time="2026-03-07T01:08:00.810762695Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Mar 7 01:08:00.811985 containerd[1511]: time="2026-03-07T01:08:00.811761829Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:08:00.813741 containerd[1511]: time="2026-03-07T01:08:00.813709449Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:08:00.814532 containerd[1511]: time="2026-03-07T01:08:00.814330694Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 1.707675223s" Mar 7 01:08:00.814532 containerd[1511]: time="2026-03-07T01:08:00.814354119Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Mar 7 01:08:00.819644 containerd[1511]: time="2026-03-07T01:08:00.819549878Z" level=info msg="CreateContainer within sandbox \"c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 7 01:08:00.833142 containerd[1511]: time="2026-03-07T01:08:00.833111974Z" level=info msg="CreateContainer within sandbox \"c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"f824de4cb9c1054f519459803a131b326e810a983582d1d9d0e40f4317ce52f6\"" Mar 7 01:08:00.833800 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1618750727.mount: Deactivated successfully. Mar 7 01:08:00.835583 containerd[1511]: time="2026-03-07T01:08:00.833868505Z" level=info msg="StartContainer for \"f824de4cb9c1054f519459803a131b326e810a983582d1d9d0e40f4317ce52f6\"" Mar 7 01:08:00.861316 systemd[1]: run-containerd-runc-k8s.io-f824de4cb9c1054f519459803a131b326e810a983582d1d9d0e40f4317ce52f6-runc.1KAGEF.mount: Deactivated successfully. Mar 7 01:08:00.870312 systemd[1]: Started cri-containerd-f824de4cb9c1054f519459803a131b326e810a983582d1d9d0e40f4317ce52f6.scope - libcontainer container f824de4cb9c1054f519459803a131b326e810a983582d1d9d0e40f4317ce52f6. Mar 7 01:08:00.895932 containerd[1511]: time="2026-03-07T01:08:00.895907652Z" level=info msg="StartContainer for \"f824de4cb9c1054f519459803a131b326e810a983582d1d9d0e40f4317ce52f6\" returns successfully" Mar 7 01:08:01.232386 kubelet[2572]: I0307 01:08:01.232334 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 7 01:08:01.232386 kubelet[2572]: I0307 01:08:01.232379 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 7 01:08:01.497547 kubelet[2572]: I0307 01:08:01.497272 2572 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/csi-node-driver-2x9rm" podStartSLOduration=33.149965963 podStartE2EDuration="42.497255666s" podCreationTimestamp="2026-03-07 01:07:19 +0000 UTC" firstStartedPulling="2026-03-07 01:07:51.467851976 +0000 UTC m=+49.427578716" lastFinishedPulling="2026-03-07 01:08:00.815141669 +0000 UTC m=+58.774868419" observedRunningTime="2026-03-07 01:08:01.496967023 +0000 UTC m=+59.456693814" watchObservedRunningTime="2026-03-07 01:08:01.497255666 +0000 UTC m=+59.456982436" Mar 7 01:08:01.498450 kubelet[2572]: I0307 01:08:01.497737 2572 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/whisker-6467698fbb-z99b9" podStartSLOduration=3.527709637 podStartE2EDuration="21.497730686s" podCreationTimestamp="2026-03-07 01:07:40 +0000 UTC" firstStartedPulling="2026-03-07 01:07:41.136450379 +0000 UTC m=+39.096177119" lastFinishedPulling="2026-03-07 01:07:59.106471417 +0000 UTC m=+57.066198168" observedRunningTime="2026-03-07 01:07:59.496542081 +0000 UTC m=+57.456268831" watchObservedRunningTime="2026-03-07 01:08:01.497730686 +0000 UTC m=+59.457457456" Mar 7 01:08:02.144485 containerd[1511]: time="2026-03-07T01:08:02.144165674Z" level=info msg="StopPodSandbox for \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\"" Mar 7 01:08:02.257046 containerd[1511]: 2026-03-07 01:08:02.213 [WARNING][5494] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"acf6180d-a049-431a-b0ff-bb13b9279e69", ResourceVersion:"1009", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588", Pod:"goldmane-9f7667bb8-85bqd", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.59.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali68bb3f856ae", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:08:02.257046 containerd[1511]: 2026-03-07 01:08:02.213 [INFO][5494] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Mar 7 01:08:02.257046 containerd[1511]: 2026-03-07 01:08:02.213 [INFO][5494] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" iface="eth0" netns="" Mar 7 01:08:02.257046 containerd[1511]: 2026-03-07 01:08:02.213 [INFO][5494] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Mar 7 01:08:02.257046 containerd[1511]: 2026-03-07 01:08:02.213 [INFO][5494] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Mar 7 01:08:02.257046 containerd[1511]: 2026-03-07 01:08:02.247 [INFO][5502] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" HandleID="k8s-pod-network.431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Workload="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" Mar 7 01:08:02.257046 containerd[1511]: 2026-03-07 01:08:02.248 [INFO][5502] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:08:02.257046 containerd[1511]: 2026-03-07 01:08:02.248 [INFO][5502] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:08:02.257046 containerd[1511]: 2026-03-07 01:08:02.252 [WARNING][5502] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" HandleID="k8s-pod-network.431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Workload="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" Mar 7 01:08:02.257046 containerd[1511]: 2026-03-07 01:08:02.252 [INFO][5502] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" HandleID="k8s-pod-network.431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Workload="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" Mar 7 01:08:02.257046 containerd[1511]: 2026-03-07 01:08:02.253 [INFO][5502] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:08:02.257046 containerd[1511]: 2026-03-07 01:08:02.255 [INFO][5494] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Mar 7 01:08:02.257046 containerd[1511]: time="2026-03-07T01:08:02.257047975Z" level=info msg="TearDown network for sandbox \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\" successfully" Mar 7 01:08:02.257046 containerd[1511]: time="2026-03-07T01:08:02.257069208Z" level=info msg="StopPodSandbox for \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\" returns successfully" Mar 7 01:08:02.257526 containerd[1511]: time="2026-03-07T01:08:02.257513119Z" level=info msg="RemovePodSandbox for \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\"" Mar 7 01:08:02.257546 containerd[1511]: time="2026-03-07T01:08:02.257534542Z" level=info msg="Forcibly stopping sandbox \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\"" Mar 7 01:08:02.307458 containerd[1511]: 2026-03-07 01:08:02.281 [WARNING][5516] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"acf6180d-a049-431a-b0ff-bb13b9279e69", ResourceVersion:"1009", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"af18f8a2f8f696f4c9672271e6cd646f10415154fa76d297368ae03737678588", Pod:"goldmane-9f7667bb8-85bqd", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.59.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali68bb3f856ae", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:08:02.307458 containerd[1511]: 2026-03-07 01:08:02.281 [INFO][5516] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Mar 7 01:08:02.307458 containerd[1511]: 2026-03-07 01:08:02.281 [INFO][5516] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" iface="eth0" netns="" Mar 7 01:08:02.307458 containerd[1511]: 2026-03-07 01:08:02.281 [INFO][5516] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Mar 7 01:08:02.307458 containerd[1511]: 2026-03-07 01:08:02.281 [INFO][5516] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Mar 7 01:08:02.307458 containerd[1511]: 2026-03-07 01:08:02.297 [INFO][5523] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" HandleID="k8s-pod-network.431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Workload="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" Mar 7 01:08:02.307458 containerd[1511]: 2026-03-07 01:08:02.297 [INFO][5523] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:08:02.307458 containerd[1511]: 2026-03-07 01:08:02.297 [INFO][5523] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:08:02.307458 containerd[1511]: 2026-03-07 01:08:02.302 [WARNING][5523] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" HandleID="k8s-pod-network.431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Workload="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" Mar 7 01:08:02.307458 containerd[1511]: 2026-03-07 01:08:02.302 [INFO][5523] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" HandleID="k8s-pod-network.431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Workload="ci--4081--3--6--n--593f1c83d2-k8s-goldmane--9f7667bb8--85bqd-eth0" Mar 7 01:08:02.307458 containerd[1511]: 2026-03-07 01:08:02.303 [INFO][5523] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:08:02.307458 containerd[1511]: 2026-03-07 01:08:02.305 [INFO][5516] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07" Mar 7 01:08:02.307843 containerd[1511]: time="2026-03-07T01:08:02.307496383Z" level=info msg="TearDown network for sandbox \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\" successfully" Mar 7 01:08:02.311431 containerd[1511]: time="2026-03-07T01:08:02.311278195Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:08:02.311431 containerd[1511]: time="2026-03-07T01:08:02.311331686Z" level=info msg="RemovePodSandbox \"431af279640805118772d934dcd4efe01ff3d05ff5d327928fca9031621a7f07\" returns successfully" Mar 7 01:08:02.311796 containerd[1511]: time="2026-03-07T01:08:02.311777491Z" level=info msg="StopPodSandbox for \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\"" Mar 7 01:08:02.370224 containerd[1511]: 2026-03-07 01:08:02.344 [WARNING][5537] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-whisker--764d867d9c--fngzt-eth0" Mar 7 01:08:02.370224 containerd[1511]: 2026-03-07 01:08:02.345 [INFO][5537] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Mar 7 01:08:02.370224 containerd[1511]: 2026-03-07 01:08:02.345 [INFO][5537] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" iface="eth0" netns="" Mar 7 01:08:02.370224 containerd[1511]: 2026-03-07 01:08:02.345 [INFO][5537] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Mar 7 01:08:02.370224 containerd[1511]: 2026-03-07 01:08:02.345 [INFO][5537] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Mar 7 01:08:02.370224 containerd[1511]: 2026-03-07 01:08:02.359 [INFO][5544] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" HandleID="k8s-pod-network.035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Workload="ci--4081--3--6--n--593f1c83d2-k8s-whisker--764d867d9c--fngzt-eth0" Mar 7 01:08:02.370224 containerd[1511]: 2026-03-07 01:08:02.359 [INFO][5544] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:08:02.370224 containerd[1511]: 2026-03-07 01:08:02.359 [INFO][5544] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:08:02.370224 containerd[1511]: 2026-03-07 01:08:02.365 [WARNING][5544] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" HandleID="k8s-pod-network.035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Workload="ci--4081--3--6--n--593f1c83d2-k8s-whisker--764d867d9c--fngzt-eth0" Mar 7 01:08:02.370224 containerd[1511]: 2026-03-07 01:08:02.365 [INFO][5544] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" HandleID="k8s-pod-network.035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Workload="ci--4081--3--6--n--593f1c83d2-k8s-whisker--764d867d9c--fngzt-eth0" Mar 7 01:08:02.370224 containerd[1511]: 2026-03-07 01:08:02.366 [INFO][5544] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:08:02.370224 containerd[1511]: 2026-03-07 01:08:02.368 [INFO][5537] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Mar 7 01:08:02.370224 containerd[1511]: time="2026-03-07T01:08:02.370065364Z" level=info msg="TearDown network for sandbox \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\" successfully" Mar 7 01:08:02.370224 containerd[1511]: time="2026-03-07T01:08:02.370094960Z" level=info msg="StopPodSandbox for \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\" returns successfully" Mar 7 01:08:02.371899 containerd[1511]: time="2026-03-07T01:08:02.370696713Z" level=info msg="RemovePodSandbox for \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\"" Mar 7 01:08:02.371899 containerd[1511]: time="2026-03-07T01:08:02.370729233Z" level=info msg="Forcibly stopping sandbox \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\"" Mar 7 01:08:02.437078 containerd[1511]: 2026-03-07 01:08:02.408 [WARNING][5558] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" WorkloadEndpoint="ci--4081--3--6--n--593f1c83d2-k8s-whisker--764d867d9c--fngzt-eth0" Mar 7 01:08:02.437078 containerd[1511]: 2026-03-07 01:08:02.409 [INFO][5558] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Mar 7 01:08:02.437078 containerd[1511]: 2026-03-07 01:08:02.409 [INFO][5558] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" iface="eth0" netns="" Mar 7 01:08:02.437078 containerd[1511]: 2026-03-07 01:08:02.409 [INFO][5558] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Mar 7 01:08:02.437078 containerd[1511]: 2026-03-07 01:08:02.409 [INFO][5558] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Mar 7 01:08:02.437078 containerd[1511]: 2026-03-07 01:08:02.427 [INFO][5568] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" HandleID="k8s-pod-network.035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Workload="ci--4081--3--6--n--593f1c83d2-k8s-whisker--764d867d9c--fngzt-eth0" Mar 7 01:08:02.437078 containerd[1511]: 2026-03-07 01:08:02.427 [INFO][5568] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:08:02.437078 containerd[1511]: 2026-03-07 01:08:02.427 [INFO][5568] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:08:02.437078 containerd[1511]: 2026-03-07 01:08:02.431 [WARNING][5568] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" HandleID="k8s-pod-network.035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Workload="ci--4081--3--6--n--593f1c83d2-k8s-whisker--764d867d9c--fngzt-eth0" Mar 7 01:08:02.437078 containerd[1511]: 2026-03-07 01:08:02.431 [INFO][5568] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" HandleID="k8s-pod-network.035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Workload="ci--4081--3--6--n--593f1c83d2-k8s-whisker--764d867d9c--fngzt-eth0" Mar 7 01:08:02.437078 containerd[1511]: 2026-03-07 01:08:02.433 [INFO][5568] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:08:02.437078 containerd[1511]: 2026-03-07 01:08:02.435 [INFO][5558] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f" Mar 7 01:08:02.437813 containerd[1511]: time="2026-03-07T01:08:02.437055717Z" level=info msg="TearDown network for sandbox \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\" successfully" Mar 7 01:08:02.441159 containerd[1511]: time="2026-03-07T01:08:02.441130929Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:08:02.441159 containerd[1511]: time="2026-03-07T01:08:02.441178131Z" level=info msg="RemovePodSandbox \"035135f4373bd62ba1b2fb42e5ee10ca5c6dbfc54d60f42b73594708cf0c565f\" returns successfully" Mar 7 01:08:02.441662 containerd[1511]: time="2026-03-07T01:08:02.441627481Z" level=info msg="StopPodSandbox for \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\"" Mar 7 01:08:02.502077 containerd[1511]: 2026-03-07 01:08:02.467 [WARNING][5582] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0869cd07-08c1-477a-a901-3ac76e743a03", ResourceVersion:"1069", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d", Pod:"csi-node-driver-2x9rm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.59.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7bdc783d202", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:08:02.502077 containerd[1511]: 2026-03-07 01:08:02.468 [INFO][5582] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Mar 7 01:08:02.502077 containerd[1511]: 2026-03-07 01:08:02.468 [INFO][5582] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" iface="eth0" netns="" Mar 7 01:08:02.502077 containerd[1511]: 2026-03-07 01:08:02.468 [INFO][5582] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Mar 7 01:08:02.502077 containerd[1511]: 2026-03-07 01:08:02.468 [INFO][5582] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Mar 7 01:08:02.502077 containerd[1511]: 2026-03-07 01:08:02.486 [INFO][5589] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" HandleID="k8s-pod-network.a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Workload="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" Mar 7 01:08:02.502077 containerd[1511]: 2026-03-07 01:08:02.486 [INFO][5589] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:08:02.502077 containerd[1511]: 2026-03-07 01:08:02.486 [INFO][5589] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:08:02.502077 containerd[1511]: 2026-03-07 01:08:02.496 [WARNING][5589] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" HandleID="k8s-pod-network.a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Workload="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" Mar 7 01:08:02.502077 containerd[1511]: 2026-03-07 01:08:02.496 [INFO][5589] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" HandleID="k8s-pod-network.a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Workload="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" Mar 7 01:08:02.502077 containerd[1511]: 2026-03-07 01:08:02.497 [INFO][5589] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:08:02.502077 containerd[1511]: 2026-03-07 01:08:02.500 [INFO][5582] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Mar 7 01:08:02.502843 containerd[1511]: time="2026-03-07T01:08:02.502114328Z" level=info msg="TearDown network for sandbox \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\" successfully" Mar 7 01:08:02.502843 containerd[1511]: time="2026-03-07T01:08:02.502136392Z" level=info msg="StopPodSandbox for \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\" returns successfully" Mar 7 01:08:02.502843 containerd[1511]: time="2026-03-07T01:08:02.502567493Z" level=info msg="RemovePodSandbox for \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\"" Mar 7 01:08:02.502843 containerd[1511]: time="2026-03-07T01:08:02.502589526Z" level=info msg="Forcibly stopping sandbox \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\"" Mar 7 01:08:02.556964 containerd[1511]: 2026-03-07 01:08:02.529 [WARNING][5603] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0869cd07-08c1-477a-a901-3ac76e743a03", ResourceVersion:"1069", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"c1f41e20afaef05641626c57f1d2775c3bc692235eeb82777850086b479c7b8d", Pod:"csi-node-driver-2x9rm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.59.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7bdc783d202", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:08:02.556964 containerd[1511]: 2026-03-07 01:08:02.529 [INFO][5603] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Mar 7 01:08:02.556964 containerd[1511]: 2026-03-07 01:08:02.529 [INFO][5603] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" iface="eth0" netns="" Mar 7 01:08:02.556964 containerd[1511]: 2026-03-07 01:08:02.529 [INFO][5603] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Mar 7 01:08:02.556964 containerd[1511]: 2026-03-07 01:08:02.529 [INFO][5603] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Mar 7 01:08:02.556964 containerd[1511]: 2026-03-07 01:08:02.546 [INFO][5610] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" HandleID="k8s-pod-network.a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Workload="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" Mar 7 01:08:02.556964 containerd[1511]: 2026-03-07 01:08:02.546 [INFO][5610] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:08:02.556964 containerd[1511]: 2026-03-07 01:08:02.546 [INFO][5610] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:08:02.556964 containerd[1511]: 2026-03-07 01:08:02.550 [WARNING][5610] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" HandleID="k8s-pod-network.a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Workload="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" Mar 7 01:08:02.556964 containerd[1511]: 2026-03-07 01:08:02.551 [INFO][5610] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" HandleID="k8s-pod-network.a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Workload="ci--4081--3--6--n--593f1c83d2-k8s-csi--node--driver--2x9rm-eth0" Mar 7 01:08:02.556964 containerd[1511]: 2026-03-07 01:08:02.552 [INFO][5610] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:08:02.556964 containerd[1511]: 2026-03-07 01:08:02.554 [INFO][5603] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741" Mar 7 01:08:02.557364 containerd[1511]: time="2026-03-07T01:08:02.556990949Z" level=info msg="TearDown network for sandbox \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\" successfully" Mar 7 01:08:02.562862 containerd[1511]: time="2026-03-07T01:08:02.562836766Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:08:02.562954 containerd[1511]: time="2026-03-07T01:08:02.562938082Z" level=info msg="RemovePodSandbox \"a69873db43c016319dc85e9e7cdb26f1cd3cae19937e392e63434a1d68a9f741\" returns successfully" Mar 7 01:08:02.563350 containerd[1511]: time="2026-03-07T01:08:02.563328430Z" level=info msg="StopPodSandbox for \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\"" Mar 7 01:08:02.622569 containerd[1511]: 2026-03-07 01:08:02.592 [WARNING][5624] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"6afa7af7-0921-4201-99cd-3abd2378e890", ResourceVersion:"944", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91", Pod:"coredns-7d764666f9-f9xbn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.59.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif2882f848af", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:08:02.622569 containerd[1511]: 2026-03-07 01:08:02.592 [INFO][5624] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Mar 7 01:08:02.622569 containerd[1511]: 2026-03-07 01:08:02.592 [INFO][5624] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" iface="eth0" netns="" Mar 7 01:08:02.622569 containerd[1511]: 2026-03-07 01:08:02.592 [INFO][5624] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Mar 7 01:08:02.622569 containerd[1511]: 2026-03-07 01:08:02.592 [INFO][5624] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Mar 7 01:08:02.622569 containerd[1511]: 2026-03-07 01:08:02.611 [INFO][5631] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" HandleID="k8s-pod-network.e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" Mar 7 01:08:02.622569 containerd[1511]: 2026-03-07 01:08:02.611 [INFO][5631] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:08:02.622569 containerd[1511]: 2026-03-07 01:08:02.612 [INFO][5631] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:08:02.622569 containerd[1511]: 2026-03-07 01:08:02.617 [WARNING][5631] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" HandleID="k8s-pod-network.e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" Mar 7 01:08:02.622569 containerd[1511]: 2026-03-07 01:08:02.617 [INFO][5631] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" HandleID="k8s-pod-network.e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" Mar 7 01:08:02.622569 containerd[1511]: 2026-03-07 01:08:02.618 [INFO][5631] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:08:02.622569 containerd[1511]: 2026-03-07 01:08:02.620 [INFO][5624] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Mar 7 01:08:02.622569 containerd[1511]: time="2026-03-07T01:08:02.622439517Z" level=info msg="TearDown network for sandbox \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\" successfully" Mar 7 01:08:02.622569 containerd[1511]: time="2026-03-07T01:08:02.622470625Z" level=info msg="StopPodSandbox for \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\" returns successfully" Mar 7 01:08:02.622974 containerd[1511]: time="2026-03-07T01:08:02.622955158Z" level=info msg="RemovePodSandbox for \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\"" Mar 7 01:08:02.622993 containerd[1511]: time="2026-03-07T01:08:02.622976771Z" level=info msg="Forcibly stopping sandbox \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\"" Mar 7 01:08:02.678818 containerd[1511]: 2026-03-07 01:08:02.651 [WARNING][5646] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"6afa7af7-0921-4201-99cd-3abd2378e890", ResourceVersion:"944", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"a71086e1585a9c56888c650f89b21f39074c9ba4a6f530ae4b43fb6c275cee91", Pod:"coredns-7d764666f9-f9xbn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.59.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif2882f848af", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:08:02.678818 containerd[1511]: 2026-03-07 01:08:02.653 [INFO][5646] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Mar 7 01:08:02.678818 containerd[1511]: 2026-03-07 01:08:02.653 [INFO][5646] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" iface="eth0" netns="" Mar 7 01:08:02.678818 containerd[1511]: 2026-03-07 01:08:02.653 [INFO][5646] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Mar 7 01:08:02.678818 containerd[1511]: 2026-03-07 01:08:02.653 [INFO][5646] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Mar 7 01:08:02.678818 containerd[1511]: 2026-03-07 01:08:02.669 [INFO][5653] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" HandleID="k8s-pod-network.e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" Mar 7 01:08:02.678818 containerd[1511]: 2026-03-07 01:08:02.669 [INFO][5653] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:08:02.678818 containerd[1511]: 2026-03-07 01:08:02.669 [INFO][5653] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:08:02.678818 containerd[1511]: 2026-03-07 01:08:02.674 [WARNING][5653] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" HandleID="k8s-pod-network.e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" Mar 7 01:08:02.678818 containerd[1511]: 2026-03-07 01:08:02.674 [INFO][5653] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" HandleID="k8s-pod-network.e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--f9xbn-eth0" Mar 7 01:08:02.678818 containerd[1511]: 2026-03-07 01:08:02.675 [INFO][5653] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:08:02.678818 containerd[1511]: 2026-03-07 01:08:02.676 [INFO][5646] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe" Mar 7 01:08:02.679173 containerd[1511]: time="2026-03-07T01:08:02.678856011Z" level=info msg="TearDown network for sandbox \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\" successfully" Mar 7 01:08:02.682619 containerd[1511]: time="2026-03-07T01:08:02.682588365Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:08:02.682695 containerd[1511]: time="2026-03-07T01:08:02.682645423Z" level=info msg="RemovePodSandbox \"e849b915ba22a919dbee0cc6ebdb8c9a5c3dbb30a1590ebeb7dbf66c0243eefe\" returns successfully" Mar 7 01:08:02.683167 containerd[1511]: time="2026-03-07T01:08:02.683151740Z" level=info msg="StopPodSandbox for \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\"" Mar 7 01:08:02.736054 containerd[1511]: 2026-03-07 01:08:02.708 [WARNING][5668] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"d25ec7da-6038-4c24-89dc-467334f42af3", ResourceVersion:"948", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f", Pod:"coredns-7d764666f9-qsb6f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.59.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali340665de010", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:08:02.736054 containerd[1511]: 2026-03-07 01:08:02.709 [INFO][5668] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Mar 7 01:08:02.736054 containerd[1511]: 2026-03-07 01:08:02.709 [INFO][5668] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" iface="eth0" netns="" Mar 7 01:08:02.736054 containerd[1511]: 2026-03-07 01:08:02.709 [INFO][5668] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Mar 7 01:08:02.736054 containerd[1511]: 2026-03-07 01:08:02.709 [INFO][5668] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Mar 7 01:08:02.736054 containerd[1511]: 2026-03-07 01:08:02.725 [INFO][5675] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" HandleID="k8s-pod-network.4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" Mar 7 01:08:02.736054 containerd[1511]: 2026-03-07 01:08:02.725 [INFO][5675] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:08:02.736054 containerd[1511]: 2026-03-07 01:08:02.725 [INFO][5675] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:08:02.736054 containerd[1511]: 2026-03-07 01:08:02.730 [WARNING][5675] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" HandleID="k8s-pod-network.4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" Mar 7 01:08:02.736054 containerd[1511]: 2026-03-07 01:08:02.730 [INFO][5675] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" HandleID="k8s-pod-network.4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" Mar 7 01:08:02.736054 containerd[1511]: 2026-03-07 01:08:02.731 [INFO][5675] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:08:02.736054 containerd[1511]: 2026-03-07 01:08:02.733 [INFO][5668] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Mar 7 01:08:02.736054 containerd[1511]: time="2026-03-07T01:08:02.735820834Z" level=info msg="TearDown network for sandbox \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\" successfully" Mar 7 01:08:02.736054 containerd[1511]: time="2026-03-07T01:08:02.735847806Z" level=info msg="StopPodSandbox for \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\" returns successfully" Mar 7 01:08:02.737585 containerd[1511]: time="2026-03-07T01:08:02.737555786Z" level=info msg="RemovePodSandbox for \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\"" Mar 7 01:08:02.737644 containerd[1511]: time="2026-03-07T01:08:02.737588175Z" level=info msg="Forcibly stopping sandbox \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\"" Mar 7 01:08:02.803835 containerd[1511]: 2026-03-07 01:08:02.768 [WARNING][5690] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"d25ec7da-6038-4c24-89dc-467334f42af3", ResourceVersion:"948", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"45be65c54806aeb4fdfa2994a6a0579223a0662d6458c39f4e280ca95611e99f", Pod:"coredns-7d764666f9-qsb6f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.59.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali340665de010", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:08:02.803835 containerd[1511]: 2026-03-07 01:08:02.768 [INFO][5690] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Mar 7 01:08:02.803835 containerd[1511]: 2026-03-07 01:08:02.768 [INFO][5690] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" iface="eth0" netns="" Mar 7 01:08:02.803835 containerd[1511]: 2026-03-07 01:08:02.768 [INFO][5690] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Mar 7 01:08:02.803835 containerd[1511]: 2026-03-07 01:08:02.768 [INFO][5690] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Mar 7 01:08:02.803835 containerd[1511]: 2026-03-07 01:08:02.792 [INFO][5697] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" HandleID="k8s-pod-network.4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" Mar 7 01:08:02.803835 containerd[1511]: 2026-03-07 01:08:02.793 [INFO][5697] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:08:02.803835 containerd[1511]: 2026-03-07 01:08:02.793 [INFO][5697] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:08:02.803835 containerd[1511]: 2026-03-07 01:08:02.797 [WARNING][5697] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" HandleID="k8s-pod-network.4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" Mar 7 01:08:02.803835 containerd[1511]: 2026-03-07 01:08:02.797 [INFO][5697] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" HandleID="k8s-pod-network.4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Workload="ci--4081--3--6--n--593f1c83d2-k8s-coredns--7d764666f9--qsb6f-eth0" Mar 7 01:08:02.803835 containerd[1511]: 2026-03-07 01:08:02.798 [INFO][5697] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:08:02.803835 containerd[1511]: 2026-03-07 01:08:02.801 [INFO][5690] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e" Mar 7 01:08:02.804278 containerd[1511]: time="2026-03-07T01:08:02.803871935Z" level=info msg="TearDown network for sandbox \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\" successfully" Mar 7 01:08:02.807086 containerd[1511]: time="2026-03-07T01:08:02.807049790Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:08:02.807141 containerd[1511]: time="2026-03-07T01:08:02.807093767Z" level=info msg="RemovePodSandbox \"4b27fed1e8961387c3d72f2ff67b3248408bf0cd201cbe9993b0bee9b2528d9e\" returns successfully" Mar 7 01:08:02.807502 containerd[1511]: time="2026-03-07T01:08:02.807471927Z" level=info msg="StopPodSandbox for \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\"" Mar 7 01:08:02.861394 containerd[1511]: 2026-03-07 01:08:02.832 [WARNING][5711] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0", GenerateName:"calico-apiserver-6f7869694b-", Namespace:"calico-system", SelfLink:"", UID:"408b461e-6520-4c86-a37a-f5d60a0218ca", ResourceVersion:"1023", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f7869694b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb", Pod:"calico-apiserver-6f7869694b-rjchv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.59.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali76534bbf384", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:08:02.861394 containerd[1511]: 2026-03-07 01:08:02.832 [INFO][5711] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Mar 7 01:08:02.861394 containerd[1511]: 2026-03-07 01:08:02.832 [INFO][5711] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" iface="eth0" netns="" Mar 7 01:08:02.861394 containerd[1511]: 2026-03-07 01:08:02.832 [INFO][5711] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Mar 7 01:08:02.861394 containerd[1511]: 2026-03-07 01:08:02.832 [INFO][5711] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Mar 7 01:08:02.861394 containerd[1511]: 2026-03-07 01:08:02.850 [INFO][5718] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" HandleID="k8s-pod-network.eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" Mar 7 01:08:02.861394 containerd[1511]: 2026-03-07 01:08:02.850 [INFO][5718] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:08:02.861394 containerd[1511]: 2026-03-07 01:08:02.851 [INFO][5718] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:08:02.861394 containerd[1511]: 2026-03-07 01:08:02.855 [WARNING][5718] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" HandleID="k8s-pod-network.eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" Mar 7 01:08:02.861394 containerd[1511]: 2026-03-07 01:08:02.855 [INFO][5718] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" HandleID="k8s-pod-network.eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" Mar 7 01:08:02.861394 containerd[1511]: 2026-03-07 01:08:02.857 [INFO][5718] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:08:02.861394 containerd[1511]: 2026-03-07 01:08:02.859 [INFO][5711] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Mar 7 01:08:02.861394 containerd[1511]: time="2026-03-07T01:08:02.861289204Z" level=info msg="TearDown network for sandbox \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\" successfully" Mar 7 01:08:02.861394 containerd[1511]: time="2026-03-07T01:08:02.861309745Z" level=info msg="StopPodSandbox for \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\" returns successfully" Mar 7 01:08:02.862218 containerd[1511]: time="2026-03-07T01:08:02.861860850Z" level=info msg="RemovePodSandbox for \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\"" Mar 7 01:08:02.862218 containerd[1511]: time="2026-03-07T01:08:02.861880190Z" level=info msg="Forcibly stopping sandbox \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\"" Mar 7 01:08:02.913411 containerd[1511]: 2026-03-07 01:08:02.887 [WARNING][5732] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0", GenerateName:"calico-apiserver-6f7869694b-", Namespace:"calico-system", SelfLink:"", UID:"408b461e-6520-4c86-a37a-f5d60a0218ca", ResourceVersion:"1023", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f7869694b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"b7fc96c1a915824d21914d040bc1225af364387c99e81216d7cdbdfea0162fdb", Pod:"calico-apiserver-6f7869694b-rjchv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.59.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali76534bbf384", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:08:02.913411 containerd[1511]: 2026-03-07 01:08:02.887 [INFO][5732] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Mar 7 01:08:02.913411 containerd[1511]: 2026-03-07 01:08:02.887 [INFO][5732] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" iface="eth0" netns="" Mar 7 01:08:02.913411 containerd[1511]: 2026-03-07 01:08:02.887 [INFO][5732] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Mar 7 01:08:02.913411 containerd[1511]: 2026-03-07 01:08:02.887 [INFO][5732] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Mar 7 01:08:02.913411 containerd[1511]: 2026-03-07 01:08:02.903 [INFO][5739] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" HandleID="k8s-pod-network.eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" Mar 7 01:08:02.913411 containerd[1511]: 2026-03-07 01:08:02.903 [INFO][5739] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:08:02.913411 containerd[1511]: 2026-03-07 01:08:02.903 [INFO][5739] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:08:02.913411 containerd[1511]: 2026-03-07 01:08:02.908 [WARNING][5739] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" HandleID="k8s-pod-network.eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" Mar 7 01:08:02.913411 containerd[1511]: 2026-03-07 01:08:02.908 [INFO][5739] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" HandleID="k8s-pod-network.eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--rjchv-eth0" Mar 7 01:08:02.913411 containerd[1511]: 2026-03-07 01:08:02.910 [INFO][5739] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:08:02.913411 containerd[1511]: 2026-03-07 01:08:02.911 [INFO][5732] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735" Mar 7 01:08:02.913828 containerd[1511]: time="2026-03-07T01:08:02.913787806Z" level=info msg="TearDown network for sandbox \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\" successfully" Mar 7 01:08:02.917594 containerd[1511]: time="2026-03-07T01:08:02.917554184Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:08:02.917663 containerd[1511]: time="2026-03-07T01:08:02.917601537Z" level=info msg="RemovePodSandbox \"eb491c23028908b2b281a6f1b9fdb96b1af9be8317cb2f3db70bcf84fc677735\" returns successfully" Mar 7 01:08:02.918016 containerd[1511]: time="2026-03-07T01:08:02.918002372Z" level=info msg="StopPodSandbox for \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\"" Mar 7 01:08:02.972005 containerd[1511]: 2026-03-07 01:08:02.943 [WARNING][5753] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0", GenerateName:"calico-kube-controllers-c8d76dd9d-", Namespace:"calico-system", SelfLink:"", UID:"57352416-50b9-4228-9b61-6bb3108b4b1a", ResourceVersion:"967", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"c8d76dd9d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2", Pod:"calico-kube-controllers-c8d76dd9d-hqr58", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.59.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali92268a984ea", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:08:02.972005 containerd[1511]: 2026-03-07 01:08:02.944 [INFO][5753] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Mar 7 01:08:02.972005 containerd[1511]: 2026-03-07 01:08:02.944 [INFO][5753] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" iface="eth0" netns="" Mar 7 01:08:02.972005 containerd[1511]: 2026-03-07 01:08:02.944 [INFO][5753] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Mar 7 01:08:02.972005 containerd[1511]: 2026-03-07 01:08:02.944 [INFO][5753] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Mar 7 01:08:02.972005 containerd[1511]: 2026-03-07 01:08:02.961 [INFO][5760] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" HandleID="k8s-pod-network.58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" Mar 7 01:08:02.972005 containerd[1511]: 2026-03-07 01:08:02.961 [INFO][5760] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:08:02.972005 containerd[1511]: 2026-03-07 01:08:02.961 [INFO][5760] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:08:02.972005 containerd[1511]: 2026-03-07 01:08:02.965 [WARNING][5760] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" HandleID="k8s-pod-network.58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" Mar 7 01:08:02.972005 containerd[1511]: 2026-03-07 01:08:02.965 [INFO][5760] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" HandleID="k8s-pod-network.58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" Mar 7 01:08:02.972005 containerd[1511]: 2026-03-07 01:08:02.967 [INFO][5760] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:08:02.972005 containerd[1511]: 2026-03-07 01:08:02.969 [INFO][5753] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Mar 7 01:08:02.973178 containerd[1511]: time="2026-03-07T01:08:02.972046796Z" level=info msg="TearDown network for sandbox \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\" successfully" Mar 7 01:08:02.973178 containerd[1511]: time="2026-03-07T01:08:02.972077954Z" level=info msg="StopPodSandbox for \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\" returns successfully" Mar 7 01:08:02.973178 containerd[1511]: time="2026-03-07T01:08:02.972494643Z" level=info msg="RemovePodSandbox for \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\"" Mar 7 01:08:02.973178 containerd[1511]: time="2026-03-07T01:08:02.972518069Z" level=info msg="Forcibly stopping sandbox \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\"" Mar 7 01:08:03.028609 containerd[1511]: 2026-03-07 01:08:02.999 [WARNING][5774] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0", GenerateName:"calico-kube-controllers-c8d76dd9d-", Namespace:"calico-system", SelfLink:"", UID:"57352416-50b9-4228-9b61-6bb3108b4b1a", ResourceVersion:"967", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"c8d76dd9d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"fbf98b5a6565959403896bf5ae91589ab059cb821fc7e07d0f6812d2b52831e2", Pod:"calico-kube-controllers-c8d76dd9d-hqr58", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.59.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali92268a984ea", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:08:03.028609 containerd[1511]: 2026-03-07 01:08:02.999 [INFO][5774] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Mar 7 01:08:03.028609 containerd[1511]: 2026-03-07 01:08:02.999 [INFO][5774] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" iface="eth0" netns="" Mar 7 01:08:03.028609 containerd[1511]: 2026-03-07 01:08:02.999 [INFO][5774] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Mar 7 01:08:03.028609 containerd[1511]: 2026-03-07 01:08:02.999 [INFO][5774] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Mar 7 01:08:03.028609 containerd[1511]: 2026-03-07 01:08:03.017 [INFO][5782] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" HandleID="k8s-pod-network.58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" Mar 7 01:08:03.028609 containerd[1511]: 2026-03-07 01:08:03.018 [INFO][5782] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:08:03.028609 containerd[1511]: 2026-03-07 01:08:03.018 [INFO][5782] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:08:03.028609 containerd[1511]: 2026-03-07 01:08:03.022 [WARNING][5782] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" HandleID="k8s-pod-network.58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" Mar 7 01:08:03.028609 containerd[1511]: 2026-03-07 01:08:03.022 [INFO][5782] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" HandleID="k8s-pod-network.58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--kube--controllers--c8d76dd9d--hqr58-eth0" Mar 7 01:08:03.028609 containerd[1511]: 2026-03-07 01:08:03.023 [INFO][5782] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:08:03.028609 containerd[1511]: 2026-03-07 01:08:03.025 [INFO][5774] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05" Mar 7 01:08:03.028609 containerd[1511]: time="2026-03-07T01:08:03.027675191Z" level=info msg="TearDown network for sandbox \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\" successfully" Mar 7 01:08:03.031711 containerd[1511]: time="2026-03-07T01:08:03.031672891Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:08:03.031711 containerd[1511]: time="2026-03-07T01:08:03.031740314Z" level=info msg="RemovePodSandbox \"58b20a1cc5b6d94384fb6f07be129f097d0a725996fb7c0cf5880f66552d6a05\" returns successfully" Mar 7 01:08:03.032326 containerd[1511]: time="2026-03-07T01:08:03.032290888Z" level=info msg="StopPodSandbox for \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\"" Mar 7 01:08:03.092304 containerd[1511]: 2026-03-07 01:08:03.064 [WARNING][5797] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0", GenerateName:"calico-apiserver-6f7869694b-", Namespace:"calico-system", SelfLink:"", UID:"ffcd3157-37de-4e8c-903c-bcf1e17ebdb3", ResourceVersion:"987", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f7869694b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714", Pod:"calico-apiserver-6f7869694b-75lrk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.59.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali239e11d89ac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:08:03.092304 containerd[1511]: 2026-03-07 01:08:03.064 [INFO][5797] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Mar 7 01:08:03.092304 containerd[1511]: 2026-03-07 01:08:03.064 [INFO][5797] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" iface="eth0" netns="" Mar 7 01:08:03.092304 containerd[1511]: 2026-03-07 01:08:03.064 [INFO][5797] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Mar 7 01:08:03.092304 containerd[1511]: 2026-03-07 01:08:03.064 [INFO][5797] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Mar 7 01:08:03.092304 containerd[1511]: 2026-03-07 01:08:03.080 [INFO][5805] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" HandleID="k8s-pod-network.a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" Mar 7 01:08:03.092304 containerd[1511]: 2026-03-07 01:08:03.081 [INFO][5805] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:08:03.092304 containerd[1511]: 2026-03-07 01:08:03.081 [INFO][5805] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:08:03.092304 containerd[1511]: 2026-03-07 01:08:03.086 [WARNING][5805] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" HandleID="k8s-pod-network.a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" Mar 7 01:08:03.092304 containerd[1511]: 2026-03-07 01:08:03.086 [INFO][5805] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" HandleID="k8s-pod-network.a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" Mar 7 01:08:03.092304 containerd[1511]: 2026-03-07 01:08:03.088 [INFO][5805] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:08:03.092304 containerd[1511]: 2026-03-07 01:08:03.090 [INFO][5797] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Mar 7 01:08:03.093321 containerd[1511]: time="2026-03-07T01:08:03.092343867Z" level=info msg="TearDown network for sandbox \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\" successfully" Mar 7 01:08:03.093321 containerd[1511]: time="2026-03-07T01:08:03.092365931Z" level=info msg="StopPodSandbox for \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\" returns successfully" Mar 7 01:08:03.093321 containerd[1511]: time="2026-03-07T01:08:03.092826506Z" level=info msg="RemovePodSandbox for \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\"" Mar 7 01:08:03.093321 containerd[1511]: time="2026-03-07T01:08:03.092853087Z" level=info msg="Forcibly stopping sandbox \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\"" Mar 7 01:08:03.155780 containerd[1511]: 2026-03-07 01:08:03.124 [WARNING][5820] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0", GenerateName:"calico-apiserver-6f7869694b-", Namespace:"calico-system", SelfLink:"", UID:"ffcd3157-37de-4e8c-903c-bcf1e17ebdb3", ResourceVersion:"987", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 7, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f7869694b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-593f1c83d2", ContainerID:"d2cacd340d23a69f0dba09881b317332f7598fd150444f90cf3400779ca01714", Pod:"calico-apiserver-6f7869694b-75lrk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.59.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali239e11d89ac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:08:03.155780 containerd[1511]: 2026-03-07 01:08:03.125 [INFO][5820] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Mar 7 01:08:03.155780 containerd[1511]: 2026-03-07 01:08:03.125 [INFO][5820] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" iface="eth0" netns="" Mar 7 01:08:03.155780 containerd[1511]: 2026-03-07 01:08:03.125 [INFO][5820] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Mar 7 01:08:03.155780 containerd[1511]: 2026-03-07 01:08:03.125 [INFO][5820] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Mar 7 01:08:03.155780 containerd[1511]: 2026-03-07 01:08:03.145 [INFO][5828] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" HandleID="k8s-pod-network.a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" Mar 7 01:08:03.155780 containerd[1511]: 2026-03-07 01:08:03.145 [INFO][5828] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:08:03.155780 containerd[1511]: 2026-03-07 01:08:03.145 [INFO][5828] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:08:03.155780 containerd[1511]: 2026-03-07 01:08:03.150 [WARNING][5828] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" HandleID="k8s-pod-network.a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" Mar 7 01:08:03.155780 containerd[1511]: 2026-03-07 01:08:03.150 [INFO][5828] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" HandleID="k8s-pod-network.a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Workload="ci--4081--3--6--n--593f1c83d2-k8s-calico--apiserver--6f7869694b--75lrk-eth0" Mar 7 01:08:03.155780 containerd[1511]: 2026-03-07 01:08:03.151 [INFO][5828] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:08:03.155780 containerd[1511]: 2026-03-07 01:08:03.153 [INFO][5820] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7" Mar 7 01:08:03.156372 containerd[1511]: time="2026-03-07T01:08:03.155865433Z" level=info msg="TearDown network for sandbox \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\" successfully" Mar 7 01:08:03.160408 containerd[1511]: time="2026-03-07T01:08:03.159773816Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:08:03.160408 containerd[1511]: time="2026-03-07T01:08:03.159829762Z" level=info msg="RemovePodSandbox \"a9a6e3a69ba801170c55f263a8b33b036528912605b9c28bfc5a229f3b6e8db7\" returns successfully" Mar 7 01:08:33.858372 kubelet[2572]: I0307 01:08:33.858037 2572 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 7 01:08:34.810810 systemd[1]: Started sshd@7-204.168.152.184:22-4.153.228.146:38760.service - OpenSSH per-connection server daemon (4.153.228.146:38760). Mar 7 01:08:35.597735 sshd[5964]: Accepted publickey for core from 4.153.228.146 port 38760 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:08:35.601446 sshd[5964]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:08:35.612306 systemd-logind[1489]: New session 8 of user core. Mar 7 01:08:35.621419 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 7 01:08:36.242824 sshd[5964]: pam_unix(sshd:session): session closed for user core Mar 7 01:08:36.248960 systemd[1]: sshd@7-204.168.152.184:22-4.153.228.146:38760.service: Deactivated successfully. Mar 7 01:08:36.252143 systemd[1]: session-8.scope: Deactivated successfully. Mar 7 01:08:36.255273 systemd-logind[1489]: Session 8 logged out. Waiting for processes to exit. Mar 7 01:08:36.257093 systemd-logind[1489]: Removed session 8. Mar 7 01:08:41.380664 systemd[1]: Started sshd@8-204.168.152.184:22-4.153.228.146:49724.service - OpenSSH per-connection server daemon (4.153.228.146:49724). Mar 7 01:08:42.159080 sshd[5981]: Accepted publickey for core from 4.153.228.146 port 49724 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:08:42.162093 sshd[5981]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:08:42.169913 systemd-logind[1489]: New session 9 of user core. Mar 7 01:08:42.177422 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 7 01:08:42.771947 sshd[5981]: pam_unix(sshd:session): session closed for user core Mar 7 01:08:42.779150 systemd-logind[1489]: Session 9 logged out. Waiting for processes to exit. Mar 7 01:08:42.780529 systemd[1]: sshd@8-204.168.152.184:22-4.153.228.146:49724.service: Deactivated successfully. Mar 7 01:08:42.785091 systemd[1]: session-9.scope: Deactivated successfully. Mar 7 01:08:42.787529 systemd-logind[1489]: Removed session 9. Mar 7 01:08:44.418353 systemd[1]: run-containerd-runc-k8s.io-cfd8e81fce8e6503b2fbfcc561a6aa0ad9f0f00dac806ca7278eefc371d1f7ff-runc.TUlqjF.mount: Deactivated successfully. Mar 7 01:08:47.910579 systemd[1]: Started sshd@9-204.168.152.184:22-4.153.228.146:49738.service - OpenSSH per-connection server daemon (4.153.228.146:49738). Mar 7 01:08:48.666495 sshd[6034]: Accepted publickey for core from 4.153.228.146 port 49738 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:08:48.667024 sshd[6034]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:08:48.674448 systemd-logind[1489]: New session 10 of user core. Mar 7 01:08:48.678312 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 7 01:08:49.258736 sshd[6034]: pam_unix(sshd:session): session closed for user core Mar 7 01:08:49.265626 systemd[1]: sshd@9-204.168.152.184:22-4.153.228.146:49738.service: Deactivated successfully. Mar 7 01:08:49.269497 systemd[1]: session-10.scope: Deactivated successfully. Mar 7 01:08:49.271304 systemd-logind[1489]: Session 10 logged out. Waiting for processes to exit. Mar 7 01:08:49.273360 systemd-logind[1489]: Removed session 10. Mar 7 01:08:49.398085 systemd[1]: Started sshd@10-204.168.152.184:22-4.153.228.146:50628.service - OpenSSH per-connection server daemon (4.153.228.146:50628). Mar 7 01:08:50.149167 sshd[6048]: Accepted publickey for core from 4.153.228.146 port 50628 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:08:50.152469 sshd[6048]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:08:50.161063 systemd-logind[1489]: New session 11 of user core. Mar 7 01:08:50.170410 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 7 01:08:50.764902 sshd[6048]: pam_unix(sshd:session): session closed for user core Mar 7 01:08:50.768581 systemd-logind[1489]: Session 11 logged out. Waiting for processes to exit. Mar 7 01:08:50.769574 systemd[1]: sshd@10-204.168.152.184:22-4.153.228.146:50628.service: Deactivated successfully. Mar 7 01:08:50.771736 systemd[1]: session-11.scope: Deactivated successfully. Mar 7 01:08:50.773318 systemd-logind[1489]: Removed session 11. Mar 7 01:08:50.900099 systemd[1]: Started sshd@11-204.168.152.184:22-4.153.228.146:50644.service - OpenSSH per-connection server daemon (4.153.228.146:50644). Mar 7 01:08:51.651884 sshd[6075]: Accepted publickey for core from 4.153.228.146 port 50644 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:08:51.656923 sshd[6075]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:08:51.662589 systemd-logind[1489]: New session 12 of user core. Mar 7 01:08:51.669324 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 7 01:08:52.226579 sshd[6075]: pam_unix(sshd:session): session closed for user core Mar 7 01:08:52.232309 systemd[1]: sshd@11-204.168.152.184:22-4.153.228.146:50644.service: Deactivated successfully. Mar 7 01:08:52.234669 systemd[1]: session-12.scope: Deactivated successfully. Mar 7 01:08:52.235744 systemd-logind[1489]: Session 12 logged out. Waiting for processes to exit. Mar 7 01:08:52.236665 systemd-logind[1489]: Removed session 12. Mar 7 01:08:57.365608 systemd[1]: Started sshd@12-204.168.152.184:22-4.153.228.146:50654.service - OpenSSH per-connection server daemon (4.153.228.146:50654). Mar 7 01:08:58.116250 sshd[6108]: Accepted publickey for core from 4.153.228.146 port 50654 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:08:58.118277 sshd[6108]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:08:58.126778 systemd-logind[1489]: New session 13 of user core. Mar 7 01:08:58.132465 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 7 01:08:58.690551 sshd[6108]: pam_unix(sshd:session): session closed for user core Mar 7 01:08:58.696165 systemd[1]: sshd@12-204.168.152.184:22-4.153.228.146:50654.service: Deactivated successfully. Mar 7 01:08:58.700132 systemd[1]: session-13.scope: Deactivated successfully. Mar 7 01:08:58.701428 systemd-logind[1489]: Session 13 logged out. Waiting for processes to exit. Mar 7 01:08:58.704180 systemd-logind[1489]: Removed session 13. Mar 7 01:08:58.828325 systemd[1]: Started sshd@13-204.168.152.184:22-4.153.228.146:50658.service - OpenSSH per-connection server daemon (4.153.228.146:50658). Mar 7 01:08:59.580243 sshd[6140]: Accepted publickey for core from 4.153.228.146 port 50658 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:08:59.580946 sshd[6140]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:08:59.589318 systemd-logind[1489]: New session 14 of user core. Mar 7 01:08:59.597441 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 7 01:09:00.423996 sshd[6140]: pam_unix(sshd:session): session closed for user core Mar 7 01:09:00.431089 systemd[1]: sshd@13-204.168.152.184:22-4.153.228.146:50658.service: Deactivated successfully. Mar 7 01:09:00.435085 systemd[1]: session-14.scope: Deactivated successfully. Mar 7 01:09:00.436441 systemd-logind[1489]: Session 14 logged out. Waiting for processes to exit. Mar 7 01:09:00.438670 systemd-logind[1489]: Removed session 14. Mar 7 01:09:00.558622 systemd[1]: Started sshd@14-204.168.152.184:22-4.153.228.146:56912.service - OpenSSH per-connection server daemon (4.153.228.146:56912). Mar 7 01:09:01.318757 sshd[6151]: Accepted publickey for core from 4.153.228.146 port 56912 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:09:01.319306 sshd[6151]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:09:01.323649 systemd-logind[1489]: New session 15 of user core. Mar 7 01:09:01.326303 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 7 01:09:02.535591 sshd[6151]: pam_unix(sshd:session): session closed for user core Mar 7 01:09:02.538880 systemd-logind[1489]: Session 15 logged out. Waiting for processes to exit. Mar 7 01:09:02.539257 systemd[1]: sshd@14-204.168.152.184:22-4.153.228.146:56912.service: Deactivated successfully. Mar 7 01:09:02.541162 systemd[1]: session-15.scope: Deactivated successfully. Mar 7 01:09:02.541949 systemd-logind[1489]: Removed session 15. Mar 7 01:09:02.675647 systemd[1]: Started sshd@15-204.168.152.184:22-4.153.228.146:56918.service - OpenSSH per-connection server daemon (4.153.228.146:56918). Mar 7 01:09:03.435569 sshd[6181]: Accepted publickey for core from 4.153.228.146 port 56918 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:09:03.438768 sshd[6181]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:09:03.449532 systemd-logind[1489]: New session 16 of user core. Mar 7 01:09:03.458486 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 7 01:09:04.125124 sshd[6181]: pam_unix(sshd:session): session closed for user core Mar 7 01:09:04.130376 systemd[1]: sshd@15-204.168.152.184:22-4.153.228.146:56918.service: Deactivated successfully. Mar 7 01:09:04.134812 systemd[1]: session-16.scope: Deactivated successfully. Mar 7 01:09:04.137622 systemd-logind[1489]: Session 16 logged out. Waiting for processes to exit. Mar 7 01:09:04.140785 systemd-logind[1489]: Removed session 16. Mar 7 01:09:04.261042 systemd[1]: Started sshd@16-204.168.152.184:22-4.153.228.146:56934.service - OpenSSH per-connection server daemon (4.153.228.146:56934). Mar 7 01:09:05.012150 sshd[6194]: Accepted publickey for core from 4.153.228.146 port 56934 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:09:05.013603 sshd[6194]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:09:05.021688 systemd-logind[1489]: New session 17 of user core. Mar 7 01:09:05.031495 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 7 01:09:05.612141 sshd[6194]: pam_unix(sshd:session): session closed for user core Mar 7 01:09:05.619679 systemd[1]: sshd@16-204.168.152.184:22-4.153.228.146:56934.service: Deactivated successfully. Mar 7 01:09:05.623535 systemd[1]: session-17.scope: Deactivated successfully. Mar 7 01:09:05.625875 systemd-logind[1489]: Session 17 logged out. Waiting for processes to exit. Mar 7 01:09:05.627940 systemd-logind[1489]: Removed session 17. Mar 7 01:09:10.748634 systemd[1]: Started sshd@17-204.168.152.184:22-4.153.228.146:40450.service - OpenSSH per-connection server daemon (4.153.228.146:40450). Mar 7 01:09:11.485859 sshd[6222]: Accepted publickey for core from 4.153.228.146 port 40450 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:09:11.487182 sshd[6222]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:09:11.493183 systemd-logind[1489]: New session 18 of user core. Mar 7 01:09:11.495400 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 7 01:09:12.071315 sshd[6222]: pam_unix(sshd:session): session closed for user core Mar 7 01:09:12.076183 systemd[1]: sshd@17-204.168.152.184:22-4.153.228.146:40450.service: Deactivated successfully. Mar 7 01:09:12.080101 systemd[1]: session-18.scope: Deactivated successfully. Mar 7 01:09:12.081604 systemd-logind[1489]: Session 18 logged out. Waiting for processes to exit. Mar 7 01:09:12.083554 systemd-logind[1489]: Removed session 18. Mar 7 01:09:17.214628 systemd[1]: Started sshd@18-204.168.152.184:22-4.153.228.146:40462.service - OpenSSH per-connection server daemon (4.153.228.146:40462). Mar 7 01:09:17.979793 sshd[6281]: Accepted publickey for core from 4.153.228.146 port 40462 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:09:17.982677 sshd[6281]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:09:17.990691 systemd-logind[1489]: New session 19 of user core. Mar 7 01:09:17.999435 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 7 01:09:18.599243 sshd[6281]: pam_unix(sshd:session): session closed for user core Mar 7 01:09:18.605419 systemd[1]: sshd@18-204.168.152.184:22-4.153.228.146:40462.service: Deactivated successfully. Mar 7 01:09:18.609549 systemd[1]: session-19.scope: Deactivated successfully. Mar 7 01:09:18.612949 systemd-logind[1489]: Session 19 logged out. Waiting for processes to exit. Mar 7 01:09:18.615187 systemd-logind[1489]: Removed session 19. Mar 7 01:09:35.785532 systemd[1]: cri-containerd-5a52644178bd6b7280bf0af6e209ff785d77a3d7cdafea2f9d0c3431bdb9e800.scope: Deactivated successfully. Mar 7 01:09:35.787368 systemd[1]: cri-containerd-5a52644178bd6b7280bf0af6e209ff785d77a3d7cdafea2f9d0c3431bdb9e800.scope: Consumed 3.340s CPU time, 18.0M memory peak, 0B memory swap peak. Mar 7 01:09:35.838226 containerd[1511]: time="2026-03-07T01:09:35.835503968Z" level=info msg="shim disconnected" id=5a52644178bd6b7280bf0af6e209ff785d77a3d7cdafea2f9d0c3431bdb9e800 namespace=k8s.io Mar 7 01:09:35.838226 containerd[1511]: time="2026-03-07T01:09:35.835568625Z" level=warning msg="cleaning up after shim disconnected" id=5a52644178bd6b7280bf0af6e209ff785d77a3d7cdafea2f9d0c3431bdb9e800 namespace=k8s.io Mar 7 01:09:35.838226 containerd[1511]: time="2026-03-07T01:09:35.835583187Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 01:09:35.842020 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5a52644178bd6b7280bf0af6e209ff785d77a3d7cdafea2f9d0c3431bdb9e800-rootfs.mount: Deactivated successfully. Mar 7 01:09:36.019230 kubelet[2572]: E0307 01:09:36.019088 2572 controller.go:251] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:47990->10.0.0.2:2379: read: connection timed out" Mar 7 01:09:36.302383 systemd[1]: cri-containerd-bebcf63f8b2070de6306897606c5ba9d599db7890dc710512037db3c5e171565.scope: Deactivated successfully. Mar 7 01:09:36.303501 systemd[1]: cri-containerd-bebcf63f8b2070de6306897606c5ba9d599db7890dc710512037db3c5e171565.scope: Consumed 7.921s CPU time. Mar 7 01:09:36.340785 containerd[1511]: time="2026-03-07T01:09:36.340682992Z" level=info msg="shim disconnected" id=bebcf63f8b2070de6306897606c5ba9d599db7890dc710512037db3c5e171565 namespace=k8s.io Mar 7 01:09:36.340785 containerd[1511]: time="2026-03-07T01:09:36.340768201Z" level=warning msg="cleaning up after shim disconnected" id=bebcf63f8b2070de6306897606c5ba9d599db7890dc710512037db3c5e171565 namespace=k8s.io Mar 7 01:09:36.340785 containerd[1511]: time="2026-03-07T01:09:36.340784165Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 01:09:36.345877 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bebcf63f8b2070de6306897606c5ba9d599db7890dc710512037db3c5e171565-rootfs.mount: Deactivated successfully. Mar 7 01:09:36.749102 kubelet[2572]: I0307 01:09:36.748693 2572 scope.go:122] "RemoveContainer" containerID="5a52644178bd6b7280bf0af6e209ff785d77a3d7cdafea2f9d0c3431bdb9e800" Mar 7 01:09:36.752116 containerd[1511]: time="2026-03-07T01:09:36.752051293Z" level=info msg="CreateContainer within sandbox \"4846f1466b6c3d751b9c00067bebf32d9ec4b53e5a103f0adffdc801f97c0d15\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 7 01:09:36.752794 kubelet[2572]: I0307 01:09:36.752745 2572 scope.go:122] "RemoveContainer" containerID="bebcf63f8b2070de6306897606c5ba9d599db7890dc710512037db3c5e171565" Mar 7 01:09:36.756009 containerd[1511]: time="2026-03-07T01:09:36.755878323Z" level=info msg="CreateContainer within sandbox \"0676fc6ff0ebbb61e89a03b796948feeff8ca5b9e4fdb374bf65556033487581\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 7 01:09:36.774090 containerd[1511]: time="2026-03-07T01:09:36.771850831Z" level=info msg="CreateContainer within sandbox \"0676fc6ff0ebbb61e89a03b796948feeff8ca5b9e4fdb374bf65556033487581\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"c1dec35042a3487d15144c6f6b54735f82d9e4896a6c16266c968b1f3be5c8fb\"" Mar 7 01:09:36.774421 containerd[1511]: time="2026-03-07T01:09:36.774388735Z" level=info msg="StartContainer for \"c1dec35042a3487d15144c6f6b54735f82d9e4896a6c16266c968b1f3be5c8fb\"" Mar 7 01:09:36.778916 containerd[1511]: time="2026-03-07T01:09:36.778832472Z" level=info msg="CreateContainer within sandbox \"4846f1466b6c3d751b9c00067bebf32d9ec4b53e5a103f0adffdc801f97c0d15\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"bec981811a9e270ef4e4d80666da4bd26f8ab6227d9a85c86d5509e7fe0638fa\"" Mar 7 01:09:36.779792 containerd[1511]: time="2026-03-07T01:09:36.779738943Z" level=info msg="StartContainer for \"bec981811a9e270ef4e4d80666da4bd26f8ab6227d9a85c86d5509e7fe0638fa\"" Mar 7 01:09:36.823415 systemd[1]: Started cri-containerd-c1dec35042a3487d15144c6f6b54735f82d9e4896a6c16266c968b1f3be5c8fb.scope - libcontainer container c1dec35042a3487d15144c6f6b54735f82d9e4896a6c16266c968b1f3be5c8fb. Mar 7 01:09:36.826896 systemd[1]: Started cri-containerd-bec981811a9e270ef4e4d80666da4bd26f8ab6227d9a85c86d5509e7fe0638fa.scope - libcontainer container bec981811a9e270ef4e4d80666da4bd26f8ab6227d9a85c86d5509e7fe0638fa. Mar 7 01:09:36.865485 containerd[1511]: time="2026-03-07T01:09:36.865413971Z" level=info msg="StartContainer for \"c1dec35042a3487d15144c6f6b54735f82d9e4896a6c16266c968b1f3be5c8fb\" returns successfully" Mar 7 01:09:36.878404 containerd[1511]: time="2026-03-07T01:09:36.877715604Z" level=info msg="StartContainer for \"bec981811a9e270ef4e4d80666da4bd26f8ab6227d9a85c86d5509e7fe0638fa\" returns successfully" Mar 7 01:09:38.569924 systemd[1]: cri-containerd-c1dec35042a3487d15144c6f6b54735f82d9e4896a6c16266c968b1f3be5c8fb.scope: Deactivated successfully. Mar 7 01:09:38.616551 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c1dec35042a3487d15144c6f6b54735f82d9e4896a6c16266c968b1f3be5c8fb-rootfs.mount: Deactivated successfully. Mar 7 01:09:38.625248 containerd[1511]: time="2026-03-07T01:09:38.625132581Z" level=info msg="shim disconnected" id=c1dec35042a3487d15144c6f6b54735f82d9e4896a6c16266c968b1f3be5c8fb namespace=k8s.io Mar 7 01:09:38.625248 containerd[1511]: time="2026-03-07T01:09:38.625239423Z" level=warning msg="cleaning up after shim disconnected" id=c1dec35042a3487d15144c6f6b54735f82d9e4896a6c16266c968b1f3be5c8fb namespace=k8s.io Mar 7 01:09:38.626425 containerd[1511]: time="2026-03-07T01:09:38.625256168Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 01:09:38.767447 kubelet[2572]: I0307 01:09:38.767162 2572 scope.go:122] "RemoveContainer" containerID="bebcf63f8b2070de6306897606c5ba9d599db7890dc710512037db3c5e171565" Mar 7 01:09:38.768252 kubelet[2572]: I0307 01:09:38.767648 2572 scope.go:122] "RemoveContainer" containerID="c1dec35042a3487d15144c6f6b54735f82d9e4896a6c16266c968b1f3be5c8fb" Mar 7 01:09:38.768252 kubelet[2572]: E0307 01:09:38.767827 2572 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-6cf4cccc57-kpvhj_tigera-operator(8c6c21c7-f042-4f30-b2bc-57048b6b4a4c)\"" pod="tigera-operator/tigera-operator-6cf4cccc57-kpvhj" podUID="8c6c21c7-f042-4f30-b2bc-57048b6b4a4c" Mar 7 01:09:38.769827 containerd[1511]: time="2026-03-07T01:09:38.769704980Z" level=info msg="RemoveContainer for \"bebcf63f8b2070de6306897606c5ba9d599db7890dc710512037db3c5e171565\"" Mar 7 01:09:38.776063 containerd[1511]: time="2026-03-07T01:09:38.775977418Z" level=info msg="RemoveContainer for \"bebcf63f8b2070de6306897606c5ba9d599db7890dc710512037db3c5e171565\" returns successfully" Mar 7 01:09:40.115407 kubelet[2572]: E0307 01:09:40.112499 2572 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:47628->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4081-3-6-n-593f1c83d2.189a69d5a5f9b092 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4081-3-6-n-593f1c83d2,UID:881450a00fe9f34600feb3266356da22,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Liveness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-n-593f1c83d2,},FirstTimestamp:2026-03-07 01:09:29.657266322 +0000 UTC m=+147.616993102,LastTimestamp:2026-03-07 01:09:29.657266322 +0000 UTC m=+147.616993102,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-n-593f1c83d2,}" Mar 7 01:09:40.666951 systemd[1]: cri-containerd-3f547db23e5f58392e323f5d30c1bcf8fbdf64d75858d49c895405e1af795fb7.scope: Deactivated successfully. Mar 7 01:09:40.667465 systemd[1]: cri-containerd-3f547db23e5f58392e323f5d30c1bcf8fbdf64d75858d49c895405e1af795fb7.scope: Consumed 1.535s CPU time, 16.3M memory peak, 0B memory swap peak. Mar 7 01:09:40.709418 containerd[1511]: time="2026-03-07T01:09:40.707289663Z" level=info msg="shim disconnected" id=3f547db23e5f58392e323f5d30c1bcf8fbdf64d75858d49c895405e1af795fb7 namespace=k8s.io Mar 7 01:09:40.709418 containerd[1511]: time="2026-03-07T01:09:40.709411593Z" level=warning msg="cleaning up after shim disconnected" id=3f547db23e5f58392e323f5d30c1bcf8fbdf64d75858d49c895405e1af795fb7 namespace=k8s.io Mar 7 01:09:40.709418 containerd[1511]: time="2026-03-07T01:09:40.709432034Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 01:09:40.714156 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3f547db23e5f58392e323f5d30c1bcf8fbdf64d75858d49c895405e1af795fb7-rootfs.mount: Deactivated successfully. Mar 7 01:09:40.777706 kubelet[2572]: I0307 01:09:40.777414 2572 scope.go:122] "RemoveContainer" containerID="3f547db23e5f58392e323f5d30c1bcf8fbdf64d75858d49c895405e1af795fb7" Mar 7 01:09:40.781469 containerd[1511]: time="2026-03-07T01:09:40.781083936Z" level=info msg="CreateContainer within sandbox \"4676abf7b4c631d2ba0e1fa8c0b42279e1d68479ca0be94f85e3ce7db2bbd9a1\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 7 01:09:40.798508 containerd[1511]: time="2026-03-07T01:09:40.798459431Z" level=info msg="CreateContainer within sandbox \"4676abf7b4c631d2ba0e1fa8c0b42279e1d68479ca0be94f85e3ce7db2bbd9a1\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"8a76604fc7b4c62072eaf6d5b56cca3df6ef4486fefc590c7c88e7f15da81ed4\"" Mar 7 01:09:40.798914 containerd[1511]: time="2026-03-07T01:09:40.798883489Z" level=info msg="StartContainer for \"8a76604fc7b4c62072eaf6d5b56cca3df6ef4486fefc590c7c88e7f15da81ed4\"" Mar 7 01:09:40.840319 systemd[1]: Started cri-containerd-8a76604fc7b4c62072eaf6d5b56cca3df6ef4486fefc590c7c88e7f15da81ed4.scope - libcontainer container 8a76604fc7b4c62072eaf6d5b56cca3df6ef4486fefc590c7c88e7f15da81ed4. Mar 7 01:09:40.873288 containerd[1511]: time="2026-03-07T01:09:40.873253688Z" level=info msg="StartContainer for \"8a76604fc7b4c62072eaf6d5b56cca3df6ef4486fefc590c7c88e7f15da81ed4\" returns successfully" Mar 7 01:09:41.710036 systemd[1]: run-containerd-runc-k8s.io-8a76604fc7b4c62072eaf6d5b56cca3df6ef4486fefc590c7c88e7f15da81ed4-runc.H2t4pN.mount: Deactivated successfully. Mar 7 01:09:46.021270 kubelet[2572]: E0307 01:09:46.020982 2572 request.go:1196] "Unexpected error when reading response body" err="net/http: request canceled (Client.Timeout or context cancellation while reading body)" Mar 7 01:09:46.021270 kubelet[2572]: E0307 01:09:46.021079 2572 controller.go:251] "Failed to update lease" err="unexpected error when reading response body. Please retry. Original error: net/http: request canceled (Client.Timeout or context cancellation while reading body)" Mar 7 01:09:51.138524 kubelet[2572]: I0307 01:09:51.138464 2572 scope.go:122] "RemoveContainer" containerID="c1dec35042a3487d15144c6f6b54735f82d9e4896a6c16266c968b1f3be5c8fb" Mar 7 01:09:51.141776 containerd[1511]: time="2026-03-07T01:09:51.141706876Z" level=info msg="CreateContainer within sandbox \"0676fc6ff0ebbb61e89a03b796948feeff8ca5b9e4fdb374bf65556033487581\" for container &ContainerMetadata{Name:tigera-operator,Attempt:2,}" Mar 7 01:09:51.161599 containerd[1511]: time="2026-03-07T01:09:51.161532540Z" level=info msg="CreateContainer within sandbox \"0676fc6ff0ebbb61e89a03b796948feeff8ca5b9e4fdb374bf65556033487581\" for &ContainerMetadata{Name:tigera-operator,Attempt:2,} returns container id \"119df8b7ee4b7f804e4bac5943f07b5551bf8be946b97b2ca35b22a2691e7741\"" Mar 7 01:09:51.165783 containerd[1511]: time="2026-03-07T01:09:51.164772699Z" level=info msg="StartContainer for \"119df8b7ee4b7f804e4bac5943f07b5551bf8be946b97b2ca35b22a2691e7741\"" Mar 7 01:09:51.230316 systemd[1]: Started cri-containerd-119df8b7ee4b7f804e4bac5943f07b5551bf8be946b97b2ca35b22a2691e7741.scope - libcontainer container 119df8b7ee4b7f804e4bac5943f07b5551bf8be946b97b2ca35b22a2691e7741. Mar 7 01:09:51.265828 containerd[1511]: time="2026-03-07T01:09:51.265791518Z" level=info msg="StartContainer for \"119df8b7ee4b7f804e4bac5943f07b5551bf8be946b97b2ca35b22a2691e7741\" returns successfully" Mar 7 01:09:56.021744 kubelet[2572]: E0307 01:09:56.021404 2572 controller.go:251] "Failed to update lease" err="Put \"https://204.168.152.184:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-593f1c83d2?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)"