Mar 6 01:41:27.182943 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Thu Mar 5 23:31:42 -00 2026 Mar 6 01:41:27.182974 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=a6bcd99e714cc2f1b95dc0d61d9d762252de26a434f12074c16f59200c97ba9c Mar 6 01:41:27.182986 kernel: BIOS-provided physical RAM map: Mar 6 01:41:27.182993 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 6 01:41:27.182998 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 6 01:41:27.183004 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 6 01:41:27.183011 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000009cfdbfff] usable Mar 6 01:41:27.183016 kernel: BIOS-e820: [mem 0x000000009cfdc000-0x000000009cffffff] reserved Mar 6 01:41:27.183022 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Mar 6 01:41:27.183031 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Mar 6 01:41:27.183037 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 6 01:41:27.183043 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 6 01:41:27.183082 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 6 01:41:27.183088 kernel: NX (Execute Disable) protection: active Mar 6 01:41:27.183095 kernel: APIC: Static calls initialized Mar 6 01:41:27.183172 kernel: SMBIOS 2.8 present. Mar 6 01:41:27.183180 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS 1.16.2-debian-1.16.2-1 04/01/2014 Mar 6 01:41:27.183186 kernel: Hypervisor detected: KVM Mar 6 01:41:27.183193 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 6 01:41:27.183199 kernel: kvm-clock: using sched offset of 7087957257 cycles Mar 6 01:41:27.183205 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 6 01:41:27.183212 kernel: tsc: Detected 2445.424 MHz processor Mar 6 01:41:27.183218 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 6 01:41:27.183224 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 6 01:41:27.183235 kernel: last_pfn = 0x9cfdc max_arch_pfn = 0x400000000 Mar 6 01:41:27.183241 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 6 01:41:27.183248 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 6 01:41:27.183254 kernel: Using GB pages for direct mapping Mar 6 01:41:27.183260 kernel: ACPI: Early table checksum verification disabled Mar 6 01:41:27.183266 kernel: ACPI: RSDP 0x00000000000F59D0 000014 (v00 BOCHS ) Mar 6 01:41:27.183273 kernel: ACPI: RSDT 0x000000009CFE241A 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 6 01:41:27.183279 kernel: ACPI: FACP 0x000000009CFE21FA 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 6 01:41:27.183285 kernel: ACPI: DSDT 0x000000009CFE0040 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 6 01:41:27.183295 kernel: ACPI: FACS 0x000000009CFE0000 000040 Mar 6 01:41:27.183301 kernel: ACPI: APIC 0x000000009CFE22EE 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 6 01:41:27.183307 kernel: ACPI: HPET 0x000000009CFE237E 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 6 01:41:27.183313 kernel: ACPI: MCFG 0x000000009CFE23B6 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 6 01:41:27.183319 kernel: ACPI: WAET 0x000000009CFE23F2 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 6 01:41:27.183326 kernel: ACPI: Reserving FACP table memory at [mem 0x9cfe21fa-0x9cfe22ed] Mar 6 01:41:27.183332 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cfe0040-0x9cfe21f9] Mar 6 01:41:27.183343 kernel: ACPI: Reserving FACS table memory at [mem 0x9cfe0000-0x9cfe003f] Mar 6 01:41:27.183352 kernel: ACPI: Reserving APIC table memory at [mem 0x9cfe22ee-0x9cfe237d] Mar 6 01:41:27.183359 kernel: ACPI: Reserving HPET table memory at [mem 0x9cfe237e-0x9cfe23b5] Mar 6 01:41:27.183365 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cfe23b6-0x9cfe23f1] Mar 6 01:41:27.183372 kernel: ACPI: Reserving WAET table memory at [mem 0x9cfe23f2-0x9cfe2419] Mar 6 01:41:27.183378 kernel: No NUMA configuration found Mar 6 01:41:27.183384 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cfdbfff] Mar 6 01:41:27.183391 kernel: NODE_DATA(0) allocated [mem 0x9cfd6000-0x9cfdbfff] Mar 6 01:41:27.183400 kernel: Zone ranges: Mar 6 01:41:27.183407 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 6 01:41:27.183413 kernel: DMA32 [mem 0x0000000001000000-0x000000009cfdbfff] Mar 6 01:41:27.183420 kernel: Normal empty Mar 6 01:41:27.183427 kernel: Movable zone start for each node Mar 6 01:41:27.183433 kernel: Early memory node ranges Mar 6 01:41:27.183439 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 6 01:41:27.183446 kernel: node 0: [mem 0x0000000000100000-0x000000009cfdbfff] Mar 6 01:41:27.183452 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cfdbfff] Mar 6 01:41:27.183462 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 6 01:41:27.183491 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 6 01:41:27.183498 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Mar 6 01:41:27.183504 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 6 01:41:27.183511 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 6 01:41:27.183518 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 6 01:41:27.183524 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 6 01:41:27.183530 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 6 01:41:27.183537 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 6 01:41:27.183547 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 6 01:41:27.183554 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 6 01:41:27.183560 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 6 01:41:27.183566 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Mar 6 01:41:27.183573 kernel: TSC deadline timer available Mar 6 01:41:27.183579 kernel: smpboot: Allowing 4 CPUs, 0 hotplug CPUs Mar 6 01:41:27.183586 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 6 01:41:27.183592 kernel: kvm-guest: KVM setup pv remote TLB flush Mar 6 01:41:27.183618 kernel: kvm-guest: setup PV sched yield Mar 6 01:41:27.183628 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Mar 6 01:41:27.183635 kernel: Booting paravirtualized kernel on KVM Mar 6 01:41:27.183641 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 6 01:41:27.183648 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Mar 6 01:41:27.183654 kernel: percpu: Embedded 57 pages/cpu s196328 r8192 d28952 u524288 Mar 6 01:41:27.183661 kernel: pcpu-alloc: s196328 r8192 d28952 u524288 alloc=1*2097152 Mar 6 01:41:27.183667 kernel: pcpu-alloc: [0] 0 1 2 3 Mar 6 01:41:27.183674 kernel: kvm-guest: PV spinlocks enabled Mar 6 01:41:27.183680 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 6 01:41:27.183690 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=a6bcd99e714cc2f1b95dc0d61d9d762252de26a434f12074c16f59200c97ba9c Mar 6 01:41:27.183697 kernel: random: crng init done Mar 6 01:41:27.183704 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 6 01:41:27.183710 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 6 01:41:27.183716 kernel: Fallback order for Node 0: 0 Mar 6 01:41:27.183762 kernel: Built 1 zonelists, mobility grouping on. Total pages: 632732 Mar 6 01:41:27.183773 kernel: Policy zone: DMA32 Mar 6 01:41:27.183781 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 6 01:41:27.183792 kernel: Memory: 2434608K/2571752K available (12288K kernel code, 2288K rwdata, 22752K rodata, 42892K init, 2304K bss, 136884K reserved, 0K cma-reserved) Mar 6 01:41:27.183798 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Mar 6 01:41:27.183805 kernel: ftrace: allocating 37996 entries in 149 pages Mar 6 01:41:27.183811 kernel: ftrace: allocated 149 pages with 4 groups Mar 6 01:41:27.183818 kernel: Dynamic Preempt: voluntary Mar 6 01:41:27.183824 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 6 01:41:27.183831 kernel: rcu: RCU event tracing is enabled. Mar 6 01:41:27.183838 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Mar 6 01:41:27.183845 kernel: Trampoline variant of Tasks RCU enabled. Mar 6 01:41:27.183855 kernel: Rude variant of Tasks RCU enabled. Mar 6 01:41:27.183862 kernel: Tracing variant of Tasks RCU enabled. Mar 6 01:41:27.183872 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 6 01:41:27.183884 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Mar 6 01:41:27.183928 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Mar 6 01:41:27.183943 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 6 01:41:27.183954 kernel: Console: colour VGA+ 80x25 Mar 6 01:41:27.183966 kernel: printk: console [ttyS0] enabled Mar 6 01:41:27.183977 kernel: ACPI: Core revision 20230628 Mar 6 01:41:27.183989 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Mar 6 01:41:27.184003 kernel: APIC: Switch to symmetric I/O mode setup Mar 6 01:41:27.184010 kernel: x2apic enabled Mar 6 01:41:27.184016 kernel: APIC: Switched APIC routing to: physical x2apic Mar 6 01:41:27.184023 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Mar 6 01:41:27.184030 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Mar 6 01:41:27.184036 kernel: kvm-guest: setup PV IPIs Mar 6 01:41:27.184043 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 6 01:41:27.184061 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Mar 6 01:41:27.184068 kernel: Calibrating delay loop (skipped) preset value.. 4890.84 BogoMIPS (lpj=2445424) Mar 6 01:41:27.184075 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 6 01:41:27.184082 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Mar 6 01:41:27.184091 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Mar 6 01:41:27.184098 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 6 01:41:27.184105 kernel: Spectre V2 : Mitigation: Retpolines Mar 6 01:41:27.184112 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Mar 6 01:41:27.184118 kernel: Speculative Store Bypass: Vulnerable Mar 6 01:41:27.184188 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Mar 6 01:41:27.184218 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Mar 6 01:41:27.184225 kernel: active return thunk: srso_alias_return_thunk Mar 6 01:41:27.184232 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Mar 6 01:41:27.184239 kernel: Transient Scheduler Attacks: Forcing mitigation on in a VM Mar 6 01:41:27.184246 kernel: Transient Scheduler Attacks: Vulnerable: Clear CPU buffers attempted, no microcode Mar 6 01:41:27.184252 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 6 01:41:27.184259 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 6 01:41:27.184266 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 6 01:41:27.184277 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 6 01:41:27.184284 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Mar 6 01:41:27.184290 kernel: Freeing SMP alternatives memory: 32K Mar 6 01:41:27.184297 kernel: pid_max: default: 32768 minimum: 301 Mar 6 01:41:27.184304 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 6 01:41:27.184311 kernel: landlock: Up and running. Mar 6 01:41:27.184317 kernel: SELinux: Initializing. Mar 6 01:41:27.184324 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 6 01:41:27.184331 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 6 01:41:27.184341 kernel: smpboot: CPU0: AMD EPYC 7763 64-Core Processor (family: 0x19, model: 0x1, stepping: 0x1) Mar 6 01:41:27.184347 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 6 01:41:27.184354 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 6 01:41:27.184361 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 6 01:41:27.184368 kernel: Performance Events: PMU not available due to virtualization, using software events only. Mar 6 01:41:27.184375 kernel: signal: max sigframe size: 1776 Mar 6 01:41:27.184402 kernel: rcu: Hierarchical SRCU implementation. Mar 6 01:41:27.184409 kernel: rcu: Max phase no-delay instances is 400. Mar 6 01:41:27.184419 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Mar 6 01:41:27.184426 kernel: smp: Bringing up secondary CPUs ... Mar 6 01:41:27.184432 kernel: smpboot: x86: Booting SMP configuration: Mar 6 01:41:27.184439 kernel: .... node #0, CPUs: #1 #2 #3 Mar 6 01:41:27.184446 kernel: smp: Brought up 1 node, 4 CPUs Mar 6 01:41:27.184453 kernel: smpboot: Max logical packages: 1 Mar 6 01:41:27.184459 kernel: smpboot: Total of 4 processors activated (19563.39 BogoMIPS) Mar 6 01:41:27.184466 kernel: devtmpfs: initialized Mar 6 01:41:27.184473 kernel: x86/mm: Memory block size: 128MB Mar 6 01:41:27.184480 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 6 01:41:27.184490 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Mar 6 01:41:27.184496 kernel: pinctrl core: initialized pinctrl subsystem Mar 6 01:41:27.184503 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 6 01:41:27.184510 kernel: audit: initializing netlink subsys (disabled) Mar 6 01:41:27.184517 kernel: audit: type=2000 audit(1772761284.717:1): state=initialized audit_enabled=0 res=1 Mar 6 01:41:27.184523 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 6 01:41:27.184530 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 6 01:41:27.184537 kernel: cpuidle: using governor menu Mar 6 01:41:27.184546 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 6 01:41:27.184553 kernel: dca service started, version 1.12.1 Mar 6 01:41:27.184560 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Mar 6 01:41:27.184567 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Mar 6 01:41:27.184574 kernel: PCI: Using configuration type 1 for base access Mar 6 01:41:27.184580 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 6 01:41:27.184587 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 6 01:41:27.184594 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 6 01:41:27.184601 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 6 01:41:27.184611 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 6 01:41:27.184617 kernel: ACPI: Added _OSI(Module Device) Mar 6 01:41:27.184624 kernel: ACPI: Added _OSI(Processor Device) Mar 6 01:41:27.184631 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 6 01:41:27.184637 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 6 01:41:27.184644 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 6 01:41:27.184651 kernel: ACPI: Interpreter enabled Mar 6 01:41:27.184658 kernel: ACPI: PM: (supports S0 S3 S5) Mar 6 01:41:27.184686 kernel: ACPI: Using IOAPIC for interrupt routing Mar 6 01:41:27.184696 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 6 01:41:27.184703 kernel: PCI: Using E820 reservations for host bridge windows Mar 6 01:41:27.184709 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 6 01:41:27.184716 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 6 01:41:27.185196 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 6 01:41:27.185370 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Mar 6 01:41:27.185521 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Mar 6 01:41:27.185531 kernel: PCI host bridge to bus 0000:00 Mar 6 01:41:27.185798 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 6 01:41:27.185980 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 6 01:41:27.186123 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 6 01:41:27.186331 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xafffffff window] Mar 6 01:41:27.186466 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 6 01:41:27.186600 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Mar 6 01:41:27.186795 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 6 01:41:27.187174 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Mar 6 01:41:27.187390 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 Mar 6 01:41:27.187573 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfd000000-0xfdffffff pref] Mar 6 01:41:27.187834 kernel: pci 0000:00:01.0: reg 0x18: [mem 0xfebd0000-0xfebd0fff] Mar 6 01:41:27.188096 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfebc0000-0xfebcffff pref] Mar 6 01:41:27.188312 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 6 01:41:27.188581 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 Mar 6 01:41:27.188785 kernel: pci 0000:00:02.0: reg 0x10: [io 0xc0c0-0xc0df] Mar 6 01:41:27.188977 kernel: pci 0000:00:02.0: reg 0x14: [mem 0xfebd1000-0xfebd1fff] Mar 6 01:41:27.189193 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfe000000-0xfe003fff 64bit pref] Mar 6 01:41:27.189481 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 Mar 6 01:41:27.189633 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc000-0xc07f] Mar 6 01:41:27.189855 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfebd2000-0xfebd2fff] Mar 6 01:41:27.190055 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe004000-0xfe007fff 64bit pref] Mar 6 01:41:27.190333 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 Mar 6 01:41:27.190487 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc0e0-0xc0ff] Mar 6 01:41:27.190633 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfebd3000-0xfebd3fff] Mar 6 01:41:27.190841 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe008000-0xfe00bfff 64bit pref] Mar 6 01:41:27.191029 kernel: pci 0000:00:04.0: reg 0x30: [mem 0xfeb80000-0xfebbffff pref] Mar 6 01:41:27.191337 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Mar 6 01:41:27.191499 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 6 01:41:27.191977 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Mar 6 01:41:27.192239 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc100-0xc11f] Mar 6 01:41:27.192420 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfebd4000-0xfebd4fff] Mar 6 01:41:27.192673 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Mar 6 01:41:27.192884 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Mar 6 01:41:27.192911 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 6 01:41:27.192925 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 6 01:41:27.192937 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 6 01:41:27.192950 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 6 01:41:27.192963 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 6 01:41:27.192970 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 6 01:41:27.192977 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 6 01:41:27.193012 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 6 01:41:27.193020 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 6 01:41:27.193032 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 6 01:41:27.193038 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 6 01:41:27.193045 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 6 01:41:27.193052 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 6 01:41:27.193059 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 6 01:41:27.193065 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 6 01:41:27.193072 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 6 01:41:27.193079 kernel: iommu: Default domain type: Translated Mar 6 01:41:27.193086 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 6 01:41:27.193095 kernel: PCI: Using ACPI for IRQ routing Mar 6 01:41:27.193102 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 6 01:41:27.193109 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 6 01:41:27.193116 kernel: e820: reserve RAM buffer [mem 0x9cfdc000-0x9fffffff] Mar 6 01:41:27.193325 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 6 01:41:27.193474 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 6 01:41:27.193620 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 6 01:41:27.193629 kernel: vgaarb: loaded Mar 6 01:41:27.193641 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Mar 6 01:41:27.193648 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Mar 6 01:41:27.193655 kernel: clocksource: Switched to clocksource kvm-clock Mar 6 01:41:27.193662 kernel: VFS: Disk quotas dquot_6.6.0 Mar 6 01:41:27.193669 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 6 01:41:27.193676 kernel: pnp: PnP ACPI init Mar 6 01:41:27.194087 kernel: system 00:05: [mem 0xb0000000-0xbfffffff window] has been reserved Mar 6 01:41:27.194101 kernel: pnp: PnP ACPI: found 6 devices Mar 6 01:41:27.194109 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 6 01:41:27.194122 kernel: NET: Registered PF_INET protocol family Mar 6 01:41:27.194180 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 6 01:41:27.194188 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 6 01:41:27.194195 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 6 01:41:27.194202 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 6 01:41:27.194209 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 6 01:41:27.194216 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 6 01:41:27.194222 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 6 01:41:27.194233 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 6 01:41:27.194240 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 6 01:41:27.194247 kernel: NET: Registered PF_XDP protocol family Mar 6 01:41:27.194395 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 6 01:41:27.194531 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 6 01:41:27.194666 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 6 01:41:27.194855 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xafffffff window] Mar 6 01:41:27.195031 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Mar 6 01:41:27.195237 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Mar 6 01:41:27.195254 kernel: PCI: CLS 0 bytes, default 64 Mar 6 01:41:27.195262 kernel: Initialise system trusted keyrings Mar 6 01:41:27.195269 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 6 01:41:27.195276 kernel: Key type asymmetric registered Mar 6 01:41:27.195283 kernel: Asymmetric key parser 'x509' registered Mar 6 01:41:27.195289 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 6 01:41:27.195296 kernel: io scheduler mq-deadline registered Mar 6 01:41:27.195303 kernel: io scheduler kyber registered Mar 6 01:41:27.195310 kernel: io scheduler bfq registered Mar 6 01:41:27.195320 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 6 01:41:27.195327 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 6 01:41:27.195334 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 6 01:41:27.195341 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Mar 6 01:41:27.195348 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 6 01:41:27.195354 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 6 01:41:27.195361 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 6 01:41:27.195368 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 6 01:41:27.195400 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 6 01:41:27.195822 kernel: rtc_cmos 00:04: RTC can wake from S4 Mar 6 01:41:27.195837 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 6 01:41:27.196017 kernel: rtc_cmos 00:04: registered as rtc0 Mar 6 01:41:27.196229 kernel: rtc_cmos 00:04: setting system clock to 2026-03-06T01:41:26 UTC (1772761286) Mar 6 01:41:27.196372 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Mar 6 01:41:27.196382 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 6 01:41:27.196389 kernel: NET: Registered PF_INET6 protocol family Mar 6 01:41:27.196396 kernel: Segment Routing with IPv6 Mar 6 01:41:27.196409 kernel: In-situ OAM (IOAM) with IPv6 Mar 6 01:41:27.196415 kernel: NET: Registered PF_PACKET protocol family Mar 6 01:41:27.196422 kernel: Key type dns_resolver registered Mar 6 01:41:27.196429 kernel: IPI shorthand broadcast: enabled Mar 6 01:41:27.196436 kernel: sched_clock: Marking stable (2050020688, 456938693)->(2857301563, -350342182) Mar 6 01:41:27.196443 kernel: registered taskstats version 1 Mar 6 01:41:27.196450 kernel: Loading compiled-in X.509 certificates Mar 6 01:41:27.196457 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: 6d88f6264570591a57b3c9c1e1c99fca6c68b8ca' Mar 6 01:41:27.196463 kernel: Key type .fscrypt registered Mar 6 01:41:27.196473 kernel: Key type fscrypt-provisioning registered Mar 6 01:41:27.196480 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 6 01:41:27.196487 kernel: ima: Allocated hash algorithm: sha1 Mar 6 01:41:27.196494 kernel: ima: No architecture policies found Mar 6 01:41:27.196500 kernel: clk: Disabling unused clocks Mar 6 01:41:27.196507 kernel: Freeing unused kernel image (initmem) memory: 42892K Mar 6 01:41:27.196514 kernel: Write protecting the kernel read-only data: 36864k Mar 6 01:41:27.196521 kernel: Freeing unused kernel image (rodata/data gap) memory: 1824K Mar 6 01:41:27.196531 kernel: Run /init as init process Mar 6 01:41:27.196538 kernel: with arguments: Mar 6 01:41:27.196544 kernel: /init Mar 6 01:41:27.196551 kernel: with environment: Mar 6 01:41:27.196558 kernel: HOME=/ Mar 6 01:41:27.196564 kernel: TERM=linux Mar 6 01:41:27.196573 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 6 01:41:27.196581 systemd[1]: Detected virtualization kvm. Mar 6 01:41:27.196592 systemd[1]: Detected architecture x86-64. Mar 6 01:41:27.196599 systemd[1]: Running in initrd. Mar 6 01:41:27.196605 systemd[1]: No hostname configured, using default hostname. Mar 6 01:41:27.196612 systemd[1]: Hostname set to . Mar 6 01:41:27.196620 systemd[1]: Initializing machine ID from VM UUID. Mar 6 01:41:27.196627 systemd[1]: Queued start job for default target initrd.target. Mar 6 01:41:27.196634 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 6 01:41:27.196641 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 6 01:41:27.196652 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 6 01:41:27.196659 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 6 01:41:27.196666 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 6 01:41:27.196674 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 6 01:41:27.196682 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 6 01:41:27.196690 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 6 01:41:27.196697 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 6 01:41:27.196707 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 6 01:41:27.196714 systemd[1]: Reached target paths.target - Path Units. Mar 6 01:41:27.196766 systemd[1]: Reached target slices.target - Slice Units. Mar 6 01:41:27.196778 systemd[1]: Reached target swap.target - Swaps. Mar 6 01:41:27.196801 systemd[1]: Reached target timers.target - Timer Units. Mar 6 01:41:27.196812 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 6 01:41:27.196822 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 6 01:41:27.196830 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 6 01:41:27.196837 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 6 01:41:27.196845 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 6 01:41:27.196852 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 6 01:41:27.196860 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 6 01:41:27.196871 systemd[1]: Reached target sockets.target - Socket Units. Mar 6 01:41:27.196884 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 6 01:41:27.196898 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 6 01:41:27.196917 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 6 01:41:27.196930 systemd[1]: Starting systemd-fsck-usr.service... Mar 6 01:41:27.196943 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 6 01:41:27.196957 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 6 01:41:27.196964 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 01:41:27.196997 systemd-journald[195]: Collecting audit messages is disabled. Mar 6 01:41:27.197018 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 6 01:41:27.197026 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 6 01:41:27.197034 systemd-journald[195]: Journal started Mar 6 01:41:27.197052 systemd-journald[195]: Runtime Journal (/run/log/journal/657639812852424480c50bacafc6a5bf) is 6.0M, max 48.4M, 42.3M free. Mar 6 01:41:27.204233 systemd[1]: Started systemd-journald.service - Journal Service. Mar 6 01:41:27.208694 systemd[1]: Finished systemd-fsck-usr.service. Mar 6 01:41:27.208871 systemd-modules-load[196]: Inserted module 'overlay' Mar 6 01:41:27.217393 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 6 01:41:27.220238 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 6 01:41:27.232640 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 6 01:41:27.234189 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 6 01:41:27.264446 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 6 01:41:27.269224 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 6 01:41:27.269025 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 6 01:41:27.287508 systemd-modules-load[196]: Inserted module 'br_netfilter' Mar 6 01:41:27.492407 kernel: Bridge firewalling registered Mar 6 01:41:27.288989 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 6 01:41:27.513504 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 6 01:41:27.514108 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 01:41:27.525222 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 6 01:41:27.557348 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 6 01:41:27.565577 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 6 01:41:27.576548 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 6 01:41:27.603380 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 6 01:41:27.617997 systemd-resolved[228]: Positive Trust Anchors: Mar 6 01:41:27.618042 systemd-resolved[228]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 6 01:41:27.618070 systemd-resolved[228]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 6 01:41:27.621088 systemd-resolved[228]: Defaulting to hostname 'linux'. Mar 6 01:41:27.623242 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 6 01:41:27.625937 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 6 01:41:27.688550 dracut-cmdline[232]: dracut-dracut-053 Mar 6 01:41:27.688550 dracut-cmdline[232]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=a6bcd99e714cc2f1b95dc0d61d9d762252de26a434f12074c16f59200c97ba9c Mar 6 01:41:27.913260 kernel: SCSI subsystem initialized Mar 6 01:41:27.927245 kernel: Loading iSCSI transport class v2.0-870. Mar 6 01:41:27.942211 kernel: iscsi: registered transport (tcp) Mar 6 01:41:27.967622 kernel: iscsi: registered transport (qla4xxx) Mar 6 01:41:27.967718 kernel: QLogic iSCSI HBA Driver Mar 6 01:41:28.031920 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 6 01:41:28.043484 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 6 01:41:28.077180 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 6 01:41:28.077232 kernel: device-mapper: uevent: version 1.0.3 Mar 6 01:41:28.080658 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 6 01:41:28.129266 kernel: raid6: avx2x4 gen() 28568 MB/s Mar 6 01:41:28.147251 kernel: raid6: avx2x2 gen() 26352 MB/s Mar 6 01:41:28.167028 kernel: raid6: avx2x1 gen() 22032 MB/s Mar 6 01:41:28.167110 kernel: raid6: using algorithm avx2x4 gen() 28568 MB/s Mar 6 01:41:28.187275 kernel: raid6: .... xor() 4204 MB/s, rmw enabled Mar 6 01:41:28.187389 kernel: raid6: using avx2x2 recovery algorithm Mar 6 01:41:28.211264 kernel: xor: automatically using best checksumming function avx Mar 6 01:41:28.390238 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 6 01:41:28.406486 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 6 01:41:28.420455 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 6 01:41:28.437551 systemd-udevd[414]: Using default interface naming scheme 'v255'. Mar 6 01:41:28.443544 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 6 01:41:28.461068 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 6 01:41:28.481636 dracut-pre-trigger[417]: rd.md=0: removing MD RAID activation Mar 6 01:41:28.529265 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 6 01:41:28.547442 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 6 01:41:28.664899 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 6 01:41:28.689333 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 6 01:41:28.707692 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Mar 6 01:41:28.708009 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 6 01:41:28.729838 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Mar 6 01:41:28.715645 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 6 01:41:28.743892 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 6 01:41:28.743926 kernel: GPT:9289727 != 19775487 Mar 6 01:41:28.743946 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 6 01:41:28.743960 kernel: GPT:9289727 != 19775487 Mar 6 01:41:28.743975 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 6 01:41:28.735304 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 6 01:41:28.757566 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 6 01:41:28.757082 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 6 01:41:28.777309 kernel: cryptd: max_cpu_qlen set to 1000 Mar 6 01:41:28.782195 kernel: libata version 3.00 loaded. Mar 6 01:41:28.783459 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 6 01:41:28.799197 kernel: ahci 0000:00:1f.2: version 3.0 Mar 6 01:41:28.799440 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 6 01:41:28.808351 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 6 01:41:28.823912 kernel: AVX2 version of gcm_enc/dec engaged. Mar 6 01:41:28.823947 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Mar 6 01:41:28.824262 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 6 01:41:28.824470 kernel: BTRFS: device fsid eccec0b1-0068-4620-ab61-f332f16460fa devid 1 transid 35 /dev/vda3 scanned by (udev-worker) (475) Mar 6 01:41:28.824488 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/vda6 scanned by (udev-worker) (461) Mar 6 01:41:28.808497 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 6 01:41:28.833219 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 6 01:41:28.845096 kernel: scsi host0: ahci Mar 6 01:41:28.862205 kernel: scsi host1: ahci Mar 6 01:41:28.862498 kernel: scsi host2: ahci Mar 6 01:41:28.862717 kernel: scsi host3: ahci Mar 6 01:41:28.862938 kernel: scsi host4: ahci Mar 6 01:41:28.863112 kernel: scsi host5: ahci Mar 6 01:41:28.863352 kernel: ata1: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4100 irq 31 Mar 6 01:41:28.863365 kernel: ata2: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4180 irq 31 Mar 6 01:41:28.863376 kernel: ata3: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4200 irq 31 Mar 6 01:41:28.861625 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 6 01:41:28.883366 kernel: ata4: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4280 irq 31 Mar 6 01:41:28.883402 kernel: ata5: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4300 irq 31 Mar 6 01:41:28.883450 kernel: ata6: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4380 irq 31 Mar 6 01:41:28.861872 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 01:41:28.879528 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 01:41:28.900232 kernel: AES CTR mode by8 optimization enabled Mar 6 01:41:28.905684 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 01:41:28.913435 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 6 01:41:28.928820 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 6 01:41:28.957580 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 6 01:41:29.124855 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 6 01:41:29.135924 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 6 01:41:29.146870 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 6 01:41:29.169692 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 6 01:41:29.178633 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 01:41:29.214010 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Mar 6 01:41:29.214068 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 6 01:41:29.214089 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 6 01:41:29.214107 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 6 01:41:29.214122 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Mar 6 01:41:29.214239 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 6 01:41:29.214261 kernel: ata3.00: applying bridge limits Mar 6 01:41:29.214282 kernel: ata1: SATA link down (SStatus 0 SControl 300) Mar 6 01:41:29.214301 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 6 01:41:29.214331 disk-uuid[553]: Primary Header is updated. Mar 6 01:41:29.214331 disk-uuid[553]: Secondary Entries is updated. Mar 6 01:41:29.214331 disk-uuid[553]: Secondary Header is updated. Mar 6 01:41:29.245939 kernel: ata3.00: configured for UDMA/100 Mar 6 01:41:29.245962 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 6 01:41:29.246001 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 6 01:41:29.246011 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 6 01:41:29.216459 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 6 01:41:29.258700 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 6 01:41:29.305866 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Mar 6 01:41:29.306327 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 6 01:41:29.323263 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Mar 6 01:41:30.243211 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 6 01:41:30.243992 disk-uuid[554]: The operation has completed successfully. Mar 6 01:41:30.283958 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 6 01:41:30.284357 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 6 01:41:30.306492 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 6 01:41:30.317884 sh[594]: Success Mar 6 01:41:30.339193 kernel: device-mapper: verity: sha256 using implementation "sha256-ni" Mar 6 01:41:30.391083 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 6 01:41:30.412029 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 6 01:41:30.415831 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 6 01:41:30.456536 kernel: BTRFS info (device dm-0): first mount of filesystem eccec0b1-0068-4620-ab61-f332f16460fa Mar 6 01:41:30.456589 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 6 01:41:30.456615 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 6 01:41:30.464641 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 6 01:41:30.464678 kernel: BTRFS info (device dm-0): using free space tree Mar 6 01:41:30.477855 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 6 01:41:30.478676 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 6 01:41:30.508433 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 6 01:41:30.518585 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 6 01:41:30.537701 kernel: BTRFS info (device vda6): first mount of filesystem dcd455b6-671f-4d9f-a5ce-de07977c88a5 Mar 6 01:41:30.537775 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 6 01:41:30.537788 kernel: BTRFS info (device vda6): using free space tree Mar 6 01:41:30.546185 kernel: BTRFS info (device vda6): auto enabling async discard Mar 6 01:41:30.561335 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 6 01:41:30.568868 kernel: BTRFS info (device vda6): last unmount of filesystem dcd455b6-671f-4d9f-a5ce-de07977c88a5 Mar 6 01:41:30.576030 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 6 01:41:30.588376 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 6 01:41:30.671613 ignition[678]: Ignition 2.19.0 Mar 6 01:41:30.672242 ignition[678]: Stage: fetch-offline Mar 6 01:41:30.672310 ignition[678]: no configs at "/usr/lib/ignition/base.d" Mar 6 01:41:30.672330 ignition[678]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 6 01:41:30.672513 ignition[678]: parsed url from cmdline: "" Mar 6 01:41:30.672521 ignition[678]: no config URL provided Mar 6 01:41:30.672532 ignition[678]: reading system config file "/usr/lib/ignition/user.ign" Mar 6 01:41:30.672548 ignition[678]: no config at "/usr/lib/ignition/user.ign" Mar 6 01:41:30.672591 ignition[678]: op(1): [started] loading QEMU firmware config module Mar 6 01:41:30.672601 ignition[678]: op(1): executing: "modprobe" "qemu_fw_cfg" Mar 6 01:41:30.683000 ignition[678]: op(1): [finished] loading QEMU firmware config module Mar 6 01:41:30.751831 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 6 01:41:30.769452 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 6 01:41:30.799033 systemd-networkd[783]: lo: Link UP Mar 6 01:41:30.799072 systemd-networkd[783]: lo: Gained carrier Mar 6 01:41:30.801456 systemd-networkd[783]: Enumeration completed Mar 6 01:41:30.803937 systemd-networkd[783]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 01:41:30.803942 systemd-networkd[783]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 6 01:41:30.804313 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 6 01:41:30.808249 systemd-networkd[783]: eth0: Link UP Mar 6 01:41:30.808256 systemd-networkd[783]: eth0: Gained carrier Mar 6 01:41:30.808267 systemd-networkd[783]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 01:41:30.809030 systemd[1]: Reached target network.target - Network. Mar 6 01:41:30.873229 systemd-networkd[783]: eth0: DHCPv4 address 10.0.0.120/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 6 01:41:30.973027 ignition[678]: parsing config with SHA512: bddf0546266bba3bec3cffb1f7f0e98762a2a0fc6ba246014f4d48244a26603c0aa1791406fcbad33653cf5718abacd9bfe9dbae546f899b364207e9c9c02dc0 Mar 6 01:41:30.977617 unknown[678]: fetched base config from "system" Mar 6 01:41:30.978061 unknown[678]: fetched user config from "qemu" Mar 6 01:41:30.978549 ignition[678]: fetch-offline: fetch-offline passed Mar 6 01:41:30.982812 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 6 01:41:30.978648 ignition[678]: Ignition finished successfully Mar 6 01:41:30.991301 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Mar 6 01:41:31.017428 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 6 01:41:31.037578 ignition[787]: Ignition 2.19.0 Mar 6 01:41:31.037609 ignition[787]: Stage: kargs Mar 6 01:41:31.041227 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 6 01:41:31.037810 ignition[787]: no configs at "/usr/lib/ignition/base.d" Mar 6 01:41:31.037823 ignition[787]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 6 01:41:31.038643 ignition[787]: kargs: kargs passed Mar 6 01:41:31.060417 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 6 01:41:31.038688 ignition[787]: Ignition finished successfully Mar 6 01:41:31.075410 ignition[795]: Ignition 2.19.0 Mar 6 01:41:31.077988 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 6 01:41:31.075420 ignition[795]: Stage: disks Mar 6 01:41:31.083794 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 6 01:41:31.075624 ignition[795]: no configs at "/usr/lib/ignition/base.d" Mar 6 01:41:31.087534 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 6 01:41:31.075637 ignition[795]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 6 01:41:31.093405 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 6 01:41:31.076537 ignition[795]: disks: disks passed Mar 6 01:41:31.096619 systemd[1]: Reached target sysinit.target - System Initialization. Mar 6 01:41:31.076585 ignition[795]: Ignition finished successfully Mar 6 01:41:31.099938 systemd[1]: Reached target basic.target - Basic System. Mar 6 01:41:31.152939 systemd-fsck[805]: ROOT: clean, 14/553520 files, 52654/553472 blocks Mar 6 01:41:31.122645 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 6 01:41:31.146929 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 6 01:41:31.162441 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 6 01:41:31.280237 kernel: EXT4-fs (vda9): mounted filesystem 6fb83788-0471-4e89-b45f-3a7586a627a9 r/w with ordered data mode. Quota mode: none. Mar 6 01:41:31.281686 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 6 01:41:31.282649 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 6 01:41:31.322123 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/vda6 scanned by mount (813) Mar 6 01:41:31.322202 kernel: BTRFS info (device vda6): first mount of filesystem dcd455b6-671f-4d9f-a5ce-de07977c88a5 Mar 6 01:41:31.322215 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 6 01:41:31.322226 kernel: BTRFS info (device vda6): using free space tree Mar 6 01:41:31.297302 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 6 01:41:31.333532 kernel: BTRFS info (device vda6): auto enabling async discard Mar 6 01:41:31.301857 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 6 01:41:31.322445 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 6 01:41:31.322504 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 6 01:41:31.322536 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 6 01:41:31.335009 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 6 01:41:31.340446 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 6 01:41:31.370383 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 6 01:41:31.420465 initrd-setup-root[837]: cut: /sysroot/etc/passwd: No such file or directory Mar 6 01:41:31.426222 initrd-setup-root[844]: cut: /sysroot/etc/group: No such file or directory Mar 6 01:41:31.435085 initrd-setup-root[851]: cut: /sysroot/etc/shadow: No such file or directory Mar 6 01:41:31.441259 initrd-setup-root[858]: cut: /sysroot/etc/gshadow: No such file or directory Mar 6 01:41:31.575840 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 6 01:41:31.602525 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 6 01:41:31.611372 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 6 01:41:31.625217 kernel: BTRFS info (device vda6): last unmount of filesystem dcd455b6-671f-4d9f-a5ce-de07977c88a5 Mar 6 01:41:31.617810 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 6 01:41:31.653073 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 6 01:41:31.665466 ignition[925]: INFO : Ignition 2.19.0 Mar 6 01:41:31.665466 ignition[925]: INFO : Stage: mount Mar 6 01:41:31.670644 ignition[925]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 6 01:41:31.670644 ignition[925]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 6 01:41:31.670644 ignition[925]: INFO : mount: mount passed Mar 6 01:41:31.670644 ignition[925]: INFO : Ignition finished successfully Mar 6 01:41:31.669849 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 6 01:41:31.693309 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 6 01:41:31.702451 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 6 01:41:31.726247 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 scanned by mount (940) Mar 6 01:41:31.726286 kernel: BTRFS info (device vda6): first mount of filesystem dcd455b6-671f-4d9f-a5ce-de07977c88a5 Mar 6 01:41:31.733555 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 6 01:41:31.733579 kernel: BTRFS info (device vda6): using free space tree Mar 6 01:41:31.743234 kernel: BTRFS info (device vda6): auto enabling async discard Mar 6 01:41:31.746396 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 6 01:41:31.782985 ignition[957]: INFO : Ignition 2.19.0 Mar 6 01:41:31.782985 ignition[957]: INFO : Stage: files Mar 6 01:41:31.789511 ignition[957]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 6 01:41:31.789511 ignition[957]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 6 01:41:31.789511 ignition[957]: DEBUG : files: compiled without relabeling support, skipping Mar 6 01:41:31.789511 ignition[957]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 6 01:41:31.789511 ignition[957]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 6 01:41:31.812885 ignition[957]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 6 01:41:31.812885 ignition[957]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 6 01:41:31.812885 ignition[957]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 6 01:41:31.812885 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 6 01:41:31.812885 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Mar 6 01:41:31.791978 unknown[957]: wrote ssh authorized keys file for user: core Mar 6 01:41:31.881424 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 6 01:41:31.990555 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 6 01:41:31.990555 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 6 01:41:32.004995 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 6 01:41:32.004995 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 6 01:41:32.004995 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 6 01:41:32.004995 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 6 01:41:32.004995 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 6 01:41:32.004995 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 6 01:41:32.004995 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 6 01:41:32.004995 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 6 01:41:32.004995 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 6 01:41:32.004995 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 6 01:41:32.004995 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 6 01:41:32.004995 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 6 01:41:32.004995 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.4-x86-64.raw: attempt #1 Mar 6 01:41:32.020348 systemd-networkd[783]: eth0: Gained IPv6LL Mar 6 01:41:32.294032 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 6 01:41:32.886782 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 6 01:41:32.886782 ignition[957]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 6 01:41:32.901395 ignition[957]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 6 01:41:32.901395 ignition[957]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 6 01:41:32.901395 ignition[957]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 6 01:41:32.901395 ignition[957]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 6 01:41:32.901395 ignition[957]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 6 01:41:32.901395 ignition[957]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 6 01:41:32.901395 ignition[957]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 6 01:41:32.901395 ignition[957]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Mar 6 01:41:32.950937 ignition[957]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Mar 6 01:41:32.950937 ignition[957]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Mar 6 01:41:32.950937 ignition[957]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Mar 6 01:41:32.950937 ignition[957]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Mar 6 01:41:32.950937 ignition[957]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Mar 6 01:41:32.950937 ignition[957]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 6 01:41:32.950937 ignition[957]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 6 01:41:32.950937 ignition[957]: INFO : files: files passed Mar 6 01:41:32.950937 ignition[957]: INFO : Ignition finished successfully Mar 6 01:41:32.938765 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 6 01:41:33.009515 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 6 01:41:33.017557 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 6 01:41:33.024636 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 6 01:41:33.024843 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 6 01:41:33.040466 initrd-setup-root-after-ignition[985]: grep: /sysroot/oem/oem-release: No such file or directory Mar 6 01:41:33.045382 initrd-setup-root-after-ignition[988]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 6 01:41:33.045382 initrd-setup-root-after-ignition[988]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 6 01:41:33.040759 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 6 01:41:33.069046 initrd-setup-root-after-ignition[992]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 6 01:41:33.050335 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 6 01:41:33.079593 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 6 01:41:33.117621 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 6 01:41:33.117941 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 6 01:41:33.127897 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 6 01:41:33.135428 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 6 01:41:33.142796 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 6 01:41:33.160534 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 6 01:41:33.177806 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 6 01:41:33.201440 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 6 01:41:33.217064 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 6 01:41:33.224900 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 6 01:41:33.232969 systemd[1]: Stopped target timers.target - Timer Units. Mar 6 01:41:33.239250 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 6 01:41:33.242481 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 6 01:41:33.250926 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 6 01:41:33.258022 systemd[1]: Stopped target basic.target - Basic System. Mar 6 01:41:33.264287 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 6 01:41:33.271775 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 6 01:41:33.279445 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 6 01:41:33.286951 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 6 01:41:33.293927 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 6 01:41:33.302314 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 6 01:41:33.309338 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 6 01:41:33.316250 systemd[1]: Stopped target swap.target - Swaps. Mar 6 01:41:33.323251 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 6 01:41:33.327391 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 6 01:41:33.334936 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 6 01:41:33.342340 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 6 01:41:33.350286 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 6 01:41:33.353422 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 6 01:41:33.362242 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 6 01:41:33.365650 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 6 01:41:33.372992 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 6 01:41:33.376547 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 6 01:41:33.384770 systemd[1]: Stopped target paths.target - Path Units. Mar 6 01:41:33.391035 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 6 01:41:33.395256 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 6 01:41:33.404462 systemd[1]: Stopped target slices.target - Slice Units. Mar 6 01:41:33.412461 systemd[1]: Stopped target sockets.target - Socket Units. Mar 6 01:41:33.420351 systemd[1]: iscsid.socket: Deactivated successfully. Mar 6 01:41:33.423428 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 6 01:41:33.430246 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 6 01:41:33.433252 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 6 01:41:33.440335 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 6 01:41:33.444287 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 6 01:41:33.453222 systemd[1]: ignition-files.service: Deactivated successfully. Mar 6 01:41:33.456411 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 6 01:41:33.476382 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 6 01:41:33.483559 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 6 01:41:33.489684 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 6 01:41:33.490493 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 6 01:41:33.504908 ignition[1013]: INFO : Ignition 2.19.0 Mar 6 01:41:33.504908 ignition[1013]: INFO : Stage: umount Mar 6 01:41:33.504908 ignition[1013]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 6 01:41:33.504908 ignition[1013]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 6 01:41:33.504908 ignition[1013]: INFO : umount: umount passed Mar 6 01:41:33.504908 ignition[1013]: INFO : Ignition finished successfully Mar 6 01:41:33.497580 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 6 01:41:33.497881 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 6 01:41:33.509719 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 6 01:41:33.509991 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 6 01:41:33.516293 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 6 01:41:33.518090 systemd[1]: Stopped target network.target - Network. Mar 6 01:41:33.522832 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 6 01:41:33.522908 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 6 01:41:33.531708 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 6 01:41:33.541334 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 6 01:41:33.551924 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 6 01:41:33.558199 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 6 01:41:33.582390 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 6 01:41:33.582509 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 6 01:41:33.593077 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 6 01:41:33.600449 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 6 01:41:33.608263 systemd-networkd[783]: eth0: DHCPv6 lease lost Mar 6 01:41:33.611961 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 6 01:41:33.615268 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 6 01:41:33.624341 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 6 01:41:33.627629 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 6 01:41:33.635390 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 6 01:41:33.638602 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 6 01:41:33.647378 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 6 01:41:33.650510 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 6 01:41:33.660539 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 6 01:41:33.660632 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 6 01:41:33.670914 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 6 01:41:33.671007 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 6 01:41:33.698429 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 6 01:41:33.702910 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 6 01:41:33.702997 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 6 01:41:33.713256 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 6 01:41:33.713320 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 6 01:41:33.720703 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 6 01:41:33.720799 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 6 01:41:33.726477 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 6 01:41:33.726549 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 6 01:41:33.735637 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 6 01:41:33.777337 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 6 01:41:33.777559 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 6 01:41:33.792424 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 6 01:41:33.792718 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 6 01:41:33.803842 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 6 01:41:33.803963 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 6 01:41:33.810578 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 6 01:41:33.810639 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 6 01:41:33.814063 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 6 01:41:33.814217 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 6 01:41:33.826983 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 6 01:41:33.827063 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 6 01:41:33.839392 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 6 01:41:33.839472 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 6 01:41:33.857411 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 6 01:41:33.861254 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 6 01:41:33.861331 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 6 01:41:33.869354 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 6 01:41:33.869414 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 6 01:41:33.877230 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 6 01:41:33.877286 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 6 01:41:33.881471 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 6 01:41:33.881527 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 01:41:33.889837 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 6 01:41:33.889988 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 6 01:41:33.897850 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 6 01:41:33.925440 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 6 01:41:33.935566 systemd[1]: Switching root. Mar 6 01:41:33.971642 systemd-journald[195]: Journal stopped Mar 6 01:41:35.483547 systemd-journald[195]: Received SIGTERM from PID 1 (systemd). Mar 6 01:41:35.483616 kernel: SELinux: policy capability network_peer_controls=1 Mar 6 01:41:35.483636 kernel: SELinux: policy capability open_perms=1 Mar 6 01:41:35.483648 kernel: SELinux: policy capability extended_socket_class=1 Mar 6 01:41:35.483672 kernel: SELinux: policy capability always_check_network=0 Mar 6 01:41:35.483684 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 6 01:41:35.483695 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 6 01:41:35.483706 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 6 01:41:35.483717 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 6 01:41:35.483778 kernel: audit: type=1403 audit(1772761294.157:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 6 01:41:35.483794 systemd[1]: Successfully loaded SELinux policy in 53.866ms. Mar 6 01:41:35.483820 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 14.471ms. Mar 6 01:41:35.483834 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 6 01:41:35.483849 systemd[1]: Detected virtualization kvm. Mar 6 01:41:35.483861 systemd[1]: Detected architecture x86-64. Mar 6 01:41:35.483873 systemd[1]: Detected first boot. Mar 6 01:41:35.483885 systemd[1]: Initializing machine ID from VM UUID. Mar 6 01:41:35.483896 zram_generator::config[1056]: No configuration found. Mar 6 01:41:35.483914 systemd[1]: Populated /etc with preset unit settings. Mar 6 01:41:35.483926 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 6 01:41:35.483938 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 6 01:41:35.483955 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 6 01:41:35.483967 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 6 01:41:35.483979 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 6 01:41:35.483990 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 6 01:41:35.484002 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 6 01:41:35.484014 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 6 01:41:35.484026 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 6 01:41:35.484037 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 6 01:41:35.484052 systemd[1]: Created slice user.slice - User and Session Slice. Mar 6 01:41:35.484064 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 6 01:41:35.484076 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 6 01:41:35.484087 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 6 01:41:35.484099 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 6 01:41:35.484111 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 6 01:41:35.484123 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 6 01:41:35.484191 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 6 01:41:35.484204 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 6 01:41:35.484220 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 6 01:41:35.484231 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 6 01:41:35.484243 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 6 01:41:35.484255 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 6 01:41:35.484268 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 6 01:41:35.484280 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 6 01:41:35.484292 systemd[1]: Reached target slices.target - Slice Units. Mar 6 01:41:35.484304 systemd[1]: Reached target swap.target - Swaps. Mar 6 01:41:35.484319 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 6 01:41:35.484331 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 6 01:41:35.484343 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 6 01:41:35.484355 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 6 01:41:35.484367 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 6 01:41:35.484378 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 6 01:41:35.484390 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 6 01:41:35.484402 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 6 01:41:35.484413 systemd[1]: Mounting media.mount - External Media Directory... Mar 6 01:41:35.484428 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 6 01:41:35.484440 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 6 01:41:35.484451 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 6 01:41:35.484463 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 6 01:41:35.484474 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 6 01:41:35.484486 systemd[1]: Reached target machines.target - Containers. Mar 6 01:41:35.484497 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 6 01:41:35.484509 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 6 01:41:35.484525 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 6 01:41:35.484536 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 6 01:41:35.484548 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 6 01:41:35.484560 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 6 01:41:35.484571 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 6 01:41:35.484583 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 6 01:41:35.484622 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 6 01:41:35.484634 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 6 01:41:35.484646 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 6 01:41:35.484661 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 6 01:41:35.484673 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 6 01:41:35.484685 systemd[1]: Stopped systemd-fsck-usr.service. Mar 6 01:41:35.484718 kernel: loop: module loaded Mar 6 01:41:35.484769 kernel: fuse: init (API version 7.39) Mar 6 01:41:35.484782 kernel: ACPI: bus type drm_connector registered Mar 6 01:41:35.484793 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 6 01:41:35.484805 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 6 01:41:35.484817 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 6 01:41:35.484878 systemd-journald[1140]: Collecting audit messages is disabled. Mar 6 01:41:35.484902 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 6 01:41:35.484914 systemd-journald[1140]: Journal started Mar 6 01:41:35.484936 systemd-journald[1140]: Runtime Journal (/run/log/journal/657639812852424480c50bacafc6a5bf) is 6.0M, max 48.4M, 42.3M free. Mar 6 01:41:34.931237 systemd[1]: Queued start job for default target multi-user.target. Mar 6 01:41:34.952406 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 6 01:41:34.953402 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 6 01:41:34.953927 systemd[1]: systemd-journald.service: Consumed 1.457s CPU time. Mar 6 01:41:35.501198 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 6 01:41:35.508996 systemd[1]: verity-setup.service: Deactivated successfully. Mar 6 01:41:35.509029 systemd[1]: Stopped verity-setup.service. Mar 6 01:41:35.519197 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 6 01:41:35.529033 systemd[1]: Started systemd-journald.service - Journal Service. Mar 6 01:41:35.530274 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 6 01:41:35.533835 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 6 01:41:35.537574 systemd[1]: Mounted media.mount - External Media Directory. Mar 6 01:41:35.541005 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 6 01:41:35.544775 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 6 01:41:35.548498 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 6 01:41:35.552081 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 6 01:41:35.556467 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 6 01:41:35.560931 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 6 01:41:35.561221 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 6 01:41:35.565615 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 6 01:41:35.565911 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 6 01:41:35.570058 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 6 01:41:35.570366 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 6 01:41:35.574285 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 6 01:41:35.574514 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 6 01:41:35.578868 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 6 01:41:35.579101 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 6 01:41:35.583030 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 6 01:41:35.583365 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 6 01:41:35.587299 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 6 01:41:35.591315 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 6 01:41:35.595773 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 6 01:41:35.613439 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 6 01:41:35.626286 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 6 01:41:35.631466 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 6 01:41:35.635023 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 6 01:41:35.635084 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 6 01:41:35.639591 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Mar 6 01:41:35.645101 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 6 01:41:35.650343 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 6 01:41:35.653958 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 6 01:41:35.656376 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 6 01:41:35.661550 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 6 01:41:35.665554 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 6 01:41:35.668590 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 6 01:41:35.672439 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 6 01:41:35.675536 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 6 01:41:35.680884 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 6 01:41:35.689582 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 6 01:41:35.697853 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 6 01:41:35.704563 systemd-journald[1140]: Time spent on flushing to /var/log/journal/657639812852424480c50bacafc6a5bf is 12.578ms for 945 entries. Mar 6 01:41:35.704563 systemd-journald[1140]: System Journal (/var/log/journal/657639812852424480c50bacafc6a5bf) is 8.0M, max 195.6M, 187.6M free. Mar 6 01:41:35.726472 kernel: loop0: detected capacity change from 0 to 140768 Mar 6 01:41:35.726497 systemd-journald[1140]: Received client request to flush runtime journal. Mar 6 01:41:35.710358 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 6 01:41:35.716249 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 6 01:41:35.723509 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 6 01:41:35.729642 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 6 01:41:35.735609 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 6 01:41:35.745773 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 6 01:41:35.752426 systemd-tmpfiles[1171]: ACLs are not supported, ignoring. Mar 6 01:41:35.752444 systemd-tmpfiles[1171]: ACLs are not supported, ignoring. Mar 6 01:41:35.759646 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Mar 6 01:41:35.769294 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 6 01:41:35.774057 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 6 01:41:35.777455 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 6 01:41:35.782859 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 6 01:41:35.799479 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 6 01:41:35.804653 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 6 01:41:35.805826 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Mar 6 01:41:35.819287 udevadm[1185]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Mar 6 01:41:35.824385 kernel: loop1: detected capacity change from 0 to 219192 Mar 6 01:41:35.840477 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 6 01:41:35.851474 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 6 01:41:35.875685 systemd-tmpfiles[1193]: ACLs are not supported, ignoring. Mar 6 01:41:35.875766 systemd-tmpfiles[1193]: ACLs are not supported, ignoring. Mar 6 01:41:35.878318 kernel: loop2: detected capacity change from 0 to 142488 Mar 6 01:41:35.882205 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 6 01:41:35.942412 kernel: loop3: detected capacity change from 0 to 140768 Mar 6 01:41:35.969209 kernel: loop4: detected capacity change from 0 to 219192 Mar 6 01:41:35.988498 kernel: loop5: detected capacity change from 0 to 142488 Mar 6 01:41:36.013337 (sd-merge)[1197]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Mar 6 01:41:36.015054 (sd-merge)[1197]: Merged extensions into '/usr'. Mar 6 01:41:36.020035 systemd[1]: Reloading requested from client PID 1170 ('systemd-sysext') (unit systemd-sysext.service)... Mar 6 01:41:36.020304 systemd[1]: Reloading... Mar 6 01:41:36.098116 zram_generator::config[1226]: No configuration found. Mar 6 01:41:36.120914 ldconfig[1165]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 6 01:41:36.264045 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 6 01:41:36.332500 systemd[1]: Reloading finished in 311 ms. Mar 6 01:41:36.376470 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 6 01:41:36.381295 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 6 01:41:36.386265 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 6 01:41:36.413522 systemd[1]: Starting ensure-sysext.service... Mar 6 01:41:36.417596 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 6 01:41:36.423289 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 6 01:41:36.429007 systemd[1]: Reloading requested from client PID 1261 ('systemctl') (unit ensure-sysext.service)... Mar 6 01:41:36.429019 systemd[1]: Reloading... Mar 6 01:41:36.450676 systemd-tmpfiles[1262]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 6 01:41:36.451568 systemd-tmpfiles[1262]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 6 01:41:36.452856 systemd-tmpfiles[1262]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 6 01:41:36.453314 systemd-tmpfiles[1262]: ACLs are not supported, ignoring. Mar 6 01:41:36.453460 systemd-tmpfiles[1262]: ACLs are not supported, ignoring. Mar 6 01:41:36.457724 systemd-tmpfiles[1262]: Detected autofs mount point /boot during canonicalization of boot. Mar 6 01:41:36.457873 systemd-tmpfiles[1262]: Skipping /boot Mar 6 01:41:36.465607 systemd-udevd[1263]: Using default interface naming scheme 'v255'. Mar 6 01:41:36.482468 systemd-tmpfiles[1262]: Detected autofs mount point /boot during canonicalization of boot. Mar 6 01:41:36.483265 systemd-tmpfiles[1262]: Skipping /boot Mar 6 01:41:36.500240 zram_generator::config[1288]: No configuration found. Mar 6 01:41:36.572316 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 35 scanned by (udev-worker) (1300) Mar 6 01:41:36.631194 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Mar 6 01:41:36.637263 kernel: ACPI: button: Power Button [PWRF] Mar 6 01:41:36.654399 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 6 01:41:36.654894 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Mar 6 01:41:36.655315 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 6 01:41:36.689244 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Mar 6 01:41:36.698063 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 6 01:41:36.723257 kernel: mousedev: PS/2 mouse device common for all mice Mar 6 01:41:36.847301 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 6 01:41:36.852053 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 6 01:41:36.852417 systemd[1]: Reloading finished in 422 ms. Mar 6 01:41:36.869839 kernel: kvm_amd: TSC scaling supported Mar 6 01:41:36.869916 kernel: kvm_amd: Nested Virtualization enabled Mar 6 01:41:36.869936 kernel: kvm_amd: Nested Paging enabled Mar 6 01:41:36.873790 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Mar 6 01:41:36.873839 kernel: kvm_amd: PMU virtualization is disabled Mar 6 01:41:36.928013 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 6 01:41:36.947536 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 6 01:41:36.965189 kernel: EDAC MC: Ver: 3.0.0 Mar 6 01:41:36.975393 systemd[1]: Finished ensure-sysext.service. Mar 6 01:41:36.991383 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 6 01:41:37.008876 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 6 01:41:37.020586 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 6 01:41:37.027016 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 6 01:41:37.033193 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 6 01:41:37.036362 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 6 01:41:37.046491 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 6 01:41:37.052415 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 6 01:41:37.060562 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 6 01:41:37.068009 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 6 01:41:37.073666 lvm[1368]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 6 01:41:37.072497 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 6 01:41:37.074121 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 6 01:41:37.080655 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 6 01:41:37.091939 augenrules[1383]: No rules Mar 6 01:41:37.092386 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 6 01:41:37.103214 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 6 01:41:37.119394 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 6 01:41:37.125115 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 6 01:41:37.131464 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 01:41:37.137696 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 6 01:41:37.139591 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 6 01:41:37.145877 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 6 01:41:37.152915 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 6 01:41:37.153428 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 6 01:41:37.159880 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 6 01:41:37.165876 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 6 01:41:37.166249 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 6 01:41:37.170635 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 6 01:41:37.171022 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 6 01:41:37.175694 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 6 01:41:37.175992 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 6 01:41:37.180451 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 6 01:41:37.185296 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 6 01:41:37.212367 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 6 01:41:37.227422 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 6 01:41:37.227573 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 6 01:41:37.227650 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 6 01:41:37.229890 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 6 01:41:37.234635 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 6 01:41:37.235945 lvm[1405]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 6 01:41:37.238369 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 6 01:41:37.239347 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 6 01:41:37.257386 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 6 01:41:37.287594 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 6 01:41:37.292673 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 6 01:41:37.381657 systemd-networkd[1388]: lo: Link UP Mar 6 01:41:37.382076 systemd-networkd[1388]: lo: Gained carrier Mar 6 01:41:37.384688 systemd-networkd[1388]: Enumeration completed Mar 6 01:41:37.385970 systemd-networkd[1388]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 01:41:37.386031 systemd-networkd[1388]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 6 01:41:37.387429 systemd-networkd[1388]: eth0: Link UP Mar 6 01:41:37.387491 systemd-networkd[1388]: eth0: Gained carrier Mar 6 01:41:37.387539 systemd-networkd[1388]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 01:41:37.393373 systemd-resolved[1389]: Positive Trust Anchors: Mar 6 01:41:37.393417 systemd-resolved[1389]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 6 01:41:37.393445 systemd-resolved[1389]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 6 01:41:37.398977 systemd-resolved[1389]: Defaulting to hostname 'linux'. Mar 6 01:41:37.403212 systemd-networkd[1388]: eth0: DHCPv4 address 10.0.0.120/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 6 01:41:37.404049 systemd-timesyncd[1391]: Network configuration changed, trying to establish connection. Mar 6 01:41:38.148320 systemd-timesyncd[1391]: Contacted time server 10.0.0.1:123 (10.0.0.1). Mar 6 01:41:38.148424 systemd-timesyncd[1391]: Initial clock synchronization to Fri 2026-03-06 01:41:38.148127 UTC. Mar 6 01:41:38.148460 systemd-resolved[1389]: Clock change detected. Flushing caches. Mar 6 01:41:38.227459 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 6 01:41:38.233463 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 6 01:41:38.238470 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 6 01:41:38.244535 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 01:41:38.251692 systemd[1]: Reached target network.target - Network. Mar 6 01:41:38.256950 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 6 01:41:38.263208 systemd[1]: Reached target sysinit.target - System Initialization. Mar 6 01:41:38.269680 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 6 01:41:38.274774 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 6 01:41:38.279623 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 6 01:41:38.284574 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 6 01:41:38.284640 systemd[1]: Reached target paths.target - Path Units. Mar 6 01:41:38.288298 systemd[1]: Reached target time-set.target - System Time Set. Mar 6 01:41:38.292497 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 6 01:41:38.297196 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 6 01:41:38.302934 systemd[1]: Reached target timers.target - Timer Units. Mar 6 01:41:38.307525 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 6 01:41:38.313731 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 6 01:41:38.330394 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 6 01:41:38.336365 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 6 01:41:38.341556 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 6 01:41:38.346999 systemd[1]: Reached target sockets.target - Socket Units. Mar 6 01:41:38.351515 systemd[1]: Reached target basic.target - Basic System. Mar 6 01:41:38.355363 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 6 01:41:38.355420 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 6 01:41:38.357096 systemd[1]: Starting containerd.service - containerd container runtime... Mar 6 01:41:38.363147 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 6 01:41:38.369154 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 6 01:41:38.375461 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 6 01:41:38.380086 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 6 01:41:38.382091 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 6 01:41:38.384568 jq[1428]: false Mar 6 01:41:38.386139 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 6 01:41:38.396537 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 6 01:41:38.404547 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 6 01:41:38.405969 dbus-daemon[1427]: [system] SELinux support is enabled Mar 6 01:41:38.412329 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 6 01:41:38.417593 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 6 01:41:38.418377 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 6 01:41:38.421477 systemd[1]: Starting update-engine.service - Update Engine... Mar 6 01:41:38.426990 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 6 01:41:38.429804 extend-filesystems[1429]: Found loop3 Mar 6 01:41:38.429804 extend-filesystems[1429]: Found loop4 Mar 6 01:41:38.444458 extend-filesystems[1429]: Found loop5 Mar 6 01:41:38.444458 extend-filesystems[1429]: Found sr0 Mar 6 01:41:38.444458 extend-filesystems[1429]: Found vda Mar 6 01:41:38.444458 extend-filesystems[1429]: Found vda1 Mar 6 01:41:38.444458 extend-filesystems[1429]: Found vda2 Mar 6 01:41:38.444458 extend-filesystems[1429]: Found vda3 Mar 6 01:41:38.444458 extend-filesystems[1429]: Found usr Mar 6 01:41:38.444458 extend-filesystems[1429]: Found vda4 Mar 6 01:41:38.444458 extend-filesystems[1429]: Found vda6 Mar 6 01:41:38.444458 extend-filesystems[1429]: Found vda7 Mar 6 01:41:38.444458 extend-filesystems[1429]: Found vda9 Mar 6 01:41:38.444458 extend-filesystems[1429]: Checking size of /dev/vda9 Mar 6 01:41:38.487991 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Mar 6 01:41:38.488099 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 35 scanned by (udev-worker) (1300) Mar 6 01:41:38.436966 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 6 01:41:38.489241 extend-filesystems[1429]: Resized partition /dev/vda9 Mar 6 01:41:38.460777 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 6 01:41:38.493168 extend-filesystems[1451]: resize2fs 1.47.1 (20-May-2024) Mar 6 01:41:38.497398 jq[1444]: true Mar 6 01:41:38.461162 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 6 01:41:38.461679 systemd[1]: motdgen.service: Deactivated successfully. Mar 6 01:41:38.461946 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 6 01:41:38.508844 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 6 01:41:38.509135 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 6 01:41:38.533952 update_engine[1441]: I20260306 01:41:38.533525 1441 main.cc:92] Flatcar Update Engine starting Mar 6 01:41:38.541930 (ntainerd)[1455]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 6 01:41:38.543911 update_engine[1441]: I20260306 01:41:38.543416 1441 update_check_scheduler.cc:74] Next update check in 6m33s Mar 6 01:41:38.547653 tar[1452]: linux-amd64/LICENSE Mar 6 01:41:38.548061 tar[1452]: linux-amd64/helm Mar 6 01:41:38.563887 jq[1454]: true Mar 6 01:41:38.569201 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Mar 6 01:41:38.587759 systemd[1]: Started update-engine.service - Update Engine. Mar 6 01:41:38.593025 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 6 01:41:38.593064 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 6 01:41:38.598917 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 6 01:41:38.598950 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 6 01:41:38.602192 systemd-logind[1440]: Watching system buttons on /dev/input/event1 (Power Button) Mar 6 01:41:38.602236 systemd-logind[1440]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 6 01:41:38.604674 extend-filesystems[1451]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 6 01:41:38.604674 extend-filesystems[1451]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 6 01:41:38.604674 extend-filesystems[1451]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Mar 6 01:41:38.625955 extend-filesystems[1429]: Resized filesystem in /dev/vda9 Mar 6 01:41:38.605824 systemd-logind[1440]: New seat seat0. Mar 6 01:41:38.630391 sshd_keygen[1446]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 6 01:41:38.640598 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 6 01:41:38.647761 systemd[1]: Started systemd-logind.service - User Login Management. Mar 6 01:41:38.656537 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 6 01:41:38.656922 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 6 01:41:38.675692 bash[1486]: Updated "/home/core/.ssh/authorized_keys" Mar 6 01:41:38.677000 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 6 01:41:38.689905 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 6 01:41:38.713777 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 6 01:41:38.720721 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 6 01:41:38.723358 locksmithd[1472]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 6 01:41:38.731925 systemd[1]: issuegen.service: Deactivated successfully. Mar 6 01:41:38.732406 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 6 01:41:38.744726 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 6 01:41:38.763495 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 6 01:41:38.778011 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 6 01:41:38.784633 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 6 01:41:38.789492 containerd[1455]: time="2026-03-06T01:41:38.785764621Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Mar 6 01:41:38.791025 systemd[1]: Reached target getty.target - Login Prompts. Mar 6 01:41:38.807415 containerd[1455]: time="2026-03-06T01:41:38.807337131Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 6 01:41:38.810200 containerd[1455]: time="2026-03-06T01:41:38.810126520Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 6 01:41:38.810200 containerd[1455]: time="2026-03-06T01:41:38.810176183Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 6 01:41:38.810200 containerd[1455]: time="2026-03-06T01:41:38.810192132Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 6 01:41:38.810490 containerd[1455]: time="2026-03-06T01:41:38.810428704Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 6 01:41:38.810490 containerd[1455]: time="2026-03-06T01:41:38.810475101Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 6 01:41:38.810589 containerd[1455]: time="2026-03-06T01:41:38.810547125Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 6 01:41:38.810589 containerd[1455]: time="2026-03-06T01:41:38.810584976Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 6 01:41:38.810905 containerd[1455]: time="2026-03-06T01:41:38.810793766Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 6 01:41:38.810905 containerd[1455]: time="2026-03-06T01:41:38.810835965Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 6 01:41:38.810905 containerd[1455]: time="2026-03-06T01:41:38.810882902Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 6 01:41:38.810905 containerd[1455]: time="2026-03-06T01:41:38.810894183Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 6 01:41:38.811041 containerd[1455]: time="2026-03-06T01:41:38.810998468Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 6 01:41:38.811396 containerd[1455]: time="2026-03-06T01:41:38.811358470Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 6 01:41:38.811569 containerd[1455]: time="2026-03-06T01:41:38.811528838Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 6 01:41:38.811569 containerd[1455]: time="2026-03-06T01:41:38.811565527Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 6 01:41:38.811706 containerd[1455]: time="2026-03-06T01:41:38.811665594Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 6 01:41:38.811793 containerd[1455]: time="2026-03-06T01:41:38.811755501Z" level=info msg="metadata content store policy set" policy=shared Mar 6 01:41:38.817109 containerd[1455]: time="2026-03-06T01:41:38.817053825Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 6 01:41:38.817149 containerd[1455]: time="2026-03-06T01:41:38.817136830Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 6 01:41:38.817183 containerd[1455]: time="2026-03-06T01:41:38.817154143Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 6 01:41:38.817327 containerd[1455]: time="2026-03-06T01:41:38.817213183Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 6 01:41:38.817398 containerd[1455]: time="2026-03-06T01:41:38.817356862Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 6 01:41:38.817569 containerd[1455]: time="2026-03-06T01:41:38.817497113Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 6 01:41:38.817790 containerd[1455]: time="2026-03-06T01:41:38.817734106Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 6 01:41:38.818004 containerd[1455]: time="2026-03-06T01:41:38.817951271Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 6 01:41:38.818004 containerd[1455]: time="2026-03-06T01:41:38.817996696Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 6 01:41:38.818049 containerd[1455]: time="2026-03-06T01:41:38.818009931Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 6 01:41:38.818049 containerd[1455]: time="2026-03-06T01:41:38.818022585Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 6 01:41:38.818049 containerd[1455]: time="2026-03-06T01:41:38.818034466Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 6 01:41:38.818049 containerd[1455]: time="2026-03-06T01:41:38.818045096Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 6 01:41:38.818126 containerd[1455]: time="2026-03-06T01:41:38.818057590Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 6 01:41:38.818126 containerd[1455]: time="2026-03-06T01:41:38.818076705Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 6 01:41:38.818126 containerd[1455]: time="2026-03-06T01:41:38.818088457Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 6 01:41:38.818126 containerd[1455]: time="2026-03-06T01:41:38.818099668Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 6 01:41:38.818126 containerd[1455]: time="2026-03-06T01:41:38.818110609Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 6 01:41:38.818310 containerd[1455]: time="2026-03-06T01:41:38.818127921Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818310 containerd[1455]: time="2026-03-06T01:41:38.818139773Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818310 containerd[1455]: time="2026-03-06T01:41:38.818150724Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818310 containerd[1455]: time="2026-03-06T01:41:38.818161284Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818310 containerd[1455]: time="2026-03-06T01:41:38.818171492Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818310 containerd[1455]: time="2026-03-06T01:41:38.818182363Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818310 containerd[1455]: time="2026-03-06T01:41:38.818192331Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818310 containerd[1455]: time="2026-03-06T01:41:38.818202821Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818310 containerd[1455]: time="2026-03-06T01:41:38.818213792Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818310 containerd[1455]: time="2026-03-06T01:41:38.818227036Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818310 containerd[1455]: time="2026-03-06T01:41:38.818303579Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818498 containerd[1455]: time="2026-03-06T01:41:38.818325740Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818498 containerd[1455]: time="2026-03-06T01:41:38.818337703Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818498 containerd[1455]: time="2026-03-06T01:41:38.818351519Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 6 01:41:38.818498 containerd[1455]: time="2026-03-06T01:41:38.818368340Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818498 containerd[1455]: time="2026-03-06T01:41:38.818378759Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818498 containerd[1455]: time="2026-03-06T01:41:38.818387856Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 6 01:41:38.818498 containerd[1455]: time="2026-03-06T01:41:38.818457466Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 6 01:41:38.818498 containerd[1455]: time="2026-03-06T01:41:38.818473817Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 6 01:41:38.818498 containerd[1455]: time="2026-03-06T01:41:38.818483435Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 6 01:41:38.818498 containerd[1455]: time="2026-03-06T01:41:38.818493574Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 6 01:41:38.818646 containerd[1455]: time="2026-03-06T01:41:38.818502561Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818646 containerd[1455]: time="2026-03-06T01:41:38.818550580Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 6 01:41:38.818646 containerd[1455]: time="2026-03-06T01:41:38.818561080Z" level=info msg="NRI interface is disabled by configuration." Mar 6 01:41:38.818646 containerd[1455]: time="2026-03-06T01:41:38.818570197Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 6 01:41:38.818897 containerd[1455]: time="2026-03-06T01:41:38.818780610Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 6 01:41:38.819071 containerd[1455]: time="2026-03-06T01:41:38.818903219Z" level=info msg="Connect containerd service" Mar 6 01:41:38.819071 containerd[1455]: time="2026-03-06T01:41:38.818936972Z" level=info msg="using legacy CRI server" Mar 6 01:41:38.819071 containerd[1455]: time="2026-03-06T01:41:38.818944797Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 6 01:41:38.819071 containerd[1455]: time="2026-03-06T01:41:38.819010319Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 6 01:41:38.819696 containerd[1455]: time="2026-03-06T01:41:38.819651717Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 6 01:41:38.820126 containerd[1455]: time="2026-03-06T01:41:38.820047366Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 6 01:41:38.820126 containerd[1455]: time="2026-03-06T01:41:38.820063490Z" level=info msg="Start subscribing containerd event" Mar 6 01:41:38.820126 containerd[1455]: time="2026-03-06T01:41:38.820164284Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 6 01:41:38.820298 containerd[1455]: time="2026-03-06T01:41:38.820172483Z" level=info msg="Start recovering state" Mar 6 01:41:38.820352 containerd[1455]: time="2026-03-06T01:41:38.820334165Z" level=info msg="Start event monitor" Mar 6 01:41:38.821480 containerd[1455]: time="2026-03-06T01:41:38.821450319Z" level=info msg="Start snapshots syncer" Mar 6 01:41:38.821596 containerd[1455]: time="2026-03-06T01:41:38.821527353Z" level=info msg="Start cni network conf syncer for default" Mar 6 01:41:38.821647 containerd[1455]: time="2026-03-06T01:41:38.821634162Z" level=info msg="Start streaming server" Mar 6 01:41:38.822196 containerd[1455]: time="2026-03-06T01:41:38.822128556Z" level=info msg="containerd successfully booted in 0.037406s" Mar 6 01:41:38.822307 systemd[1]: Started containerd.service - containerd container runtime. Mar 6 01:41:39.116621 tar[1452]: linux-amd64/README.md Mar 6 01:41:39.133209 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 6 01:41:39.930622 systemd-networkd[1388]: eth0: Gained IPv6LL Mar 6 01:41:39.934896 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 6 01:41:39.940006 systemd[1]: Reached target network-online.target - Network is Online. Mar 6 01:41:39.959608 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Mar 6 01:41:39.965829 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 01:41:39.972605 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 6 01:41:40.001606 systemd[1]: coreos-metadata.service: Deactivated successfully. Mar 6 01:41:40.002102 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Mar 6 01:41:40.007572 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 6 01:41:40.011574 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 6 01:41:40.870628 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 01:41:40.877056 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 6 01:41:40.877574 (kubelet)[1540]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 01:41:40.884684 systemd[1]: Startup finished in 2.229s (kernel) + 7.360s (initrd) + 6.036s (userspace) = 15.627s. Mar 6 01:41:41.393548 kubelet[1540]: E0306 01:41:41.393436 1540 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 01:41:41.398159 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 01:41:41.398642 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 01:41:41.399646 systemd[1]: kubelet.service: Consumed 1.088s CPU time. Mar 6 01:41:41.703143 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 6 01:41:41.704931 systemd[1]: Started sshd@0-10.0.0.120:22-10.0.0.1:50930.service - OpenSSH per-connection server daemon (10.0.0.1:50930). Mar 6 01:41:41.778605 sshd[1554]: Accepted publickey for core from 10.0.0.1 port 50930 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:41:41.781508 sshd[1554]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:41:41.809689 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 6 01:41:41.880564 kernel: hrtimer: interrupt took 9059918 ns Mar 6 01:41:41.880645 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 6 01:41:41.885044 systemd-logind[1440]: New session 1 of user core. Mar 6 01:41:42.161482 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 6 01:41:42.184964 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 6 01:41:42.202153 (systemd)[1558]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 6 01:41:42.430501 systemd[1558]: Queued start job for default target default.target. Mar 6 01:41:42.450432 systemd[1558]: Created slice app.slice - User Application Slice. Mar 6 01:41:42.450567 systemd[1558]: Reached target paths.target - Paths. Mar 6 01:41:42.450590 systemd[1558]: Reached target timers.target - Timers. Mar 6 01:41:42.461429 systemd[1558]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 6 01:41:42.485239 systemd[1558]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 6 01:41:42.485548 systemd[1558]: Reached target sockets.target - Sockets. Mar 6 01:41:42.485600 systemd[1558]: Reached target basic.target - Basic System. Mar 6 01:41:42.485647 systemd[1558]: Reached target default.target - Main User Target. Mar 6 01:41:42.485689 systemd[1558]: Startup finished in 239ms. Mar 6 01:41:42.487007 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 6 01:41:42.498841 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 6 01:41:42.575942 systemd[1]: Started sshd@1-10.0.0.120:22-10.0.0.1:48454.service - OpenSSH per-connection server daemon (10.0.0.1:48454). Mar 6 01:41:42.647716 sshd[1569]: Accepted publickey for core from 10.0.0.1 port 48454 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:41:42.650831 sshd[1569]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:41:42.661811 systemd-logind[1440]: New session 2 of user core. Mar 6 01:41:42.677563 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 6 01:41:42.744212 sshd[1569]: pam_unix(sshd:session): session closed for user core Mar 6 01:41:42.778788 systemd[1]: sshd@1-10.0.0.120:22-10.0.0.1:48454.service: Deactivated successfully. Mar 6 01:41:42.784361 systemd[1]: session-2.scope: Deactivated successfully. Mar 6 01:41:42.786735 systemd-logind[1440]: Session 2 logged out. Waiting for processes to exit. Mar 6 01:41:42.806714 systemd[1]: Started sshd@2-10.0.0.120:22-10.0.0.1:48466.service - OpenSSH per-connection server daemon (10.0.0.1:48466). Mar 6 01:41:42.812201 systemd-logind[1440]: Removed session 2. Mar 6 01:41:42.886513 sshd[1576]: Accepted publickey for core from 10.0.0.1 port 48466 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:41:42.889414 sshd[1576]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:41:42.903240 systemd-logind[1440]: New session 3 of user core. Mar 6 01:41:42.919706 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 6 01:41:42.990688 sshd[1576]: pam_unix(sshd:session): session closed for user core Mar 6 01:41:43.021775 systemd[1]: sshd@2-10.0.0.120:22-10.0.0.1:48466.service: Deactivated successfully. Mar 6 01:41:43.025028 systemd[1]: session-3.scope: Deactivated successfully. Mar 6 01:41:43.027434 systemd-logind[1440]: Session 3 logged out. Waiting for processes to exit. Mar 6 01:41:43.066213 systemd[1]: Started sshd@3-10.0.0.120:22-10.0.0.1:48474.service - OpenSSH per-connection server daemon (10.0.0.1:48474). Mar 6 01:41:43.068638 systemd-logind[1440]: Removed session 3. Mar 6 01:41:43.125013 sshd[1583]: Accepted publickey for core from 10.0.0.1 port 48474 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:41:43.127815 sshd[1583]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:41:43.149031 systemd-logind[1440]: New session 4 of user core. Mar 6 01:41:43.173099 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 6 01:41:43.241677 sshd[1583]: pam_unix(sshd:session): session closed for user core Mar 6 01:41:43.267463 systemd[1]: sshd@3-10.0.0.120:22-10.0.0.1:48474.service: Deactivated successfully. Mar 6 01:41:43.275572 systemd[1]: session-4.scope: Deactivated successfully. Mar 6 01:41:43.278691 systemd-logind[1440]: Session 4 logged out. Waiting for processes to exit. Mar 6 01:41:43.292972 systemd[1]: Started sshd@4-10.0.0.120:22-10.0.0.1:48490.service - OpenSSH per-connection server daemon (10.0.0.1:48490). Mar 6 01:41:43.294838 systemd-logind[1440]: Removed session 4. Mar 6 01:41:43.341065 sshd[1590]: Accepted publickey for core from 10.0.0.1 port 48490 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:41:43.343477 sshd[1590]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:41:43.353680 systemd-logind[1440]: New session 5 of user core. Mar 6 01:41:43.374576 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 6 01:41:43.528822 sudo[1593]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 6 01:41:43.529406 sudo[1593]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 6 01:41:43.571175 sudo[1593]: pam_unix(sudo:session): session closed for user root Mar 6 01:41:43.574789 sshd[1590]: pam_unix(sshd:session): session closed for user core Mar 6 01:41:43.608809 systemd[1]: sshd@4-10.0.0.120:22-10.0.0.1:48490.service: Deactivated successfully. Mar 6 01:41:43.611237 systemd[1]: session-5.scope: Deactivated successfully. Mar 6 01:41:43.612776 systemd-logind[1440]: Session 5 logged out. Waiting for processes to exit. Mar 6 01:41:43.622805 systemd[1]: Started sshd@5-10.0.0.120:22-10.0.0.1:48506.service - OpenSSH per-connection server daemon (10.0.0.1:48506). Mar 6 01:41:43.624513 systemd-logind[1440]: Removed session 5. Mar 6 01:41:43.710932 sshd[1598]: Accepted publickey for core from 10.0.0.1 port 48506 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:41:43.713477 sshd[1598]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:41:43.720605 systemd-logind[1440]: New session 6 of user core. Mar 6 01:41:43.734507 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 6 01:41:43.799561 sudo[1602]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 6 01:41:43.800093 sudo[1602]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 6 01:41:43.806828 sudo[1602]: pam_unix(sudo:session): session closed for user root Mar 6 01:41:43.815712 sudo[1601]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Mar 6 01:41:43.816229 sudo[1601]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 6 01:41:43.843634 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Mar 6 01:41:43.866669 auditctl[1605]: No rules Mar 6 01:41:43.869230 systemd[1]: audit-rules.service: Deactivated successfully. Mar 6 01:41:43.869646 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Mar 6 01:41:43.872908 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 6 01:41:43.926208 augenrules[1623]: No rules Mar 6 01:41:43.928420 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 6 01:41:43.929776 sudo[1601]: pam_unix(sudo:session): session closed for user root Mar 6 01:41:43.932204 sshd[1598]: pam_unix(sshd:session): session closed for user core Mar 6 01:41:43.975984 systemd[1]: sshd@5-10.0.0.120:22-10.0.0.1:48506.service: Deactivated successfully. Mar 6 01:41:43.979016 systemd[1]: session-6.scope: Deactivated successfully. Mar 6 01:41:43.981948 systemd-logind[1440]: Session 6 logged out. Waiting for processes to exit. Mar 6 01:41:43.992728 systemd[1]: Started sshd@6-10.0.0.120:22-10.0.0.1:48520.service - OpenSSH per-connection server daemon (10.0.0.1:48520). Mar 6 01:41:43.994524 systemd-logind[1440]: Removed session 6. Mar 6 01:41:44.039003 sshd[1631]: Accepted publickey for core from 10.0.0.1 port 48520 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:41:44.041455 sshd[1631]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:41:44.049046 systemd-logind[1440]: New session 7 of user core. Mar 6 01:41:44.066392 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 6 01:41:44.141584 sudo[1634]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 6 01:41:44.142626 sudo[1634]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 6 01:41:44.595738 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 6 01:41:44.595992 (dockerd)[1652]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 6 01:41:45.504174 dockerd[1652]: time="2026-03-06T01:41:45.502930876Z" level=info msg="Starting up" Mar 6 01:41:45.845097 dockerd[1652]: time="2026-03-06T01:41:45.844739980Z" level=info msg="Loading containers: start." Mar 6 01:41:46.433506 kernel: Initializing XFRM netlink socket Mar 6 01:41:46.592068 systemd-networkd[1388]: docker0: Link UP Mar 6 01:41:46.633018 dockerd[1652]: time="2026-03-06T01:41:46.632910639Z" level=info msg="Loading containers: done." Mar 6 01:41:46.671410 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3360711365-merged.mount: Deactivated successfully. Mar 6 01:41:46.672540 dockerd[1652]: time="2026-03-06T01:41:46.671396268Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 6 01:41:46.672540 dockerd[1652]: time="2026-03-06T01:41:46.671582495Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Mar 6 01:41:46.672540 dockerd[1652]: time="2026-03-06T01:41:46.671709903Z" level=info msg="Daemon has completed initialization" Mar 6 01:41:46.749144 dockerd[1652]: time="2026-03-06T01:41:46.748433095Z" level=info msg="API listen on /run/docker.sock" Mar 6 01:41:46.749212 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 6 01:41:47.536208 containerd[1455]: time="2026-03-06T01:41:47.535953832Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\"" Mar 6 01:41:48.094345 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1827158435.mount: Deactivated successfully. Mar 6 01:41:49.544847 containerd[1455]: time="2026-03-06T01:41:49.544752200Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:49.548226 containerd[1455]: time="2026-03-06T01:41:49.545616142Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.5: active requests=0, bytes read=27074497" Mar 6 01:41:49.549334 containerd[1455]: time="2026-03-06T01:41:49.549226765Z" level=info msg="ImageCreate event name:\"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:49.553218 containerd[1455]: time="2026-03-06T01:41:49.553066834Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:49.554187 containerd[1455]: time="2026-03-06T01:41:49.554145097Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.5\" with image id \"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\", size \"27071096\" in 2.018133778s" Mar 6 01:41:49.554187 containerd[1455]: time="2026-03-06T01:41:49.554178330Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\" returns image reference \"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\"" Mar 6 01:41:49.555295 containerd[1455]: time="2026-03-06T01:41:49.555221267Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\"" Mar 6 01:41:50.833411 containerd[1455]: time="2026-03-06T01:41:50.833224137Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:50.834623 containerd[1455]: time="2026-03-06T01:41:50.834552368Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.5: active requests=0, bytes read=21165823" Mar 6 01:41:50.835752 containerd[1455]: time="2026-03-06T01:41:50.835664048Z" level=info msg="ImageCreate event name:\"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:50.839749 containerd[1455]: time="2026-03-06T01:41:50.839631166Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:50.840718 containerd[1455]: time="2026-03-06T01:41:50.840606527Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.5\" with image id \"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\", size \"22822771\" in 1.285358139s" Mar 6 01:41:50.840718 containerd[1455]: time="2026-03-06T01:41:50.840663092Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\" returns image reference \"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\"" Mar 6 01:41:50.841227 containerd[1455]: time="2026-03-06T01:41:50.841208711Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\"" Mar 6 01:41:51.648850 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 6 01:41:51.661006 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 01:41:51.839451 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 01:41:51.849714 (kubelet)[1873]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 01:41:51.934340 kubelet[1873]: E0306 01:41:51.933978 1873 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 01:41:51.940468 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 01:41:51.940795 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 01:41:51.955755 containerd[1455]: time="2026-03-06T01:41:51.955644167Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:51.957396 containerd[1455]: time="2026-03-06T01:41:51.957235539Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.5: active requests=0, bytes read=15729824" Mar 6 01:41:51.958747 containerd[1455]: time="2026-03-06T01:41:51.958650583Z" level=info msg="ImageCreate event name:\"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:51.963083 containerd[1455]: time="2026-03-06T01:41:51.962990183Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:51.964752 containerd[1455]: time="2026-03-06T01:41:51.964678587Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.5\" with image id \"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\", size \"17386790\" in 1.123440861s" Mar 6 01:41:51.964810 containerd[1455]: time="2026-03-06T01:41:51.964750942Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\" returns image reference \"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\"" Mar 6 01:41:51.965835 containerd[1455]: time="2026-03-06T01:41:51.965781233Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\"" Mar 6 01:41:53.387193 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3247641587.mount: Deactivated successfully. Mar 6 01:41:53.789546 containerd[1455]: time="2026-03-06T01:41:53.788777733Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:53.790513 containerd[1455]: time="2026-03-06T01:41:53.790426472Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.5: active requests=0, bytes read=25861770" Mar 6 01:41:53.791743 containerd[1455]: time="2026-03-06T01:41:53.791662901Z" level=info msg="ImageCreate event name:\"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:53.794709 containerd[1455]: time="2026-03-06T01:41:53.794603657Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:53.796633 containerd[1455]: time="2026-03-06T01:41:53.796530294Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.5\" with image id \"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\", repo tag \"registry.k8s.io/kube-proxy:v1.34.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\", size \"25860789\" in 1.830685893s" Mar 6 01:41:53.796633 containerd[1455]: time="2026-03-06T01:41:53.796619251Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\" returns image reference \"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\"" Mar 6 01:41:53.797716 containerd[1455]: time="2026-03-06T01:41:53.797553911Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Mar 6 01:41:54.288209 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1387460794.mount: Deactivated successfully. Mar 6 01:41:55.615313 containerd[1455]: time="2026-03-06T01:41:55.615174808Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:55.616660 containerd[1455]: time="2026-03-06T01:41:55.616581927Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=22388007" Mar 6 01:41:55.617981 containerd[1455]: time="2026-03-06T01:41:55.617814370Z" level=info msg="ImageCreate event name:\"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:55.623383 containerd[1455]: time="2026-03-06T01:41:55.623352809Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:55.624491 containerd[1455]: time="2026-03-06T01:41:55.624412166Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"22384805\" in 1.826801108s" Mar 6 01:41:55.624491 containerd[1455]: time="2026-03-06T01:41:55.624445468Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\"" Mar 6 01:41:55.625347 containerd[1455]: time="2026-03-06T01:41:55.625179569Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 6 01:41:56.021825 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2098929677.mount: Deactivated successfully. Mar 6 01:41:56.032443 containerd[1455]: time="2026-03-06T01:41:56.032204801Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:56.033830 containerd[1455]: time="2026-03-06T01:41:56.033627698Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321218" Mar 6 01:41:56.035480 containerd[1455]: time="2026-03-06T01:41:56.035231272Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:56.039029 containerd[1455]: time="2026-03-06T01:41:56.038990352Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:56.040209 containerd[1455]: time="2026-03-06T01:41:56.040133908Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 414.927478ms" Mar 6 01:41:56.040209 containerd[1455]: time="2026-03-06T01:41:56.040160488Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Mar 6 01:41:56.041013 containerd[1455]: time="2026-03-06T01:41:56.040956019Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\"" Mar 6 01:41:56.488421 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3360124524.mount: Deactivated successfully. Mar 6 01:41:57.504723 containerd[1455]: time="2026-03-06T01:41:57.504554825Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:57.506036 containerd[1455]: time="2026-03-06T01:41:57.505968494Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.5-0: active requests=0, bytes read=22860674" Mar 6 01:41:57.507432 containerd[1455]: time="2026-03-06T01:41:57.507359148Z" level=info msg="ImageCreate event name:\"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:57.512464 containerd[1455]: time="2026-03-06T01:41:57.512366339Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:41:57.514137 containerd[1455]: time="2026-03-06T01:41:57.514068840Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.5-0\" with image id \"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\", repo tag \"registry.k8s.io/etcd:3.6.5-0\", repo digest \"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\", size \"22871747\" in 1.473042019s" Mar 6 01:41:57.514207 containerd[1455]: time="2026-03-06T01:41:57.514134884Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\" returns image reference \"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\"" Mar 6 01:42:00.759713 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 01:42:00.769738 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 01:42:00.811503 systemd[1]: Reloading requested from client PID 2039 ('systemctl') (unit session-7.scope)... Mar 6 01:42:00.811564 systemd[1]: Reloading... Mar 6 01:42:00.935357 zram_generator::config[2081]: No configuration found. Mar 6 01:42:01.085730 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 6 01:42:01.177105 systemd[1]: Reloading finished in 364 ms. Mar 6 01:42:01.278684 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 01:42:01.285000 systemd[1]: kubelet.service: Deactivated successfully. Mar 6 01:42:01.285509 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 01:42:01.296017 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 01:42:01.486012 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 01:42:01.507174 (kubelet)[2128]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 6 01:42:01.587903 kubelet[2128]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 6 01:42:01.587903 kubelet[2128]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 6 01:42:01.588422 kubelet[2128]: I0306 01:42:01.587950 2128 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 6 01:42:01.869621 kubelet[2128]: I0306 01:42:01.869419 2128 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 6 01:42:01.869621 kubelet[2128]: I0306 01:42:01.869507 2128 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 6 01:42:01.869621 kubelet[2128]: I0306 01:42:01.869558 2128 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 6 01:42:01.869621 kubelet[2128]: I0306 01:42:01.869578 2128 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 6 01:42:01.870082 kubelet[2128]: I0306 01:42:01.870005 2128 server.go:956] "Client rotation is on, will bootstrap in background" Mar 6 01:42:01.909552 kubelet[2128]: E0306 01:42:01.909490 2128 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.120:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.120:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 6 01:42:01.909708 kubelet[2128]: I0306 01:42:01.909601 2128 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 6 01:42:01.917362 kubelet[2128]: E0306 01:42:01.917222 2128 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 6 01:42:01.917535 kubelet[2128]: I0306 01:42:01.917455 2128 server.go:1400] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Mar 6 01:42:01.930109 kubelet[2128]: I0306 01:42:01.929971 2128 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 6 01:42:01.931143 kubelet[2128]: I0306 01:42:01.930990 2128 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 6 01:42:01.931202 kubelet[2128]: I0306 01:42:01.931048 2128 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 6 01:42:01.931477 kubelet[2128]: I0306 01:42:01.931205 2128 topology_manager.go:138] "Creating topology manager with none policy" Mar 6 01:42:01.931477 kubelet[2128]: I0306 01:42:01.931213 2128 container_manager_linux.go:306] "Creating device plugin manager" Mar 6 01:42:01.931477 kubelet[2128]: I0306 01:42:01.931454 2128 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 6 01:42:01.934811 kubelet[2128]: I0306 01:42:01.934699 2128 state_mem.go:36] "Initialized new in-memory state store" Mar 6 01:42:01.935480 kubelet[2128]: I0306 01:42:01.935364 2128 kubelet.go:475] "Attempting to sync node with API server" Mar 6 01:42:01.935480 kubelet[2128]: I0306 01:42:01.935408 2128 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 6 01:42:01.935480 kubelet[2128]: I0306 01:42:01.935442 2128 kubelet.go:387] "Adding apiserver pod source" Mar 6 01:42:01.935480 kubelet[2128]: I0306 01:42:01.935468 2128 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 6 01:42:01.937623 kubelet[2128]: E0306 01:42:01.937209 2128 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.120:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.120:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 6 01:42:01.940469 kubelet[2128]: E0306 01:42:01.940328 2128 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.120:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.120:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 6 01:42:01.941440 kubelet[2128]: I0306 01:42:01.941230 2128 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 6 01:42:01.944215 kubelet[2128]: I0306 01:42:01.944187 2128 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 6 01:42:01.944404 kubelet[2128]: I0306 01:42:01.944390 2128 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 6 01:42:01.944531 kubelet[2128]: W0306 01:42:01.944510 2128 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 6 01:42:01.950759 kubelet[2128]: I0306 01:42:01.950416 2128 server.go:1262] "Started kubelet" Mar 6 01:42:01.954023 kubelet[2128]: I0306 01:42:01.953816 2128 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 6 01:42:01.955419 kubelet[2128]: I0306 01:42:01.955369 2128 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 6 01:42:01.958671 kubelet[2128]: I0306 01:42:01.958339 2128 server.go:310] "Adding debug handlers to kubelet server" Mar 6 01:42:01.961351 kubelet[2128]: I0306 01:42:01.958633 2128 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 6 01:42:01.961476 kubelet[2128]: I0306 01:42:01.958846 2128 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 6 01:42:01.961621 kubelet[2128]: I0306 01:42:01.961436 2128 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 6 01:42:01.962441 kubelet[2128]: E0306 01:42:01.962330 2128 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 6 01:42:01.962569 kubelet[2128]: E0306 01:42:01.957584 2128 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.120:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.120:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.189a1d07a267619a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2026-03-06 01:42:01.95037225 +0000 UTC m=+0.434784609,LastTimestamp:2026-03-06 01:42:01.95037225 +0000 UTC m=+0.434784609,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 6 01:42:01.963678 kubelet[2128]: I0306 01:42:01.963659 2128 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 6 01:42:01.963845 kubelet[2128]: I0306 01:42:01.961556 2128 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 6 01:42:01.965056 kubelet[2128]: E0306 01:42:01.964781 2128 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.120:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.120:6443: connect: connection refused" interval="200ms" Mar 6 01:42:01.965196 kubelet[2128]: I0306 01:42:01.965180 2128 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 6 01:42:01.965500 kubelet[2128]: I0306 01:42:01.965483 2128 reconciler.go:29] "Reconciler: start to sync state" Mar 6 01:42:01.965798 kubelet[2128]: I0306 01:42:01.965716 2128 factory.go:223] Registration of the systemd container factory successfully Mar 6 01:42:01.966052 kubelet[2128]: I0306 01:42:01.965968 2128 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 6 01:42:01.966700 kubelet[2128]: E0306 01:42:01.966674 2128 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 6 01:42:01.966827 kubelet[2128]: E0306 01:42:01.966540 2128 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.120:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.120:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 6 01:42:01.973568 kubelet[2128]: I0306 01:42:01.973519 2128 factory.go:223] Registration of the containerd container factory successfully Mar 6 01:42:02.006836 kubelet[2128]: I0306 01:42:02.006577 2128 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 6 01:42:02.006836 kubelet[2128]: I0306 01:42:02.006597 2128 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 6 01:42:02.006836 kubelet[2128]: I0306 01:42:02.006671 2128 state_mem.go:36] "Initialized new in-memory state store" Mar 6 01:42:02.007347 kubelet[2128]: I0306 01:42:02.007325 2128 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 6 01:42:02.011536 kubelet[2128]: I0306 01:42:02.009789 2128 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 6 01:42:02.011536 kubelet[2128]: I0306 01:42:02.009809 2128 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 6 01:42:02.011536 kubelet[2128]: I0306 01:42:02.009832 2128 kubelet.go:2428] "Starting kubelet main sync loop" Mar 6 01:42:02.011536 kubelet[2128]: E0306 01:42:02.009916 2128 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 6 01:42:02.012229 kubelet[2128]: I0306 01:42:02.012150 2128 policy_none.go:49] "None policy: Start" Mar 6 01:42:02.012441 kubelet[2128]: I0306 01:42:02.012241 2128 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 6 01:42:02.012661 kubelet[2128]: I0306 01:42:02.012542 2128 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 6 01:42:02.013011 kubelet[2128]: E0306 01:42:02.012989 2128 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.120:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.120:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 6 01:42:02.016679 kubelet[2128]: I0306 01:42:02.016598 2128 policy_none.go:47] "Start" Mar 6 01:42:02.031842 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 6 01:42:02.050414 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 6 01:42:02.063349 kubelet[2128]: E0306 01:42:02.062997 2128 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 6 01:42:02.068523 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 6 01:42:02.072176 kubelet[2128]: E0306 01:42:02.071112 2128 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 6 01:42:02.072176 kubelet[2128]: I0306 01:42:02.071680 2128 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 6 01:42:02.072176 kubelet[2128]: I0306 01:42:02.071698 2128 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 6 01:42:02.072441 kubelet[2128]: I0306 01:42:02.072227 2128 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 6 01:42:02.074289 kubelet[2128]: E0306 01:42:02.074064 2128 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 6 01:42:02.074289 kubelet[2128]: E0306 01:42:02.074124 2128 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Mar 6 01:42:02.129595 systemd[1]: Created slice kubepods-burstable-pod08fa4cad9539997788b2467194cbc971.slice - libcontainer container kubepods-burstable-pod08fa4cad9539997788b2467194cbc971.slice. Mar 6 01:42:02.153846 kubelet[2128]: E0306 01:42:02.153751 2128 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 6 01:42:02.159402 systemd[1]: Created slice kubepods-burstable-poddb0989cdb653dfec284dd4f35625e9e7.slice - libcontainer container kubepods-burstable-poddb0989cdb653dfec284dd4f35625e9e7.slice. Mar 6 01:42:02.166571 kubelet[2128]: E0306 01:42:02.166112 2128 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.120:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.120:6443: connect: connection refused" interval="400ms" Mar 6 01:42:02.166571 kubelet[2128]: I0306 01:42:02.166213 2128 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/08fa4cad9539997788b2467194cbc971-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"08fa4cad9539997788b2467194cbc971\") " pod="kube-system/kube-apiserver-localhost" Mar 6 01:42:02.166571 kubelet[2128]: I0306 01:42:02.166328 2128 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/08fa4cad9539997788b2467194cbc971-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"08fa4cad9539997788b2467194cbc971\") " pod="kube-system/kube-apiserver-localhost" Mar 6 01:42:02.166571 kubelet[2128]: I0306 01:42:02.166355 2128 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/08fa4cad9539997788b2467194cbc971-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"08fa4cad9539997788b2467194cbc971\") " pod="kube-system/kube-apiserver-localhost" Mar 6 01:42:02.166571 kubelet[2128]: I0306 01:42:02.166370 2128 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 6 01:42:02.166808 kubelet[2128]: I0306 01:42:02.166384 2128 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 6 01:42:02.166808 kubelet[2128]: I0306 01:42:02.166398 2128 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 6 01:42:02.166808 kubelet[2128]: I0306 01:42:02.166411 2128 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/89efda49e166906783d8d868d41ebb86-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"89efda49e166906783d8d868d41ebb86\") " pod="kube-system/kube-scheduler-localhost" Mar 6 01:42:02.166808 kubelet[2128]: I0306 01:42:02.166423 2128 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 6 01:42:02.166808 kubelet[2128]: I0306 01:42:02.166438 2128 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 6 01:42:02.167964 kubelet[2128]: E0306 01:42:02.167793 2128 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 6 01:42:02.173608 kubelet[2128]: I0306 01:42:02.173584 2128 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Mar 6 01:42:02.173611 systemd[1]: Created slice kubepods-burstable-pod89efda49e166906783d8d868d41ebb86.slice - libcontainer container kubepods-burstable-pod89efda49e166906783d8d868d41ebb86.slice. Mar 6 01:42:02.174368 kubelet[2128]: E0306 01:42:02.174234 2128 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.120:6443/api/v1/nodes\": dial tcp 10.0.0.120:6443: connect: connection refused" node="localhost" Mar 6 01:42:02.176488 kubelet[2128]: E0306 01:42:02.176406 2128 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 6 01:42:02.377356 kubelet[2128]: I0306 01:42:02.377109 2128 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Mar 6 01:42:02.377929 kubelet[2128]: E0306 01:42:02.377740 2128 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.120:6443/api/v1/nodes\": dial tcp 10.0.0.120:6443: connect: connection refused" node="localhost" Mar 6 01:42:02.509045 kubelet[2128]: E0306 01:42:02.508633 2128 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:02.509824 containerd[1455]: time="2026-03-06T01:42:02.509728785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:08fa4cad9539997788b2467194cbc971,Namespace:kube-system,Attempt:0,}" Mar 6 01:42:02.512637 kubelet[2128]: E0306 01:42:02.512497 2128 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:02.513454 containerd[1455]: time="2026-03-06T01:42:02.513384744Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:db0989cdb653dfec284dd4f35625e9e7,Namespace:kube-system,Attempt:0,}" Mar 6 01:42:02.515862 kubelet[2128]: E0306 01:42:02.515786 2128 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:02.516460 containerd[1455]: time="2026-03-06T01:42:02.516385956Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:89efda49e166906783d8d868d41ebb86,Namespace:kube-system,Attempt:0,}" Mar 6 01:42:02.568134 kubelet[2128]: E0306 01:42:02.568085 2128 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.120:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.120:6443: connect: connection refused" interval="800ms" Mar 6 01:42:02.780863 kubelet[2128]: I0306 01:42:02.780705 2128 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Mar 6 01:42:02.781513 kubelet[2128]: E0306 01:42:02.781025 2128 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.120:6443/api/v1/nodes\": dial tcp 10.0.0.120:6443: connect: connection refused" node="localhost" Mar 6 01:42:02.799990 kubelet[2128]: E0306 01:42:02.799742 2128 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.120:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.120:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 6 01:42:02.832717 kubelet[2128]: E0306 01:42:02.832632 2128 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.120:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.120:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 6 01:42:02.936371 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2328133743.mount: Deactivated successfully. Mar 6 01:42:02.945382 containerd[1455]: time="2026-03-06T01:42:02.945129935Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 01:42:02.951111 containerd[1455]: time="2026-03-06T01:42:02.950769218Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Mar 6 01:42:02.952389 containerd[1455]: time="2026-03-06T01:42:02.952168985Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 01:42:02.953484 containerd[1455]: time="2026-03-06T01:42:02.953439314Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 6 01:42:02.955057 containerd[1455]: time="2026-03-06T01:42:02.954832338Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 01:42:02.956573 containerd[1455]: time="2026-03-06T01:42:02.956406658Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 01:42:02.958019 containerd[1455]: time="2026-03-06T01:42:02.957927996Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 6 01:42:02.960712 containerd[1455]: time="2026-03-06T01:42:02.960560622Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 01:42:02.963856 containerd[1455]: time="2026-03-06T01:42:02.963770827Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 450.291887ms" Mar 6 01:42:02.967360 containerd[1455]: time="2026-03-06T01:42:02.967179719Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 450.708514ms" Mar 6 01:42:02.968744 containerd[1455]: time="2026-03-06T01:42:02.968583731Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 458.736415ms" Mar 6 01:42:03.159223 containerd[1455]: time="2026-03-06T01:42:03.158688004Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 6 01:42:03.159223 containerd[1455]: time="2026-03-06T01:42:03.158753286Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 6 01:42:03.159223 containerd[1455]: time="2026-03-06T01:42:03.158768635Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:03.159223 containerd[1455]: time="2026-03-06T01:42:03.158866538Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:03.199763 containerd[1455]: time="2026-03-06T01:42:03.182358441Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 6 01:42:03.199763 containerd[1455]: time="2026-03-06T01:42:03.182430274Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 6 01:42:03.199763 containerd[1455]: time="2026-03-06T01:42:03.182444321Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:03.199763 containerd[1455]: time="2026-03-06T01:42:03.182540621Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:03.275398 containerd[1455]: time="2026-03-06T01:42:03.274995022Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 6 01:42:03.275398 containerd[1455]: time="2026-03-06T01:42:03.275059182Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 6 01:42:03.275398 containerd[1455]: time="2026-03-06T01:42:03.275070282Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:03.275398 containerd[1455]: time="2026-03-06T01:42:03.275160451Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:03.300705 systemd[1]: Started cri-containerd-d05e50a42d4d000a5991440d02edca5ce5b65c8108e060d8bab198b918d2a2f7.scope - libcontainer container d05e50a42d4d000a5991440d02edca5ce5b65c8108e060d8bab198b918d2a2f7. Mar 6 01:42:03.309476 systemd[1]: Started cri-containerd-1dcc6169ee4150707e73bbd34b605fc0085834589c236dc9aa20ec202305c9cd.scope - libcontainer container 1dcc6169ee4150707e73bbd34b605fc0085834589c236dc9aa20ec202305c9cd. Mar 6 01:42:03.313658 systemd[1]: Started cri-containerd-f482ec2ce1ed4a61a8bea7a38433392cdfdddc8bd443810354651364a1ffce00.scope - libcontainer container f482ec2ce1ed4a61a8bea7a38433392cdfdddc8bd443810354651364a1ffce00. Mar 6 01:42:03.372221 kubelet[2128]: E0306 01:42:03.372094 2128 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.120:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.120:6443: connect: connection refused" interval="1.6s" Mar 6 01:42:03.382598 containerd[1455]: time="2026-03-06T01:42:03.382440811Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:db0989cdb653dfec284dd4f35625e9e7,Namespace:kube-system,Attempt:0,} returns sandbox id \"d05e50a42d4d000a5991440d02edca5ce5b65c8108e060d8bab198b918d2a2f7\"" Mar 6 01:42:03.385687 kubelet[2128]: E0306 01:42:03.385230 2128 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:03.401755 containerd[1455]: time="2026-03-06T01:42:03.401673382Z" level=info msg="CreateContainer within sandbox \"d05e50a42d4d000a5991440d02edca5ce5b65c8108e060d8bab198b918d2a2f7\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 6 01:42:03.409021 containerd[1455]: time="2026-03-06T01:42:03.408865856Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:89efda49e166906783d8d868d41ebb86,Namespace:kube-system,Attempt:0,} returns sandbox id \"1dcc6169ee4150707e73bbd34b605fc0085834589c236dc9aa20ec202305c9cd\"" Mar 6 01:42:03.411064 containerd[1455]: time="2026-03-06T01:42:03.409484852Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:08fa4cad9539997788b2467194cbc971,Namespace:kube-system,Attempt:0,} returns sandbox id \"f482ec2ce1ed4a61a8bea7a38433392cdfdddc8bd443810354651364a1ffce00\"" Mar 6 01:42:03.411715 kubelet[2128]: E0306 01:42:03.411544 2128 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:03.412100 kubelet[2128]: E0306 01:42:03.411861 2128 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:03.420516 containerd[1455]: time="2026-03-06T01:42:03.420475896Z" level=info msg="CreateContainer within sandbox \"1dcc6169ee4150707e73bbd34b605fc0085834589c236dc9aa20ec202305c9cd\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 6 01:42:03.424175 containerd[1455]: time="2026-03-06T01:42:03.424078172Z" level=info msg="CreateContainer within sandbox \"f482ec2ce1ed4a61a8bea7a38433392cdfdddc8bd443810354651364a1ffce00\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 6 01:42:03.427162 kubelet[2128]: E0306 01:42:03.427014 2128 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.120:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.120:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 6 01:42:03.427162 kubelet[2128]: E0306 01:42:03.427083 2128 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.120:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.120:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 6 01:42:03.456652 containerd[1455]: time="2026-03-06T01:42:03.456480692Z" level=info msg="CreateContainer within sandbox \"d05e50a42d4d000a5991440d02edca5ce5b65c8108e060d8bab198b918d2a2f7\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"cc40b5d93044dfbc40573ef99c435f93e7d2fa499f4a2d66e3bb9ff18e191628\"" Mar 6 01:42:03.458487 containerd[1455]: time="2026-03-06T01:42:03.458182777Z" level=info msg="StartContainer for \"cc40b5d93044dfbc40573ef99c435f93e7d2fa499f4a2d66e3bb9ff18e191628\"" Mar 6 01:42:03.460184 containerd[1455]: time="2026-03-06T01:42:03.460093528Z" level=info msg="CreateContainer within sandbox \"1dcc6169ee4150707e73bbd34b605fc0085834589c236dc9aa20ec202305c9cd\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"d0be748edecf8e3ec9c6d24044d25029bf0cbfc2b7f4bc5dbf0dc56aff14fcd7\"" Mar 6 01:42:03.461025 containerd[1455]: time="2026-03-06T01:42:03.460933886Z" level=info msg="StartContainer for \"d0be748edecf8e3ec9c6d24044d25029bf0cbfc2b7f4bc5dbf0dc56aff14fcd7\"" Mar 6 01:42:03.471622 containerd[1455]: time="2026-03-06T01:42:03.471538034Z" level=info msg="CreateContainer within sandbox \"f482ec2ce1ed4a61a8bea7a38433392cdfdddc8bd443810354651364a1ffce00\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"d8cf6f00594218316bf3dcea397f53724bc2f68237bb4e7fd33a4ee14a8d040f\"" Mar 6 01:42:03.472445 containerd[1455]: time="2026-03-06T01:42:03.472403215Z" level=info msg="StartContainer for \"d8cf6f00594218316bf3dcea397f53724bc2f68237bb4e7fd33a4ee14a8d040f\"" Mar 6 01:42:03.512670 systemd[1]: Started cri-containerd-cc40b5d93044dfbc40573ef99c435f93e7d2fa499f4a2d66e3bb9ff18e191628.scope - libcontainer container cc40b5d93044dfbc40573ef99c435f93e7d2fa499f4a2d66e3bb9ff18e191628. Mar 6 01:42:03.517221 systemd[1]: Started cri-containerd-d0be748edecf8e3ec9c6d24044d25029bf0cbfc2b7f4bc5dbf0dc56aff14fcd7.scope - libcontainer container d0be748edecf8e3ec9c6d24044d25029bf0cbfc2b7f4bc5dbf0dc56aff14fcd7. Mar 6 01:42:03.527569 systemd[1]: Started cri-containerd-d8cf6f00594218316bf3dcea397f53724bc2f68237bb4e7fd33a4ee14a8d040f.scope - libcontainer container d8cf6f00594218316bf3dcea397f53724bc2f68237bb4e7fd33a4ee14a8d040f. Mar 6 01:42:03.583961 kubelet[2128]: I0306 01:42:03.583800 2128 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Mar 6 01:42:03.587302 kubelet[2128]: E0306 01:42:03.587014 2128 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.120:6443/api/v1/nodes\": dial tcp 10.0.0.120:6443: connect: connection refused" node="localhost" Mar 6 01:42:03.606390 containerd[1455]: time="2026-03-06T01:42:03.605994654Z" level=info msg="StartContainer for \"d0be748edecf8e3ec9c6d24044d25029bf0cbfc2b7f4bc5dbf0dc56aff14fcd7\" returns successfully" Mar 6 01:42:03.606390 containerd[1455]: time="2026-03-06T01:42:03.606096784Z" level=info msg="StartContainer for \"cc40b5d93044dfbc40573ef99c435f93e7d2fa499f4a2d66e3bb9ff18e191628\" returns successfully" Mar 6 01:42:03.615376 containerd[1455]: time="2026-03-06T01:42:03.615233657Z" level=info msg="StartContainer for \"d8cf6f00594218316bf3dcea397f53724bc2f68237bb4e7fd33a4ee14a8d040f\" returns successfully" Mar 6 01:42:04.035360 kubelet[2128]: E0306 01:42:04.035160 2128 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 6 01:42:04.036046 kubelet[2128]: E0306 01:42:04.035856 2128 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:04.041970 kubelet[2128]: E0306 01:42:04.041823 2128 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 6 01:42:04.042089 kubelet[2128]: E0306 01:42:04.042063 2128 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:04.045176 kubelet[2128]: E0306 01:42:04.045099 2128 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 6 01:42:04.045458 kubelet[2128]: E0306 01:42:04.045404 2128 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:05.054333 kubelet[2128]: E0306 01:42:05.054184 2128 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 6 01:42:05.054785 kubelet[2128]: E0306 01:42:05.054476 2128 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:05.060645 kubelet[2128]: E0306 01:42:05.060571 2128 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 6 01:42:05.060822 kubelet[2128]: E0306 01:42:05.060748 2128 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:05.061081 kubelet[2128]: E0306 01:42:05.061009 2128 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 6 01:42:05.061347 kubelet[2128]: E0306 01:42:05.061215 2128 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:05.114116 kubelet[2128]: E0306 01:42:05.114022 2128 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Mar 6 01:42:05.189780 kubelet[2128]: I0306 01:42:05.189736 2128 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Mar 6 01:42:05.302072 kubelet[2128]: I0306 01:42:05.301978 2128 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Mar 6 01:42:05.302072 kubelet[2128]: E0306 01:42:05.302043 2128 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Mar 6 01:42:05.312676 kubelet[2128]: E0306 01:42:05.312549 2128 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 6 01:42:05.413983 kubelet[2128]: E0306 01:42:05.413821 2128 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 6 01:42:05.514348 kubelet[2128]: E0306 01:42:05.514080 2128 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 6 01:42:05.615714 kubelet[2128]: E0306 01:42:05.615414 2128 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 6 01:42:05.717083 kubelet[2128]: E0306 01:42:05.716690 2128 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 6 01:42:05.818205 kubelet[2128]: E0306 01:42:05.817946 2128 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 6 01:42:05.918831 kubelet[2128]: E0306 01:42:05.918150 2128 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 6 01:42:06.018445 kubelet[2128]: E0306 01:42:06.018340 2128 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 6 01:42:06.119036 kubelet[2128]: E0306 01:42:06.118987 2128 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 6 01:42:06.220533 kubelet[2128]: E0306 01:42:06.220212 2128 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 6 01:42:06.320468 kubelet[2128]: E0306 01:42:06.320409 2128 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 6 01:42:06.421968 kubelet[2128]: E0306 01:42:06.421236 2128 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 6 01:42:06.525542 kubelet[2128]: E0306 01:42:06.523923 2128 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 6 01:42:06.666202 kubelet[2128]: I0306 01:42:06.665212 2128 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Mar 6 01:42:06.803385 kubelet[2128]: I0306 01:42:06.802356 2128 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Mar 6 01:42:06.810054 kubelet[2128]: I0306 01:42:06.809972 2128 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Mar 6 01:42:06.942922 kubelet[2128]: I0306 01:42:06.942659 2128 apiserver.go:52] "Watching apiserver" Mar 6 01:42:06.946513 kubelet[2128]: E0306 01:42:06.945684 2128 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:06.946513 kubelet[2128]: E0306 01:42:06.946073 2128 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:06.946815 kubelet[2128]: E0306 01:42:06.946792 2128 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:06.964375 kubelet[2128]: I0306 01:42:06.964173 2128 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 6 01:42:08.119497 systemd[1]: Reloading requested from client PID 2422 ('systemctl') (unit session-7.scope)... Mar 6 01:42:08.120196 systemd[1]: Reloading... Mar 6 01:42:08.321466 zram_generator::config[2467]: No configuration found. Mar 6 01:42:08.461938 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 6 01:42:08.567553 systemd[1]: Reloading finished in 446 ms. Mar 6 01:42:08.631154 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 01:42:08.643511 systemd[1]: kubelet.service: Deactivated successfully. Mar 6 01:42:08.643827 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 01:42:08.643955 systemd[1]: kubelet.service: Consumed 1.704s CPU time, 128.1M memory peak, 0B memory swap peak. Mar 6 01:42:08.659707 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 01:42:08.835407 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 01:42:08.842742 (kubelet)[2506]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 6 01:42:08.929707 kubelet[2506]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 6 01:42:08.929707 kubelet[2506]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 6 01:42:08.929707 kubelet[2506]: I0306 01:42:08.929734 2506 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 6 01:42:08.941101 kubelet[2506]: I0306 01:42:08.941019 2506 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 6 01:42:08.941101 kubelet[2506]: I0306 01:42:08.941067 2506 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 6 01:42:08.941101 kubelet[2506]: I0306 01:42:08.941106 2506 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 6 01:42:08.941364 kubelet[2506]: I0306 01:42:08.941117 2506 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 6 01:42:08.941512 kubelet[2506]: I0306 01:42:08.941466 2506 server.go:956] "Client rotation is on, will bootstrap in background" Mar 6 01:42:08.942986 kubelet[2506]: I0306 01:42:08.942868 2506 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 6 01:42:08.947688 kubelet[2506]: I0306 01:42:08.947502 2506 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 6 01:42:08.951028 kubelet[2506]: E0306 01:42:08.950983 2506 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 6 01:42:08.951028 kubelet[2506]: I0306 01:42:08.951045 2506 server.go:1400] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Mar 6 01:42:08.958642 kubelet[2506]: I0306 01:42:08.958589 2506 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 6 01:42:08.959130 kubelet[2506]: I0306 01:42:08.959037 2506 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 6 01:42:08.959361 kubelet[2506]: I0306 01:42:08.959098 2506 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 6 01:42:08.959487 kubelet[2506]: I0306 01:42:08.959385 2506 topology_manager.go:138] "Creating topology manager with none policy" Mar 6 01:42:08.959487 kubelet[2506]: I0306 01:42:08.959406 2506 container_manager_linux.go:306] "Creating device plugin manager" Mar 6 01:42:08.959487 kubelet[2506]: I0306 01:42:08.959452 2506 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 6 01:42:08.959761 kubelet[2506]: I0306 01:42:08.959714 2506 state_mem.go:36] "Initialized new in-memory state store" Mar 6 01:42:08.960172 kubelet[2506]: I0306 01:42:08.960089 2506 kubelet.go:475] "Attempting to sync node with API server" Mar 6 01:42:08.961053 kubelet[2506]: I0306 01:42:08.960847 2506 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 6 01:42:08.961365 kubelet[2506]: I0306 01:42:08.961155 2506 kubelet.go:387] "Adding apiserver pod source" Mar 6 01:42:08.961648 kubelet[2506]: I0306 01:42:08.961551 2506 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 6 01:42:08.963738 kubelet[2506]: I0306 01:42:08.963679 2506 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 6 01:42:08.964404 kubelet[2506]: I0306 01:42:08.964331 2506 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 6 01:42:08.964404 kubelet[2506]: I0306 01:42:08.964415 2506 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 6 01:42:08.970489 kubelet[2506]: I0306 01:42:08.970417 2506 server.go:1262] "Started kubelet" Mar 6 01:42:08.974305 kubelet[2506]: I0306 01:42:08.972539 2506 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 6 01:42:08.974305 kubelet[2506]: I0306 01:42:08.972610 2506 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 6 01:42:08.974305 kubelet[2506]: I0306 01:42:08.972969 2506 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 6 01:42:08.974305 kubelet[2506]: I0306 01:42:08.973661 2506 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 6 01:42:08.975807 kubelet[2506]: I0306 01:42:08.975691 2506 server.go:310] "Adding debug handlers to kubelet server" Mar 6 01:42:08.980317 kubelet[2506]: I0306 01:42:08.979134 2506 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 6 01:42:08.982678 kubelet[2506]: I0306 01:42:08.981661 2506 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 6 01:42:08.986601 kubelet[2506]: I0306 01:42:08.986480 2506 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 6 01:42:08.986601 kubelet[2506]: E0306 01:42:08.986579 2506 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 6 01:42:08.989685 kubelet[2506]: I0306 01:42:08.989596 2506 factory.go:223] Registration of the systemd container factory successfully Mar 6 01:42:08.989864 kubelet[2506]: I0306 01:42:08.989751 2506 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 6 01:42:08.991578 kubelet[2506]: I0306 01:42:08.990399 2506 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 6 01:42:08.991578 kubelet[2506]: I0306 01:42:08.990589 2506 reconciler.go:29] "Reconciler: start to sync state" Mar 6 01:42:08.991578 kubelet[2506]: E0306 01:42:08.990812 2506 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 6 01:42:08.998324 kubelet[2506]: I0306 01:42:08.998199 2506 factory.go:223] Registration of the containerd container factory successfully Mar 6 01:42:09.021124 kubelet[2506]: I0306 01:42:09.021052 2506 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 6 01:42:09.034706 kubelet[2506]: I0306 01:42:09.034531 2506 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 6 01:42:09.035359 kubelet[2506]: I0306 01:42:09.034692 2506 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 6 01:42:09.037862 kubelet[2506]: I0306 01:42:09.036691 2506 kubelet.go:2428] "Starting kubelet main sync loop" Mar 6 01:42:09.037862 kubelet[2506]: E0306 01:42:09.037855 2506 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 6 01:42:09.085795 kubelet[2506]: I0306 01:42:09.083688 2506 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 6 01:42:09.085795 kubelet[2506]: I0306 01:42:09.084059 2506 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 6 01:42:09.085795 kubelet[2506]: I0306 01:42:09.084085 2506 state_mem.go:36] "Initialized new in-memory state store" Mar 6 01:42:09.085795 kubelet[2506]: I0306 01:42:09.084224 2506 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 6 01:42:09.085795 kubelet[2506]: I0306 01:42:09.084233 2506 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 6 01:42:09.085795 kubelet[2506]: I0306 01:42:09.084340 2506 policy_none.go:49] "None policy: Start" Mar 6 01:42:09.085795 kubelet[2506]: I0306 01:42:09.084351 2506 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 6 01:42:09.085795 kubelet[2506]: I0306 01:42:09.084363 2506 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 6 01:42:09.085795 kubelet[2506]: I0306 01:42:09.084468 2506 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 6 01:42:09.085795 kubelet[2506]: I0306 01:42:09.084477 2506 policy_none.go:47] "Start" Mar 6 01:42:09.098567 kubelet[2506]: E0306 01:42:09.098505 2506 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 6 01:42:09.098770 kubelet[2506]: I0306 01:42:09.098697 2506 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 6 01:42:09.098770 kubelet[2506]: I0306 01:42:09.098745 2506 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 6 01:42:09.099497 kubelet[2506]: I0306 01:42:09.099434 2506 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 6 01:42:09.101336 kubelet[2506]: E0306 01:42:09.101236 2506 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 6 01:42:09.140187 kubelet[2506]: I0306 01:42:09.139377 2506 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Mar 6 01:42:09.140187 kubelet[2506]: I0306 01:42:09.139947 2506 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Mar 6 01:42:09.142232 kubelet[2506]: I0306 01:42:09.142146 2506 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Mar 6 01:42:09.152362 kubelet[2506]: E0306 01:42:09.152023 2506 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Mar 6 01:42:09.152516 kubelet[2506]: E0306 01:42:09.152473 2506 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Mar 6 01:42:09.152635 kubelet[2506]: E0306 01:42:09.152567 2506 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Mar 6 01:42:09.217060 kubelet[2506]: I0306 01:42:09.215482 2506 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Mar 6 01:42:09.226179 kubelet[2506]: I0306 01:42:09.226139 2506 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Mar 6 01:42:09.227128 kubelet[2506]: I0306 01:42:09.226783 2506 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Mar 6 01:42:09.292118 kubelet[2506]: I0306 01:42:09.291992 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/89efda49e166906783d8d868d41ebb86-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"89efda49e166906783d8d868d41ebb86\") " pod="kube-system/kube-scheduler-localhost" Mar 6 01:42:09.292118 kubelet[2506]: I0306 01:42:09.292058 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/08fa4cad9539997788b2467194cbc971-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"08fa4cad9539997788b2467194cbc971\") " pod="kube-system/kube-apiserver-localhost" Mar 6 01:42:09.292368 kubelet[2506]: I0306 01:42:09.292079 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/08fa4cad9539997788b2467194cbc971-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"08fa4cad9539997788b2467194cbc971\") " pod="kube-system/kube-apiserver-localhost" Mar 6 01:42:09.292368 kubelet[2506]: I0306 01:42:09.292169 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 6 01:42:09.292368 kubelet[2506]: I0306 01:42:09.292185 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 6 01:42:09.292368 kubelet[2506]: I0306 01:42:09.292198 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/08fa4cad9539997788b2467194cbc971-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"08fa4cad9539997788b2467194cbc971\") " pod="kube-system/kube-apiserver-localhost" Mar 6 01:42:09.292368 kubelet[2506]: I0306 01:42:09.292210 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 6 01:42:09.292480 kubelet[2506]: I0306 01:42:09.292223 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 6 01:42:09.292480 kubelet[2506]: I0306 01:42:09.292342 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 6 01:42:09.454166 kubelet[2506]: E0306 01:42:09.453932 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:09.454641 kubelet[2506]: E0306 01:42:09.454343 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:09.454641 kubelet[2506]: E0306 01:42:09.454580 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:09.963584 kubelet[2506]: I0306 01:42:09.963406 2506 apiserver.go:52] "Watching apiserver" Mar 6 01:42:09.991233 kubelet[2506]: I0306 01:42:09.991082 2506 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 6 01:42:10.062922 kubelet[2506]: E0306 01:42:10.062733 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:10.064206 kubelet[2506]: E0306 01:42:10.063741 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:10.064206 kubelet[2506]: E0306 01:42:10.064068 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:10.102471 kubelet[2506]: I0306 01:42:10.102373 2506 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=4.102355099 podStartE2EDuration="4.102355099s" podCreationTimestamp="2026-03-06 01:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 01:42:10.0902182 +0000 UTC m=+1.239793658" watchObservedRunningTime="2026-03-06 01:42:10.102355099 +0000 UTC m=+1.251930577" Mar 6 01:42:10.102471 kubelet[2506]: I0306 01:42:10.102486 2506 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=4.102479662 podStartE2EDuration="4.102479662s" podCreationTimestamp="2026-03-06 01:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 01:42:10.101934924 +0000 UTC m=+1.251510433" watchObservedRunningTime="2026-03-06 01:42:10.102479662 +0000 UTC m=+1.252055140" Mar 6 01:42:10.113770 kubelet[2506]: I0306 01:42:10.112868 2506 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=4.112855111 podStartE2EDuration="4.112855111s" podCreationTimestamp="2026-03-06 01:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 01:42:10.112590106 +0000 UTC m=+1.262165574" watchObservedRunningTime="2026-03-06 01:42:10.112855111 +0000 UTC m=+1.262430569" Mar 6 01:42:11.066002 kubelet[2506]: E0306 01:42:11.065775 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:11.067143 kubelet[2506]: E0306 01:42:11.067070 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:12.553506 kubelet[2506]: I0306 01:42:12.553411 2506 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 6 01:42:12.554145 containerd[1455]: time="2026-03-06T01:42:12.553966084Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 6 01:42:12.554681 kubelet[2506]: I0306 01:42:12.554341 2506 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 6 01:42:13.512355 kubelet[2506]: E0306 01:42:13.512220 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:13.703805 systemd[1]: Created slice kubepods-besteffort-podf9cc6b8a_b9b7_4087_b79d_709672d8d04a.slice - libcontainer container kubepods-besteffort-podf9cc6b8a_b9b7_4087_b79d_709672d8d04a.slice. Mar 6 01:42:13.730221 kubelet[2506]: I0306 01:42:13.730121 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/f9cc6b8a-b9b7-4087-b79d-709672d8d04a-kube-proxy\") pod \"kube-proxy-wgft8\" (UID: \"f9cc6b8a-b9b7-4087-b79d-709672d8d04a\") " pod="kube-system/kube-proxy-wgft8" Mar 6 01:42:13.730927 kubelet[2506]: I0306 01:42:13.730563 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f9cc6b8a-b9b7-4087-b79d-709672d8d04a-xtables-lock\") pod \"kube-proxy-wgft8\" (UID: \"f9cc6b8a-b9b7-4087-b79d-709672d8d04a\") " pod="kube-system/kube-proxy-wgft8" Mar 6 01:42:13.730927 kubelet[2506]: I0306 01:42:13.730586 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbm5q\" (UniqueName: \"kubernetes.io/projected/f9cc6b8a-b9b7-4087-b79d-709672d8d04a-kube-api-access-qbm5q\") pod \"kube-proxy-wgft8\" (UID: \"f9cc6b8a-b9b7-4087-b79d-709672d8d04a\") " pod="kube-system/kube-proxy-wgft8" Mar 6 01:42:13.730927 kubelet[2506]: I0306 01:42:13.730718 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9cc6b8a-b9b7-4087-b79d-709672d8d04a-lib-modules\") pod \"kube-proxy-wgft8\" (UID: \"f9cc6b8a-b9b7-4087-b79d-709672d8d04a\") " pod="kube-system/kube-proxy-wgft8" Mar 6 01:42:13.748209 systemd[1]: Created slice kubepods-besteffort-poda05a2b9e_b832_4193_9a73_b115f81d3d8c.slice - libcontainer container kubepods-besteffort-poda05a2b9e_b832_4193_9a73_b115f81d3d8c.slice. Mar 6 01:42:13.831450 kubelet[2506]: I0306 01:42:13.831177 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a05a2b9e-b832-4193-9a73-b115f81d3d8c-var-lib-calico\") pod \"tigera-operator-5588576f44-bd6bf\" (UID: \"a05a2b9e-b832-4193-9a73-b115f81d3d8c\") " pod="tigera-operator/tigera-operator-5588576f44-bd6bf" Mar 6 01:42:13.831450 kubelet[2506]: I0306 01:42:13.831216 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk8d6\" (UniqueName: \"kubernetes.io/projected/a05a2b9e-b832-4193-9a73-b115f81d3d8c-kube-api-access-xk8d6\") pod \"tigera-operator-5588576f44-bd6bf\" (UID: \"a05a2b9e-b832-4193-9a73-b115f81d3d8c\") " pod="tigera-operator/tigera-operator-5588576f44-bd6bf" Mar 6 01:42:14.019041 kubelet[2506]: E0306 01:42:14.018900 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:14.019780 containerd[1455]: time="2026-03-06T01:42:14.019634287Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-wgft8,Uid:f9cc6b8a-b9b7-4087-b79d-709672d8d04a,Namespace:kube-system,Attempt:0,}" Mar 6 01:42:14.063770 containerd[1455]: time="2026-03-06T01:42:14.063213721Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-bd6bf,Uid:a05a2b9e-b832-4193-9a73-b115f81d3d8c,Namespace:tigera-operator,Attempt:0,}" Mar 6 01:42:14.065191 containerd[1455]: time="2026-03-06T01:42:14.064794230Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 6 01:42:14.065661 containerd[1455]: time="2026-03-06T01:42:14.065373384Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 6 01:42:14.065661 containerd[1455]: time="2026-03-06T01:42:14.065443504Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:14.065661 containerd[1455]: time="2026-03-06T01:42:14.065594675Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:14.077487 kubelet[2506]: E0306 01:42:14.077153 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:14.124591 systemd[1]: Started cri-containerd-6c320057f35b39e7cd4f069bacbe12b4e0fe575153f7f22ce8325978d0a36b96.scope - libcontainer container 6c320057f35b39e7cd4f069bacbe12b4e0fe575153f7f22ce8325978d0a36b96. Mar 6 01:42:14.146825 containerd[1455]: time="2026-03-06T01:42:14.146582960Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 6 01:42:14.146825 containerd[1455]: time="2026-03-06T01:42:14.146632912Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 6 01:42:14.146825 containerd[1455]: time="2026-03-06T01:42:14.146642810Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:14.146825 containerd[1455]: time="2026-03-06T01:42:14.146741143Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:14.176608 systemd[1]: Started cri-containerd-f0ab33ca6d0c833e2fb21acfa14ec8e392211c62accf7f6c2efaba1ba6bb8bf1.scope - libcontainer container f0ab33ca6d0c833e2fb21acfa14ec8e392211c62accf7f6c2efaba1ba6bb8bf1. Mar 6 01:42:14.182333 containerd[1455]: time="2026-03-06T01:42:14.182189303Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-wgft8,Uid:f9cc6b8a-b9b7-4087-b79d-709672d8d04a,Namespace:kube-system,Attempt:0,} returns sandbox id \"6c320057f35b39e7cd4f069bacbe12b4e0fe575153f7f22ce8325978d0a36b96\"" Mar 6 01:42:14.184968 kubelet[2506]: E0306 01:42:14.184752 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:14.199626 containerd[1455]: time="2026-03-06T01:42:14.199380383Z" level=info msg="CreateContainer within sandbox \"6c320057f35b39e7cd4f069bacbe12b4e0fe575153f7f22ce8325978d0a36b96\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 6 01:42:14.228153 containerd[1455]: time="2026-03-06T01:42:14.228051951Z" level=info msg="CreateContainer within sandbox \"6c320057f35b39e7cd4f069bacbe12b4e0fe575153f7f22ce8325978d0a36b96\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"f6c74e56b6297a2375266958f2d5d6d63e7295843b5cf813cbf32de808caf553\"" Mar 6 01:42:14.229535 containerd[1455]: time="2026-03-06T01:42:14.229362812Z" level=info msg="StartContainer for \"f6c74e56b6297a2375266958f2d5d6d63e7295843b5cf813cbf32de808caf553\"" Mar 6 01:42:14.264716 containerd[1455]: time="2026-03-06T01:42:14.264515148Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-bd6bf,Uid:a05a2b9e-b832-4193-9a73-b115f81d3d8c,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"f0ab33ca6d0c833e2fb21acfa14ec8e392211c62accf7f6c2efaba1ba6bb8bf1\"" Mar 6 01:42:14.269840 containerd[1455]: time="2026-03-06T01:42:14.269737252Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 6 01:42:14.290573 systemd[1]: Started cri-containerd-f6c74e56b6297a2375266958f2d5d6d63e7295843b5cf813cbf32de808caf553.scope - libcontainer container f6c74e56b6297a2375266958f2d5d6d63e7295843b5cf813cbf32de808caf553. Mar 6 01:42:14.349665 containerd[1455]: time="2026-03-06T01:42:14.349437186Z" level=info msg="StartContainer for \"f6c74e56b6297a2375266958f2d5d6d63e7295843b5cf813cbf32de808caf553\" returns successfully" Mar 6 01:42:15.082373 kubelet[2506]: E0306 01:42:15.082198 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:15.082373 kubelet[2506]: E0306 01:42:15.082474 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:15.095494 kubelet[2506]: I0306 01:42:15.095363 2506 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-wgft8" podStartSLOduration=2.095347765 podStartE2EDuration="2.095347765s" podCreationTimestamp="2026-03-06 01:42:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 01:42:15.09511343 +0000 UTC m=+6.244688898" watchObservedRunningTime="2026-03-06 01:42:15.095347765 +0000 UTC m=+6.244923223" Mar 6 01:42:15.180932 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2729179730.mount: Deactivated successfully. Mar 6 01:42:16.836443 containerd[1455]: time="2026-03-06T01:42:16.836361636Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:16.837640 containerd[1455]: time="2026-03-06T01:42:16.837539096Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Mar 6 01:42:16.838714 containerd[1455]: time="2026-03-06T01:42:16.838620327Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:16.841950 containerd[1455]: time="2026-03-06T01:42:16.841871001Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:16.842644 containerd[1455]: time="2026-03-06T01:42:16.842577586Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 2.572748284s" Mar 6 01:42:16.842644 containerd[1455]: time="2026-03-06T01:42:16.842638910Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Mar 6 01:42:16.849119 containerd[1455]: time="2026-03-06T01:42:16.849029492Z" level=info msg="CreateContainer within sandbox \"f0ab33ca6d0c833e2fb21acfa14ec8e392211c62accf7f6c2efaba1ba6bb8bf1\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 6 01:42:16.863180 containerd[1455]: time="2026-03-06T01:42:16.863095415Z" level=info msg="CreateContainer within sandbox \"f0ab33ca6d0c833e2fb21acfa14ec8e392211c62accf7f6c2efaba1ba6bb8bf1\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"9024ac6c495eb90762c1e48377d8def249f3df1f8a5f37bb25878efa53e8280b\"" Mar 6 01:42:16.864605 containerd[1455]: time="2026-03-06T01:42:16.863687739Z" level=info msg="StartContainer for \"9024ac6c495eb90762c1e48377d8def249f3df1f8a5f37bb25878efa53e8280b\"" Mar 6 01:42:16.921778 systemd[1]: Started cri-containerd-9024ac6c495eb90762c1e48377d8def249f3df1f8a5f37bb25878efa53e8280b.scope - libcontainer container 9024ac6c495eb90762c1e48377d8def249f3df1f8a5f37bb25878efa53e8280b. Mar 6 01:42:16.965491 containerd[1455]: time="2026-03-06T01:42:16.965422091Z" level=info msg="StartContainer for \"9024ac6c495eb90762c1e48377d8def249f3df1f8a5f37bb25878efa53e8280b\" returns successfully" Mar 6 01:42:17.393102 kubelet[2506]: E0306 01:42:17.392657 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:17.411920 kubelet[2506]: I0306 01:42:17.411344 2506 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5588576f44-bd6bf" podStartSLOduration=1.83554519 podStartE2EDuration="4.411231298s" podCreationTimestamp="2026-03-06 01:42:13 +0000 UTC" firstStartedPulling="2026-03-06 01:42:14.268738067 +0000 UTC m=+5.418313525" lastFinishedPulling="2026-03-06 01:42:16.844424165 +0000 UTC m=+7.993999633" observedRunningTime="2026-03-06 01:42:17.103104948 +0000 UTC m=+8.252680406" watchObservedRunningTime="2026-03-06 01:42:17.411231298 +0000 UTC m=+8.560806756" Mar 6 01:42:18.093135 kubelet[2506]: E0306 01:42:18.093028 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:18.217331 kubelet[2506]: E0306 01:42:18.215955 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:19.101327 kubelet[2506]: E0306 01:42:19.100135 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:19.101327 kubelet[2506]: E0306 01:42:19.100996 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:23.170031 sudo[1634]: pam_unix(sudo:session): session closed for user root Mar 6 01:42:23.187873 sshd[1631]: pam_unix(sshd:session): session closed for user core Mar 6 01:42:23.212546 systemd[1]: sshd@6-10.0.0.120:22-10.0.0.1:48520.service: Deactivated successfully. Mar 6 01:42:23.216643 systemd[1]: session-7.scope: Deactivated successfully. Mar 6 01:42:23.216983 systemd[1]: session-7.scope: Consumed 7.293s CPU time, 161.6M memory peak, 0B memory swap peak. Mar 6 01:42:23.220033 systemd-logind[1440]: Session 7 logged out. Waiting for processes to exit. Mar 6 01:42:23.223958 systemd-logind[1440]: Removed session 7. Mar 6 01:42:24.258468 update_engine[1441]: I20260306 01:42:24.258350 1441 update_attempter.cc:509] Updating boot flags... Mar 6 01:42:24.339683 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 35 scanned by (udev-worker) (2926) Mar 6 01:42:24.445409 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 35 scanned by (udev-worker) (2928) Mar 6 01:42:25.688201 systemd[1]: Created slice kubepods-besteffort-pod8ec3050f_ae9d_41e7_9108_1da419135623.slice - libcontainer container kubepods-besteffort-pod8ec3050f_ae9d_41e7_9108_1da419135623.slice. Mar 6 01:42:25.848061 kubelet[2506]: I0306 01:42:25.847736 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ec3050f-ae9d-41e7-9108-1da419135623-tigera-ca-bundle\") pod \"calico-typha-797898f7f4-r44lp\" (UID: \"8ec3050f-ae9d-41e7-9108-1da419135623\") " pod="calico-system/calico-typha-797898f7f4-r44lp" Mar 6 01:42:25.848061 kubelet[2506]: I0306 01:42:25.847777 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9v8l\" (UniqueName: \"kubernetes.io/projected/8ec3050f-ae9d-41e7-9108-1da419135623-kube-api-access-l9v8l\") pod \"calico-typha-797898f7f4-r44lp\" (UID: \"8ec3050f-ae9d-41e7-9108-1da419135623\") " pod="calico-system/calico-typha-797898f7f4-r44lp" Mar 6 01:42:25.848061 kubelet[2506]: I0306 01:42:25.847986 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/8ec3050f-ae9d-41e7-9108-1da419135623-typha-certs\") pod \"calico-typha-797898f7f4-r44lp\" (UID: \"8ec3050f-ae9d-41e7-9108-1da419135623\") " pod="calico-system/calico-typha-797898f7f4-r44lp" Mar 6 01:42:25.893237 systemd[1]: Created slice kubepods-besteffort-podaab3a3e6_0e4b_459e_ab06_64efa7de5971.slice - libcontainer container kubepods-besteffort-podaab3a3e6_0e4b_459e_ab06_64efa7de5971.slice. Mar 6 01:42:25.907527 kubelet[2506]: E0306 01:42:25.907008 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9qmk" podUID="f7e93c3d-576a-474e-b310-bc124fa176c8" Mar 6 01:42:26.002909 kubelet[2506]: E0306 01:42:26.002589 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:26.004359 containerd[1455]: time="2026-03-06T01:42:26.003939612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-797898f7f4-r44lp,Uid:8ec3050f-ae9d-41e7-9108-1da419135623,Namespace:calico-system,Attempt:0,}" Mar 6 01:42:26.052333 kubelet[2506]: I0306 01:42:26.051423 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/aab3a3e6-0e4b-459e-ab06-64efa7de5971-cni-bin-dir\") pod \"calico-node-dgbbq\" (UID: \"aab3a3e6-0e4b-459e-ab06-64efa7de5971\") " pod="calico-system/calico-node-dgbbq" Mar 6 01:42:26.052333 kubelet[2506]: I0306 01:42:26.051472 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/aab3a3e6-0e4b-459e-ab06-64efa7de5971-flexvol-driver-host\") pod \"calico-node-dgbbq\" (UID: \"aab3a3e6-0e4b-459e-ab06-64efa7de5971\") " pod="calico-system/calico-node-dgbbq" Mar 6 01:42:26.052333 kubelet[2506]: I0306 01:42:26.051496 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aab3a3e6-0e4b-459e-ab06-64efa7de5971-tigera-ca-bundle\") pod \"calico-node-dgbbq\" (UID: \"aab3a3e6-0e4b-459e-ab06-64efa7de5971\") " pod="calico-system/calico-node-dgbbq" Mar 6 01:42:26.052333 kubelet[2506]: I0306 01:42:26.051524 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7e93c3d-576a-474e-b310-bc124fa176c8-kubelet-dir\") pod \"csi-node-driver-p9qmk\" (UID: \"f7e93c3d-576a-474e-b310-bc124fa176c8\") " pod="calico-system/csi-node-driver-p9qmk" Mar 6 01:42:26.052333 kubelet[2506]: I0306 01:42:26.051549 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/aab3a3e6-0e4b-459e-ab06-64efa7de5971-policysync\") pod \"calico-node-dgbbq\" (UID: \"aab3a3e6-0e4b-459e-ab06-64efa7de5971\") " pod="calico-system/calico-node-dgbbq" Mar 6 01:42:26.052537 kubelet[2506]: I0306 01:42:26.051569 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f7e93c3d-576a-474e-b310-bc124fa176c8-registration-dir\") pod \"csi-node-driver-p9qmk\" (UID: \"f7e93c3d-576a-474e-b310-bc124fa176c8\") " pod="calico-system/csi-node-driver-p9qmk" Mar 6 01:42:26.052537 kubelet[2506]: I0306 01:42:26.051594 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/f7e93c3d-576a-474e-b310-bc124fa176c8-varrun\") pod \"csi-node-driver-p9qmk\" (UID: \"f7e93c3d-576a-474e-b310-bc124fa176c8\") " pod="calico-system/csi-node-driver-p9qmk" Mar 6 01:42:26.052537 kubelet[2506]: I0306 01:42:26.051617 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/aab3a3e6-0e4b-459e-ab06-64efa7de5971-node-certs\") pod \"calico-node-dgbbq\" (UID: \"aab3a3e6-0e4b-459e-ab06-64efa7de5971\") " pod="calico-system/calico-node-dgbbq" Mar 6 01:42:26.052537 kubelet[2506]: I0306 01:42:26.051654 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/aab3a3e6-0e4b-459e-ab06-64efa7de5971-xtables-lock\") pod \"calico-node-dgbbq\" (UID: \"aab3a3e6-0e4b-459e-ab06-64efa7de5971\") " pod="calico-system/calico-node-dgbbq" Mar 6 01:42:26.052537 kubelet[2506]: I0306 01:42:26.051675 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/aab3a3e6-0e4b-459e-ab06-64efa7de5971-cni-log-dir\") pod \"calico-node-dgbbq\" (UID: \"aab3a3e6-0e4b-459e-ab06-64efa7de5971\") " pod="calico-system/calico-node-dgbbq" Mar 6 01:42:26.052652 kubelet[2506]: I0306 01:42:26.051696 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f7e93c3d-576a-474e-b310-bc124fa176c8-socket-dir\") pod \"csi-node-driver-p9qmk\" (UID: \"f7e93c3d-576a-474e-b310-bc124fa176c8\") " pod="calico-system/csi-node-driver-p9qmk" Mar 6 01:42:26.052652 kubelet[2506]: I0306 01:42:26.051719 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/aab3a3e6-0e4b-459e-ab06-64efa7de5971-cni-net-dir\") pod \"calico-node-dgbbq\" (UID: \"aab3a3e6-0e4b-459e-ab06-64efa7de5971\") " pod="calico-system/calico-node-dgbbq" Mar 6 01:42:26.052652 kubelet[2506]: I0306 01:42:26.051743 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/aab3a3e6-0e4b-459e-ab06-64efa7de5971-var-lib-calico\") pod \"calico-node-dgbbq\" (UID: \"aab3a3e6-0e4b-459e-ab06-64efa7de5971\") " pod="calico-system/calico-node-dgbbq" Mar 6 01:42:26.052652 kubelet[2506]: I0306 01:42:26.051764 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/aab3a3e6-0e4b-459e-ab06-64efa7de5971-var-run-calico\") pod \"calico-node-dgbbq\" (UID: \"aab3a3e6-0e4b-459e-ab06-64efa7de5971\") " pod="calico-system/calico-node-dgbbq" Mar 6 01:42:26.052652 kubelet[2506]: I0306 01:42:26.051851 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/aab3a3e6-0e4b-459e-ab06-64efa7de5971-sys-fs\") pod \"calico-node-dgbbq\" (UID: \"aab3a3e6-0e4b-459e-ab06-64efa7de5971\") " pod="calico-system/calico-node-dgbbq" Mar 6 01:42:26.052959 kubelet[2506]: I0306 01:42:26.051878 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aab3a3e6-0e4b-459e-ab06-64efa7de5971-lib-modules\") pod \"calico-node-dgbbq\" (UID: \"aab3a3e6-0e4b-459e-ab06-64efa7de5971\") " pod="calico-system/calico-node-dgbbq" Mar 6 01:42:26.052959 kubelet[2506]: I0306 01:42:26.051920 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsctd\" (UniqueName: \"kubernetes.io/projected/aab3a3e6-0e4b-459e-ab06-64efa7de5971-kube-api-access-bsctd\") pod \"calico-node-dgbbq\" (UID: \"aab3a3e6-0e4b-459e-ab06-64efa7de5971\") " pod="calico-system/calico-node-dgbbq" Mar 6 01:42:26.052959 kubelet[2506]: I0306 01:42:26.051942 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9tkr\" (UniqueName: \"kubernetes.io/projected/f7e93c3d-576a-474e-b310-bc124fa176c8-kube-api-access-w9tkr\") pod \"csi-node-driver-p9qmk\" (UID: \"f7e93c3d-576a-474e-b310-bc124fa176c8\") " pod="calico-system/csi-node-driver-p9qmk" Mar 6 01:42:26.052959 kubelet[2506]: I0306 01:42:26.051962 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/aab3a3e6-0e4b-459e-ab06-64efa7de5971-bpffs\") pod \"calico-node-dgbbq\" (UID: \"aab3a3e6-0e4b-459e-ab06-64efa7de5971\") " pod="calico-system/calico-node-dgbbq" Mar 6 01:42:26.052959 kubelet[2506]: I0306 01:42:26.051982 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/aab3a3e6-0e4b-459e-ab06-64efa7de5971-nodeproc\") pod \"calico-node-dgbbq\" (UID: \"aab3a3e6-0e4b-459e-ab06-64efa7de5971\") " pod="calico-system/calico-node-dgbbq" Mar 6 01:42:26.066335 containerd[1455]: time="2026-03-06T01:42:26.065952815Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 6 01:42:26.066408 containerd[1455]: time="2026-03-06T01:42:26.066350577Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 6 01:42:26.066408 containerd[1455]: time="2026-03-06T01:42:26.066377026Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:26.066766 containerd[1455]: time="2026-03-06T01:42:26.066537705Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:26.121982 systemd[1]: Started cri-containerd-f242b32e3f071f5162f8a2dfe25206a0caad74f09002269e5f62cbaad714b1ff.scope - libcontainer container f242b32e3f071f5162f8a2dfe25206a0caad74f09002269e5f62cbaad714b1ff. Mar 6 01:42:26.157567 kubelet[2506]: E0306 01:42:26.157480 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.157567 kubelet[2506]: W0306 01:42:26.157555 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.157717 kubelet[2506]: E0306 01:42:26.157589 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.158850 kubelet[2506]: E0306 01:42:26.158684 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.158850 kubelet[2506]: W0306 01:42:26.158751 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.158850 kubelet[2506]: E0306 01:42:26.158831 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.160709 kubelet[2506]: E0306 01:42:26.160538 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.160979 kubelet[2506]: W0306 01:42:26.160709 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.160979 kubelet[2506]: E0306 01:42:26.160732 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.162597 kubelet[2506]: E0306 01:42:26.162442 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.162689 kubelet[2506]: W0306 01:42:26.162626 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.163097 kubelet[2506]: E0306 01:42:26.162935 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.163971 kubelet[2506]: E0306 01:42:26.163876 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.163971 kubelet[2506]: W0306 01:42:26.163898 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.163971 kubelet[2506]: E0306 01:42:26.163919 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.164546 kubelet[2506]: E0306 01:42:26.164490 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.164597 kubelet[2506]: W0306 01:42:26.164553 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.164597 kubelet[2506]: E0306 01:42:26.164574 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.165194 kubelet[2506]: E0306 01:42:26.165129 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.165194 kubelet[2506]: W0306 01:42:26.165190 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.165382 kubelet[2506]: E0306 01:42:26.165206 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.166019 kubelet[2506]: E0306 01:42:26.165950 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.166019 kubelet[2506]: W0306 01:42:26.166009 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.166119 kubelet[2506]: E0306 01:42:26.166024 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.166620 kubelet[2506]: E0306 01:42:26.166569 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.166659 kubelet[2506]: W0306 01:42:26.166621 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.166659 kubelet[2506]: E0306 01:42:26.166636 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.167533 kubelet[2506]: E0306 01:42:26.167340 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.167533 kubelet[2506]: W0306 01:42:26.167394 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.167533 kubelet[2506]: E0306 01:42:26.167408 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.169591 kubelet[2506]: E0306 01:42:26.169508 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.169591 kubelet[2506]: W0306 01:42:26.169569 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.169591 kubelet[2506]: E0306 01:42:26.169584 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.169975 kubelet[2506]: E0306 01:42:26.169959 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.170211 kubelet[2506]: W0306 01:42:26.170031 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.170211 kubelet[2506]: E0306 01:42:26.170052 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.170989 kubelet[2506]: E0306 01:42:26.170751 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.170989 kubelet[2506]: W0306 01:42:26.170765 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.170989 kubelet[2506]: E0306 01:42:26.170775 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.171593 kubelet[2506]: E0306 01:42:26.171505 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.171593 kubelet[2506]: W0306 01:42:26.171567 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.171593 kubelet[2506]: E0306 01:42:26.171579 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.172133 kubelet[2506]: E0306 01:42:26.172088 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.172133 kubelet[2506]: W0306 01:42:26.172130 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.172463 kubelet[2506]: E0306 01:42:26.172142 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.172898 kubelet[2506]: E0306 01:42:26.172691 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.172898 kubelet[2506]: W0306 01:42:26.172707 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.172898 kubelet[2506]: E0306 01:42:26.172720 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.176084 kubelet[2506]: E0306 01:42:26.175348 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.176084 kubelet[2506]: W0306 01:42:26.175366 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.176084 kubelet[2506]: E0306 01:42:26.175380 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.177039 kubelet[2506]: E0306 01:42:26.176778 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.177039 kubelet[2506]: W0306 01:42:26.176882 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.177039 kubelet[2506]: E0306 01:42:26.176894 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.178240 kubelet[2506]: E0306 01:42:26.178215 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.178240 kubelet[2506]: W0306 01:42:26.178230 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.178240 kubelet[2506]: E0306 01:42:26.178239 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.178861 kubelet[2506]: E0306 01:42:26.178840 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.179069 kubelet[2506]: W0306 01:42:26.178933 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.179069 kubelet[2506]: E0306 01:42:26.178954 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.179974 kubelet[2506]: E0306 01:42:26.179881 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.179974 kubelet[2506]: W0306 01:42:26.179897 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.179974 kubelet[2506]: E0306 01:42:26.179910 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.180861 kubelet[2506]: E0306 01:42:26.180846 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.181111 kubelet[2506]: W0306 01:42:26.180920 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.181111 kubelet[2506]: E0306 01:42:26.180934 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.181483 kubelet[2506]: E0306 01:42:26.181467 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.181568 kubelet[2506]: W0306 01:42:26.181550 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.181637 kubelet[2506]: E0306 01:42:26.181623 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.182526 kubelet[2506]: E0306 01:42:26.182511 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.182778 kubelet[2506]: W0306 01:42:26.182598 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.182778 kubelet[2506]: E0306 01:42:26.182616 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.183162 kubelet[2506]: E0306 01:42:26.183145 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.183236 kubelet[2506]: W0306 01:42:26.183223 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.183879 kubelet[2506]: E0306 01:42:26.183479 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.184469 kubelet[2506]: E0306 01:42:26.184215 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.184469 kubelet[2506]: W0306 01:42:26.184231 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.184469 kubelet[2506]: E0306 01:42:26.184357 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.184897 kubelet[2506]: E0306 01:42:26.184876 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.185342 kubelet[2506]: W0306 01:42:26.184968 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.185342 kubelet[2506]: E0306 01:42:26.184987 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.185749 kubelet[2506]: E0306 01:42:26.185732 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.185984 kubelet[2506]: W0306 01:42:26.185894 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.185984 kubelet[2506]: E0306 01:42:26.185915 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.186599 kubelet[2506]: E0306 01:42:26.186503 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.186599 kubelet[2506]: W0306 01:42:26.186519 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.186599 kubelet[2506]: E0306 01:42:26.186532 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.187506 kubelet[2506]: E0306 01:42:26.187231 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.187506 kubelet[2506]: W0306 01:42:26.187383 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.187506 kubelet[2506]: E0306 01:42:26.187398 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.188515 kubelet[2506]: E0306 01:42:26.188495 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.188604 kubelet[2506]: W0306 01:42:26.188588 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.188683 kubelet[2506]: E0306 01:42:26.188666 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.189755 kubelet[2506]: E0306 01:42:26.189738 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.189911 kubelet[2506]: W0306 01:42:26.189896 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.189974 kubelet[2506]: E0306 01:42:26.189957 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.190916 kubelet[2506]: E0306 01:42:26.190897 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.191030 kubelet[2506]: W0306 01:42:26.190992 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.191030 kubelet[2506]: E0306 01:42:26.191012 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.192111 kubelet[2506]: E0306 01:42:26.191854 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.192111 kubelet[2506]: W0306 01:42:26.191873 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.192111 kubelet[2506]: E0306 01:42:26.191887 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.192681 kubelet[2506]: E0306 01:42:26.192577 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.192681 kubelet[2506]: W0306 01:42:26.192661 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.192681 kubelet[2506]: E0306 01:42:26.192675 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.194030 kubelet[2506]: E0306 01:42:26.193046 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.194030 kubelet[2506]: W0306 01:42:26.193058 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.194030 kubelet[2506]: E0306 01:42:26.193068 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.194030 kubelet[2506]: E0306 01:42:26.193659 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.194030 kubelet[2506]: W0306 01:42:26.193670 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.194030 kubelet[2506]: E0306 01:42:26.193680 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.194489 kubelet[2506]: E0306 01:42:26.194216 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.194489 kubelet[2506]: W0306 01:42:26.194227 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.194489 kubelet[2506]: E0306 01:42:26.194236 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.195055 kubelet[2506]: E0306 01:42:26.194967 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.195055 kubelet[2506]: W0306 01:42:26.195021 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.195055 kubelet[2506]: E0306 01:42:26.195032 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.195721 kubelet[2506]: E0306 01:42:26.195660 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.195721 kubelet[2506]: W0306 01:42:26.195702 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.195721 kubelet[2506]: E0306 01:42:26.195712 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.196481 kubelet[2506]: E0306 01:42:26.196236 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.196481 kubelet[2506]: W0306 01:42:26.196336 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.196481 kubelet[2506]: E0306 01:42:26.196347 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.196928 kubelet[2506]: E0306 01:42:26.196753 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.196928 kubelet[2506]: W0306 01:42:26.196763 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.196928 kubelet[2506]: E0306 01:42:26.196772 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.199546 kubelet[2506]: E0306 01:42:26.197604 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.199546 kubelet[2506]: W0306 01:42:26.197725 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.199546 kubelet[2506]: E0306 01:42:26.197737 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.200187 kubelet[2506]: E0306 01:42:26.199900 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.200187 kubelet[2506]: W0306 01:42:26.199970 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.200187 kubelet[2506]: E0306 01:42:26.199999 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.200979 kubelet[2506]: E0306 01:42:26.200699 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.200979 kubelet[2506]: W0306 01:42:26.200767 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.200979 kubelet[2506]: E0306 01:42:26.200847 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.202205 kubelet[2506]: E0306 01:42:26.201574 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.202205 kubelet[2506]: W0306 01:42:26.201631 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.202205 kubelet[2506]: E0306 01:42:26.201650 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.203072 kubelet[2506]: E0306 01:42:26.202947 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.203072 kubelet[2506]: W0306 01:42:26.203006 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.203072 kubelet[2506]: E0306 01:42:26.203022 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.203909 kubelet[2506]: E0306 01:42:26.203662 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.203909 kubelet[2506]: W0306 01:42:26.203716 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.203909 kubelet[2506]: E0306 01:42:26.203732 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.204473 kubelet[2506]: E0306 01:42:26.204217 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.204473 kubelet[2506]: W0306 01:42:26.204375 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.204473 kubelet[2506]: E0306 01:42:26.204392 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.205110 kubelet[2506]: E0306 01:42:26.204773 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.205110 kubelet[2506]: W0306 01:42:26.204895 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.205110 kubelet[2506]: E0306 01:42:26.204912 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.206011 kubelet[2506]: E0306 01:42:26.205992 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.206094 kubelet[2506]: W0306 01:42:26.206077 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.206437 kubelet[2506]: E0306 01:42:26.206239 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.208061 kubelet[2506]: E0306 01:42:26.207456 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.208061 kubelet[2506]: W0306 01:42:26.207476 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.208061 kubelet[2506]: E0306 01:42:26.207490 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.208746 kubelet[2506]: E0306 01:42:26.208656 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.208746 kubelet[2506]: W0306 01:42:26.208723 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.208746 kubelet[2506]: E0306 01:42:26.208739 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.210403 kubelet[2506]: E0306 01:42:26.210179 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.210403 kubelet[2506]: W0306 01:42:26.210196 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.210403 kubelet[2506]: E0306 01:42:26.210210 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.212118 containerd[1455]: time="2026-03-06T01:42:26.212008153Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-797898f7f4-r44lp,Uid:8ec3050f-ae9d-41e7-9108-1da419135623,Namespace:calico-system,Attempt:0,} returns sandbox id \"f242b32e3f071f5162f8a2dfe25206a0caad74f09002269e5f62cbaad714b1ff\"" Mar 6 01:42:26.213428 kubelet[2506]: E0306 01:42:26.212003 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.213428 kubelet[2506]: W0306 01:42:26.212095 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.213428 kubelet[2506]: E0306 01:42:26.212110 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.213428 kubelet[2506]: E0306 01:42:26.213007 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:26.215440 kubelet[2506]: E0306 01:42:26.213915 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.215440 kubelet[2506]: W0306 01:42:26.214069 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.215440 kubelet[2506]: E0306 01:42:26.214438 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.215969 containerd[1455]: time="2026-03-06T01:42:26.215874823Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 6 01:42:26.216738 kubelet[2506]: E0306 01:42:26.216536 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.216960 kubelet[2506]: W0306 01:42:26.216746 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.216960 kubelet[2506]: E0306 01:42:26.216763 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.219738 kubelet[2506]: E0306 01:42:26.219532 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.219738 kubelet[2506]: W0306 01:42:26.219707 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.219738 kubelet[2506]: E0306 01:42:26.219723 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.234377 kubelet[2506]: E0306 01:42:26.232531 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.234377 kubelet[2506]: W0306 01:42:26.232547 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.234377 kubelet[2506]: E0306 01:42:26.232558 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.236651 kubelet[2506]: E0306 01:42:26.236570 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.236651 kubelet[2506]: W0306 01:42:26.236641 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.236713 kubelet[2506]: E0306 01:42:26.236668 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.239601 kubelet[2506]: E0306 01:42:26.239546 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:26.239642 kubelet[2506]: W0306 01:42:26.239606 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:26.239642 kubelet[2506]: E0306 01:42:26.239627 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:26.504873 containerd[1455]: time="2026-03-06T01:42:26.504514379Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-dgbbq,Uid:aab3a3e6-0e4b-459e-ab06-64efa7de5971,Namespace:calico-system,Attempt:0,}" Mar 6 01:42:26.575880 containerd[1455]: time="2026-03-06T01:42:26.575584623Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 6 01:42:26.576431 containerd[1455]: time="2026-03-06T01:42:26.575740564Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 6 01:42:26.576431 containerd[1455]: time="2026-03-06T01:42:26.576187156Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:26.577173 containerd[1455]: time="2026-03-06T01:42:26.577056825Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:26.618608 systemd[1]: Started cri-containerd-eab51331bdcdf6fdedc5fd1326d1d2df24c438822fc74963a53ca627fc546848.scope - libcontainer container eab51331bdcdf6fdedc5fd1326d1d2df24c438822fc74963a53ca627fc546848. Mar 6 01:42:26.676454 containerd[1455]: time="2026-03-06T01:42:26.676414430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-dgbbq,Uid:aab3a3e6-0e4b-459e-ab06-64efa7de5971,Namespace:calico-system,Attempt:0,} returns sandbox id \"eab51331bdcdf6fdedc5fd1326d1d2df24c438822fc74963a53ca627fc546848\"" Mar 6 01:42:27.039122 kubelet[2506]: E0306 01:42:27.038881 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9qmk" podUID="f7e93c3d-576a-474e-b310-bc124fa176c8" Mar 6 01:42:28.905991 containerd[1455]: time="2026-03-06T01:42:28.905902467Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:28.908078 containerd[1455]: time="2026-03-06T01:42:28.907676533Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Mar 6 01:42:28.909351 containerd[1455]: time="2026-03-06T01:42:28.909161080Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:28.912503 containerd[1455]: time="2026-03-06T01:42:28.912341496Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:28.913720 containerd[1455]: time="2026-03-06T01:42:28.913569312Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 2.697617285s" Mar 6 01:42:28.913720 containerd[1455]: time="2026-03-06T01:42:28.913697961Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Mar 6 01:42:28.915366 containerd[1455]: time="2026-03-06T01:42:28.915054628Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 6 01:42:28.942570 containerd[1455]: time="2026-03-06T01:42:28.942370114Z" level=info msg="CreateContainer within sandbox \"f242b32e3f071f5162f8a2dfe25206a0caad74f09002269e5f62cbaad714b1ff\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 6 01:42:28.965587 containerd[1455]: time="2026-03-06T01:42:28.965500838Z" level=info msg="CreateContainer within sandbox \"f242b32e3f071f5162f8a2dfe25206a0caad74f09002269e5f62cbaad714b1ff\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"b806643bf76211bdfdae44730ca7a2297302ada0686a1c580df420b253c536f3\"" Mar 6 01:42:28.967660 containerd[1455]: time="2026-03-06T01:42:28.966527628Z" level=info msg="StartContainer for \"b806643bf76211bdfdae44730ca7a2297302ada0686a1c580df420b253c536f3\"" Mar 6 01:42:29.032538 systemd[1]: Started cri-containerd-b806643bf76211bdfdae44730ca7a2297302ada0686a1c580df420b253c536f3.scope - libcontainer container b806643bf76211bdfdae44730ca7a2297302ada0686a1c580df420b253c536f3. Mar 6 01:42:29.038636 kubelet[2506]: E0306 01:42:29.038490 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9qmk" podUID="f7e93c3d-576a-474e-b310-bc124fa176c8" Mar 6 01:42:29.114719 containerd[1455]: time="2026-03-06T01:42:29.114349169Z" level=info msg="StartContainer for \"b806643bf76211bdfdae44730ca7a2297302ada0686a1c580df420b253c536f3\" returns successfully" Mar 6 01:42:29.181066 kubelet[2506]: E0306 01:42:29.180515 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:29.190814 kubelet[2506]: E0306 01:42:29.189890 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.190814 kubelet[2506]: W0306 01:42:29.189915 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.190814 kubelet[2506]: E0306 01:42:29.189933 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.190814 kubelet[2506]: E0306 01:42:29.190451 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.190814 kubelet[2506]: W0306 01:42:29.190462 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.190814 kubelet[2506]: E0306 01:42:29.190471 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.191141 kubelet[2506]: E0306 01:42:29.191006 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.191562 kubelet[2506]: W0306 01:42:29.191479 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.191562 kubelet[2506]: E0306 01:42:29.191499 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.192479 kubelet[2506]: E0306 01:42:29.192450 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.192620 kubelet[2506]: W0306 01:42:29.192474 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.192620 kubelet[2506]: E0306 01:42:29.192506 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.194355 kubelet[2506]: E0306 01:42:29.194081 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.194355 kubelet[2506]: W0306 01:42:29.194095 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.194355 kubelet[2506]: E0306 01:42:29.194221 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.196187 kubelet[2506]: E0306 01:42:29.196093 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.196187 kubelet[2506]: W0306 01:42:29.196176 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.196437 kubelet[2506]: E0306 01:42:29.196193 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.199324 kubelet[2506]: E0306 01:42:29.197151 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.199324 kubelet[2506]: W0306 01:42:29.197173 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.199324 kubelet[2506]: E0306 01:42:29.197188 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.199560 kubelet[2506]: E0306 01:42:29.199459 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.199560 kubelet[2506]: W0306 01:42:29.199530 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.199560 kubelet[2506]: E0306 01:42:29.199546 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.200547 kubelet[2506]: E0306 01:42:29.200377 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.200547 kubelet[2506]: W0306 01:42:29.200435 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.200547 kubelet[2506]: E0306 01:42:29.200454 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.201104 kubelet[2506]: I0306 01:42:29.201007 2506 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-797898f7f4-r44lp" podStartSLOduration=1.500356471 podStartE2EDuration="4.200994176s" podCreationTimestamp="2026-03-06 01:42:25 +0000 UTC" firstStartedPulling="2026-03-06 01:42:26.214237516 +0000 UTC m=+17.363812975" lastFinishedPulling="2026-03-06 01:42:28.914875222 +0000 UTC m=+20.064450680" observedRunningTime="2026-03-06 01:42:29.200761582 +0000 UTC m=+20.350337060" watchObservedRunningTime="2026-03-06 01:42:29.200994176 +0000 UTC m=+20.350569644" Mar 6 01:42:29.201445 kubelet[2506]: E0306 01:42:29.201363 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.201501 kubelet[2506]: W0306 01:42:29.201476 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.201501 kubelet[2506]: E0306 01:42:29.201488 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.202380 kubelet[2506]: E0306 01:42:29.202198 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.202500 kubelet[2506]: W0306 01:42:29.202420 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.202500 kubelet[2506]: E0306 01:42:29.202440 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.204117 kubelet[2506]: E0306 01:42:29.204041 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.204117 kubelet[2506]: W0306 01:42:29.204097 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.204117 kubelet[2506]: E0306 01:42:29.204109 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.204684 kubelet[2506]: E0306 01:42:29.204599 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.204684 kubelet[2506]: W0306 01:42:29.204665 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.204684 kubelet[2506]: E0306 01:42:29.204682 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.206518 kubelet[2506]: E0306 01:42:29.206496 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.206518 kubelet[2506]: W0306 01:42:29.206513 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.206518 kubelet[2506]: E0306 01:42:29.206524 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.207437 kubelet[2506]: E0306 01:42:29.207372 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.207437 kubelet[2506]: W0306 01:42:29.207388 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.207437 kubelet[2506]: E0306 01:42:29.207399 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.285421 kubelet[2506]: E0306 01:42:29.283501 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.285421 kubelet[2506]: W0306 01:42:29.283527 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.285421 kubelet[2506]: E0306 01:42:29.283549 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.285421 kubelet[2506]: E0306 01:42:29.284004 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.285421 kubelet[2506]: W0306 01:42:29.284024 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.285421 kubelet[2506]: E0306 01:42:29.284039 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.293402 kubelet[2506]: E0306 01:42:29.292194 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.293402 kubelet[2506]: W0306 01:42:29.292218 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.293402 kubelet[2506]: E0306 01:42:29.292240 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.296175 kubelet[2506]: E0306 01:42:29.296096 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.296175 kubelet[2506]: W0306 01:42:29.296167 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.296458 kubelet[2506]: E0306 01:42:29.296191 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.296940 kubelet[2506]: E0306 01:42:29.296868 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.296940 kubelet[2506]: W0306 01:42:29.296938 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.297004 kubelet[2506]: E0306 01:42:29.296956 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.297748 kubelet[2506]: E0306 01:42:29.297567 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.297849 kubelet[2506]: W0306 01:42:29.297748 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.297849 kubelet[2506]: E0306 01:42:29.297767 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.306665 kubelet[2506]: E0306 01:42:29.306516 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.306749 kubelet[2506]: W0306 01:42:29.306707 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.306749 kubelet[2506]: E0306 01:42:29.306730 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.316550 kubelet[2506]: E0306 01:42:29.316399 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.316550 kubelet[2506]: W0306 01:42:29.316488 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.316550 kubelet[2506]: E0306 01:42:29.316514 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.319146 kubelet[2506]: E0306 01:42:29.319057 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.319146 kubelet[2506]: W0306 01:42:29.319124 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.319232 kubelet[2506]: E0306 01:42:29.319151 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.320089 kubelet[2506]: E0306 01:42:29.319965 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.320089 kubelet[2506]: W0306 01:42:29.320033 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.320089 kubelet[2506]: E0306 01:42:29.320050 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.321762 kubelet[2506]: E0306 01:42:29.321441 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.321762 kubelet[2506]: W0306 01:42:29.321499 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.321762 kubelet[2506]: E0306 01:42:29.321513 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.323630 kubelet[2506]: E0306 01:42:29.323374 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.323630 kubelet[2506]: W0306 01:42:29.323427 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.323630 kubelet[2506]: E0306 01:42:29.323447 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.324942 kubelet[2506]: E0306 01:42:29.324873 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.324942 kubelet[2506]: W0306 01:42:29.324931 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.325028 kubelet[2506]: E0306 01:42:29.324948 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.328697 kubelet[2506]: E0306 01:42:29.328512 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.328697 kubelet[2506]: W0306 01:42:29.328531 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.328697 kubelet[2506]: E0306 01:42:29.328552 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.329656 kubelet[2506]: E0306 01:42:29.329634 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.329656 kubelet[2506]: W0306 01:42:29.329651 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.329730 kubelet[2506]: E0306 01:42:29.329665 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.332486 kubelet[2506]: E0306 01:42:29.332419 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.332486 kubelet[2506]: W0306 01:42:29.332483 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.332562 kubelet[2506]: E0306 01:42:29.332498 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.333480 kubelet[2506]: E0306 01:42:29.333426 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.333480 kubelet[2506]: W0306 01:42:29.333478 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.333546 kubelet[2506]: E0306 01:42:29.333494 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:29.334749 kubelet[2506]: E0306 01:42:29.334679 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:29.334749 kubelet[2506]: W0306 01:42:29.334743 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:29.334895 kubelet[2506]: E0306 01:42:29.334760 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.183586 kubelet[2506]: I0306 01:42:30.183461 2506 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 6 01:42:30.184530 kubelet[2506]: E0306 01:42:30.183996 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:30.221719 kubelet[2506]: E0306 01:42:30.221634 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.221719 kubelet[2506]: W0306 01:42:30.221711 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.221986 kubelet[2506]: E0306 01:42:30.221742 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.222489 kubelet[2506]: E0306 01:42:30.222441 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.222572 kubelet[2506]: W0306 01:42:30.222491 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.222572 kubelet[2506]: E0306 01:42:30.222502 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.223070 kubelet[2506]: E0306 01:42:30.223011 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.223070 kubelet[2506]: W0306 01:42:30.223065 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.223184 kubelet[2506]: E0306 01:42:30.223080 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.224335 kubelet[2506]: E0306 01:42:30.224099 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.224335 kubelet[2506]: W0306 01:42:30.224151 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.224335 kubelet[2506]: E0306 01:42:30.224167 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.225548 kubelet[2506]: E0306 01:42:30.225189 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.225548 kubelet[2506]: W0306 01:42:30.225402 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.225548 kubelet[2506]: E0306 01:42:30.225418 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.226485 kubelet[2506]: E0306 01:42:30.226431 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.226485 kubelet[2506]: W0306 01:42:30.226483 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.226593 kubelet[2506]: E0306 01:42:30.226499 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.227537 kubelet[2506]: E0306 01:42:30.227389 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.227814 kubelet[2506]: W0306 01:42:30.227573 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.227865 kubelet[2506]: E0306 01:42:30.227765 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.228484 kubelet[2506]: E0306 01:42:30.228410 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.228484 kubelet[2506]: W0306 01:42:30.228472 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.228582 kubelet[2506]: E0306 01:42:30.228486 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.229064 kubelet[2506]: E0306 01:42:30.228995 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.229064 kubelet[2506]: W0306 01:42:30.229051 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.229064 kubelet[2506]: E0306 01:42:30.229064 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.229711 kubelet[2506]: E0306 01:42:30.229664 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.229835 kubelet[2506]: W0306 01:42:30.229713 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.229835 kubelet[2506]: E0306 01:42:30.229728 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.230604 kubelet[2506]: E0306 01:42:30.230410 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.230604 kubelet[2506]: W0306 01:42:30.230470 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.230604 kubelet[2506]: E0306 01:42:30.230484 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.231138 kubelet[2506]: E0306 01:42:30.231060 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.231138 kubelet[2506]: W0306 01:42:30.231129 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.231361 kubelet[2506]: E0306 01:42:30.231145 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.231863 kubelet[2506]: E0306 01:42:30.231723 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.231863 kubelet[2506]: W0306 01:42:30.231843 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.231863 kubelet[2506]: E0306 01:42:30.231860 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.232455 kubelet[2506]: E0306 01:42:30.232429 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.232455 kubelet[2506]: W0306 01:42:30.232445 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.232546 kubelet[2506]: E0306 01:42:30.232459 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.233102 kubelet[2506]: E0306 01:42:30.233015 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.233102 kubelet[2506]: W0306 01:42:30.233087 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.233102 kubelet[2506]: E0306 01:42:30.233104 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.308036 kubelet[2506]: E0306 01:42:30.307964 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.308190 kubelet[2506]: W0306 01:42:30.308091 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.308190 kubelet[2506]: E0306 01:42:30.308119 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.309098 kubelet[2506]: E0306 01:42:30.308933 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.309098 kubelet[2506]: W0306 01:42:30.308953 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.309098 kubelet[2506]: E0306 01:42:30.308966 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.309762 kubelet[2506]: E0306 01:42:30.309711 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.309889 kubelet[2506]: W0306 01:42:30.309766 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.309889 kubelet[2506]: E0306 01:42:30.309841 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.310565 kubelet[2506]: E0306 01:42:30.310513 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.310632 kubelet[2506]: W0306 01:42:30.310570 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.310632 kubelet[2506]: E0306 01:42:30.310590 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.311184 kubelet[2506]: E0306 01:42:30.311147 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.311184 kubelet[2506]: W0306 01:42:30.311163 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.311184 kubelet[2506]: E0306 01:42:30.311177 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.311988 kubelet[2506]: E0306 01:42:30.311930 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.311988 kubelet[2506]: W0306 01:42:30.311950 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.311988 kubelet[2506]: E0306 01:42:30.311965 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.312611 kubelet[2506]: E0306 01:42:30.312551 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.312611 kubelet[2506]: W0306 01:42:30.312595 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.312611 kubelet[2506]: E0306 01:42:30.312608 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.313371 kubelet[2506]: E0306 01:42:30.313197 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.313426 kubelet[2506]: W0306 01:42:30.313371 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.313426 kubelet[2506]: E0306 01:42:30.313392 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.314425 kubelet[2506]: E0306 01:42:30.314145 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.314425 kubelet[2506]: W0306 01:42:30.314168 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.314425 kubelet[2506]: E0306 01:42:30.314185 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.315088 kubelet[2506]: E0306 01:42:30.315065 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.315336 kubelet[2506]: W0306 01:42:30.315161 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.315506 kubelet[2506]: E0306 01:42:30.315234 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.315883 kubelet[2506]: E0306 01:42:30.315759 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.315883 kubelet[2506]: W0306 01:42:30.315860 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.315883 kubelet[2506]: E0306 01:42:30.315876 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.316819 kubelet[2506]: E0306 01:42:30.316563 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.316819 kubelet[2506]: W0306 01:42:30.316621 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.316819 kubelet[2506]: E0306 01:42:30.316637 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.317449 kubelet[2506]: E0306 01:42:30.317199 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.317449 kubelet[2506]: W0306 01:42:30.317330 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.317449 kubelet[2506]: E0306 01:42:30.317344 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.317992 kubelet[2506]: E0306 01:42:30.317910 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.317992 kubelet[2506]: W0306 01:42:30.317955 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.317992 kubelet[2506]: E0306 01:42:30.317967 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.318857 kubelet[2506]: E0306 01:42:30.318730 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.318857 kubelet[2506]: W0306 01:42:30.318837 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.318857 kubelet[2506]: E0306 01:42:30.318856 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.319527 kubelet[2506]: E0306 01:42:30.319475 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.319527 kubelet[2506]: W0306 01:42:30.319518 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.319527 kubelet[2506]: E0306 01:42:30.319531 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.320063 kubelet[2506]: E0306 01:42:30.320006 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.320063 kubelet[2506]: W0306 01:42:30.320055 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.320158 kubelet[2506]: E0306 01:42:30.320070 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.320732 kubelet[2506]: E0306 01:42:30.320679 2506 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 01:42:30.320850 kubelet[2506]: W0306 01:42:30.320734 2506 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 01:42:30.320850 kubelet[2506]: E0306 01:42:30.320747 2506 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 01:42:30.693903 containerd[1455]: time="2026-03-06T01:42:30.693699105Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:30.695318 containerd[1455]: time="2026-03-06T01:42:30.695147165Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Mar 6 01:42:30.696496 containerd[1455]: time="2026-03-06T01:42:30.696350579Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:30.699112 containerd[1455]: time="2026-03-06T01:42:30.699043478Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:30.700071 containerd[1455]: time="2026-03-06T01:42:30.699909122Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 1.784824116s" Mar 6 01:42:30.700071 containerd[1455]: time="2026-03-06T01:42:30.699970466Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Mar 6 01:42:30.706216 containerd[1455]: time="2026-03-06T01:42:30.706073847Z" level=info msg="CreateContainer within sandbox \"eab51331bdcdf6fdedc5fd1326d1d2df24c438822fc74963a53ca627fc546848\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 6 01:42:30.730656 containerd[1455]: time="2026-03-06T01:42:30.730561390Z" level=info msg="CreateContainer within sandbox \"eab51331bdcdf6fdedc5fd1326d1d2df24c438822fc74963a53ca627fc546848\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"fdac50a0d2e8ca74f67a15496f7f5b23980a1b45acef0553b1a8dcb2849e7d62\"" Mar 6 01:42:30.731469 containerd[1455]: time="2026-03-06T01:42:30.731386970Z" level=info msg="StartContainer for \"fdac50a0d2e8ca74f67a15496f7f5b23980a1b45acef0553b1a8dcb2849e7d62\"" Mar 6 01:42:30.815950 systemd[1]: Started cri-containerd-fdac50a0d2e8ca74f67a15496f7f5b23980a1b45acef0553b1a8dcb2849e7d62.scope - libcontainer container fdac50a0d2e8ca74f67a15496f7f5b23980a1b45acef0553b1a8dcb2849e7d62. Mar 6 01:42:31.040941 kubelet[2506]: E0306 01:42:31.039640 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9qmk" podUID="f7e93c3d-576a-474e-b310-bc124fa176c8" Mar 6 01:42:31.149325 containerd[1455]: time="2026-03-06T01:42:31.148387435Z" level=info msg="StartContainer for \"fdac50a0d2e8ca74f67a15496f7f5b23980a1b45acef0553b1a8dcb2849e7d62\" returns successfully" Mar 6 01:42:31.164871 systemd[1]: cri-containerd-fdac50a0d2e8ca74f67a15496f7f5b23980a1b45acef0553b1a8dcb2849e7d62.scope: Deactivated successfully. Mar 6 01:42:31.252600 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fdac50a0d2e8ca74f67a15496f7f5b23980a1b45acef0553b1a8dcb2849e7d62-rootfs.mount: Deactivated successfully. Mar 6 01:42:31.305921 containerd[1455]: time="2026-03-06T01:42:31.302712104Z" level=info msg="shim disconnected" id=fdac50a0d2e8ca74f67a15496f7f5b23980a1b45acef0553b1a8dcb2849e7d62 namespace=k8s.io Mar 6 01:42:31.305921 containerd[1455]: time="2026-03-06T01:42:31.305613413Z" level=warning msg="cleaning up after shim disconnected" id=fdac50a0d2e8ca74f67a15496f7f5b23980a1b45acef0553b1a8dcb2849e7d62 namespace=k8s.io Mar 6 01:42:31.305921 containerd[1455]: time="2026-03-06T01:42:31.305626908Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 6 01:42:32.203943 containerd[1455]: time="2026-03-06T01:42:32.203406148Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 6 01:42:33.041687 kubelet[2506]: E0306 01:42:33.041480 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9qmk" podUID="f7e93c3d-576a-474e-b310-bc124fa176c8" Mar 6 01:42:35.040017 kubelet[2506]: E0306 01:42:35.039916 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9qmk" podUID="f7e93c3d-576a-474e-b310-bc124fa176c8" Mar 6 01:42:37.038668 kubelet[2506]: E0306 01:42:37.038595 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9qmk" podUID="f7e93c3d-576a-474e-b310-bc124fa176c8" Mar 6 01:42:38.278902 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3546684050.mount: Deactivated successfully. Mar 6 01:42:38.335004 kubelet[2506]: I0306 01:42:38.334829 2506 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 6 01:42:38.336812 kubelet[2506]: E0306 01:42:38.336634 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:38.580644 containerd[1455]: time="2026-03-06T01:42:38.579915443Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Mar 6 01:42:38.584566 containerd[1455]: time="2026-03-06T01:42:38.584432001Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:38.594055 containerd[1455]: time="2026-03-06T01:42:38.593938395Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:38.598676 containerd[1455]: time="2026-03-06T01:42:38.598426337Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:38.599079 containerd[1455]: time="2026-03-06T01:42:38.598976120Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 6.395415103s" Mar 6 01:42:38.599079 containerd[1455]: time="2026-03-06T01:42:38.599051260Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Mar 6 01:42:38.620418 containerd[1455]: time="2026-03-06T01:42:38.620240519Z" level=info msg="CreateContainer within sandbox \"eab51331bdcdf6fdedc5fd1326d1d2df24c438822fc74963a53ca627fc546848\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 6 01:42:38.770189 containerd[1455]: time="2026-03-06T01:42:38.770054064Z" level=info msg="CreateContainer within sandbox \"eab51331bdcdf6fdedc5fd1326d1d2df24c438822fc74963a53ca627fc546848\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"1bc1b21e926ce01d28b2b4c0348d30d9d58d88a942bccdb94cbdb87e6938bffd\"" Mar 6 01:42:38.771753 containerd[1455]: time="2026-03-06T01:42:38.771140096Z" level=info msg="StartContainer for \"1bc1b21e926ce01d28b2b4c0348d30d9d58d88a942bccdb94cbdb87e6938bffd\"" Mar 6 01:42:38.871443 systemd[1]: Started cri-containerd-1bc1b21e926ce01d28b2b4c0348d30d9d58d88a942bccdb94cbdb87e6938bffd.scope - libcontainer container 1bc1b21e926ce01d28b2b4c0348d30d9d58d88a942bccdb94cbdb87e6938bffd. Mar 6 01:42:38.929453 containerd[1455]: time="2026-03-06T01:42:38.928856226Z" level=info msg="StartContainer for \"1bc1b21e926ce01d28b2b4c0348d30d9d58d88a942bccdb94cbdb87e6938bffd\" returns successfully" Mar 6 01:42:39.041061 kubelet[2506]: E0306 01:42:39.040895 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9qmk" podUID="f7e93c3d-576a-474e-b310-bc124fa176c8" Mar 6 01:42:39.041596 systemd[1]: cri-containerd-1bc1b21e926ce01d28b2b4c0348d30d9d58d88a942bccdb94cbdb87e6938bffd.scope: Deactivated successfully. Mar 6 01:42:39.103006 containerd[1455]: time="2026-03-06T01:42:39.102673258Z" level=info msg="shim disconnected" id=1bc1b21e926ce01d28b2b4c0348d30d9d58d88a942bccdb94cbdb87e6938bffd namespace=k8s.io Mar 6 01:42:39.103006 containerd[1455]: time="2026-03-06T01:42:39.102743559Z" level=warning msg="cleaning up after shim disconnected" id=1bc1b21e926ce01d28b2b4c0348d30d9d58d88a942bccdb94cbdb87e6938bffd namespace=k8s.io Mar 6 01:42:39.103006 containerd[1455]: time="2026-03-06T01:42:39.102816605Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 6 01:42:39.230549 kubelet[2506]: E0306 01:42:39.229966 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:39.232537 containerd[1455]: time="2026-03-06T01:42:39.232450921Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 6 01:42:39.279852 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1bc1b21e926ce01d28b2b4c0348d30d9d58d88a942bccdb94cbdb87e6938bffd-rootfs.mount: Deactivated successfully. Mar 6 01:42:41.038947 kubelet[2506]: E0306 01:42:41.038725 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9qmk" podUID="f7e93c3d-576a-474e-b310-bc124fa176c8" Mar 6 01:42:43.030959 containerd[1455]: time="2026-03-06T01:42:43.030832465Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:43.032089 containerd[1455]: time="2026-03-06T01:42:43.032001857Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Mar 6 01:42:43.034177 containerd[1455]: time="2026-03-06T01:42:43.033967076Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:43.037996 containerd[1455]: time="2026-03-06T01:42:43.037930520Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:43.038637 kubelet[2506]: E0306 01:42:43.038529 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9qmk" podUID="f7e93c3d-576a-474e-b310-bc124fa176c8" Mar 6 01:42:43.039520 containerd[1455]: time="2026-03-06T01:42:43.039470712Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 3.806931647s" Mar 6 01:42:43.039685 containerd[1455]: time="2026-03-06T01:42:43.039515836Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Mar 6 01:42:43.047226 containerd[1455]: time="2026-03-06T01:42:43.047106290Z" level=info msg="CreateContainer within sandbox \"eab51331bdcdf6fdedc5fd1326d1d2df24c438822fc74963a53ca627fc546848\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 6 01:42:43.069728 containerd[1455]: time="2026-03-06T01:42:43.069605960Z" level=info msg="CreateContainer within sandbox \"eab51331bdcdf6fdedc5fd1326d1d2df24c438822fc74963a53ca627fc546848\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a919110ae01500ed9354f9b1915506290b497b5fb554a3271412f46bfc149d09\"" Mar 6 01:42:43.072353 containerd[1455]: time="2026-03-06T01:42:43.070704591Z" level=info msg="StartContainer for \"a919110ae01500ed9354f9b1915506290b497b5fb554a3271412f46bfc149d09\"" Mar 6 01:42:43.162565 systemd[1]: Started cri-containerd-a919110ae01500ed9354f9b1915506290b497b5fb554a3271412f46bfc149d09.scope - libcontainer container a919110ae01500ed9354f9b1915506290b497b5fb554a3271412f46bfc149d09. Mar 6 01:42:43.241439 containerd[1455]: time="2026-03-06T01:42:43.240707524Z" level=info msg="StartContainer for \"a919110ae01500ed9354f9b1915506290b497b5fb554a3271412f46bfc149d09\" returns successfully" Mar 6 01:42:43.987479 systemd[1]: cri-containerd-a919110ae01500ed9354f9b1915506290b497b5fb554a3271412f46bfc149d09.scope: Deactivated successfully. Mar 6 01:42:43.987959 systemd[1]: cri-containerd-a919110ae01500ed9354f9b1915506290b497b5fb554a3271412f46bfc149d09.scope: Consumed 1.033s CPU time. Mar 6 01:42:44.026036 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a919110ae01500ed9354f9b1915506290b497b5fb554a3271412f46bfc149d09-rootfs.mount: Deactivated successfully. Mar 6 01:42:44.034075 containerd[1455]: time="2026-03-06T01:42:44.033980695Z" level=info msg="shim disconnected" id=a919110ae01500ed9354f9b1915506290b497b5fb554a3271412f46bfc149d09 namespace=k8s.io Mar 6 01:42:44.034545 containerd[1455]: time="2026-03-06T01:42:44.034080690Z" level=warning msg="cleaning up after shim disconnected" id=a919110ae01500ed9354f9b1915506290b497b5fb554a3271412f46bfc149d09 namespace=k8s.io Mar 6 01:42:44.034545 containerd[1455]: time="2026-03-06T01:42:44.034094736Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 6 01:42:44.060046 kubelet[2506]: I0306 01:42:44.059950 2506 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Mar 6 01:42:44.131981 systemd[1]: Created slice kubepods-besteffort-pod2864938a_9ea0_4ee9_987a_e214cf44a87b.slice - libcontainer container kubepods-besteffort-pod2864938a_9ea0_4ee9_987a_e214cf44a87b.slice. Mar 6 01:42:44.143052 systemd[1]: Created slice kubepods-besteffort-pod2402e1ec_ea50_4f17_85cf_f279f0f10494.slice - libcontainer container kubepods-besteffort-pod2402e1ec_ea50_4f17_85cf_f279f0f10494.slice. Mar 6 01:42:44.152035 systemd[1]: Created slice kubepods-besteffort-pod85ea7565_b32d_4a81_801b_58416cb42d38.slice - libcontainer container kubepods-besteffort-pod85ea7565_b32d_4a81_801b_58416cb42d38.slice. Mar 6 01:42:44.166174 systemd[1]: Created slice kubepods-besteffort-pod8b18e331_fe15_4bd5_8fb0_3c314b33990f.slice - libcontainer container kubepods-besteffort-pod8b18e331_fe15_4bd5_8fb0_3c314b33990f.slice. Mar 6 01:42:44.175745 systemd[1]: Created slice kubepods-besteffort-pod5e1355ef_1fae_411e_a25e_19a787706802.slice - libcontainer container kubepods-besteffort-pod5e1355ef_1fae_411e_a25e_19a787706802.slice. Mar 6 01:42:44.188167 systemd[1]: Created slice kubepods-burstable-pode4692164_1dad_4fc3_ad4b_fb8a4d587f00.slice - libcontainer container kubepods-burstable-pode4692164_1dad_4fc3_ad4b_fb8a4d587f00.slice. Mar 6 01:42:44.202034 systemd[1]: Created slice kubepods-burstable-pod4e0c3015_6a1a_484c_97f5_39c6519fa25a.slice - libcontainer container kubepods-burstable-pod4e0c3015_6a1a_484c_97f5_39c6519fa25a.slice. Mar 6 01:42:44.227031 kubelet[2506]: I0306 01:42:44.226932 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q44m\" (UniqueName: \"kubernetes.io/projected/8b18e331-fe15-4bd5-8fb0-3c314b33990f-kube-api-access-5q44m\") pod \"calico-apiserver-c7b968c57-92fjb\" (UID: \"8b18e331-fe15-4bd5-8fb0-3c314b33990f\") " pod="calico-system/calico-apiserver-c7b968c57-92fjb" Mar 6 01:42:44.227031 kubelet[2506]: I0306 01:42:44.227008 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4692164-1dad-4fc3-ad4b-fb8a4d587f00-config-volume\") pod \"coredns-66bc5c9577-t4vmj\" (UID: \"e4692164-1dad-4fc3-ad4b-fb8a4d587f00\") " pod="kube-system/coredns-66bc5c9577-t4vmj" Mar 6 01:42:44.227031 kubelet[2506]: I0306 01:42:44.227027 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbs6w\" (UniqueName: \"kubernetes.io/projected/2864938a-9ea0-4ee9-987a-e214cf44a87b-kube-api-access-tbs6w\") pod \"calico-kube-controllers-5764cfbc4b-xwn5j\" (UID: \"2864938a-9ea0-4ee9-987a-e214cf44a87b\") " pod="calico-system/calico-kube-controllers-5764cfbc4b-xwn5j" Mar 6 01:42:44.227031 kubelet[2506]: I0306 01:42:44.227042 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8b18e331-fe15-4bd5-8fb0-3c314b33990f-calico-apiserver-certs\") pod \"calico-apiserver-c7b968c57-92fjb\" (UID: \"8b18e331-fe15-4bd5-8fb0-3c314b33990f\") " pod="calico-system/calico-apiserver-c7b968c57-92fjb" Mar 6 01:42:44.227413 kubelet[2506]: I0306 01:42:44.227060 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8sqk\" (UniqueName: \"kubernetes.io/projected/5e1355ef-1fae-411e-a25e-19a787706802-kube-api-access-q8sqk\") pod \"calico-apiserver-c7b968c57-g2k99\" (UID: \"5e1355ef-1fae-411e-a25e-19a787706802\") " pod="calico-system/calico-apiserver-c7b968c57-g2k99" Mar 6 01:42:44.227413 kubelet[2506]: I0306 01:42:44.227087 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/85ea7565-b32d-4a81-801b-58416cb42d38-nginx-config\") pod \"whisker-54777d5447-bt5zk\" (UID: \"85ea7565-b32d-4a81-801b-58416cb42d38\") " pod="calico-system/whisker-54777d5447-bt5zk" Mar 6 01:42:44.227413 kubelet[2506]: I0306 01:42:44.227112 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/85ea7565-b32d-4a81-801b-58416cb42d38-whisker-backend-key-pair\") pod \"whisker-54777d5447-bt5zk\" (UID: \"85ea7565-b32d-4a81-801b-58416cb42d38\") " pod="calico-system/whisker-54777d5447-bt5zk" Mar 6 01:42:44.227413 kubelet[2506]: I0306 01:42:44.227139 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2864938a-9ea0-4ee9-987a-e214cf44a87b-tigera-ca-bundle\") pod \"calico-kube-controllers-5764cfbc4b-xwn5j\" (UID: \"2864938a-9ea0-4ee9-987a-e214cf44a87b\") " pod="calico-system/calico-kube-controllers-5764cfbc4b-xwn5j" Mar 6 01:42:44.227413 kubelet[2506]: I0306 01:42:44.227175 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2402e1ec-ea50-4f17-85cf-f279f0f10494-goldmane-ca-bundle\") pod \"goldmane-cccfbd5cf-fnx6s\" (UID: \"2402e1ec-ea50-4f17-85cf-f279f0f10494\") " pod="calico-system/goldmane-cccfbd5cf-fnx6s" Mar 6 01:42:44.227864 kubelet[2506]: I0306 01:42:44.227203 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e0c3015-6a1a-484c-97f5-39c6519fa25a-config-volume\") pod \"coredns-66bc5c9577-rdfzk\" (UID: \"4e0c3015-6a1a-484c-97f5-39c6519fa25a\") " pod="kube-system/coredns-66bc5c9577-rdfzk" Mar 6 01:42:44.227864 kubelet[2506]: I0306 01:42:44.227225 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2402e1ec-ea50-4f17-85cf-f279f0f10494-config\") pod \"goldmane-cccfbd5cf-fnx6s\" (UID: \"2402e1ec-ea50-4f17-85cf-f279f0f10494\") " pod="calico-system/goldmane-cccfbd5cf-fnx6s" Mar 6 01:42:44.227864 kubelet[2506]: I0306 01:42:44.227239 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/2402e1ec-ea50-4f17-85cf-f279f0f10494-goldmane-key-pair\") pod \"goldmane-cccfbd5cf-fnx6s\" (UID: \"2402e1ec-ea50-4f17-85cf-f279f0f10494\") " pod="calico-system/goldmane-cccfbd5cf-fnx6s" Mar 6 01:42:44.227864 kubelet[2506]: I0306 01:42:44.227349 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5g5n\" (UniqueName: \"kubernetes.io/projected/2402e1ec-ea50-4f17-85cf-f279f0f10494-kube-api-access-x5g5n\") pod \"goldmane-cccfbd5cf-fnx6s\" (UID: \"2402e1ec-ea50-4f17-85cf-f279f0f10494\") " pod="calico-system/goldmane-cccfbd5cf-fnx6s" Mar 6 01:42:44.227864 kubelet[2506]: I0306 01:42:44.227363 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85ea7565-b32d-4a81-801b-58416cb42d38-whisker-ca-bundle\") pod \"whisker-54777d5447-bt5zk\" (UID: \"85ea7565-b32d-4a81-801b-58416cb42d38\") " pod="calico-system/whisker-54777d5447-bt5zk" Mar 6 01:42:44.228170 kubelet[2506]: I0306 01:42:44.227379 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5e1355ef-1fae-411e-a25e-19a787706802-calico-apiserver-certs\") pod \"calico-apiserver-c7b968c57-g2k99\" (UID: \"5e1355ef-1fae-411e-a25e-19a787706802\") " pod="calico-system/calico-apiserver-c7b968c57-g2k99" Mar 6 01:42:44.228170 kubelet[2506]: I0306 01:42:44.227393 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ptjx\" (UniqueName: \"kubernetes.io/projected/4e0c3015-6a1a-484c-97f5-39c6519fa25a-kube-api-access-9ptjx\") pod \"coredns-66bc5c9577-rdfzk\" (UID: \"4e0c3015-6a1a-484c-97f5-39c6519fa25a\") " pod="kube-system/coredns-66bc5c9577-rdfzk" Mar 6 01:42:44.228170 kubelet[2506]: I0306 01:42:44.227406 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp298\" (UniqueName: \"kubernetes.io/projected/85ea7565-b32d-4a81-801b-58416cb42d38-kube-api-access-fp298\") pod \"whisker-54777d5447-bt5zk\" (UID: \"85ea7565-b32d-4a81-801b-58416cb42d38\") " pod="calico-system/whisker-54777d5447-bt5zk" Mar 6 01:42:44.228170 kubelet[2506]: I0306 01:42:44.227420 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxglv\" (UniqueName: \"kubernetes.io/projected/e4692164-1dad-4fc3-ad4b-fb8a4d587f00-kube-api-access-rxglv\") pod \"coredns-66bc5c9577-t4vmj\" (UID: \"e4692164-1dad-4fc3-ad4b-fb8a4d587f00\") " pod="kube-system/coredns-66bc5c9577-t4vmj" Mar 6 01:42:44.283819 containerd[1455]: time="2026-03-06T01:42:44.283472318Z" level=info msg="CreateContainer within sandbox \"eab51331bdcdf6fdedc5fd1326d1d2df24c438822fc74963a53ca627fc546848\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 6 01:42:44.315210 containerd[1455]: time="2026-03-06T01:42:44.315116180Z" level=info msg="CreateContainer within sandbox \"eab51331bdcdf6fdedc5fd1326d1d2df24c438822fc74963a53ca627fc546848\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"8b383acb2ea5147159880d52ccea6ccfffd98ff5ea642966af63ddee57474369\"" Mar 6 01:42:44.316383 containerd[1455]: time="2026-03-06T01:42:44.316023003Z" level=info msg="StartContainer for \"8b383acb2ea5147159880d52ccea6ccfffd98ff5ea642966af63ddee57474369\"" Mar 6 01:42:44.400715 systemd[1]: Started cri-containerd-8b383acb2ea5147159880d52ccea6ccfffd98ff5ea642966af63ddee57474369.scope - libcontainer container 8b383acb2ea5147159880d52ccea6ccfffd98ff5ea642966af63ddee57474369. Mar 6 01:42:44.445354 containerd[1455]: time="2026-03-06T01:42:44.445115526Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5764cfbc4b-xwn5j,Uid:2864938a-9ea0-4ee9-987a-e214cf44a87b,Namespace:calico-system,Attempt:0,}" Mar 6 01:42:44.455742 containerd[1455]: time="2026-03-06T01:42:44.455614300Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-fnx6s,Uid:2402e1ec-ea50-4f17-85cf-f279f0f10494,Namespace:calico-system,Attempt:0,}" Mar 6 01:42:44.460706 containerd[1455]: time="2026-03-06T01:42:44.460521306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54777d5447-bt5zk,Uid:85ea7565-b32d-4a81-801b-58416cb42d38,Namespace:calico-system,Attempt:0,}" Mar 6 01:42:44.474906 containerd[1455]: time="2026-03-06T01:42:44.474734767Z" level=info msg="StartContainer for \"8b383acb2ea5147159880d52ccea6ccfffd98ff5ea642966af63ddee57474369\" returns successfully" Mar 6 01:42:44.481974 containerd[1455]: time="2026-03-06T01:42:44.481823030Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c7b968c57-92fjb,Uid:8b18e331-fe15-4bd5-8fb0-3c314b33990f,Namespace:calico-system,Attempt:0,}" Mar 6 01:42:44.485562 containerd[1455]: time="2026-03-06T01:42:44.485473045Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c7b968c57-g2k99,Uid:5e1355ef-1fae-411e-a25e-19a787706802,Namespace:calico-system,Attempt:0,}" Mar 6 01:42:44.501414 kubelet[2506]: E0306 01:42:44.501214 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:44.502549 containerd[1455]: time="2026-03-06T01:42:44.502446097Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-t4vmj,Uid:e4692164-1dad-4fc3-ad4b-fb8a4d587f00,Namespace:kube-system,Attempt:0,}" Mar 6 01:42:44.517213 kubelet[2506]: E0306 01:42:44.516067 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:44.517485 containerd[1455]: time="2026-03-06T01:42:44.516815657Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-rdfzk,Uid:4e0c3015-6a1a-484c-97f5-39c6519fa25a,Namespace:kube-system,Attempt:0,}" Mar 6 01:42:44.847384 containerd[1455]: time="2026-03-06T01:42:44.847196883Z" level=error msg="Failed to destroy network for sandbox \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.854963 containerd[1455]: time="2026-03-06T01:42:44.854919815Z" level=error msg="encountered an error cleaning up failed sandbox \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.855152 containerd[1455]: time="2026-03-06T01:42:44.855120960Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-t4vmj,Uid:e4692164-1dad-4fc3-ad4b-fb8a4d587f00,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.879547 kubelet[2506]: E0306 01:42:44.879490 2506 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.879547 kubelet[2506]: E0306 01:42:44.879583 2506 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-t4vmj" Mar 6 01:42:44.879547 kubelet[2506]: E0306 01:42:44.879616 2506 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-t4vmj" Mar 6 01:42:44.883436 kubelet[2506]: E0306 01:42:44.880140 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-t4vmj_kube-system(e4692164-1dad-4fc3-ad4b-fb8a4d587f00)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-t4vmj_kube-system(e4692164-1dad-4fc3-ad4b-fb8a4d587f00)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-t4vmj" podUID="e4692164-1dad-4fc3-ad4b-fb8a4d587f00" Mar 6 01:42:44.922187 containerd[1455]: time="2026-03-06T01:42:44.922120084Z" level=error msg="Failed to destroy network for sandbox \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.924738 containerd[1455]: time="2026-03-06T01:42:44.924576798Z" level=error msg="encountered an error cleaning up failed sandbox \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.924948 containerd[1455]: time="2026-03-06T01:42:44.924725016Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5764cfbc4b-xwn5j,Uid:2864938a-9ea0-4ee9-987a-e214cf44a87b,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.926100 kubelet[2506]: E0306 01:42:44.925469 2506 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.926100 kubelet[2506]: E0306 01:42:44.925564 2506 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5764cfbc4b-xwn5j" Mar 6 01:42:44.926100 kubelet[2506]: E0306 01:42:44.925592 2506 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5764cfbc4b-xwn5j" Mar 6 01:42:44.926419 kubelet[2506]: E0306 01:42:44.925660 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5764cfbc4b-xwn5j_calico-system(2864938a-9ea0-4ee9-987a-e214cf44a87b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5764cfbc4b-xwn5j_calico-system(2864938a-9ea0-4ee9-987a-e214cf44a87b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5764cfbc4b-xwn5j" podUID="2864938a-9ea0-4ee9-987a-e214cf44a87b" Mar 6 01:42:44.947981 containerd[1455]: time="2026-03-06T01:42:44.947059743Z" level=error msg="Failed to destroy network for sandbox \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.948907 containerd[1455]: time="2026-03-06T01:42:44.948864470Z" level=error msg="encountered an error cleaning up failed sandbox \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.949889 containerd[1455]: time="2026-03-06T01:42:44.949060316Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-fnx6s,Uid:2402e1ec-ea50-4f17-85cf-f279f0f10494,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.950118 kubelet[2506]: E0306 01:42:44.949399 2506 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.950118 kubelet[2506]: E0306 01:42:44.949463 2506 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-fnx6s" Mar 6 01:42:44.950118 kubelet[2506]: E0306 01:42:44.949496 2506 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-fnx6s" Mar 6 01:42:44.950395 kubelet[2506]: E0306 01:42:44.949560 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-cccfbd5cf-fnx6s_calico-system(2402e1ec-ea50-4f17-85cf-f279f0f10494)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-cccfbd5cf-fnx6s_calico-system(2402e1ec-ea50-4f17-85cf-f279f0f10494)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-cccfbd5cf-fnx6s" podUID="2402e1ec-ea50-4f17-85cf-f279f0f10494" Mar 6 01:42:44.957524 containerd[1455]: time="2026-03-06T01:42:44.957475088Z" level=error msg="Failed to destroy network for sandbox \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.967927 containerd[1455]: time="2026-03-06T01:42:44.965369815Z" level=error msg="encountered an error cleaning up failed sandbox \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.967927 containerd[1455]: time="2026-03-06T01:42:44.965470332Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54777d5447-bt5zk,Uid:85ea7565-b32d-4a81-801b-58416cb42d38,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.968944 kubelet[2506]: E0306 01:42:44.965858 2506 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.968944 kubelet[2506]: E0306 01:42:44.965921 2506 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-54777d5447-bt5zk" Mar 6 01:42:44.968944 kubelet[2506]: E0306 01:42:44.965947 2506 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-54777d5447-bt5zk" Mar 6 01:42:44.969104 kubelet[2506]: E0306 01:42:44.966027 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-54777d5447-bt5zk_calico-system(85ea7565-b32d-4a81-801b-58416cb42d38)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-54777d5447-bt5zk_calico-system(85ea7565-b32d-4a81-801b-58416cb42d38)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-54777d5447-bt5zk" podUID="85ea7565-b32d-4a81-801b-58416cb42d38" Mar 6 01:42:44.984923 containerd[1455]: time="2026-03-06T01:42:44.984728409Z" level=error msg="Failed to destroy network for sandbox \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.989422 containerd[1455]: time="2026-03-06T01:42:44.989008606Z" level=error msg="encountered an error cleaning up failed sandbox \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.989422 containerd[1455]: time="2026-03-06T01:42:44.989108242Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-rdfzk,Uid:4e0c3015-6a1a-484c-97f5-39c6519fa25a,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.990168 kubelet[2506]: E0306 01:42:44.989571 2506 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:44.990168 kubelet[2506]: E0306 01:42:44.989617 2506 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-rdfzk" Mar 6 01:42:44.990168 kubelet[2506]: E0306 01:42:44.989634 2506 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-rdfzk" Mar 6 01:42:44.990439 kubelet[2506]: E0306 01:42:44.989673 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-rdfzk_kube-system(4e0c3015-6a1a-484c-97f5-39c6519fa25a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-rdfzk_kube-system(4e0c3015-6a1a-484c-97f5-39c6519fa25a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-rdfzk" podUID="4e0c3015-6a1a-484c-97f5-39c6519fa25a" Mar 6 01:42:45.000910 containerd[1455]: time="2026-03-06T01:42:44.999356181Z" level=error msg="Failed to destroy network for sandbox \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:45.000910 containerd[1455]: time="2026-03-06T01:42:45.000639205Z" level=error msg="encountered an error cleaning up failed sandbox \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:45.000910 containerd[1455]: time="2026-03-06T01:42:45.000709416Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c7b968c57-92fjb,Uid:8b18e331-fe15-4bd5-8fb0-3c314b33990f,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:45.001468 kubelet[2506]: E0306 01:42:45.001432 2506 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:45.001695 kubelet[2506]: E0306 01:42:45.001596 2506 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-c7b968c57-92fjb" Mar 6 01:42:45.001695 kubelet[2506]: E0306 01:42:45.001684 2506 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-c7b968c57-92fjb" Mar 6 01:42:45.002700 kubelet[2506]: E0306 01:42:45.002610 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-c7b968c57-92fjb_calico-system(8b18e331-fe15-4bd5-8fb0-3c314b33990f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-c7b968c57-92fjb_calico-system(8b18e331-fe15-4bd5-8fb0-3c314b33990f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-c7b968c57-92fjb" podUID="8b18e331-fe15-4bd5-8fb0-3c314b33990f" Mar 6 01:42:45.017040 containerd[1455]: time="2026-03-06T01:42:45.016742626Z" level=error msg="Failed to destroy network for sandbox \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:45.019477 containerd[1455]: time="2026-03-06T01:42:45.019105656Z" level=error msg="encountered an error cleaning up failed sandbox \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:45.019477 containerd[1455]: time="2026-03-06T01:42:45.019241199Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c7b968c57-g2k99,Uid:5e1355ef-1fae-411e-a25e-19a787706802,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:45.020037 kubelet[2506]: E0306 01:42:45.019855 2506 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 01:42:45.020037 kubelet[2506]: E0306 01:42:45.019924 2506 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-c7b968c57-g2k99" Mar 6 01:42:45.020037 kubelet[2506]: E0306 01:42:45.019960 2506 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-c7b968c57-g2k99" Mar 6 01:42:45.020186 kubelet[2506]: E0306 01:42:45.020023 2506 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-c7b968c57-g2k99_calico-system(5e1355ef-1fae-411e-a25e-19a787706802)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-c7b968c57-g2k99_calico-system(5e1355ef-1fae-411e-a25e-19a787706802)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-c7b968c57-g2k99" podUID="5e1355ef-1fae-411e-a25e-19a787706802" Mar 6 01:42:45.057566 systemd[1]: Created slice kubepods-besteffort-podf7e93c3d_576a_474e_b310_bc124fa176c8.slice - libcontainer container kubepods-besteffort-podf7e93c3d_576a_474e_b310_bc124fa176c8.slice. Mar 6 01:42:45.068184 containerd[1455]: time="2026-03-06T01:42:45.067203159Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p9qmk,Uid:f7e93c3d-576a-474e-b310-bc124fa176c8,Namespace:calico-system,Attempt:0,}" Mar 6 01:42:45.297212 kubelet[2506]: I0306 01:42:45.296076 2506 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Mar 6 01:42:45.311704 kubelet[2506]: I0306 01:42:45.311514 2506 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Mar 6 01:42:45.318366 kubelet[2506]: I0306 01:42:45.317570 2506 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Mar 6 01:42:45.321433 containerd[1455]: time="2026-03-06T01:42:45.321110976Z" level=info msg="StopPodSandbox for \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\"" Mar 6 01:42:45.324056 containerd[1455]: time="2026-03-06T01:42:45.323726808Z" level=info msg="Ensure that sandbox e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7 in task-service has been cleanup successfully" Mar 6 01:42:45.327865 containerd[1455]: time="2026-03-06T01:42:45.327555371Z" level=info msg="StopPodSandbox for \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\"" Mar 6 01:42:45.327955 containerd[1455]: time="2026-03-06T01:42:45.327901417Z" level=info msg="Ensure that sandbox d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8 in task-service has been cleanup successfully" Mar 6 01:42:45.328970 containerd[1455]: time="2026-03-06T01:42:45.328493755Z" level=info msg="StopPodSandbox for \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\"" Mar 6 01:42:45.328970 containerd[1455]: time="2026-03-06T01:42:45.328628987Z" level=info msg="Ensure that sandbox 090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96 in task-service has been cleanup successfully" Mar 6 01:42:45.330354 kubelet[2506]: I0306 01:42:45.330107 2506 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Mar 6 01:42:45.333467 containerd[1455]: time="2026-03-06T01:42:45.332942382Z" level=info msg="StopPodSandbox for \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\"" Mar 6 01:42:45.344195 kubelet[2506]: I0306 01:42:45.343235 2506 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-dgbbq" podStartSLOduration=3.980162614 podStartE2EDuration="20.343218048s" podCreationTimestamp="2026-03-06 01:42:25 +0000 UTC" firstStartedPulling="2026-03-06 01:42:26.678099524 +0000 UTC m=+17.827674981" lastFinishedPulling="2026-03-06 01:42:43.041154957 +0000 UTC m=+34.190730415" observedRunningTime="2026-03-06 01:42:45.341052629 +0000 UTC m=+36.490628086" watchObservedRunningTime="2026-03-06 01:42:45.343218048 +0000 UTC m=+36.492793507" Mar 6 01:42:45.351164 kubelet[2506]: I0306 01:42:45.351136 2506 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Mar 6 01:42:45.355709 containerd[1455]: time="2026-03-06T01:42:45.355368713Z" level=info msg="Ensure that sandbox 527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4 in task-service has been cleanup successfully" Mar 6 01:42:45.360568 kubelet[2506]: I0306 01:42:45.359852 2506 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Mar 6 01:42:45.360664 containerd[1455]: time="2026-03-06T01:42:45.355830887Z" level=info msg="StopPodSandbox for \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\"" Mar 6 01:42:45.361570 containerd[1455]: time="2026-03-06T01:42:45.361180019Z" level=info msg="Ensure that sandbox b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701 in task-service has been cleanup successfully" Mar 6 01:42:45.364525 containerd[1455]: time="2026-03-06T01:42:45.364494896Z" level=info msg="StopPodSandbox for \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\"" Mar 6 01:42:45.365122 containerd[1455]: time="2026-03-06T01:42:45.364841251Z" level=info msg="Ensure that sandbox 61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063 in task-service has been cleanup successfully" Mar 6 01:42:45.381807 kubelet[2506]: I0306 01:42:45.381534 2506 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Mar 6 01:42:45.383555 containerd[1455]: time="2026-03-06T01:42:45.383382775Z" level=info msg="StopPodSandbox for \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\"" Mar 6 01:42:45.383710 containerd[1455]: time="2026-03-06T01:42:45.383612925Z" level=info msg="Ensure that sandbox 58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56 in task-service has been cleanup successfully" Mar 6 01:42:45.728045 systemd-networkd[1388]: caliaf74e622c0b: Link UP Mar 6 01:42:45.729490 systemd-networkd[1388]: caliaf74e622c0b: Gained carrier Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.183 [ERROR][3712] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.225 [INFO][3712] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--p9qmk-eth0 csi-node-driver- calico-system f7e93c3d-576a-474e-b310-bc124fa176c8 774 0 2026-03-06 01:42:25 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:98cbb5577 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-p9qmk eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] caliaf74e622c0b [] [] }} ContainerID="fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" Namespace="calico-system" Pod="csi-node-driver-p9qmk" WorkloadEndpoint="localhost-k8s-csi--node--driver--p9qmk-" Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.226 [INFO][3712] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" Namespace="calico-system" Pod="csi-node-driver-p9qmk" WorkloadEndpoint="localhost-k8s-csi--node--driver--p9qmk-eth0" Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.406 [INFO][3732] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" HandleID="k8s-pod-network.fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" Workload="localhost-k8s-csi--node--driver--p9qmk-eth0" Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.484 [INFO][3732] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" HandleID="k8s-pod-network.fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" Workload="localhost-k8s-csi--node--driver--p9qmk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000134f70), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-p9qmk", "timestamp":"2026-03-06 01:42:45.406523649 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000192000)} Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.484 [INFO][3732] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.484 [INFO][3732] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.484 [INFO][3732] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.512 [INFO][3732] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" host="localhost" Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.575 [INFO][3732] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.600 [INFO][3732] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.603 [INFO][3732] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.608 [INFO][3732] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.608 [INFO][3732] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" host="localhost" Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.625 [INFO][3732] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029 Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.644 [INFO][3732] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" host="localhost" Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.659 [INFO][3732] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" host="localhost" Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.659 [INFO][3732] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" host="localhost" Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.659 [INFO][3732] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:42:45.844238 containerd[1455]: 2026-03-06 01:42:45.659 [INFO][3732] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" HandleID="k8s-pod-network.fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" Workload="localhost-k8s-csi--node--driver--p9qmk-eth0" Mar 6 01:42:45.845505 containerd[1455]: 2026-03-06 01:42:45.697 [INFO][3712] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" Namespace="calico-system" Pod="csi-node-driver-p9qmk" WorkloadEndpoint="localhost-k8s-csi--node--driver--p9qmk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--p9qmk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f7e93c3d-576a-474e-b310-bc124fa176c8", ResourceVersion:"774", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-p9qmk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliaf74e622c0b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:42:45.845505 containerd[1455]: 2026-03-06 01:42:45.697 [INFO][3712] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" Namespace="calico-system" Pod="csi-node-driver-p9qmk" WorkloadEndpoint="localhost-k8s-csi--node--driver--p9qmk-eth0" Mar 6 01:42:45.845505 containerd[1455]: 2026-03-06 01:42:45.697 [INFO][3712] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaf74e622c0b ContainerID="fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" Namespace="calico-system" Pod="csi-node-driver-p9qmk" WorkloadEndpoint="localhost-k8s-csi--node--driver--p9qmk-eth0" Mar 6 01:42:45.845505 containerd[1455]: 2026-03-06 01:42:45.731 [INFO][3712] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" Namespace="calico-system" Pod="csi-node-driver-p9qmk" WorkloadEndpoint="localhost-k8s-csi--node--driver--p9qmk-eth0" Mar 6 01:42:45.845505 containerd[1455]: 2026-03-06 01:42:45.734 [INFO][3712] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" Namespace="calico-system" Pod="csi-node-driver-p9qmk" WorkloadEndpoint="localhost-k8s-csi--node--driver--p9qmk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--p9qmk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f7e93c3d-576a-474e-b310-bc124fa176c8", ResourceVersion:"774", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029", Pod:"csi-node-driver-p9qmk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliaf74e622c0b", MAC:"8a:2a:c6:e4:57:42", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:42:45.845505 containerd[1455]: 2026-03-06 01:42:45.828 [INFO][3712] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029" Namespace="calico-system" Pod="csi-node-driver-p9qmk" WorkloadEndpoint="localhost-k8s-csi--node--driver--p9qmk-eth0" Mar 6 01:42:46.016647 containerd[1455]: time="2026-03-06T01:42:46.012721323Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 6 01:42:46.016647 containerd[1455]: time="2026-03-06T01:42:46.012852448Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 6 01:42:46.016647 containerd[1455]: time="2026-03-06T01:42:46.012869781Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:46.016647 containerd[1455]: time="2026-03-06T01:42:46.012984363Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:46.022977 containerd[1455]: 2026-03-06 01:42:45.658 [INFO][3770] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Mar 6 01:42:46.022977 containerd[1455]: 2026-03-06 01:42:45.659 [INFO][3770] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" iface="eth0" netns="/var/run/netns/cni-458e2e1e-0618-128c-58af-b5e1524cfaa4" Mar 6 01:42:46.022977 containerd[1455]: 2026-03-06 01:42:45.661 [INFO][3770] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" iface="eth0" netns="/var/run/netns/cni-458e2e1e-0618-128c-58af-b5e1524cfaa4" Mar 6 01:42:46.022977 containerd[1455]: 2026-03-06 01:42:45.671 [INFO][3770] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" iface="eth0" netns="/var/run/netns/cni-458e2e1e-0618-128c-58af-b5e1524cfaa4" Mar 6 01:42:46.022977 containerd[1455]: 2026-03-06 01:42:45.671 [INFO][3770] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Mar 6 01:42:46.022977 containerd[1455]: 2026-03-06 01:42:45.671 [INFO][3770] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Mar 6 01:42:46.022977 containerd[1455]: 2026-03-06 01:42:45.953 [INFO][3878] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" HandleID="k8s-pod-network.090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Workload="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" Mar 6 01:42:46.022977 containerd[1455]: 2026-03-06 01:42:45.954 [INFO][3878] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:42:46.022977 containerd[1455]: 2026-03-06 01:42:45.954 [INFO][3878] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:42:46.022977 containerd[1455]: 2026-03-06 01:42:45.980 [WARNING][3878] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" HandleID="k8s-pod-network.090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Workload="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" Mar 6 01:42:46.022977 containerd[1455]: 2026-03-06 01:42:45.987 [INFO][3878] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" HandleID="k8s-pod-network.090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Workload="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" Mar 6 01:42:46.022977 containerd[1455]: 2026-03-06 01:42:46.008 [INFO][3878] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:42:46.022977 containerd[1455]: 2026-03-06 01:42:46.013 [INFO][3770] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Mar 6 01:42:46.028098 containerd[1455]: time="2026-03-06T01:42:46.024923305Z" level=info msg="TearDown network for sandbox \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\" successfully" Mar 6 01:42:46.028098 containerd[1455]: time="2026-03-06T01:42:46.027939984Z" level=info msg="StopPodSandbox for \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\" returns successfully" Mar 6 01:42:46.029954 systemd[1]: run-netns-cni\x2d458e2e1e\x2d0618\x2d128c\x2d58af\x2db5e1524cfaa4.mount: Deactivated successfully. Mar 6 01:42:46.041686 containerd[1455]: time="2026-03-06T01:42:46.040908196Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c7b968c57-g2k99,Uid:5e1355ef-1fae-411e-a25e-19a787706802,Namespace:calico-system,Attempt:1,}" Mar 6 01:42:46.055687 containerd[1455]: 2026-03-06 01:42:45.688 [INFO][3772] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Mar 6 01:42:46.055687 containerd[1455]: 2026-03-06 01:42:45.691 [INFO][3772] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" iface="eth0" netns="/var/run/netns/cni-2c56f12f-23a7-0220-7de8-8093401f3718" Mar 6 01:42:46.055687 containerd[1455]: 2026-03-06 01:42:45.691 [INFO][3772] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" iface="eth0" netns="/var/run/netns/cni-2c56f12f-23a7-0220-7de8-8093401f3718" Mar 6 01:42:46.055687 containerd[1455]: 2026-03-06 01:42:45.693 [INFO][3772] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" iface="eth0" netns="/var/run/netns/cni-2c56f12f-23a7-0220-7de8-8093401f3718" Mar 6 01:42:46.055687 containerd[1455]: 2026-03-06 01:42:45.693 [INFO][3772] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Mar 6 01:42:46.055687 containerd[1455]: 2026-03-06 01:42:45.693 [INFO][3772] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Mar 6 01:42:46.055687 containerd[1455]: 2026-03-06 01:42:45.982 [INFO][3890] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" HandleID="k8s-pod-network.d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Workload="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" Mar 6 01:42:46.055687 containerd[1455]: 2026-03-06 01:42:45.992 [INFO][3890] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:42:46.055687 containerd[1455]: 2026-03-06 01:42:46.008 [INFO][3890] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:42:46.055687 containerd[1455]: 2026-03-06 01:42:46.021 [WARNING][3890] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" HandleID="k8s-pod-network.d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Workload="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" Mar 6 01:42:46.055687 containerd[1455]: 2026-03-06 01:42:46.021 [INFO][3890] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" HandleID="k8s-pod-network.d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Workload="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" Mar 6 01:42:46.055687 containerd[1455]: 2026-03-06 01:42:46.043 [INFO][3890] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:42:46.055687 containerd[1455]: 2026-03-06 01:42:46.049 [INFO][3772] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Mar 6 01:42:46.057178 containerd[1455]: time="2026-03-06T01:42:46.057125290Z" level=info msg="TearDown network for sandbox \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\" successfully" Mar 6 01:42:46.057436 containerd[1455]: time="2026-03-06T01:42:46.057417245Z" level=info msg="StopPodSandbox for \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\" returns successfully" Mar 6 01:42:46.064032 systemd[1]: run-netns-cni\x2d2c56f12f\x2d23a7\x2d0220\x2d7de8\x2d8093401f3718.mount: Deactivated successfully. Mar 6 01:42:46.065171 containerd[1455]: time="2026-03-06T01:42:46.065066050Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c7b968c57-92fjb,Uid:8b18e331-fe15-4bd5-8fb0-3c314b33990f,Namespace:calico-system,Attempt:1,}" Mar 6 01:42:46.107833 containerd[1455]: 2026-03-06 01:42:45.683 [INFO][3828] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Mar 6 01:42:46.107833 containerd[1455]: 2026-03-06 01:42:45.689 [INFO][3828] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" iface="eth0" netns="/var/run/netns/cni-c39821db-a62f-74ea-40f8-ca2ce31948c0" Mar 6 01:42:46.107833 containerd[1455]: 2026-03-06 01:42:45.690 [INFO][3828] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" iface="eth0" netns="/var/run/netns/cni-c39821db-a62f-74ea-40f8-ca2ce31948c0" Mar 6 01:42:46.107833 containerd[1455]: 2026-03-06 01:42:45.694 [INFO][3828] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" iface="eth0" netns="/var/run/netns/cni-c39821db-a62f-74ea-40f8-ca2ce31948c0" Mar 6 01:42:46.107833 containerd[1455]: 2026-03-06 01:42:45.698 [INFO][3828] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Mar 6 01:42:46.107833 containerd[1455]: 2026-03-06 01:42:45.698 [INFO][3828] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Mar 6 01:42:46.107833 containerd[1455]: 2026-03-06 01:42:46.044 [INFO][3899] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" HandleID="k8s-pod-network.61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Workload="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" Mar 6 01:42:46.107833 containerd[1455]: 2026-03-06 01:42:46.044 [INFO][3899] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:42:46.107833 containerd[1455]: 2026-03-06 01:42:46.044 [INFO][3899] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:42:46.107833 containerd[1455]: 2026-03-06 01:42:46.069 [WARNING][3899] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" HandleID="k8s-pod-network.61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Workload="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" Mar 6 01:42:46.107833 containerd[1455]: 2026-03-06 01:42:46.070 [INFO][3899] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" HandleID="k8s-pod-network.61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Workload="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" Mar 6 01:42:46.107833 containerd[1455]: 2026-03-06 01:42:46.083 [INFO][3899] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:42:46.107833 containerd[1455]: 2026-03-06 01:42:46.092 [INFO][3828] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Mar 6 01:42:46.110538 containerd[1455]: time="2026-03-06T01:42:46.109998954Z" level=info msg="TearDown network for sandbox \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\" successfully" Mar 6 01:42:46.110538 containerd[1455]: time="2026-03-06T01:42:46.110027046Z" level=info msg="StopPodSandbox for \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\" returns successfully" Mar 6 01:42:46.118613 containerd[1455]: time="2026-03-06T01:42:46.118575539Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-fnx6s,Uid:2402e1ec-ea50-4f17-85cf-f279f0f10494,Namespace:calico-system,Attempt:1,}" Mar 6 01:42:46.132004 systemd[1]: Started cri-containerd-fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029.scope - libcontainer container fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029. Mar 6 01:42:46.186353 systemd-resolved[1389]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 6 01:42:46.188998 containerd[1455]: 2026-03-06 01:42:45.745 [INFO][3829] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Mar 6 01:42:46.188998 containerd[1455]: 2026-03-06 01:42:45.745 [INFO][3829] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" iface="eth0" netns="/var/run/netns/cni-0daefc5c-6513-c50a-2da2-f57e51d0215c" Mar 6 01:42:46.188998 containerd[1455]: 2026-03-06 01:42:45.746 [INFO][3829] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" iface="eth0" netns="/var/run/netns/cni-0daefc5c-6513-c50a-2da2-f57e51d0215c" Mar 6 01:42:46.188998 containerd[1455]: 2026-03-06 01:42:45.762 [INFO][3829] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" iface="eth0" netns="/var/run/netns/cni-0daefc5c-6513-c50a-2da2-f57e51d0215c" Mar 6 01:42:46.188998 containerd[1455]: 2026-03-06 01:42:45.762 [INFO][3829] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Mar 6 01:42:46.188998 containerd[1455]: 2026-03-06 01:42:45.762 [INFO][3829] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Mar 6 01:42:46.188998 containerd[1455]: 2026-03-06 01:42:46.074 [INFO][3913] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" HandleID="k8s-pod-network.b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Workload="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" Mar 6 01:42:46.188998 containerd[1455]: 2026-03-06 01:42:46.075 [INFO][3913] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:42:46.188998 containerd[1455]: 2026-03-06 01:42:46.083 [INFO][3913] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:42:46.188998 containerd[1455]: 2026-03-06 01:42:46.118 [WARNING][3913] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" HandleID="k8s-pod-network.b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Workload="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" Mar 6 01:42:46.188998 containerd[1455]: 2026-03-06 01:42:46.120 [INFO][3913] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" HandleID="k8s-pod-network.b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Workload="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" Mar 6 01:42:46.188998 containerd[1455]: 2026-03-06 01:42:46.137 [INFO][3913] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:42:46.188998 containerd[1455]: 2026-03-06 01:42:46.170 [INFO][3829] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Mar 6 01:42:46.190037 containerd[1455]: time="2026-03-06T01:42:46.189694705Z" level=info msg="TearDown network for sandbox \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\" successfully" Mar 6 01:42:46.190037 containerd[1455]: time="2026-03-06T01:42:46.189731333Z" level=info msg="StopPodSandbox for \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\" returns successfully" Mar 6 01:42:46.196580 kubelet[2506]: E0306 01:42:46.196148 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:46.199195 containerd[1455]: time="2026-03-06T01:42:46.198229319Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-t4vmj,Uid:e4692164-1dad-4fc3-ad4b-fb8a4d587f00,Namespace:kube-system,Attempt:1,}" Mar 6 01:42:46.223684 containerd[1455]: 2026-03-06 01:42:45.888 [INFO][3820] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Mar 6 01:42:46.223684 containerd[1455]: 2026-03-06 01:42:45.888 [INFO][3820] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" iface="eth0" netns="/var/run/netns/cni-3903dbfc-de66-1d0b-5e21-50c4286a8d2a" Mar 6 01:42:46.223684 containerd[1455]: 2026-03-06 01:42:45.891 [INFO][3820] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" iface="eth0" netns="/var/run/netns/cni-3903dbfc-de66-1d0b-5e21-50c4286a8d2a" Mar 6 01:42:46.223684 containerd[1455]: 2026-03-06 01:42:45.892 [INFO][3820] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" iface="eth0" netns="/var/run/netns/cni-3903dbfc-de66-1d0b-5e21-50c4286a8d2a" Mar 6 01:42:46.223684 containerd[1455]: 2026-03-06 01:42:45.896 [INFO][3820] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Mar 6 01:42:46.223684 containerd[1455]: 2026-03-06 01:42:45.896 [INFO][3820] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Mar 6 01:42:46.223684 containerd[1455]: 2026-03-06 01:42:46.083 [INFO][3942] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" HandleID="k8s-pod-network.527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Workload="localhost-k8s-whisker--54777d5447--bt5zk-eth0" Mar 6 01:42:46.223684 containerd[1455]: 2026-03-06 01:42:46.086 [INFO][3942] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:42:46.223684 containerd[1455]: 2026-03-06 01:42:46.142 [INFO][3942] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:42:46.223684 containerd[1455]: 2026-03-06 01:42:46.169 [WARNING][3942] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" HandleID="k8s-pod-network.527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Workload="localhost-k8s-whisker--54777d5447--bt5zk-eth0" Mar 6 01:42:46.223684 containerd[1455]: 2026-03-06 01:42:46.170 [INFO][3942] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" HandleID="k8s-pod-network.527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Workload="localhost-k8s-whisker--54777d5447--bt5zk-eth0" Mar 6 01:42:46.223684 containerd[1455]: 2026-03-06 01:42:46.182 [INFO][3942] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:42:46.223684 containerd[1455]: 2026-03-06 01:42:46.195 [INFO][3820] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Mar 6 01:42:46.225673 containerd[1455]: time="2026-03-06T01:42:46.225464424Z" level=info msg="TearDown network for sandbox \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\" successfully" Mar 6 01:42:46.226709 containerd[1455]: time="2026-03-06T01:42:46.226682467Z" level=info msg="StopPodSandbox for \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\" returns successfully" Mar 6 01:42:46.266090 containerd[1455]: time="2026-03-06T01:42:46.265406324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p9qmk,Uid:f7e93c3d-576a-474e-b310-bc124fa176c8,Namespace:calico-system,Attempt:0,} returns sandbox id \"fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029\"" Mar 6 01:42:46.280412 systemd[1]: run-netns-cni\x2d0daefc5c\x2d6513\x2dc50a\x2d2da2\x2df57e51d0215c.mount: Deactivated successfully. Mar 6 01:42:46.281530 systemd[1]: run-netns-cni\x2d3903dbfc\x2dde66\x2d1d0b\x2d5e21\x2d50c4286a8d2a.mount: Deactivated successfully. Mar 6 01:42:46.281610 systemd[1]: run-netns-cni\x2dc39821db\x2da62f\x2d74ea\x2d40f8\x2dca2ce31948c0.mount: Deactivated successfully. Mar 6 01:42:46.293468 containerd[1455]: 2026-03-06 01:42:45.946 [INFO][3827] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Mar 6 01:42:46.293468 containerd[1455]: 2026-03-06 01:42:45.946 [INFO][3827] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" iface="eth0" netns="/var/run/netns/cni-f15c69ff-40e8-4ba1-761b-d1869263af64" Mar 6 01:42:46.293468 containerd[1455]: 2026-03-06 01:42:45.947 [INFO][3827] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" iface="eth0" netns="/var/run/netns/cni-f15c69ff-40e8-4ba1-761b-d1869263af64" Mar 6 01:42:46.293468 containerd[1455]: 2026-03-06 01:42:45.966 [INFO][3827] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" iface="eth0" netns="/var/run/netns/cni-f15c69ff-40e8-4ba1-761b-d1869263af64" Mar 6 01:42:46.293468 containerd[1455]: 2026-03-06 01:42:45.966 [INFO][3827] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Mar 6 01:42:46.293468 containerd[1455]: 2026-03-06 01:42:45.966 [INFO][3827] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Mar 6 01:42:46.293468 containerd[1455]: 2026-03-06 01:42:46.144 [INFO][3966] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" HandleID="k8s-pod-network.58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Workload="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" Mar 6 01:42:46.293468 containerd[1455]: 2026-03-06 01:42:46.149 [INFO][3966] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:42:46.293468 containerd[1455]: 2026-03-06 01:42:46.184 [INFO][3966] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:42:46.293468 containerd[1455]: 2026-03-06 01:42:46.214 [WARNING][3966] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" HandleID="k8s-pod-network.58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Workload="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" Mar 6 01:42:46.293468 containerd[1455]: 2026-03-06 01:42:46.214 [INFO][3966] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" HandleID="k8s-pod-network.58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Workload="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" Mar 6 01:42:46.293468 containerd[1455]: 2026-03-06 01:42:46.220 [INFO][3966] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:42:46.293468 containerd[1455]: 2026-03-06 01:42:46.252 [INFO][3827] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Mar 6 01:42:46.300613 systemd[1]: run-netns-cni\x2df15c69ff\x2d40e8\x2d4ba1\x2d761b\x2dd1869263af64.mount: Deactivated successfully. Mar 6 01:42:46.302596 containerd[1455]: time="2026-03-06T01:42:46.302229354Z" level=info msg="TearDown network for sandbox \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\" successfully" Mar 6 01:42:46.303226 containerd[1455]: time="2026-03-06T01:42:46.302922157Z" level=info msg="StopPodSandbox for \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\" returns successfully" Mar 6 01:42:46.305104 containerd[1455]: time="2026-03-06T01:42:46.305071518Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 6 01:42:46.308700 containerd[1455]: time="2026-03-06T01:42:46.308446006Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5764cfbc4b-xwn5j,Uid:2864938a-9ea0-4ee9-987a-e214cf44a87b,Namespace:calico-system,Attempt:1,}" Mar 6 01:42:46.317460 containerd[1455]: 2026-03-06 01:42:45.872 [INFO][3826] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Mar 6 01:42:46.317460 containerd[1455]: 2026-03-06 01:42:45.873 [INFO][3826] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" iface="eth0" netns="/var/run/netns/cni-44cf8e88-54f8-f116-5399-cf60786119b5" Mar 6 01:42:46.317460 containerd[1455]: 2026-03-06 01:42:45.879 [INFO][3826] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" iface="eth0" netns="/var/run/netns/cni-44cf8e88-54f8-f116-5399-cf60786119b5" Mar 6 01:42:46.317460 containerd[1455]: 2026-03-06 01:42:45.897 [INFO][3826] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" iface="eth0" netns="/var/run/netns/cni-44cf8e88-54f8-f116-5399-cf60786119b5" Mar 6 01:42:46.317460 containerd[1455]: 2026-03-06 01:42:45.897 [INFO][3826] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Mar 6 01:42:46.317460 containerd[1455]: 2026-03-06 01:42:45.897 [INFO][3826] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Mar 6 01:42:46.317460 containerd[1455]: 2026-03-06 01:42:46.173 [INFO][3941] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" HandleID="k8s-pod-network.e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Workload="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" Mar 6 01:42:46.317460 containerd[1455]: 2026-03-06 01:42:46.181 [INFO][3941] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:42:46.317460 containerd[1455]: 2026-03-06 01:42:46.224 [INFO][3941] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:42:46.317460 containerd[1455]: 2026-03-06 01:42:46.257 [WARNING][3941] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" HandleID="k8s-pod-network.e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Workload="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" Mar 6 01:42:46.317460 containerd[1455]: 2026-03-06 01:42:46.257 [INFO][3941] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" HandleID="k8s-pod-network.e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Workload="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" Mar 6 01:42:46.317460 containerd[1455]: 2026-03-06 01:42:46.264 [INFO][3941] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:42:46.317460 containerd[1455]: 2026-03-06 01:42:46.280 [INFO][3826] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Mar 6 01:42:46.320320 containerd[1455]: time="2026-03-06T01:42:46.320010678Z" level=info msg="TearDown network for sandbox \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\" successfully" Mar 6 01:42:46.320567 containerd[1455]: time="2026-03-06T01:42:46.320375428Z" level=info msg="StopPodSandbox for \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\" returns successfully" Mar 6 01:42:46.323421 systemd[1]: run-netns-cni\x2d44cf8e88\x2d54f8\x2df116\x2d5399\x2dcf60786119b5.mount: Deactivated successfully. Mar 6 01:42:46.327478 kubelet[2506]: E0306 01:42:46.326654 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:46.327956 containerd[1455]: time="2026-03-06T01:42:46.327816697Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-rdfzk,Uid:4e0c3015-6a1a-484c-97f5-39c6519fa25a,Namespace:kube-system,Attempt:1,}" Mar 6 01:42:46.360834 kubelet[2506]: I0306 01:42:46.357893 2506 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp298\" (UniqueName: \"kubernetes.io/projected/85ea7565-b32d-4a81-801b-58416cb42d38-kube-api-access-fp298\") pod \"85ea7565-b32d-4a81-801b-58416cb42d38\" (UID: \"85ea7565-b32d-4a81-801b-58416cb42d38\") " Mar 6 01:42:46.360834 kubelet[2506]: I0306 01:42:46.357948 2506 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/85ea7565-b32d-4a81-801b-58416cb42d38-nginx-config\") pod \"85ea7565-b32d-4a81-801b-58416cb42d38\" (UID: \"85ea7565-b32d-4a81-801b-58416cb42d38\") " Mar 6 01:42:46.360834 kubelet[2506]: I0306 01:42:46.357972 2506 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/85ea7565-b32d-4a81-801b-58416cb42d38-whisker-backend-key-pair\") pod \"85ea7565-b32d-4a81-801b-58416cb42d38\" (UID: \"85ea7565-b32d-4a81-801b-58416cb42d38\") " Mar 6 01:42:46.360834 kubelet[2506]: I0306 01:42:46.358004 2506 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85ea7565-b32d-4a81-801b-58416cb42d38-whisker-ca-bundle\") pod \"85ea7565-b32d-4a81-801b-58416cb42d38\" (UID: \"85ea7565-b32d-4a81-801b-58416cb42d38\") " Mar 6 01:42:46.360834 kubelet[2506]: I0306 01:42:46.358823 2506 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85ea7565-b32d-4a81-801b-58416cb42d38-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "85ea7565-b32d-4a81-801b-58416cb42d38" (UID: "85ea7565-b32d-4a81-801b-58416cb42d38"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 6 01:42:46.369039 kubelet[2506]: I0306 01:42:46.368897 2506 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85ea7565-b32d-4a81-801b-58416cb42d38-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "85ea7565-b32d-4a81-801b-58416cb42d38" (UID: "85ea7565-b32d-4a81-801b-58416cb42d38"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 6 01:42:46.379882 kubelet[2506]: I0306 01:42:46.379500 2506 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ea7565-b32d-4a81-801b-58416cb42d38-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "85ea7565-b32d-4a81-801b-58416cb42d38" (UID: "85ea7565-b32d-4a81-801b-58416cb42d38"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 6 01:42:46.404470 kubelet[2506]: I0306 01:42:46.404035 2506 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ea7565-b32d-4a81-801b-58416cb42d38-kube-api-access-fp298" (OuterVolumeSpecName: "kube-api-access-fp298") pod "85ea7565-b32d-4a81-801b-58416cb42d38" (UID: "85ea7565-b32d-4a81-801b-58416cb42d38"). InnerVolumeSpecName "kube-api-access-fp298". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 6 01:42:46.459489 kubelet[2506]: I0306 01:42:46.459411 2506 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fp298\" (UniqueName: \"kubernetes.io/projected/85ea7565-b32d-4a81-801b-58416cb42d38-kube-api-access-fp298\") on node \"localhost\" DevicePath \"\"" Mar 6 01:42:46.459489 kubelet[2506]: I0306 01:42:46.459445 2506 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/85ea7565-b32d-4a81-801b-58416cb42d38-nginx-config\") on node \"localhost\" DevicePath \"\"" Mar 6 01:42:46.459489 kubelet[2506]: I0306 01:42:46.459457 2506 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/85ea7565-b32d-4a81-801b-58416cb42d38-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Mar 6 01:42:46.459489 kubelet[2506]: I0306 01:42:46.459465 2506 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85ea7565-b32d-4a81-801b-58416cb42d38-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Mar 6 01:42:46.685807 systemd-networkd[1388]: calia9db2015650: Link UP Mar 6 01:42:46.687432 systemd-networkd[1388]: calia9db2015650: Gained carrier Mar 6 01:42:46.717965 systemd[1]: Removed slice kubepods-besteffort-pod85ea7565_b32d_4a81_801b_58416cb42d38.slice - libcontainer container kubepods-besteffort-pod85ea7565_b32d_4a81_801b_58416cb42d38.slice. Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.344 [ERROR][4029] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.378 [INFO][4029] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0 goldmane-cccfbd5cf- calico-system 2402e1ec-ea50-4f17-85cf-f279f0f10494 958 0 2026-03-06 01:42:25 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:cccfbd5cf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-cccfbd5cf-fnx6s eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calia9db2015650 [] [] }} ContainerID="63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-fnx6s" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--fnx6s-" Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.380 [INFO][4029] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-fnx6s" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.559 [INFO][4083] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" HandleID="k8s-pod-network.63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" Workload="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.574 [INFO][4083] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" HandleID="k8s-pod-network.63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" Workload="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000bfd30), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-cccfbd5cf-fnx6s", "timestamp":"2026-03-06 01:42:46.559510657 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000218c60)} Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.574 [INFO][4083] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.574 [INFO][4083] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.574 [INFO][4083] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.579 [INFO][4083] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" host="localhost" Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.593 [INFO][4083] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.605 [INFO][4083] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.612 [INFO][4083] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.622 [INFO][4083] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.624 [INFO][4083] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" host="localhost" Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.641 [INFO][4083] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.650 [INFO][4083] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" host="localhost" Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.664 [INFO][4083] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" host="localhost" Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.665 [INFO][4083] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" host="localhost" Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.666 [INFO][4083] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:42:46.746413 containerd[1455]: 2026-03-06 01:42:46.666 [INFO][4083] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" HandleID="k8s-pod-network.63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" Workload="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" Mar 6 01:42:46.747459 containerd[1455]: 2026-03-06 01:42:46.673 [INFO][4029] cni-plugin/k8s.go 418: Populated endpoint ContainerID="63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-fnx6s" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"2402e1ec-ea50-4f17-85cf-f279f0f10494", ResourceVersion:"958", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-cccfbd5cf-fnx6s", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia9db2015650", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:42:46.747459 containerd[1455]: 2026-03-06 01:42:46.674 [INFO][4029] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-fnx6s" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" Mar 6 01:42:46.747459 containerd[1455]: 2026-03-06 01:42:46.674 [INFO][4029] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia9db2015650 ContainerID="63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-fnx6s" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" Mar 6 01:42:46.747459 containerd[1455]: 2026-03-06 01:42:46.691 [INFO][4029] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-fnx6s" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" Mar 6 01:42:46.747459 containerd[1455]: 2026-03-06 01:42:46.700 [INFO][4029] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-fnx6s" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"2402e1ec-ea50-4f17-85cf-f279f0f10494", ResourceVersion:"958", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a", Pod:"goldmane-cccfbd5cf-fnx6s", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia9db2015650", MAC:"06:08:07:84:0d:d9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:42:46.747459 containerd[1455]: 2026-03-06 01:42:46.734 [INFO][4029] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-fnx6s" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" Mar 6 01:42:46.811195 systemd-networkd[1388]: caliaf74e622c0b: Gained IPv6LL Mar 6 01:42:46.829159 systemd-networkd[1388]: calif260f011e82: Link UP Mar 6 01:42:46.831822 systemd-networkd[1388]: calif260f011e82: Gained carrier Mar 6 01:42:46.850596 systemd[1]: Created slice kubepods-besteffort-pod8942173a_761b_4bcd_8c98_18f1fc6f16e7.slice - libcontainer container kubepods-besteffort-pod8942173a_761b_4bcd_8c98_18f1fc6f16e7.slice. Mar 6 01:42:46.870456 containerd[1455]: time="2026-03-06T01:42:46.867944570Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 6 01:42:46.870456 containerd[1455]: time="2026-03-06T01:42:46.868020221Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 6 01:42:46.870456 containerd[1455]: time="2026-03-06T01:42:46.868033245Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:46.870456 containerd[1455]: time="2026-03-06T01:42:46.868149001Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.335 [ERROR][4012] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.381 [INFO][4012] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0 calico-apiserver-c7b968c57- calico-system 8b18e331-fe15-4bd5-8fb0-3c314b33990f 959 0 2026-03-06 01:42:24 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:c7b968c57 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-c7b968c57-92fjb eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calif260f011e82 [] [] }} ContainerID="7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" Namespace="calico-system" Pod="calico-apiserver-c7b968c57-92fjb" WorkloadEndpoint="localhost-k8s-calico--apiserver--c7b968c57--92fjb-" Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.384 [INFO][4012] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" Namespace="calico-system" Pod="calico-apiserver-c7b968c57-92fjb" WorkloadEndpoint="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.580 [INFO][4097] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" HandleID="k8s-pod-network.7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" Workload="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.599 [INFO][4097] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" HandleID="k8s-pod-network.7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" Workload="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000196bc0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-apiserver-c7b968c57-92fjb", "timestamp":"2026-03-06 01:42:46.5801231 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001e14a0)} Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.601 [INFO][4097] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.666 [INFO][4097] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.666 [INFO][4097] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.681 [INFO][4097] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" host="localhost" Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.703 [INFO][4097] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.729 [INFO][4097] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.739 [INFO][4097] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.747 [INFO][4097] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.748 [INFO][4097] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" host="localhost" Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.755 [INFO][4097] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910 Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.776 [INFO][4097] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" host="localhost" Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.789 [INFO][4097] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" host="localhost" Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.790 [INFO][4097] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" host="localhost" Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.790 [INFO][4097] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:42:46.899709 containerd[1455]: 2026-03-06 01:42:46.790 [INFO][4097] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" HandleID="k8s-pod-network.7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" Workload="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" Mar 6 01:42:46.901684 containerd[1455]: 2026-03-06 01:42:46.799 [INFO][4012] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" Namespace="calico-system" Pod="calico-apiserver-c7b968c57-92fjb" WorkloadEndpoint="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0", GenerateName:"calico-apiserver-c7b968c57-", Namespace:"calico-system", SelfLink:"", UID:"8b18e331-fe15-4bd5-8fb0-3c314b33990f", ResourceVersion:"959", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c7b968c57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-c7b968c57-92fjb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calif260f011e82", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:42:46.901684 containerd[1455]: 2026-03-06 01:42:46.799 [INFO][4012] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" Namespace="calico-system" Pod="calico-apiserver-c7b968c57-92fjb" WorkloadEndpoint="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" Mar 6 01:42:46.901684 containerd[1455]: 2026-03-06 01:42:46.799 [INFO][4012] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif260f011e82 ContainerID="7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" Namespace="calico-system" Pod="calico-apiserver-c7b968c57-92fjb" WorkloadEndpoint="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" Mar 6 01:42:46.901684 containerd[1455]: 2026-03-06 01:42:46.830 [INFO][4012] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" Namespace="calico-system" Pod="calico-apiserver-c7b968c57-92fjb" WorkloadEndpoint="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" Mar 6 01:42:46.901684 containerd[1455]: 2026-03-06 01:42:46.844 [INFO][4012] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" Namespace="calico-system" Pod="calico-apiserver-c7b968c57-92fjb" WorkloadEndpoint="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0", GenerateName:"calico-apiserver-c7b968c57-", Namespace:"calico-system", SelfLink:"", UID:"8b18e331-fe15-4bd5-8fb0-3c314b33990f", ResourceVersion:"959", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c7b968c57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910", Pod:"calico-apiserver-c7b968c57-92fjb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calif260f011e82", MAC:"92:50:9e:de:3c:57", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:42:46.901684 containerd[1455]: 2026-03-06 01:42:46.894 [INFO][4012] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910" Namespace="calico-system" Pod="calico-apiserver-c7b968c57-92fjb" WorkloadEndpoint="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" Mar 6 01:42:46.928015 systemd[1]: Started cri-containerd-63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a.scope - libcontainer container 63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a. Mar 6 01:42:46.963963 kubelet[2506]: I0306 01:42:46.963858 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/8942173a-761b-4bcd-8c98-18f1fc6f16e7-nginx-config\") pod \"whisker-7cf9ccfb48-6j6fp\" (UID: \"8942173a-761b-4bcd-8c98-18f1fc6f16e7\") " pod="calico-system/whisker-7cf9ccfb48-6j6fp" Mar 6 01:42:46.965060 kubelet[2506]: I0306 01:42:46.964850 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/8942173a-761b-4bcd-8c98-18f1fc6f16e7-whisker-backend-key-pair\") pod \"whisker-7cf9ccfb48-6j6fp\" (UID: \"8942173a-761b-4bcd-8c98-18f1fc6f16e7\") " pod="calico-system/whisker-7cf9ccfb48-6j6fp" Mar 6 01:42:46.965060 kubelet[2506]: I0306 01:42:46.964905 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8lbp\" (UniqueName: \"kubernetes.io/projected/8942173a-761b-4bcd-8c98-18f1fc6f16e7-kube-api-access-j8lbp\") pod \"whisker-7cf9ccfb48-6j6fp\" (UID: \"8942173a-761b-4bcd-8c98-18f1fc6f16e7\") " pod="calico-system/whisker-7cf9ccfb48-6j6fp" Mar 6 01:42:46.965060 kubelet[2506]: I0306 01:42:46.964929 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8942173a-761b-4bcd-8c98-18f1fc6f16e7-whisker-ca-bundle\") pod \"whisker-7cf9ccfb48-6j6fp\" (UID: \"8942173a-761b-4bcd-8c98-18f1fc6f16e7\") " pod="calico-system/whisker-7cf9ccfb48-6j6fp" Mar 6 01:42:47.013136 systemd-resolved[1389]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 6 01:42:47.027036 containerd[1455]: time="2026-03-06T01:42:47.026572847Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 6 01:42:47.027036 containerd[1455]: time="2026-03-06T01:42:47.026669168Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 6 01:42:47.027036 containerd[1455]: time="2026-03-06T01:42:47.026688624Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:47.027036 containerd[1455]: time="2026-03-06T01:42:47.026872175Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:47.052410 kubelet[2506]: I0306 01:42:47.052374 2506 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85ea7565-b32d-4a81-801b-58416cb42d38" path="/var/lib/kubelet/pods/85ea7565-b32d-4a81-801b-58416cb42d38/volumes" Mar 6 01:42:47.093643 systemd-networkd[1388]: cali36009d6b7e9: Link UP Mar 6 01:42:47.094946 systemd-networkd[1388]: cali36009d6b7e9: Gained carrier Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.493 [ERROR][4084] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.528 [INFO][4084] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--66bc5c9577--rdfzk-eth0 coredns-66bc5c9577- kube-system 4e0c3015-6a1a-484c-97f5-39c6519fa25a 962 0 2026-03-06 01:42:13 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-66bc5c9577-rdfzk eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali36009d6b7e9 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" Namespace="kube-system" Pod="coredns-66bc5c9577-rdfzk" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--rdfzk-" Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.529 [INFO][4084] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" Namespace="kube-system" Pod="coredns-66bc5c9577-rdfzk" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.616 [INFO][4152] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" HandleID="k8s-pod-network.f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" Workload="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.626 [INFO][4152] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" HandleID="k8s-pod-network.f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" Workload="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00057a310), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-66bc5c9577-rdfzk", "timestamp":"2026-03-06 01:42:46.616100793 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0005329a0)} Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.626 [INFO][4152] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.793 [INFO][4152] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.793 [INFO][4152] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.814 [INFO][4152] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" host="localhost" Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.862 [INFO][4152] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.907 [INFO][4152] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.926 [INFO][4152] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.934 [INFO][4152] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.934 [INFO][4152] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" host="localhost" Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.938 [INFO][4152] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.948 [INFO][4152] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" host="localhost" Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.965 [INFO][4152] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" host="localhost" Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.966 [INFO][4152] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" host="localhost" Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.966 [INFO][4152] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:42:47.118214 containerd[1455]: 2026-03-06 01:42:46.967 [INFO][4152] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" HandleID="k8s-pod-network.f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" Workload="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" Mar 6 01:42:47.119474 containerd[1455]: 2026-03-06 01:42:47.006 [INFO][4084] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" Namespace="kube-system" Pod="coredns-66bc5c9577-rdfzk" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--rdfzk-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"4e0c3015-6a1a-484c-97f5-39c6519fa25a", ResourceVersion:"962", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-66bc5c9577-rdfzk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali36009d6b7e9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:42:47.119474 containerd[1455]: 2026-03-06 01:42:47.011 [INFO][4084] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" Namespace="kube-system" Pod="coredns-66bc5c9577-rdfzk" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" Mar 6 01:42:47.119474 containerd[1455]: 2026-03-06 01:42:47.011 [INFO][4084] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali36009d6b7e9 ContainerID="f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" Namespace="kube-system" Pod="coredns-66bc5c9577-rdfzk" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" Mar 6 01:42:47.119474 containerd[1455]: 2026-03-06 01:42:47.094 [INFO][4084] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" Namespace="kube-system" Pod="coredns-66bc5c9577-rdfzk" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" Mar 6 01:42:47.119474 containerd[1455]: 2026-03-06 01:42:47.094 [INFO][4084] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" Namespace="kube-system" Pod="coredns-66bc5c9577-rdfzk" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--rdfzk-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"4e0c3015-6a1a-484c-97f5-39c6519fa25a", ResourceVersion:"962", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed", Pod:"coredns-66bc5c9577-rdfzk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali36009d6b7e9", MAC:"66:48:e3:b0:1f:5e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:42:47.119474 containerd[1455]: 2026-03-06 01:42:47.110 [INFO][4084] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed" Namespace="kube-system" Pod="coredns-66bc5c9577-rdfzk" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" Mar 6 01:42:47.160982 systemd[1]: Started cri-containerd-7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910.scope - libcontainer container 7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910. Mar 6 01:42:47.163940 containerd[1455]: time="2026-03-06T01:42:47.163380702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7cf9ccfb48-6j6fp,Uid:8942173a-761b-4bcd-8c98-18f1fc6f16e7,Namespace:calico-system,Attempt:0,}" Mar 6 01:42:47.165339 containerd[1455]: time="2026-03-06T01:42:47.165173935Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-fnx6s,Uid:2402e1ec-ea50-4f17-85cf-f279f0f10494,Namespace:calico-system,Attempt:1,} returns sandbox id \"63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a\"" Mar 6 01:42:47.250073 systemd-networkd[1388]: cali7178df35d0b: Link UP Mar 6 01:42:47.260213 systemd-networkd[1388]: cali7178df35d0b: Gained carrier Mar 6 01:42:47.280435 systemd[1]: var-lib-kubelet-pods-85ea7565\x2db32d\x2d4a81\x2d801b\x2d58416cb42d38-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dfp298.mount: Deactivated successfully. Mar 6 01:42:47.280559 systemd[1]: var-lib-kubelet-pods-85ea7565\x2db32d\x2d4a81\x2d801b\x2d58416cb42d38-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 6 01:42:47.287537 systemd-resolved[1389]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:46.300 [ERROR][4003] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:46.339 [INFO][4003] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0 calico-apiserver-c7b968c57- calico-system 5e1355ef-1fae-411e-a25e-19a787706802 956 0 2026-03-06 01:42:24 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:c7b968c57 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-c7b968c57-g2k99 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali7178df35d0b [] [] }} ContainerID="aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" Namespace="calico-system" Pod="calico-apiserver-c7b968c57-g2k99" WorkloadEndpoint="localhost-k8s-calico--apiserver--c7b968c57--g2k99-" Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:46.339 [INFO][4003] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" Namespace="calico-system" Pod="calico-apiserver-c7b968c57-g2k99" WorkloadEndpoint="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:46.600 [INFO][4110] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" HandleID="k8s-pod-network.aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" Workload="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:46.622 [INFO][4110] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" HandleID="k8s-pod-network.aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" Workload="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000402a50), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-apiserver-c7b968c57-g2k99", "timestamp":"2026-03-06 01:42:46.600935743 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001bb8c0)} Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:46.631 [INFO][4110] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:46.968 [INFO][4110] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:46.968 [INFO][4110] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:46.978 [INFO][4110] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" host="localhost" Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:47.023 [INFO][4110] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:47.097 [INFO][4110] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:47.111 [INFO][4110] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:47.115 [INFO][4110] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:47.117 [INFO][4110] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" host="localhost" Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:47.133 [INFO][4110] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881 Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:47.172 [INFO][4110] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" host="localhost" Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:47.192 [INFO][4110] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" host="localhost" Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:47.194 [INFO][4110] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" host="localhost" Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:47.198 [INFO][4110] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:42:47.302453 containerd[1455]: 2026-03-06 01:42:47.198 [INFO][4110] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" HandleID="k8s-pod-network.aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" Workload="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" Mar 6 01:42:47.303114 containerd[1455]: 2026-03-06 01:42:47.229 [INFO][4003] cni-plugin/k8s.go 418: Populated endpoint ContainerID="aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" Namespace="calico-system" Pod="calico-apiserver-c7b968c57-g2k99" WorkloadEndpoint="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0", GenerateName:"calico-apiserver-c7b968c57-", Namespace:"calico-system", SelfLink:"", UID:"5e1355ef-1fae-411e-a25e-19a787706802", ResourceVersion:"956", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c7b968c57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-c7b968c57-g2k99", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali7178df35d0b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:42:47.303114 containerd[1455]: 2026-03-06 01:42:47.229 [INFO][4003] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" Namespace="calico-system" Pod="calico-apiserver-c7b968c57-g2k99" WorkloadEndpoint="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" Mar 6 01:42:47.303114 containerd[1455]: 2026-03-06 01:42:47.229 [INFO][4003] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7178df35d0b ContainerID="aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" Namespace="calico-system" Pod="calico-apiserver-c7b968c57-g2k99" WorkloadEndpoint="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" Mar 6 01:42:47.303114 containerd[1455]: 2026-03-06 01:42:47.264 [INFO][4003] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" Namespace="calico-system" Pod="calico-apiserver-c7b968c57-g2k99" WorkloadEndpoint="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" Mar 6 01:42:47.303114 containerd[1455]: 2026-03-06 01:42:47.267 [INFO][4003] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" Namespace="calico-system" Pod="calico-apiserver-c7b968c57-g2k99" WorkloadEndpoint="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0", GenerateName:"calico-apiserver-c7b968c57-", Namespace:"calico-system", SelfLink:"", UID:"5e1355ef-1fae-411e-a25e-19a787706802", ResourceVersion:"956", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c7b968c57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881", Pod:"calico-apiserver-c7b968c57-g2k99", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali7178df35d0b", MAC:"4a:9a:95:2d:71:a3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:42:47.303114 containerd[1455]: 2026-03-06 01:42:47.293 [INFO][4003] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881" Namespace="calico-system" Pod="calico-apiserver-c7b968c57-g2k99" WorkloadEndpoint="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" Mar 6 01:42:47.355844 containerd[1455]: time="2026-03-06T01:42:47.352029602Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 6 01:42:47.355844 containerd[1455]: time="2026-03-06T01:42:47.352153222Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 6 01:42:47.355844 containerd[1455]: time="2026-03-06T01:42:47.352176976Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:47.355844 containerd[1455]: time="2026-03-06T01:42:47.353133131Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:47.368063 systemd-networkd[1388]: calif0d39e01b10: Link UP Mar 6 01:42:47.382879 systemd-networkd[1388]: calif0d39e01b10: Gained carrier Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:46.316 [ERROR][4053] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:46.438 [INFO][4053] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--66bc5c9577--t4vmj-eth0 coredns-66bc5c9577- kube-system e4692164-1dad-4fc3-ad4b-fb8a4d587f00 960 0 2026-03-06 01:42:13 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-66bc5c9577-t4vmj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif0d39e01b10 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" Namespace="kube-system" Pod="coredns-66bc5c9577-t4vmj" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--t4vmj-" Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:46.443 [INFO][4053] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" Namespace="kube-system" Pod="coredns-66bc5c9577-t4vmj" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:46.628 [INFO][4131] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" HandleID="k8s-pod-network.84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" Workload="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:46.653 [INFO][4131] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" HandleID="k8s-pod-network.84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" Workload="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ce1c0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-66bc5c9577-t4vmj", "timestamp":"2026-03-06 01:42:46.628726474 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001e62c0)} Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:46.653 [INFO][4131] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:47.204 [INFO][4131] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:47.205 [INFO][4131] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:47.210 [INFO][4131] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" host="localhost" Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:47.232 [INFO][4131] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:47.246 [INFO][4131] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:47.264 [INFO][4131] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:47.280 [INFO][4131] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:47.280 [INFO][4131] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" host="localhost" Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:47.296 [INFO][4131] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8 Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:47.309 [INFO][4131] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" host="localhost" Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:47.324 [INFO][4131] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" host="localhost" Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:47.324 [INFO][4131] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" host="localhost" Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:47.324 [INFO][4131] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:42:47.437401 containerd[1455]: 2026-03-06 01:42:47.324 [INFO][4131] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" HandleID="k8s-pod-network.84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" Workload="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" Mar 6 01:42:47.438128 containerd[1455]: 2026-03-06 01:42:47.354 [INFO][4053] cni-plugin/k8s.go 418: Populated endpoint ContainerID="84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" Namespace="kube-system" Pod="coredns-66bc5c9577-t4vmj" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--t4vmj-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"e4692164-1dad-4fc3-ad4b-fb8a4d587f00", ResourceVersion:"960", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-66bc5c9577-t4vmj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif0d39e01b10", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:42:47.438128 containerd[1455]: 2026-03-06 01:42:47.354 [INFO][4053] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" Namespace="kube-system" Pod="coredns-66bc5c9577-t4vmj" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" Mar 6 01:42:47.438128 containerd[1455]: 2026-03-06 01:42:47.354 [INFO][4053] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif0d39e01b10 ContainerID="84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" Namespace="kube-system" Pod="coredns-66bc5c9577-t4vmj" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" Mar 6 01:42:47.438128 containerd[1455]: 2026-03-06 01:42:47.384 [INFO][4053] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" Namespace="kube-system" Pod="coredns-66bc5c9577-t4vmj" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" Mar 6 01:42:47.438128 containerd[1455]: 2026-03-06 01:42:47.384 [INFO][4053] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" Namespace="kube-system" Pod="coredns-66bc5c9577-t4vmj" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--t4vmj-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"e4692164-1dad-4fc3-ad4b-fb8a4d587f00", ResourceVersion:"960", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8", Pod:"coredns-66bc5c9577-t4vmj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif0d39e01b10", MAC:"ce:04:db:b8:44:fe", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:42:47.438128 containerd[1455]: 2026-03-06 01:42:47.410 [INFO][4053] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8" Namespace="kube-system" Pod="coredns-66bc5c9577-t4vmj" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" Mar 6 01:42:47.463070 containerd[1455]: time="2026-03-06T01:42:47.463031625Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c7b968c57-92fjb,Uid:8b18e331-fe15-4bd5-8fb0-3c314b33990f,Namespace:calico-system,Attempt:1,} returns sandbox id \"7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910\"" Mar 6 01:42:47.541853 systemd[1]: run-containerd-runc-k8s.io-f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed-runc.t0fv62.mount: Deactivated successfully. Mar 6 01:42:47.546583 containerd[1455]: time="2026-03-06T01:42:47.540673844Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 6 01:42:47.546583 containerd[1455]: time="2026-03-06T01:42:47.540741359Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 6 01:42:47.546583 containerd[1455]: time="2026-03-06T01:42:47.540822010Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:47.546583 containerd[1455]: time="2026-03-06T01:42:47.540949158Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:47.557560 systemd[1]: Started cri-containerd-f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed.scope - libcontainer container f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed. Mar 6 01:42:47.622603 systemd[1]: Started cri-containerd-aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881.scope - libcontainer container aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881. Mar 6 01:42:47.628642 systemd-networkd[1388]: calif836518fd78: Link UP Mar 6 01:42:47.642105 systemd-networkd[1388]: calif836518fd78: Gained carrier Mar 6 01:42:47.667828 containerd[1455]: time="2026-03-06T01:42:47.666450914Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 6 01:42:47.667828 containerd[1455]: time="2026-03-06T01:42:47.666518820Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 6 01:42:47.667999 containerd[1455]: time="2026-03-06T01:42:47.667846377Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:47.668102 systemd-resolved[1389]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 6 01:42:47.672161 containerd[1455]: time="2026-03-06T01:42:47.668074073Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:46.542 [ERROR][4075] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:46.573 [INFO][4075] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0 calico-kube-controllers-5764cfbc4b- calico-system 2864938a-9ea0-4ee9-987a-e214cf44a87b 964 0 2026-03-06 01:42:25 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5764cfbc4b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-5764cfbc4b-xwn5j eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calif836518fd78 [] [] }} ContainerID="3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" Namespace="calico-system" Pod="calico-kube-controllers-5764cfbc4b-xwn5j" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-" Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:46.573 [INFO][4075] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" Namespace="calico-system" Pod="calico-kube-controllers-5764cfbc4b-xwn5j" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:46.678 [INFO][4167] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" HandleID="k8s-pod-network.3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" Workload="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:46.712 [INFO][4167] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" HandleID="k8s-pod-network.3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" Workload="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001a5af0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-5764cfbc4b-xwn5j", "timestamp":"2026-03-06 01:42:46.678379048 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0004418c0)} Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:46.713 [INFO][4167] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:47.326 [INFO][4167] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:47.326 [INFO][4167] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:47.339 [INFO][4167] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" host="localhost" Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:47.366 [INFO][4167] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:47.395 [INFO][4167] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:47.417 [INFO][4167] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:47.423 [INFO][4167] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:47.423 [INFO][4167] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" host="localhost" Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:47.429 [INFO][4167] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2 Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:47.445 [INFO][4167] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" host="localhost" Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:47.465 [INFO][4167] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" host="localhost" Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:47.479 [INFO][4167] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" host="localhost" Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:47.484 [INFO][4167] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:42:47.672161 containerd[1455]: 2026-03-06 01:42:47.488 [INFO][4167] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" HandleID="k8s-pod-network.3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" Workload="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" Mar 6 01:42:47.673989 containerd[1455]: 2026-03-06 01:42:47.578 [INFO][4075] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" Namespace="calico-system" Pod="calico-kube-controllers-5764cfbc4b-xwn5j" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0", GenerateName:"calico-kube-controllers-5764cfbc4b-", Namespace:"calico-system", SelfLink:"", UID:"2864938a-9ea0-4ee9-987a-e214cf44a87b", ResourceVersion:"964", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5764cfbc4b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-5764cfbc4b-xwn5j", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif836518fd78", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:42:47.673989 containerd[1455]: 2026-03-06 01:42:47.579 [INFO][4075] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" Namespace="calico-system" Pod="calico-kube-controllers-5764cfbc4b-xwn5j" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" Mar 6 01:42:47.673989 containerd[1455]: 2026-03-06 01:42:47.579 [INFO][4075] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif836518fd78 ContainerID="3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" Namespace="calico-system" Pod="calico-kube-controllers-5764cfbc4b-xwn5j" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" Mar 6 01:42:47.673989 containerd[1455]: 2026-03-06 01:42:47.646 [INFO][4075] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" Namespace="calico-system" Pod="calico-kube-controllers-5764cfbc4b-xwn5j" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" Mar 6 01:42:47.673989 containerd[1455]: 2026-03-06 01:42:47.647 [INFO][4075] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" Namespace="calico-system" Pod="calico-kube-controllers-5764cfbc4b-xwn5j" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0", GenerateName:"calico-kube-controllers-5764cfbc4b-", Namespace:"calico-system", SelfLink:"", UID:"2864938a-9ea0-4ee9-987a-e214cf44a87b", ResourceVersion:"964", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5764cfbc4b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2", Pod:"calico-kube-controllers-5764cfbc4b-xwn5j", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif836518fd78", MAC:"ba:f2:f1:63:3d:c1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:42:47.673989 containerd[1455]: 2026-03-06 01:42:47.660 [INFO][4075] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2" Namespace="calico-system" Pod="calico-kube-controllers-5764cfbc4b-xwn5j" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" Mar 6 01:42:47.761811 systemd-resolved[1389]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 6 01:42:47.782082 containerd[1455]: time="2026-03-06T01:42:47.781949713Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-rdfzk,Uid:4e0c3015-6a1a-484c-97f5-39c6519fa25a,Namespace:kube-system,Attempt:1,} returns sandbox id \"f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed\"" Mar 6 01:42:47.786415 kubelet[2506]: E0306 01:42:47.786374 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:47.808583 systemd[1]: Started cri-containerd-84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8.scope - libcontainer container 84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8. Mar 6 01:42:47.820228 containerd[1455]: time="2026-03-06T01:42:47.818061702Z" level=info msg="CreateContainer within sandbox \"f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 6 01:42:47.830412 systemd-networkd[1388]: cali7951b32369b: Link UP Mar 6 01:42:47.832225 systemd-networkd[1388]: cali7951b32369b: Gained carrier Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.318 [ERROR][4371] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.380 [INFO][4371] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--7cf9ccfb48--6j6fp-eth0 whisker-7cf9ccfb48- calico-system 8942173a-761b-4bcd-8c98-18f1fc6f16e7 991 0 2026-03-06 01:42:46 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7cf9ccfb48 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-7cf9ccfb48-6j6fp eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali7951b32369b [] [] }} ContainerID="6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" Namespace="calico-system" Pod="whisker-7cf9ccfb48-6j6fp" WorkloadEndpoint="localhost-k8s-whisker--7cf9ccfb48--6j6fp-" Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.380 [INFO][4371] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" Namespace="calico-system" Pod="whisker-7cf9ccfb48-6j6fp" WorkloadEndpoint="localhost-k8s-whisker--7cf9ccfb48--6j6fp-eth0" Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.668 [INFO][4441] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" HandleID="k8s-pod-network.6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" Workload="localhost-k8s-whisker--7cf9ccfb48--6j6fp-eth0" Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.683 [INFO][4441] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" HandleID="k8s-pod-network.6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" Workload="localhost-k8s-whisker--7cf9ccfb48--6j6fp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000233dd0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-7cf9ccfb48-6j6fp", "timestamp":"2026-03-06 01:42:47.668972487 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00011fce0)} Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.683 [INFO][4441] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.683 [INFO][4441] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.683 [INFO][4441] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.690 [INFO][4441] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" host="localhost" Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.714 [INFO][4441] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.737 [INFO][4441] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.745 [INFO][4441] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.754 [INFO][4441] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.754 [INFO][4441] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" host="localhost" Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.756 [INFO][4441] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2 Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.767 [INFO][4441] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" host="localhost" Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.787 [INFO][4441] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" host="localhost" Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.787 [INFO][4441] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" host="localhost" Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.787 [INFO][4441] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:42:47.878040 containerd[1455]: 2026-03-06 01:42:47.787 [INFO][4441] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" HandleID="k8s-pod-network.6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" Workload="localhost-k8s-whisker--7cf9ccfb48--6j6fp-eth0" Mar 6 01:42:47.880891 containerd[1455]: 2026-03-06 01:42:47.794 [INFO][4371] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" Namespace="calico-system" Pod="whisker-7cf9ccfb48-6j6fp" WorkloadEndpoint="localhost-k8s-whisker--7cf9ccfb48--6j6fp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--7cf9ccfb48--6j6fp-eth0", GenerateName:"whisker-7cf9ccfb48-", Namespace:"calico-system", SelfLink:"", UID:"8942173a-761b-4bcd-8c98-18f1fc6f16e7", ResourceVersion:"991", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7cf9ccfb48", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-7cf9ccfb48-6j6fp", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali7951b32369b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:42:47.880891 containerd[1455]: 2026-03-06 01:42:47.794 [INFO][4371] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" Namespace="calico-system" Pod="whisker-7cf9ccfb48-6j6fp" WorkloadEndpoint="localhost-k8s-whisker--7cf9ccfb48--6j6fp-eth0" Mar 6 01:42:47.880891 containerd[1455]: 2026-03-06 01:42:47.794 [INFO][4371] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7951b32369b ContainerID="6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" Namespace="calico-system" Pod="whisker-7cf9ccfb48-6j6fp" WorkloadEndpoint="localhost-k8s-whisker--7cf9ccfb48--6j6fp-eth0" Mar 6 01:42:47.880891 containerd[1455]: 2026-03-06 01:42:47.840 [INFO][4371] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" Namespace="calico-system" Pod="whisker-7cf9ccfb48-6j6fp" WorkloadEndpoint="localhost-k8s-whisker--7cf9ccfb48--6j6fp-eth0" Mar 6 01:42:47.880891 containerd[1455]: 2026-03-06 01:42:47.845 [INFO][4371] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" Namespace="calico-system" Pod="whisker-7cf9ccfb48-6j6fp" WorkloadEndpoint="localhost-k8s-whisker--7cf9ccfb48--6j6fp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--7cf9ccfb48--6j6fp-eth0", GenerateName:"whisker-7cf9ccfb48-", Namespace:"calico-system", SelfLink:"", UID:"8942173a-761b-4bcd-8c98-18f1fc6f16e7", ResourceVersion:"991", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7cf9ccfb48", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2", Pod:"whisker-7cf9ccfb48-6j6fp", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali7951b32369b", MAC:"aa:b6:1c:8c:40:e0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:42:47.880891 containerd[1455]: 2026-03-06 01:42:47.864 [INFO][4371] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2" Namespace="calico-system" Pod="whisker-7cf9ccfb48-6j6fp" WorkloadEndpoint="localhost-k8s-whisker--7cf9ccfb48--6j6fp-eth0" Mar 6 01:42:47.899627 systemd-networkd[1388]: calia9db2015650: Gained IPv6LL Mar 6 01:42:47.958728 systemd-resolved[1389]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 6 01:42:47.963597 systemd-networkd[1388]: calif260f011e82: Gained IPv6LL Mar 6 01:42:47.989406 containerd[1455]: time="2026-03-06T01:42:47.989108682Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 6 01:42:47.989406 containerd[1455]: time="2026-03-06T01:42:47.989363037Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 6 01:42:47.990204 containerd[1455]: time="2026-03-06T01:42:47.989739159Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:47.991196 containerd[1455]: time="2026-03-06T01:42:47.991156123Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:48.013472 containerd[1455]: time="2026-03-06T01:42:48.013432579Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c7b968c57-g2k99,Uid:5e1355ef-1fae-411e-a25e-19a787706802,Namespace:calico-system,Attempt:1,} returns sandbox id \"aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881\"" Mar 6 01:42:48.025737 containerd[1455]: time="2026-03-06T01:42:48.025542950Z" level=info msg="CreateContainer within sandbox \"f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c455b71a6f5da7dce41d44ed31adcb54814cb334be2f8a01e48b557aa2a04384\"" Mar 6 01:42:48.027317 containerd[1455]: time="2026-03-06T01:42:48.027096854Z" level=info msg="StartContainer for \"c455b71a6f5da7dce41d44ed31adcb54814cb334be2f8a01e48b557aa2a04384\"" Mar 6 01:42:48.050128 containerd[1455]: time="2026-03-06T01:42:48.048397296Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 6 01:42:48.050667 containerd[1455]: time="2026-03-06T01:42:48.050074325Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 6 01:42:48.050667 containerd[1455]: time="2026-03-06T01:42:48.050109892Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:48.050667 containerd[1455]: time="2026-03-06T01:42:48.050231619Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 6 01:42:48.083484 systemd[1]: Started cri-containerd-6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2.scope - libcontainer container 6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2. Mar 6 01:42:48.105447 systemd[1]: Started cri-containerd-3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2.scope - libcontainer container 3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2. Mar 6 01:42:48.139010 systemd-resolved[1389]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 6 01:42:48.142160 containerd[1455]: time="2026-03-06T01:42:48.141998508Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-t4vmj,Uid:e4692164-1dad-4fc3-ad4b-fb8a4d587f00,Namespace:kube-system,Attempt:1,} returns sandbox id \"84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8\"" Mar 6 01:42:48.144455 kubelet[2506]: E0306 01:42:48.144050 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:48.156041 containerd[1455]: time="2026-03-06T01:42:48.155706521Z" level=info msg="CreateContainer within sandbox \"84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 6 01:42:48.171571 systemd-resolved[1389]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 6 01:42:48.190547 containerd[1455]: time="2026-03-06T01:42:48.190373508Z" level=info msg="CreateContainer within sandbox \"84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4ee3bdc7a6e6baa33619accdcb3e192128520a465774f8793fd3d1d2df5a810e\"" Mar 6 01:42:48.196579 containerd[1455]: time="2026-03-06T01:42:48.196384788Z" level=info msg="StartContainer for \"4ee3bdc7a6e6baa33619accdcb3e192128520a465774f8793fd3d1d2df5a810e\"" Mar 6 01:42:48.207444 systemd[1]: Started cri-containerd-c455b71a6f5da7dce41d44ed31adcb54814cb334be2f8a01e48b557aa2a04384.scope - libcontainer container c455b71a6f5da7dce41d44ed31adcb54814cb334be2f8a01e48b557aa2a04384. Mar 6 01:42:48.255541 kernel: calico-node[4250]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 6 01:42:48.260151 containerd[1455]: time="2026-03-06T01:42:48.259927928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5764cfbc4b-xwn5j,Uid:2864938a-9ea0-4ee9-987a-e214cf44a87b,Namespace:calico-system,Attempt:1,} returns sandbox id \"3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2\"" Mar 6 01:42:48.295495 containerd[1455]: time="2026-03-06T01:42:48.293026652Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7cf9ccfb48-6j6fp,Uid:8942173a-761b-4bcd-8c98-18f1fc6f16e7,Namespace:calico-system,Attempt:0,} returns sandbox id \"6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2\"" Mar 6 01:42:48.358875 systemd[1]: Started cri-containerd-4ee3bdc7a6e6baa33619accdcb3e192128520a465774f8793fd3d1d2df5a810e.scope - libcontainer container 4ee3bdc7a6e6baa33619accdcb3e192128520a465774f8793fd3d1d2df5a810e. Mar 6 01:42:48.382644 containerd[1455]: time="2026-03-06T01:42:48.382422868Z" level=info msg="StartContainer for \"c455b71a6f5da7dce41d44ed31adcb54814cb334be2f8a01e48b557aa2a04384\" returns successfully" Mar 6 01:42:48.457225 kubelet[2506]: E0306 01:42:48.455840 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:48.487497 containerd[1455]: time="2026-03-06T01:42:48.486994556Z" level=info msg="StartContainer for \"4ee3bdc7a6e6baa33619accdcb3e192128520a465774f8793fd3d1d2df5a810e\" returns successfully" Mar 6 01:42:48.525480 kubelet[2506]: E0306 01:42:48.525360 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:48.555979 kubelet[2506]: I0306 01:42:48.554899 2506 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-rdfzk" podStartSLOduration=35.554883969 podStartE2EDuration="35.554883969s" podCreationTimestamp="2026-03-06 01:42:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 01:42:48.552858373 +0000 UTC m=+39.702433841" watchObservedRunningTime="2026-03-06 01:42:48.554883969 +0000 UTC m=+39.704459427" Mar 6 01:42:48.597441 containerd[1455]: time="2026-03-06T01:42:48.594977490Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:48.599203 containerd[1455]: time="2026-03-06T01:42:48.599098121Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Mar 6 01:42:48.603674 containerd[1455]: time="2026-03-06T01:42:48.603410308Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:48.621086 containerd[1455]: time="2026-03-06T01:42:48.620645333Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:48.624811 containerd[1455]: time="2026-03-06T01:42:48.624721763Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 2.319539267s" Mar 6 01:42:48.624872 containerd[1455]: time="2026-03-06T01:42:48.624819145Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Mar 6 01:42:48.638019 containerd[1455]: time="2026-03-06T01:42:48.637952196Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 6 01:42:48.648855 containerd[1455]: time="2026-03-06T01:42:48.648619915Z" level=info msg="CreateContainer within sandbox \"fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 6 01:42:48.690177 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3542679000.mount: Deactivated successfully. Mar 6 01:42:48.696122 containerd[1455]: time="2026-03-06T01:42:48.695692239Z" level=info msg="CreateContainer within sandbox \"fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"e4bc7023604e372cfd0b0dec5f2bf3a30eadf215160b628b75fb915f438b2116\"" Mar 6 01:42:48.697865 containerd[1455]: time="2026-03-06T01:42:48.697543926Z" level=info msg="StartContainer for \"e4bc7023604e372cfd0b0dec5f2bf3a30eadf215160b628b75fb915f438b2116\"" Mar 6 01:42:48.766491 systemd[1]: Started cri-containerd-e4bc7023604e372cfd0b0dec5f2bf3a30eadf215160b628b75fb915f438b2116.scope - libcontainer container e4bc7023604e372cfd0b0dec5f2bf3a30eadf215160b628b75fb915f438b2116. Mar 6 01:42:48.794503 systemd-networkd[1388]: cali36009d6b7e9: Gained IPv6LL Mar 6 01:42:48.840986 containerd[1455]: time="2026-03-06T01:42:48.840705101Z" level=info msg="StartContainer for \"e4bc7023604e372cfd0b0dec5f2bf3a30eadf215160b628b75fb915f438b2116\" returns successfully" Mar 6 01:42:48.859047 systemd-networkd[1388]: cali7178df35d0b: Gained IPv6LL Mar 6 01:42:48.986646 systemd-networkd[1388]: calif836518fd78: Gained IPv6LL Mar 6 01:42:49.114941 systemd-networkd[1388]: calif0d39e01b10: Gained IPv6LL Mar 6 01:42:49.245668 systemd-networkd[1388]: cali7951b32369b: Gained IPv6LL Mar 6 01:42:49.302015 systemd-networkd[1388]: vxlan.calico: Link UP Mar 6 01:42:49.302069 systemd-networkd[1388]: vxlan.calico: Gained carrier Mar 6 01:42:49.554373 kubelet[2506]: E0306 01:42:49.553996 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:49.574181 kubelet[2506]: E0306 01:42:49.574051 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:49.584824 kubelet[2506]: I0306 01:42:49.583977 2506 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-t4vmj" podStartSLOduration=36.583957539 podStartE2EDuration="36.583957539s" podCreationTimestamp="2026-03-06 01:42:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 01:42:48.582910769 +0000 UTC m=+39.732486227" watchObservedRunningTime="2026-03-06 01:42:49.583957539 +0000 UTC m=+40.733533037" Mar 6 01:42:50.234179 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3674453789.mount: Deactivated successfully. Mar 6 01:42:50.583433 kubelet[2506]: E0306 01:42:50.582812 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:50.583433 kubelet[2506]: E0306 01:42:50.583324 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:50.651467 systemd-networkd[1388]: vxlan.calico: Gained IPv6LL Mar 6 01:42:50.937727 containerd[1455]: time="2026-03-06T01:42:50.924160058Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:50.937727 containerd[1455]: time="2026-03-06T01:42:50.925930923Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Mar 6 01:42:50.937727 containerd[1455]: time="2026-03-06T01:42:50.927653539Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:50.937727 containerd[1455]: time="2026-03-06T01:42:50.933389615Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:50.937727 containerd[1455]: time="2026-03-06T01:42:50.934742346Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 2.296749575s" Mar 6 01:42:50.937727 containerd[1455]: time="2026-03-06T01:42:50.934860246Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Mar 6 01:42:50.937727 containerd[1455]: time="2026-03-06T01:42:50.937138751Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 6 01:42:50.948432 containerd[1455]: time="2026-03-06T01:42:50.945726338Z" level=info msg="CreateContainer within sandbox \"63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 6 01:42:50.978927 containerd[1455]: time="2026-03-06T01:42:50.978834630Z" level=info msg="CreateContainer within sandbox \"63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"1b67c6c5ad5a788fe4a6f598ed9db1e3a6a9bfd8affd4554b0fe123993eba5a7\"" Mar 6 01:42:50.980223 containerd[1455]: time="2026-03-06T01:42:50.980020038Z" level=info msg="StartContainer for \"1b67c6c5ad5a788fe4a6f598ed9db1e3a6a9bfd8affd4554b0fe123993eba5a7\"" Mar 6 01:42:51.072688 systemd[1]: Started cri-containerd-1b67c6c5ad5a788fe4a6f598ed9db1e3a6a9bfd8affd4554b0fe123993eba5a7.scope - libcontainer container 1b67c6c5ad5a788fe4a6f598ed9db1e3a6a9bfd8affd4554b0fe123993eba5a7. Mar 6 01:42:51.153648 containerd[1455]: time="2026-03-06T01:42:51.153598284Z" level=info msg="StartContainer for \"1b67c6c5ad5a788fe4a6f598ed9db1e3a6a9bfd8affd4554b0fe123993eba5a7\" returns successfully" Mar 6 01:42:51.593060 kubelet[2506]: E0306 01:42:51.592941 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:51.593604 kubelet[2506]: E0306 01:42:51.593128 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:42:52.993919 containerd[1455]: time="2026-03-06T01:42:52.993612934Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:52.995729 containerd[1455]: time="2026-03-06T01:42:52.995592699Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Mar 6 01:42:52.998054 containerd[1455]: time="2026-03-06T01:42:52.997847523Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:53.001811 containerd[1455]: time="2026-03-06T01:42:53.001706441Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:53.003438 containerd[1455]: time="2026-03-06T01:42:53.003241554Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 2.066066695s" Mar 6 01:42:53.003506 containerd[1455]: time="2026-03-06T01:42:53.003436667Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 6 01:42:53.005317 containerd[1455]: time="2026-03-06T01:42:53.005152941Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 6 01:42:53.011326 containerd[1455]: time="2026-03-06T01:42:53.011134664Z" level=info msg="CreateContainer within sandbox \"7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 6 01:42:53.036641 containerd[1455]: time="2026-03-06T01:42:53.036543062Z" level=info msg="CreateContainer within sandbox \"7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ef046a4f666e12512525ee9f9afefe3cb50117e63a84ccd5d39ecc07d16f8add\"" Mar 6 01:42:53.038424 containerd[1455]: time="2026-03-06T01:42:53.038139998Z" level=info msg="StartContainer for \"ef046a4f666e12512525ee9f9afefe3cb50117e63a84ccd5d39ecc07d16f8add\"" Mar 6 01:42:53.098896 systemd[1]: Started cri-containerd-ef046a4f666e12512525ee9f9afefe3cb50117e63a84ccd5d39ecc07d16f8add.scope - libcontainer container ef046a4f666e12512525ee9f9afefe3cb50117e63a84ccd5d39ecc07d16f8add. Mar 6 01:42:53.116606 containerd[1455]: time="2026-03-06T01:42:53.116503013Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:53.118542 containerd[1455]: time="2026-03-06T01:42:53.118455057Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 6 01:42:53.122099 containerd[1455]: time="2026-03-06T01:42:53.121865784Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 116.629357ms" Mar 6 01:42:53.122099 containerd[1455]: time="2026-03-06T01:42:53.121938680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 6 01:42:53.125385 containerd[1455]: time="2026-03-06T01:42:53.125219233Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 6 01:42:53.130593 containerd[1455]: time="2026-03-06T01:42:53.130398120Z" level=info msg="CreateContainer within sandbox \"aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 6 01:42:53.153644 containerd[1455]: time="2026-03-06T01:42:53.153206655Z" level=info msg="CreateContainer within sandbox \"aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"8fd28b341861e7882e5b95b703906c050c2d4ee5847cd9a2f4439cd77dbf4249\"" Mar 6 01:42:53.156496 containerd[1455]: time="2026-03-06T01:42:53.156408387Z" level=info msg="StartContainer for \"8fd28b341861e7882e5b95b703906c050c2d4ee5847cd9a2f4439cd77dbf4249\"" Mar 6 01:42:53.185152 containerd[1455]: time="2026-03-06T01:42:53.184897215Z" level=info msg="StartContainer for \"ef046a4f666e12512525ee9f9afefe3cb50117e63a84ccd5d39ecc07d16f8add\" returns successfully" Mar 6 01:42:53.208525 systemd[1]: Started cri-containerd-8fd28b341861e7882e5b95b703906c050c2d4ee5847cd9a2f4439cd77dbf4249.scope - libcontainer container 8fd28b341861e7882e5b95b703906c050c2d4ee5847cd9a2f4439cd77dbf4249. Mar 6 01:42:53.285087 containerd[1455]: time="2026-03-06T01:42:53.284905242Z" level=info msg="StartContainer for \"8fd28b341861e7882e5b95b703906c050c2d4ee5847cd9a2f4439cd77dbf4249\" returns successfully" Mar 6 01:42:53.649198 kubelet[2506]: I0306 01:42:53.646141 2506 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-cccfbd5cf-fnx6s" podStartSLOduration=24.877976204 podStartE2EDuration="28.646119245s" podCreationTimestamp="2026-03-06 01:42:25 +0000 UTC" firstStartedPulling="2026-03-06 01:42:47.168559584 +0000 UTC m=+38.318135043" lastFinishedPulling="2026-03-06 01:42:50.936702626 +0000 UTC m=+42.086278084" observedRunningTime="2026-03-06 01:42:51.645553875 +0000 UTC m=+42.795129333" watchObservedRunningTime="2026-03-06 01:42:53.646119245 +0000 UTC m=+44.795694704" Mar 6 01:42:53.674834 kubelet[2506]: I0306 01:42:53.674484 2506 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-c7b968c57-92fjb" podStartSLOduration=24.135155794 podStartE2EDuration="29.674468964s" podCreationTimestamp="2026-03-06 01:42:24 +0000 UTC" firstStartedPulling="2026-03-06 01:42:47.465608144 +0000 UTC m=+38.615183603" lastFinishedPulling="2026-03-06 01:42:53.004921316 +0000 UTC m=+44.154496773" observedRunningTime="2026-03-06 01:42:53.671058979 +0000 UTC m=+44.820634447" watchObservedRunningTime="2026-03-06 01:42:53.674468964 +0000 UTC m=+44.824044423" Mar 6 01:42:53.674834 kubelet[2506]: I0306 01:42:53.674623 2506 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-c7b968c57-g2k99" podStartSLOduration=24.567965933 podStartE2EDuration="29.674617271s" podCreationTimestamp="2026-03-06 01:42:24 +0000 UTC" firstStartedPulling="2026-03-06 01:42:48.017512264 +0000 UTC m=+39.167087722" lastFinishedPulling="2026-03-06 01:42:53.124163602 +0000 UTC m=+44.273739060" observedRunningTime="2026-03-06 01:42:53.65270541 +0000 UTC m=+44.802280878" watchObservedRunningTime="2026-03-06 01:42:53.674617271 +0000 UTC m=+44.824192730" Mar 6 01:42:54.628001 kubelet[2506]: I0306 01:42:54.627888 2506 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 6 01:42:54.630361 kubelet[2506]: I0306 01:42:54.629401 2506 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 6 01:42:55.185695 containerd[1455]: time="2026-03-06T01:42:55.185595684Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:55.187021 containerd[1455]: time="2026-03-06T01:42:55.186884251Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Mar 6 01:42:55.206226 containerd[1455]: time="2026-03-06T01:42:55.206075323Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:55.209665 containerd[1455]: time="2026-03-06T01:42:55.209560246Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:55.210405 containerd[1455]: time="2026-03-06T01:42:55.210360142Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 2.085111424s" Mar 6 01:42:55.210465 containerd[1455]: time="2026-03-06T01:42:55.210412159Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Mar 6 01:42:55.211896 containerd[1455]: time="2026-03-06T01:42:55.211692943Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 6 01:42:55.230951 containerd[1455]: time="2026-03-06T01:42:55.230885912Z" level=info msg="CreateContainer within sandbox \"3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 6 01:42:55.259079 containerd[1455]: time="2026-03-06T01:42:55.259000296Z" level=info msg="CreateContainer within sandbox \"3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"2760661ec1810eaca8578798cc9e6424a6382eec75218276dddfba27dc5ab217\"" Mar 6 01:42:55.260997 containerd[1455]: time="2026-03-06T01:42:55.259902608Z" level=info msg="StartContainer for \"2760661ec1810eaca8578798cc9e6424a6382eec75218276dddfba27dc5ab217\"" Mar 6 01:42:55.368587 systemd[1]: Started cri-containerd-2760661ec1810eaca8578798cc9e6424a6382eec75218276dddfba27dc5ab217.scope - libcontainer container 2760661ec1810eaca8578798cc9e6424a6382eec75218276dddfba27dc5ab217. Mar 6 01:42:55.437981 containerd[1455]: time="2026-03-06T01:42:55.437784481Z" level=info msg="StartContainer for \"2760661ec1810eaca8578798cc9e6424a6382eec75218276dddfba27dc5ab217\" returns successfully" Mar 6 01:42:55.721164 kubelet[2506]: I0306 01:42:55.720886 2506 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5764cfbc4b-xwn5j" podStartSLOduration=23.792873911 podStartE2EDuration="30.720870465s" podCreationTimestamp="2026-03-06 01:42:25 +0000 UTC" firstStartedPulling="2026-03-06 01:42:48.283528855 +0000 UTC m=+39.433104313" lastFinishedPulling="2026-03-06 01:42:55.211525409 +0000 UTC m=+46.361100867" observedRunningTime="2026-03-06 01:42:55.655515425 +0000 UTC m=+46.805090882" watchObservedRunningTime="2026-03-06 01:42:55.720870465 +0000 UTC m=+46.870445923" Mar 6 01:42:56.935955 containerd[1455]: time="2026-03-06T01:42:56.935864964Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:56.973144 containerd[1455]: time="2026-03-06T01:42:56.972980145Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Mar 6 01:42:56.975209 containerd[1455]: time="2026-03-06T01:42:56.975080005Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:56.979138 containerd[1455]: time="2026-03-06T01:42:56.979043901Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:56.980241 containerd[1455]: time="2026-03-06T01:42:56.980167443Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 1.768444955s" Mar 6 01:42:56.980370 containerd[1455]: time="2026-03-06T01:42:56.980238467Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Mar 6 01:42:56.981927 containerd[1455]: time="2026-03-06T01:42:56.981539533Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 6 01:42:56.985179 containerd[1455]: time="2026-03-06T01:42:56.985156418Z" level=info msg="CreateContainer within sandbox \"6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 6 01:42:57.009641 containerd[1455]: time="2026-03-06T01:42:57.009570288Z" level=info msg="CreateContainer within sandbox \"6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"83747bb3d3505ca9c301cffd9061a8787f196483a2166aa858cb313d7ab69928\"" Mar 6 01:42:57.010580 containerd[1455]: time="2026-03-06T01:42:57.010537760Z" level=info msg="StartContainer for \"83747bb3d3505ca9c301cffd9061a8787f196483a2166aa858cb313d7ab69928\"" Mar 6 01:42:57.071467 systemd[1]: Started cri-containerd-83747bb3d3505ca9c301cffd9061a8787f196483a2166aa858cb313d7ab69928.scope - libcontainer container 83747bb3d3505ca9c301cffd9061a8787f196483a2166aa858cb313d7ab69928. Mar 6 01:42:57.141728 containerd[1455]: time="2026-03-06T01:42:57.134218023Z" level=info msg="StartContainer for \"83747bb3d3505ca9c301cffd9061a8787f196483a2166aa858cb313d7ab69928\" returns successfully" Mar 6 01:42:58.307748 containerd[1455]: time="2026-03-06T01:42:58.307671994Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:58.308967 containerd[1455]: time="2026-03-06T01:42:58.308922217Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Mar 6 01:42:58.311124 containerd[1455]: time="2026-03-06T01:42:58.311048924Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:58.314542 containerd[1455]: time="2026-03-06T01:42:58.314477628Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:58.315457 containerd[1455]: time="2026-03-06T01:42:58.315410366Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 1.333836108s" Mar 6 01:42:58.315588 containerd[1455]: time="2026-03-06T01:42:58.315461512Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Mar 6 01:42:58.316882 containerd[1455]: time="2026-03-06T01:42:58.316854001Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 6 01:42:58.322170 containerd[1455]: time="2026-03-06T01:42:58.322107137Z" level=info msg="CreateContainer within sandbox \"fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 6 01:42:58.368217 containerd[1455]: time="2026-03-06T01:42:58.368105670Z" level=info msg="CreateContainer within sandbox \"fa5f6994cfb320754a12e73b89eee27084eb716c750058939ac6f9cbaf78d029\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"716f8fd87a0a5a0afd4b75ba77ce133186ececf87a9651a8e98af3eb3d045637\"" Mar 6 01:42:58.369151 containerd[1455]: time="2026-03-06T01:42:58.369072584Z" level=info msg="StartContainer for \"716f8fd87a0a5a0afd4b75ba77ce133186ececf87a9651a8e98af3eb3d045637\"" Mar 6 01:42:58.411995 systemd[1]: run-containerd-runc-k8s.io-716f8fd87a0a5a0afd4b75ba77ce133186ececf87a9651a8e98af3eb3d045637-runc.BuvOSg.mount: Deactivated successfully. Mar 6 01:42:58.434568 systemd[1]: Started cri-containerd-716f8fd87a0a5a0afd4b75ba77ce133186ececf87a9651a8e98af3eb3d045637.scope - libcontainer container 716f8fd87a0a5a0afd4b75ba77ce133186ececf87a9651a8e98af3eb3d045637. Mar 6 01:42:58.481503 containerd[1455]: time="2026-03-06T01:42:58.481409726Z" level=info msg="StartContainer for \"716f8fd87a0a5a0afd4b75ba77ce133186ececf87a9651a8e98af3eb3d045637\" returns successfully" Mar 6 01:42:58.661719 kubelet[2506]: I0306 01:42:58.661624 2506 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-p9qmk" podStartSLOduration=21.647516781 podStartE2EDuration="33.661607264s" podCreationTimestamp="2026-03-06 01:42:25 +0000 UTC" firstStartedPulling="2026-03-06 01:42:46.302415181 +0000 UTC m=+37.451990639" lastFinishedPulling="2026-03-06 01:42:58.316505664 +0000 UTC m=+49.466081122" observedRunningTime="2026-03-06 01:42:58.661030997 +0000 UTC m=+49.810606455" watchObservedRunningTime="2026-03-06 01:42:58.661607264 +0000 UTC m=+49.811182721" Mar 6 01:42:59.157466 kubelet[2506]: I0306 01:42:59.157369 2506 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 6 01:42:59.158529 kubelet[2506]: I0306 01:42:59.158472 2506 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 6 01:42:59.288930 containerd[1455]: time="2026-03-06T01:42:59.288847852Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:59.290002 containerd[1455]: time="2026-03-06T01:42:59.289964175Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Mar 6 01:42:59.291593 containerd[1455]: time="2026-03-06T01:42:59.291535318Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:59.294627 containerd[1455]: time="2026-03-06T01:42:59.294583815Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 01:42:59.295496 containerd[1455]: time="2026-03-06T01:42:59.295444901Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 978.560554ms" Mar 6 01:42:59.295496 containerd[1455]: time="2026-03-06T01:42:59.295491388Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Mar 6 01:42:59.302435 containerd[1455]: time="2026-03-06T01:42:59.302400156Z" level=info msg="CreateContainer within sandbox \"6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 6 01:42:59.320092 containerd[1455]: time="2026-03-06T01:42:59.320036020Z" level=info msg="CreateContainer within sandbox \"6248e8c2c516d334654944bcac37350f137c61928d71aef459b15723510844a2\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"097c2f2440e036b8a47f33c6fc8bf0214c4e1aa9ce21f05b7c96f40c3b0f41b6\"" Mar 6 01:42:59.320839 containerd[1455]: time="2026-03-06T01:42:59.320807501Z" level=info msg="StartContainer for \"097c2f2440e036b8a47f33c6fc8bf0214c4e1aa9ce21f05b7c96f40c3b0f41b6\"" Mar 6 01:42:59.364400 systemd[1]: Started cri-containerd-097c2f2440e036b8a47f33c6fc8bf0214c4e1aa9ce21f05b7c96f40c3b0f41b6.scope - libcontainer container 097c2f2440e036b8a47f33c6fc8bf0214c4e1aa9ce21f05b7c96f40c3b0f41b6. Mar 6 01:42:59.420557 containerd[1455]: time="2026-03-06T01:42:59.420219282Z" level=info msg="StartContainer for \"097c2f2440e036b8a47f33c6fc8bf0214c4e1aa9ce21f05b7c96f40c3b0f41b6\" returns successfully" Mar 6 01:43:02.309656 systemd[1]: Started sshd@7-10.0.0.120:22-10.0.0.1:50500.service - OpenSSH per-connection server daemon (10.0.0.1:50500). Mar 6 01:43:02.412513 sshd[5382]: Accepted publickey for core from 10.0.0.1 port 50500 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:02.415451 sshd[5382]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:02.422905 systemd-logind[1440]: New session 8 of user core. Mar 6 01:43:02.430547 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 6 01:43:02.864484 sshd[5382]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:02.869343 systemd[1]: sshd@7-10.0.0.120:22-10.0.0.1:50500.service: Deactivated successfully. Mar 6 01:43:02.871678 systemd[1]: session-8.scope: Deactivated successfully. Mar 6 01:43:02.872648 systemd-logind[1440]: Session 8 logged out. Waiting for processes to exit. Mar 6 01:43:02.874396 systemd-logind[1440]: Removed session 8. Mar 6 01:43:06.325103 kubelet[2506]: I0306 01:43:06.324909 2506 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 6 01:43:06.358650 kubelet[2506]: I0306 01:43:06.358441 2506 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7cf9ccfb48-6j6fp" podStartSLOduration=9.365758048 podStartE2EDuration="20.358423574s" podCreationTimestamp="2026-03-06 01:42:46 +0000 UTC" firstStartedPulling="2026-03-06 01:42:48.303682872 +0000 UTC m=+39.453258331" lastFinishedPulling="2026-03-06 01:42:59.296348399 +0000 UTC m=+50.445923857" observedRunningTime="2026-03-06 01:42:59.666471658 +0000 UTC m=+50.816047116" watchObservedRunningTime="2026-03-06 01:43:06.358423574 +0000 UTC m=+57.507999062" Mar 6 01:43:07.877399 systemd[1]: Started sshd@8-10.0.0.120:22-10.0.0.1:50508.service - OpenSSH per-connection server daemon (10.0.0.1:50508). Mar 6 01:43:07.928881 sshd[5407]: Accepted publickey for core from 10.0.0.1 port 50508 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:07.930569 sshd[5407]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:07.936232 systemd-logind[1440]: New session 9 of user core. Mar 6 01:43:07.944457 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 6 01:43:08.101134 sshd[5407]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:08.106888 systemd[1]: sshd@8-10.0.0.120:22-10.0.0.1:50508.service: Deactivated successfully. Mar 6 01:43:08.110022 systemd[1]: session-9.scope: Deactivated successfully. Mar 6 01:43:08.112313 systemd-logind[1440]: Session 9 logged out. Waiting for processes to exit. Mar 6 01:43:08.114147 systemd-logind[1440]: Removed session 9. Mar 6 01:43:08.987817 containerd[1455]: time="2026-03-06T01:43:08.987208131Z" level=info msg="StopPodSandbox for \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\"" Mar 6 01:43:09.220880 containerd[1455]: 2026-03-06 01:43:09.116 [WARNING][5431] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0", GenerateName:"calico-apiserver-c7b968c57-", Namespace:"calico-system", SelfLink:"", UID:"8b18e331-fe15-4bd5-8fb0-3c314b33990f", ResourceVersion:"1084", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c7b968c57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910", Pod:"calico-apiserver-c7b968c57-92fjb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calif260f011e82", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:43:09.220880 containerd[1455]: 2026-03-06 01:43:09.117 [INFO][5431] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Mar 6 01:43:09.220880 containerd[1455]: 2026-03-06 01:43:09.117 [INFO][5431] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" iface="eth0" netns="" Mar 6 01:43:09.220880 containerd[1455]: 2026-03-06 01:43:09.117 [INFO][5431] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Mar 6 01:43:09.220880 containerd[1455]: 2026-03-06 01:43:09.117 [INFO][5431] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Mar 6 01:43:09.220880 containerd[1455]: 2026-03-06 01:43:09.199 [INFO][5442] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" HandleID="k8s-pod-network.d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Workload="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" Mar 6 01:43:09.220880 containerd[1455]: 2026-03-06 01:43:09.199 [INFO][5442] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:43:09.220880 containerd[1455]: 2026-03-06 01:43:09.199 [INFO][5442] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:43:09.220880 containerd[1455]: 2026-03-06 01:43:09.209 [WARNING][5442] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" HandleID="k8s-pod-network.d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Workload="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" Mar 6 01:43:09.220880 containerd[1455]: 2026-03-06 01:43:09.209 [INFO][5442] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" HandleID="k8s-pod-network.d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Workload="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" Mar 6 01:43:09.220880 containerd[1455]: 2026-03-06 01:43:09.212 [INFO][5442] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:43:09.220880 containerd[1455]: 2026-03-06 01:43:09.216 [INFO][5431] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Mar 6 01:43:09.228829 containerd[1455]: time="2026-03-06T01:43:09.228700914Z" level=info msg="TearDown network for sandbox \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\" successfully" Mar 6 01:43:09.228996 containerd[1455]: time="2026-03-06T01:43:09.228835375Z" level=info msg="StopPodSandbox for \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\" returns successfully" Mar 6 01:43:09.274627 containerd[1455]: time="2026-03-06T01:43:09.274406761Z" level=info msg="RemovePodSandbox for \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\"" Mar 6 01:43:09.278183 containerd[1455]: time="2026-03-06T01:43:09.278118747Z" level=info msg="Forcibly stopping sandbox \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\"" Mar 6 01:43:09.434403 containerd[1455]: 2026-03-06 01:43:09.347 [WARNING][5459] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0", GenerateName:"calico-apiserver-c7b968c57-", Namespace:"calico-system", SelfLink:"", UID:"8b18e331-fe15-4bd5-8fb0-3c314b33990f", ResourceVersion:"1084", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c7b968c57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7e3066e1cc66150e87795028f7080b9b80f9871c6f3dae527a066ce3e6641910", Pod:"calico-apiserver-c7b968c57-92fjb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calif260f011e82", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:43:09.434403 containerd[1455]: 2026-03-06 01:43:09.348 [INFO][5459] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Mar 6 01:43:09.434403 containerd[1455]: 2026-03-06 01:43:09.348 [INFO][5459] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" iface="eth0" netns="" Mar 6 01:43:09.434403 containerd[1455]: 2026-03-06 01:43:09.348 [INFO][5459] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Mar 6 01:43:09.434403 containerd[1455]: 2026-03-06 01:43:09.348 [INFO][5459] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Mar 6 01:43:09.434403 containerd[1455]: 2026-03-06 01:43:09.395 [INFO][5467] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" HandleID="k8s-pod-network.d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Workload="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" Mar 6 01:43:09.434403 containerd[1455]: 2026-03-06 01:43:09.395 [INFO][5467] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:43:09.434403 containerd[1455]: 2026-03-06 01:43:09.395 [INFO][5467] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:43:09.434403 containerd[1455]: 2026-03-06 01:43:09.415 [WARNING][5467] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" HandleID="k8s-pod-network.d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Workload="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" Mar 6 01:43:09.434403 containerd[1455]: 2026-03-06 01:43:09.415 [INFO][5467] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" HandleID="k8s-pod-network.d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Workload="localhost-k8s-calico--apiserver--c7b968c57--92fjb-eth0" Mar 6 01:43:09.434403 containerd[1455]: 2026-03-06 01:43:09.425 [INFO][5467] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:43:09.434403 containerd[1455]: 2026-03-06 01:43:09.429 [INFO][5459] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8" Mar 6 01:43:09.434403 containerd[1455]: time="2026-03-06T01:43:09.434361189Z" level=info msg="TearDown network for sandbox \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\" successfully" Mar 6 01:43:09.460042 containerd[1455]: time="2026-03-06T01:43:09.459887838Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 6 01:43:09.460199 containerd[1455]: time="2026-03-06T01:43:09.460049310Z" level=info msg="RemovePodSandbox \"d539c1ced688e3061f9c2add9a1a5b28c9f0c352616cba968cbc0905f7ebbdb8\" returns successfully" Mar 6 01:43:09.467446 containerd[1455]: time="2026-03-06T01:43:09.467415050Z" level=info msg="StopPodSandbox for \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\"" Mar 6 01:43:09.586880 containerd[1455]: 2026-03-06 01:43:09.524 [WARNING][5484] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" WorkloadEndpoint="localhost-k8s-whisker--54777d5447--bt5zk-eth0" Mar 6 01:43:09.586880 containerd[1455]: 2026-03-06 01:43:09.524 [INFO][5484] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Mar 6 01:43:09.586880 containerd[1455]: 2026-03-06 01:43:09.524 [INFO][5484] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" iface="eth0" netns="" Mar 6 01:43:09.586880 containerd[1455]: 2026-03-06 01:43:09.524 [INFO][5484] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Mar 6 01:43:09.586880 containerd[1455]: 2026-03-06 01:43:09.524 [INFO][5484] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Mar 6 01:43:09.586880 containerd[1455]: 2026-03-06 01:43:09.559 [INFO][5492] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" HandleID="k8s-pod-network.527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Workload="localhost-k8s-whisker--54777d5447--bt5zk-eth0" Mar 6 01:43:09.586880 containerd[1455]: 2026-03-06 01:43:09.559 [INFO][5492] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:43:09.586880 containerd[1455]: 2026-03-06 01:43:09.559 [INFO][5492] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:43:09.586880 containerd[1455]: 2026-03-06 01:43:09.576 [WARNING][5492] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" HandleID="k8s-pod-network.527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Workload="localhost-k8s-whisker--54777d5447--bt5zk-eth0" Mar 6 01:43:09.586880 containerd[1455]: 2026-03-06 01:43:09.576 [INFO][5492] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" HandleID="k8s-pod-network.527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Workload="localhost-k8s-whisker--54777d5447--bt5zk-eth0" Mar 6 01:43:09.586880 containerd[1455]: 2026-03-06 01:43:09.580 [INFO][5492] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:43:09.586880 containerd[1455]: 2026-03-06 01:43:09.584 [INFO][5484] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Mar 6 01:43:09.586880 containerd[1455]: time="2026-03-06T01:43:09.586824730Z" level=info msg="TearDown network for sandbox \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\" successfully" Mar 6 01:43:09.586880 containerd[1455]: time="2026-03-06T01:43:09.586846691Z" level=info msg="StopPodSandbox for \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\" returns successfully" Mar 6 01:43:09.587643 containerd[1455]: time="2026-03-06T01:43:09.587480445Z" level=info msg="RemovePodSandbox for \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\"" Mar 6 01:43:09.587643 containerd[1455]: time="2026-03-06T01:43:09.587529637Z" level=info msg="Forcibly stopping sandbox \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\"" Mar 6 01:43:09.692157 containerd[1455]: 2026-03-06 01:43:09.634 [WARNING][5510] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" WorkloadEndpoint="localhost-k8s-whisker--54777d5447--bt5zk-eth0" Mar 6 01:43:09.692157 containerd[1455]: 2026-03-06 01:43:09.634 [INFO][5510] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Mar 6 01:43:09.692157 containerd[1455]: 2026-03-06 01:43:09.634 [INFO][5510] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" iface="eth0" netns="" Mar 6 01:43:09.692157 containerd[1455]: 2026-03-06 01:43:09.634 [INFO][5510] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Mar 6 01:43:09.692157 containerd[1455]: 2026-03-06 01:43:09.634 [INFO][5510] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Mar 6 01:43:09.692157 containerd[1455]: 2026-03-06 01:43:09.672 [INFO][5518] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" HandleID="k8s-pod-network.527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Workload="localhost-k8s-whisker--54777d5447--bt5zk-eth0" Mar 6 01:43:09.692157 containerd[1455]: 2026-03-06 01:43:09.673 [INFO][5518] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:43:09.692157 containerd[1455]: 2026-03-06 01:43:09.673 [INFO][5518] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:43:09.692157 containerd[1455]: 2026-03-06 01:43:09.683 [WARNING][5518] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" HandleID="k8s-pod-network.527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Workload="localhost-k8s-whisker--54777d5447--bt5zk-eth0" Mar 6 01:43:09.692157 containerd[1455]: 2026-03-06 01:43:09.683 [INFO][5518] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" HandleID="k8s-pod-network.527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Workload="localhost-k8s-whisker--54777d5447--bt5zk-eth0" Mar 6 01:43:09.692157 containerd[1455]: 2026-03-06 01:43:09.686 [INFO][5518] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:43:09.692157 containerd[1455]: 2026-03-06 01:43:09.689 [INFO][5510] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4" Mar 6 01:43:09.692157 containerd[1455]: time="2026-03-06T01:43:09.692140830Z" level=info msg="TearDown network for sandbox \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\" successfully" Mar 6 01:43:09.699577 containerd[1455]: time="2026-03-06T01:43:09.699460126Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 6 01:43:09.699577 containerd[1455]: time="2026-03-06T01:43:09.699566545Z" level=info msg="RemovePodSandbox \"527b7328c092880975564836fe3236c36f91d841059efd1cfc18080a484c24e4\" returns successfully" Mar 6 01:43:09.700196 containerd[1455]: time="2026-03-06T01:43:09.700127960Z" level=info msg="StopPodSandbox for \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\"" Mar 6 01:43:09.816455 containerd[1455]: 2026-03-06 01:43:09.757 [WARNING][5537] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--t4vmj-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"e4692164-1dad-4fc3-ad4b-fb8a4d587f00", ResourceVersion:"1042", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8", Pod:"coredns-66bc5c9577-t4vmj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif0d39e01b10", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:43:09.816455 containerd[1455]: 2026-03-06 01:43:09.757 [INFO][5537] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Mar 6 01:43:09.816455 containerd[1455]: 2026-03-06 01:43:09.757 [INFO][5537] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" iface="eth0" netns="" Mar 6 01:43:09.816455 containerd[1455]: 2026-03-06 01:43:09.757 [INFO][5537] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Mar 6 01:43:09.816455 containerd[1455]: 2026-03-06 01:43:09.757 [INFO][5537] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Mar 6 01:43:09.816455 containerd[1455]: 2026-03-06 01:43:09.794 [INFO][5546] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" HandleID="k8s-pod-network.b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Workload="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" Mar 6 01:43:09.816455 containerd[1455]: 2026-03-06 01:43:09.794 [INFO][5546] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:43:09.816455 containerd[1455]: 2026-03-06 01:43:09.794 [INFO][5546] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:43:09.816455 containerd[1455]: 2026-03-06 01:43:09.806 [WARNING][5546] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" HandleID="k8s-pod-network.b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Workload="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" Mar 6 01:43:09.816455 containerd[1455]: 2026-03-06 01:43:09.806 [INFO][5546] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" HandleID="k8s-pod-network.b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Workload="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" Mar 6 01:43:09.816455 containerd[1455]: 2026-03-06 01:43:09.809 [INFO][5546] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:43:09.816455 containerd[1455]: 2026-03-06 01:43:09.812 [INFO][5537] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Mar 6 01:43:09.816455 containerd[1455]: time="2026-03-06T01:43:09.816405035Z" level=info msg="TearDown network for sandbox \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\" successfully" Mar 6 01:43:09.816455 containerd[1455]: time="2026-03-06T01:43:09.816429281Z" level=info msg="StopPodSandbox for \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\" returns successfully" Mar 6 01:43:09.817233 containerd[1455]: time="2026-03-06T01:43:09.817065947Z" level=info msg="RemovePodSandbox for \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\"" Mar 6 01:43:09.817233 containerd[1455]: time="2026-03-06T01:43:09.817105671Z" level=info msg="Forcibly stopping sandbox \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\"" Mar 6 01:43:09.971575 containerd[1455]: 2026-03-06 01:43:09.890 [WARNING][5565] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--t4vmj-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"e4692164-1dad-4fc3-ad4b-fb8a4d587f00", ResourceVersion:"1042", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"84b8ba27f4211afe63eb891bbac14335140cf2a2d13cbcd68bff6922225176e8", Pod:"coredns-66bc5c9577-t4vmj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif0d39e01b10", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:43:09.971575 containerd[1455]: 2026-03-06 01:43:09.891 [INFO][5565] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Mar 6 01:43:09.971575 containerd[1455]: 2026-03-06 01:43:09.891 [INFO][5565] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" iface="eth0" netns="" Mar 6 01:43:09.971575 containerd[1455]: 2026-03-06 01:43:09.891 [INFO][5565] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Mar 6 01:43:09.971575 containerd[1455]: 2026-03-06 01:43:09.891 [INFO][5565] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Mar 6 01:43:09.971575 containerd[1455]: 2026-03-06 01:43:09.943 [INFO][5573] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" HandleID="k8s-pod-network.b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Workload="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" Mar 6 01:43:09.971575 containerd[1455]: 2026-03-06 01:43:09.943 [INFO][5573] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:43:09.971575 containerd[1455]: 2026-03-06 01:43:09.943 [INFO][5573] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:43:09.971575 containerd[1455]: 2026-03-06 01:43:09.958 [WARNING][5573] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" HandleID="k8s-pod-network.b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Workload="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" Mar 6 01:43:09.971575 containerd[1455]: 2026-03-06 01:43:09.959 [INFO][5573] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" HandleID="k8s-pod-network.b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Workload="localhost-k8s-coredns--66bc5c9577--t4vmj-eth0" Mar 6 01:43:09.971575 containerd[1455]: 2026-03-06 01:43:09.961 [INFO][5573] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:43:09.971575 containerd[1455]: 2026-03-06 01:43:09.966 [INFO][5565] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701" Mar 6 01:43:09.971575 containerd[1455]: time="2026-03-06T01:43:09.971437427Z" level=info msg="TearDown network for sandbox \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\" successfully" Mar 6 01:43:09.979531 containerd[1455]: time="2026-03-06T01:43:09.979350048Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 6 01:43:09.979531 containerd[1455]: time="2026-03-06T01:43:09.979471255Z" level=info msg="RemovePodSandbox \"b5382521014f4020a7a8958059b52631f8bfa38dfee0e9b2a938107e70653701\" returns successfully" Mar 6 01:43:09.980240 containerd[1455]: time="2026-03-06T01:43:09.980098075Z" level=info msg="StopPodSandbox for \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\"" Mar 6 01:43:10.150531 containerd[1455]: 2026-03-06 01:43:10.069 [WARNING][5595] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"2402e1ec-ea50-4f17-85cf-f279f0f10494", ResourceVersion:"1067", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a", Pod:"goldmane-cccfbd5cf-fnx6s", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia9db2015650", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:43:10.150531 containerd[1455]: 2026-03-06 01:43:10.069 [INFO][5595] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Mar 6 01:43:10.150531 containerd[1455]: 2026-03-06 01:43:10.069 [INFO][5595] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" iface="eth0" netns="" Mar 6 01:43:10.150531 containerd[1455]: 2026-03-06 01:43:10.069 [INFO][5595] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Mar 6 01:43:10.150531 containerd[1455]: 2026-03-06 01:43:10.069 [INFO][5595] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Mar 6 01:43:10.150531 containerd[1455]: 2026-03-06 01:43:10.122 [INFO][5615] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" HandleID="k8s-pod-network.61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Workload="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" Mar 6 01:43:10.150531 containerd[1455]: 2026-03-06 01:43:10.122 [INFO][5615] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:43:10.150531 containerd[1455]: 2026-03-06 01:43:10.122 [INFO][5615] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:43:10.150531 containerd[1455]: 2026-03-06 01:43:10.132 [WARNING][5615] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" HandleID="k8s-pod-network.61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Workload="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" Mar 6 01:43:10.150531 containerd[1455]: 2026-03-06 01:43:10.132 [INFO][5615] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" HandleID="k8s-pod-network.61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Workload="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" Mar 6 01:43:10.150531 containerd[1455]: 2026-03-06 01:43:10.135 [INFO][5615] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:43:10.150531 containerd[1455]: 2026-03-06 01:43:10.140 [INFO][5595] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Mar 6 01:43:10.150531 containerd[1455]: time="2026-03-06T01:43:10.147093435Z" level=info msg="TearDown network for sandbox \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\" successfully" Mar 6 01:43:10.150531 containerd[1455]: time="2026-03-06T01:43:10.147128440Z" level=info msg="StopPodSandbox for \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\" returns successfully" Mar 6 01:43:10.150531 containerd[1455]: time="2026-03-06T01:43:10.148505426Z" level=info msg="RemovePodSandbox for \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\"" Mar 6 01:43:10.150531 containerd[1455]: time="2026-03-06T01:43:10.148538699Z" level=info msg="Forcibly stopping sandbox \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\"" Mar 6 01:43:10.296448 containerd[1455]: 2026-03-06 01:43:10.220 [WARNING][5633] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"2402e1ec-ea50-4f17-85cf-f279f0f10494", ResourceVersion:"1067", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"63e63c42bf1c0bc98ac7fedf39492a7138724de4ae0c9b80d432146dccf9df1a", Pod:"goldmane-cccfbd5cf-fnx6s", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia9db2015650", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:43:10.296448 containerd[1455]: 2026-03-06 01:43:10.221 [INFO][5633] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Mar 6 01:43:10.296448 containerd[1455]: 2026-03-06 01:43:10.221 [INFO][5633] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" iface="eth0" netns="" Mar 6 01:43:10.296448 containerd[1455]: 2026-03-06 01:43:10.221 [INFO][5633] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Mar 6 01:43:10.296448 containerd[1455]: 2026-03-06 01:43:10.221 [INFO][5633] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Mar 6 01:43:10.296448 containerd[1455]: 2026-03-06 01:43:10.274 [INFO][5642] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" HandleID="k8s-pod-network.61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Workload="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" Mar 6 01:43:10.296448 containerd[1455]: 2026-03-06 01:43:10.275 [INFO][5642] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:43:10.296448 containerd[1455]: 2026-03-06 01:43:10.275 [INFO][5642] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:43:10.296448 containerd[1455]: 2026-03-06 01:43:10.284 [WARNING][5642] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" HandleID="k8s-pod-network.61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Workload="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" Mar 6 01:43:10.296448 containerd[1455]: 2026-03-06 01:43:10.284 [INFO][5642] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" HandleID="k8s-pod-network.61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Workload="localhost-k8s-goldmane--cccfbd5cf--fnx6s-eth0" Mar 6 01:43:10.296448 containerd[1455]: 2026-03-06 01:43:10.287 [INFO][5642] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:43:10.296448 containerd[1455]: 2026-03-06 01:43:10.291 [INFO][5633] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063" Mar 6 01:43:10.296448 containerd[1455]: time="2026-03-06T01:43:10.296097577Z" level=info msg="TearDown network for sandbox \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\" successfully" Mar 6 01:43:10.303555 containerd[1455]: time="2026-03-06T01:43:10.303449641Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 6 01:43:10.303555 containerd[1455]: time="2026-03-06T01:43:10.303558074Z" level=info msg="RemovePodSandbox \"61d00f094391ce2cfaaf7da4e05a734c536701d28eea84f283b6c59cfe17f063\" returns successfully" Mar 6 01:43:10.304432 containerd[1455]: time="2026-03-06T01:43:10.304397361Z" level=info msg="StopPodSandbox for \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\"" Mar 6 01:43:10.463150 containerd[1455]: 2026-03-06 01:43:10.393 [WARNING][5659] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--rdfzk-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"4e0c3015-6a1a-484c-97f5-39c6519fa25a", ResourceVersion:"1048", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed", Pod:"coredns-66bc5c9577-rdfzk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali36009d6b7e9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:43:10.463150 containerd[1455]: 2026-03-06 01:43:10.394 [INFO][5659] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Mar 6 01:43:10.463150 containerd[1455]: 2026-03-06 01:43:10.394 [INFO][5659] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" iface="eth0" netns="" Mar 6 01:43:10.463150 containerd[1455]: 2026-03-06 01:43:10.394 [INFO][5659] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Mar 6 01:43:10.463150 containerd[1455]: 2026-03-06 01:43:10.394 [INFO][5659] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Mar 6 01:43:10.463150 containerd[1455]: 2026-03-06 01:43:10.436 [INFO][5667] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" HandleID="k8s-pod-network.e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Workload="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" Mar 6 01:43:10.463150 containerd[1455]: 2026-03-06 01:43:10.436 [INFO][5667] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:43:10.463150 containerd[1455]: 2026-03-06 01:43:10.436 [INFO][5667] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:43:10.463150 containerd[1455]: 2026-03-06 01:43:10.448 [WARNING][5667] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" HandleID="k8s-pod-network.e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Workload="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" Mar 6 01:43:10.463150 containerd[1455]: 2026-03-06 01:43:10.449 [INFO][5667] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" HandleID="k8s-pod-network.e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Workload="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" Mar 6 01:43:10.463150 containerd[1455]: 2026-03-06 01:43:10.455 [INFO][5667] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:43:10.463150 containerd[1455]: 2026-03-06 01:43:10.458 [INFO][5659] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Mar 6 01:43:10.463150 containerd[1455]: time="2026-03-06T01:43:10.462837195Z" level=info msg="TearDown network for sandbox \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\" successfully" Mar 6 01:43:10.463150 containerd[1455]: time="2026-03-06T01:43:10.462922094Z" level=info msg="StopPodSandbox for \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\" returns successfully" Mar 6 01:43:10.465071 containerd[1455]: time="2026-03-06T01:43:10.464977285Z" level=info msg="RemovePodSandbox for \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\"" Mar 6 01:43:10.465071 containerd[1455]: time="2026-03-06T01:43:10.465058396Z" level=info msg="Forcibly stopping sandbox \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\"" Mar 6 01:43:10.605397 containerd[1455]: 2026-03-06 01:43:10.553 [WARNING][5683] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--rdfzk-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"4e0c3015-6a1a-484c-97f5-39c6519fa25a", ResourceVersion:"1048", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f96c7fbc9a082abfafa2e83c9822505f11140df49cdd8274efe813479602f5ed", Pod:"coredns-66bc5c9577-rdfzk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali36009d6b7e9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:43:10.605397 containerd[1455]: 2026-03-06 01:43:10.553 [INFO][5683] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Mar 6 01:43:10.605397 containerd[1455]: 2026-03-06 01:43:10.553 [INFO][5683] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" iface="eth0" netns="" Mar 6 01:43:10.605397 containerd[1455]: 2026-03-06 01:43:10.553 [INFO][5683] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Mar 6 01:43:10.605397 containerd[1455]: 2026-03-06 01:43:10.553 [INFO][5683] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Mar 6 01:43:10.605397 containerd[1455]: 2026-03-06 01:43:10.587 [INFO][5691] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" HandleID="k8s-pod-network.e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Workload="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" Mar 6 01:43:10.605397 containerd[1455]: 2026-03-06 01:43:10.588 [INFO][5691] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:43:10.605397 containerd[1455]: 2026-03-06 01:43:10.588 [INFO][5691] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:43:10.605397 containerd[1455]: 2026-03-06 01:43:10.596 [WARNING][5691] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" HandleID="k8s-pod-network.e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Workload="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" Mar 6 01:43:10.605397 containerd[1455]: 2026-03-06 01:43:10.596 [INFO][5691] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" HandleID="k8s-pod-network.e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Workload="localhost-k8s-coredns--66bc5c9577--rdfzk-eth0" Mar 6 01:43:10.605397 containerd[1455]: 2026-03-06 01:43:10.598 [INFO][5691] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:43:10.605397 containerd[1455]: 2026-03-06 01:43:10.601 [INFO][5683] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7" Mar 6 01:43:10.605923 containerd[1455]: time="2026-03-06T01:43:10.605440347Z" level=info msg="TearDown network for sandbox \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\" successfully" Mar 6 01:43:10.611992 containerd[1455]: time="2026-03-06T01:43:10.611924007Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 6 01:43:10.612038 containerd[1455]: time="2026-03-06T01:43:10.612010759Z" level=info msg="RemovePodSandbox \"e076a5d2d51ab2ffe9f9e4ab8c37f50c2d98006e5dda272709f226998feb9df7\" returns successfully" Mar 6 01:43:10.612975 containerd[1455]: time="2026-03-06T01:43:10.612913287Z" level=info msg="StopPodSandbox for \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\"" Mar 6 01:43:10.743991 containerd[1455]: 2026-03-06 01:43:10.676 [WARNING][5709] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0", GenerateName:"calico-kube-controllers-5764cfbc4b-", Namespace:"calico-system", SelfLink:"", UID:"2864938a-9ea0-4ee9-987a-e214cf44a87b", ResourceVersion:"1098", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5764cfbc4b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2", Pod:"calico-kube-controllers-5764cfbc4b-xwn5j", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif836518fd78", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:43:10.743991 containerd[1455]: 2026-03-06 01:43:10.676 [INFO][5709] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Mar 6 01:43:10.743991 containerd[1455]: 2026-03-06 01:43:10.676 [INFO][5709] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" iface="eth0" netns="" Mar 6 01:43:10.743991 containerd[1455]: 2026-03-06 01:43:10.676 [INFO][5709] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Mar 6 01:43:10.743991 containerd[1455]: 2026-03-06 01:43:10.676 [INFO][5709] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Mar 6 01:43:10.743991 containerd[1455]: 2026-03-06 01:43:10.722 [INFO][5718] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" HandleID="k8s-pod-network.58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Workload="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" Mar 6 01:43:10.743991 containerd[1455]: 2026-03-06 01:43:10.723 [INFO][5718] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:43:10.743991 containerd[1455]: 2026-03-06 01:43:10.723 [INFO][5718] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:43:10.743991 containerd[1455]: 2026-03-06 01:43:10.732 [WARNING][5718] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" HandleID="k8s-pod-network.58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Workload="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" Mar 6 01:43:10.743991 containerd[1455]: 2026-03-06 01:43:10.732 [INFO][5718] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" HandleID="k8s-pod-network.58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Workload="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" Mar 6 01:43:10.743991 containerd[1455]: 2026-03-06 01:43:10.735 [INFO][5718] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:43:10.743991 containerd[1455]: 2026-03-06 01:43:10.740 [INFO][5709] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Mar 6 01:43:10.743991 containerd[1455]: time="2026-03-06T01:43:10.743947283Z" level=info msg="TearDown network for sandbox \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\" successfully" Mar 6 01:43:10.743991 containerd[1455]: time="2026-03-06T01:43:10.743972670Z" level=info msg="StopPodSandbox for \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\" returns successfully" Mar 6 01:43:10.744736 containerd[1455]: time="2026-03-06T01:43:10.744686393Z" level=info msg="RemovePodSandbox for \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\"" Mar 6 01:43:10.744822 containerd[1455]: time="2026-03-06T01:43:10.744737980Z" level=info msg="Forcibly stopping sandbox \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\"" Mar 6 01:43:10.887175 containerd[1455]: 2026-03-06 01:43:10.813 [WARNING][5734] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0", GenerateName:"calico-kube-controllers-5764cfbc4b-", Namespace:"calico-system", SelfLink:"", UID:"2864938a-9ea0-4ee9-987a-e214cf44a87b", ResourceVersion:"1098", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5764cfbc4b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"3959e5c42d6255d42a0b52d5b388b0b5a382bd689b7085a8bf24b8ebeacba9d2", Pod:"calico-kube-controllers-5764cfbc4b-xwn5j", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif836518fd78", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:43:10.887175 containerd[1455]: 2026-03-06 01:43:10.813 [INFO][5734] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Mar 6 01:43:10.887175 containerd[1455]: 2026-03-06 01:43:10.813 [INFO][5734] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" iface="eth0" netns="" Mar 6 01:43:10.887175 containerd[1455]: 2026-03-06 01:43:10.813 [INFO][5734] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Mar 6 01:43:10.887175 containerd[1455]: 2026-03-06 01:43:10.813 [INFO][5734] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Mar 6 01:43:10.887175 containerd[1455]: 2026-03-06 01:43:10.866 [INFO][5743] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" HandleID="k8s-pod-network.58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Workload="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" Mar 6 01:43:10.887175 containerd[1455]: 2026-03-06 01:43:10.867 [INFO][5743] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:43:10.887175 containerd[1455]: 2026-03-06 01:43:10.867 [INFO][5743] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:43:10.887175 containerd[1455]: 2026-03-06 01:43:10.875 [WARNING][5743] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" HandleID="k8s-pod-network.58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Workload="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" Mar 6 01:43:10.887175 containerd[1455]: 2026-03-06 01:43:10.876 [INFO][5743] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" HandleID="k8s-pod-network.58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Workload="localhost-k8s-calico--kube--controllers--5764cfbc4b--xwn5j-eth0" Mar 6 01:43:10.887175 containerd[1455]: 2026-03-06 01:43:10.879 [INFO][5743] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:43:10.887175 containerd[1455]: 2026-03-06 01:43:10.883 [INFO][5734] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56" Mar 6 01:43:10.887175 containerd[1455]: time="2026-03-06T01:43:10.887099652Z" level=info msg="TearDown network for sandbox \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\" successfully" Mar 6 01:43:10.892923 containerd[1455]: time="2026-03-06T01:43:10.892827005Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 6 01:43:10.892923 containerd[1455]: time="2026-03-06T01:43:10.892904901Z" level=info msg="RemovePodSandbox \"58986cd24b14ce25436e23642b5906c92068a5655d1532508bd6839070131a56\" returns successfully" Mar 6 01:43:10.894239 containerd[1455]: time="2026-03-06T01:43:10.894057551Z" level=info msg="StopPodSandbox for \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\"" Mar 6 01:43:11.051852 containerd[1455]: 2026-03-06 01:43:10.978 [WARNING][5761] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0", GenerateName:"calico-apiserver-c7b968c57-", Namespace:"calico-system", SelfLink:"", UID:"5e1355ef-1fae-411e-a25e-19a787706802", ResourceVersion:"1190", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c7b968c57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881", Pod:"calico-apiserver-c7b968c57-g2k99", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali7178df35d0b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:43:11.051852 containerd[1455]: 2026-03-06 01:43:10.979 [INFO][5761] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Mar 6 01:43:11.051852 containerd[1455]: 2026-03-06 01:43:10.979 [INFO][5761] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" iface="eth0" netns="" Mar 6 01:43:11.051852 containerd[1455]: 2026-03-06 01:43:10.979 [INFO][5761] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Mar 6 01:43:11.051852 containerd[1455]: 2026-03-06 01:43:10.979 [INFO][5761] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Mar 6 01:43:11.051852 containerd[1455]: 2026-03-06 01:43:11.025 [INFO][5770] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" HandleID="k8s-pod-network.090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Workload="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" Mar 6 01:43:11.051852 containerd[1455]: 2026-03-06 01:43:11.026 [INFO][5770] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:43:11.051852 containerd[1455]: 2026-03-06 01:43:11.026 [INFO][5770] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:43:11.051852 containerd[1455]: 2026-03-06 01:43:11.037 [WARNING][5770] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" HandleID="k8s-pod-network.090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Workload="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" Mar 6 01:43:11.051852 containerd[1455]: 2026-03-06 01:43:11.037 [INFO][5770] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" HandleID="k8s-pod-network.090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Workload="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" Mar 6 01:43:11.051852 containerd[1455]: 2026-03-06 01:43:11.041 [INFO][5770] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:43:11.051852 containerd[1455]: 2026-03-06 01:43:11.045 [INFO][5761] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Mar 6 01:43:11.051852 containerd[1455]: time="2026-03-06T01:43:11.051086275Z" level=info msg="TearDown network for sandbox \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\" successfully" Mar 6 01:43:11.051852 containerd[1455]: time="2026-03-06T01:43:11.051116893Z" level=info msg="StopPodSandbox for \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\" returns successfully" Mar 6 01:43:11.054067 containerd[1455]: time="2026-03-06T01:43:11.053660557Z" level=info msg="RemovePodSandbox for \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\"" Mar 6 01:43:11.054067 containerd[1455]: time="2026-03-06T01:43:11.053860249Z" level=info msg="Forcibly stopping sandbox \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\"" Mar 6 01:43:11.179361 kubelet[2506]: I0306 01:43:11.178088 2506 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 6 01:43:11.193408 containerd[1455]: 2026-03-06 01:43:11.128 [WARNING][5788] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0", GenerateName:"calico-apiserver-c7b968c57-", Namespace:"calico-system", SelfLink:"", UID:"5e1355ef-1fae-411e-a25e-19a787706802", ResourceVersion:"1190", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 1, 42, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c7b968c57", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"aff81ea9579512632d9d83d98ac6e1c7b6c3a4f78a5c063c2f1c423944e46881", Pod:"calico-apiserver-c7b968c57-g2k99", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali7178df35d0b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 01:43:11.193408 containerd[1455]: 2026-03-06 01:43:11.128 [INFO][5788] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Mar 6 01:43:11.193408 containerd[1455]: 2026-03-06 01:43:11.128 [INFO][5788] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" iface="eth0" netns="" Mar 6 01:43:11.193408 containerd[1455]: 2026-03-06 01:43:11.128 [INFO][5788] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Mar 6 01:43:11.193408 containerd[1455]: 2026-03-06 01:43:11.128 [INFO][5788] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Mar 6 01:43:11.193408 containerd[1455]: 2026-03-06 01:43:11.174 [INFO][5797] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" HandleID="k8s-pod-network.090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Workload="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" Mar 6 01:43:11.193408 containerd[1455]: 2026-03-06 01:43:11.174 [INFO][5797] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 01:43:11.193408 containerd[1455]: 2026-03-06 01:43:11.174 [INFO][5797] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 01:43:11.193408 containerd[1455]: 2026-03-06 01:43:11.182 [WARNING][5797] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" HandleID="k8s-pod-network.090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Workload="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" Mar 6 01:43:11.193408 containerd[1455]: 2026-03-06 01:43:11.182 [INFO][5797] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" HandleID="k8s-pod-network.090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Workload="localhost-k8s-calico--apiserver--c7b968c57--g2k99-eth0" Mar 6 01:43:11.193408 containerd[1455]: 2026-03-06 01:43:11.185 [INFO][5797] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 01:43:11.193408 containerd[1455]: 2026-03-06 01:43:11.189 [INFO][5788] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96" Mar 6 01:43:11.194925 containerd[1455]: time="2026-03-06T01:43:11.193451988Z" level=info msg="TearDown network for sandbox \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\" successfully" Mar 6 01:43:11.203906 containerd[1455]: time="2026-03-06T01:43:11.203748912Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 6 01:43:11.203906 containerd[1455]: time="2026-03-06T01:43:11.203904623Z" level=info msg="RemovePodSandbox \"090de82475000ce8701676daa39331d51198e9d23599791ef1975015f2a21d96\" returns successfully" Mar 6 01:43:13.134869 systemd[1]: Started sshd@9-10.0.0.120:22-10.0.0.1:39152.service - OpenSSH per-connection server daemon (10.0.0.1:39152). Mar 6 01:43:13.222966 sshd[5808]: Accepted publickey for core from 10.0.0.1 port 39152 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:13.226034 sshd[5808]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:13.233346 systemd-logind[1440]: New session 10 of user core. Mar 6 01:43:13.241653 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 6 01:43:13.452630 sshd[5808]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:13.457936 systemd[1]: sshd@9-10.0.0.120:22-10.0.0.1:39152.service: Deactivated successfully. Mar 6 01:43:13.460624 systemd[1]: session-10.scope: Deactivated successfully. Mar 6 01:43:13.461963 systemd-logind[1440]: Session 10 logged out. Waiting for processes to exit. Mar 6 01:43:13.464004 systemd-logind[1440]: Removed session 10. Mar 6 01:43:18.497971 systemd[1]: Started sshd@10-10.0.0.120:22-10.0.0.1:39158.service - OpenSSH per-connection server daemon (10.0.0.1:39158). Mar 6 01:43:18.558242 sshd[5849]: Accepted publickey for core from 10.0.0.1 port 39158 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:18.568361 sshd[5849]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:18.576754 systemd-logind[1440]: New session 11 of user core. Mar 6 01:43:18.586666 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 6 01:43:18.799520 sshd[5849]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:18.805466 systemd[1]: sshd@10-10.0.0.120:22-10.0.0.1:39158.service: Deactivated successfully. Mar 6 01:43:18.808312 systemd[1]: session-11.scope: Deactivated successfully. Mar 6 01:43:18.809613 systemd-logind[1440]: Session 11 logged out. Waiting for processes to exit. Mar 6 01:43:18.811827 systemd-logind[1440]: Removed session 11. Mar 6 01:43:22.039858 kubelet[2506]: E0306 01:43:22.039706 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:43:23.815187 systemd[1]: Started sshd@11-10.0.0.120:22-10.0.0.1:47568.service - OpenSSH per-connection server daemon (10.0.0.1:47568). Mar 6 01:43:23.875470 sshd[5887]: Accepted publickey for core from 10.0.0.1 port 47568 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:23.877585 sshd[5887]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:23.885103 systemd-logind[1440]: New session 12 of user core. Mar 6 01:43:23.896708 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 6 01:43:24.101695 sshd[5887]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:24.107672 systemd[1]: sshd@11-10.0.0.120:22-10.0.0.1:47568.service: Deactivated successfully. Mar 6 01:43:24.111048 systemd[1]: session-12.scope: Deactivated successfully. Mar 6 01:43:24.112431 systemd-logind[1440]: Session 12 logged out. Waiting for processes to exit. Mar 6 01:43:24.114204 systemd-logind[1440]: Removed session 12. Mar 6 01:43:27.041959 kubelet[2506]: E0306 01:43:27.041746 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:43:29.039988 kubelet[2506]: E0306 01:43:29.039915 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:43:29.131358 systemd[1]: Started sshd@12-10.0.0.120:22-10.0.0.1:47574.service - OpenSSH per-connection server daemon (10.0.0.1:47574). Mar 6 01:43:29.200078 sshd[5951]: Accepted publickey for core from 10.0.0.1 port 47574 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:29.202417 sshd[5951]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:29.209924 systemd-logind[1440]: New session 13 of user core. Mar 6 01:43:29.224579 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 6 01:43:29.391228 sshd[5951]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:29.398849 systemd[1]: sshd@12-10.0.0.120:22-10.0.0.1:47574.service: Deactivated successfully. Mar 6 01:43:29.403191 systemd[1]: session-13.scope: Deactivated successfully. Mar 6 01:43:29.404842 systemd-logind[1440]: Session 13 logged out. Waiting for processes to exit. Mar 6 01:43:29.407040 systemd-logind[1440]: Removed session 13. Mar 6 01:43:34.039110 kubelet[2506]: E0306 01:43:34.038967 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:43:34.417530 systemd[1]: Started sshd@13-10.0.0.120:22-10.0.0.1:54816.service - OpenSSH per-connection server daemon (10.0.0.1:54816). Mar 6 01:43:34.476723 sshd[5979]: Accepted publickey for core from 10.0.0.1 port 54816 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:34.479993 sshd[5979]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:34.489885 systemd-logind[1440]: New session 14 of user core. Mar 6 01:43:34.495896 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 6 01:43:34.727885 sshd[5979]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:34.735036 systemd[1]: sshd@13-10.0.0.120:22-10.0.0.1:54816.service: Deactivated successfully. Mar 6 01:43:34.738747 systemd[1]: session-14.scope: Deactivated successfully. Mar 6 01:43:34.743893 systemd-logind[1440]: Session 14 logged out. Waiting for processes to exit. Mar 6 01:43:34.748242 systemd-logind[1440]: Removed session 14. Mar 6 01:43:39.007145 systemd[1]: run-containerd-runc-k8s.io-1b67c6c5ad5a788fe4a6f598ed9db1e3a6a9bfd8affd4554b0fe123993eba5a7-runc.4njmGm.mount: Deactivated successfully. Mar 6 01:43:39.738405 systemd[1]: Started sshd@14-10.0.0.120:22-10.0.0.1:54830.service - OpenSSH per-connection server daemon (10.0.0.1:54830). Mar 6 01:43:39.835394 sshd[6015]: Accepted publickey for core from 10.0.0.1 port 54830 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:39.838025 sshd[6015]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:39.846214 systemd-logind[1440]: New session 15 of user core. Mar 6 01:43:39.855621 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 6 01:43:40.020135 sshd[6015]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:40.024180 systemd[1]: sshd@14-10.0.0.120:22-10.0.0.1:54830.service: Deactivated successfully. Mar 6 01:43:40.028035 systemd[1]: session-15.scope: Deactivated successfully. Mar 6 01:43:40.031381 systemd-logind[1440]: Session 15 logged out. Waiting for processes to exit. Mar 6 01:43:40.033176 systemd-logind[1440]: Removed session 15. Mar 6 01:43:45.040607 systemd[1]: Started sshd@15-10.0.0.120:22-10.0.0.1:60056.service - OpenSSH per-connection server daemon (10.0.0.1:60056). Mar 6 01:43:45.088409 sshd[6052]: Accepted publickey for core from 10.0.0.1 port 60056 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:45.091160 sshd[6052]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:45.100906 systemd-logind[1440]: New session 16 of user core. Mar 6 01:43:45.110661 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 6 01:43:45.277935 sshd[6052]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:45.287682 systemd[1]: sshd@15-10.0.0.120:22-10.0.0.1:60056.service: Deactivated successfully. Mar 6 01:43:45.290021 systemd[1]: session-16.scope: Deactivated successfully. Mar 6 01:43:45.292672 systemd-logind[1440]: Session 16 logged out. Waiting for processes to exit. Mar 6 01:43:45.297993 systemd[1]: Started sshd@16-10.0.0.120:22-10.0.0.1:60060.service - OpenSSH per-connection server daemon (10.0.0.1:60060). Mar 6 01:43:45.300030 systemd-logind[1440]: Removed session 16. Mar 6 01:43:45.368352 sshd[6068]: Accepted publickey for core from 10.0.0.1 port 60060 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:45.371499 sshd[6068]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:45.379574 systemd-logind[1440]: New session 17 of user core. Mar 6 01:43:45.388535 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 6 01:43:45.628762 sshd[6068]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:45.653901 systemd[1]: sshd@16-10.0.0.120:22-10.0.0.1:60060.service: Deactivated successfully. Mar 6 01:43:45.666544 systemd[1]: session-17.scope: Deactivated successfully. Mar 6 01:43:45.670686 systemd-logind[1440]: Session 17 logged out. Waiting for processes to exit. Mar 6 01:43:45.678978 systemd[1]: Started sshd@17-10.0.0.120:22-10.0.0.1:60064.service - OpenSSH per-connection server daemon (10.0.0.1:60064). Mar 6 01:43:45.683123 systemd-logind[1440]: Removed session 17. Mar 6 01:43:45.735954 sshd[6081]: Accepted publickey for core from 10.0.0.1 port 60064 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:45.742852 sshd[6081]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:45.750606 systemd-logind[1440]: New session 18 of user core. Mar 6 01:43:45.755490 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 6 01:43:45.920668 sshd[6081]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:45.926622 systemd[1]: sshd@17-10.0.0.120:22-10.0.0.1:60064.service: Deactivated successfully. Mar 6 01:43:45.929197 systemd[1]: session-18.scope: Deactivated successfully. Mar 6 01:43:45.930926 systemd-logind[1440]: Session 18 logged out. Waiting for processes to exit. Mar 6 01:43:45.933189 systemd-logind[1440]: Removed session 18. Mar 6 01:43:50.944651 systemd[1]: Started sshd@18-10.0.0.120:22-10.0.0.1:60074.service - OpenSSH per-connection server daemon (10.0.0.1:60074). Mar 6 01:43:51.072630 sshd[6119]: Accepted publickey for core from 10.0.0.1 port 60074 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:51.076369 sshd[6119]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:51.083519 systemd-logind[1440]: New session 19 of user core. Mar 6 01:43:51.090502 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 6 01:43:51.288886 sshd[6119]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:51.298237 systemd[1]: sshd@18-10.0.0.120:22-10.0.0.1:60074.service: Deactivated successfully. Mar 6 01:43:51.300592 systemd[1]: session-19.scope: Deactivated successfully. Mar 6 01:43:51.302599 systemd-logind[1440]: Session 19 logged out. Waiting for processes to exit. Mar 6 01:43:51.312634 systemd[1]: Started sshd@19-10.0.0.120:22-10.0.0.1:60088.service - OpenSSH per-connection server daemon (10.0.0.1:60088). Mar 6 01:43:51.313936 systemd-logind[1440]: Removed session 19. Mar 6 01:43:51.352688 sshd[6133]: Accepted publickey for core from 10.0.0.1 port 60088 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:51.355164 sshd[6133]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:51.363577 systemd-logind[1440]: New session 20 of user core. Mar 6 01:43:51.371527 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 6 01:43:51.781881 sshd[6133]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:51.792216 systemd[1]: sshd@19-10.0.0.120:22-10.0.0.1:60088.service: Deactivated successfully. Mar 6 01:43:51.795553 systemd[1]: session-20.scope: Deactivated successfully. Mar 6 01:43:51.797968 systemd-logind[1440]: Session 20 logged out. Waiting for processes to exit. Mar 6 01:43:51.808213 systemd[1]: Started sshd@20-10.0.0.120:22-10.0.0.1:60098.service - OpenSSH per-connection server daemon (10.0.0.1:60098). Mar 6 01:43:51.809645 systemd-logind[1440]: Removed session 20. Mar 6 01:43:51.856868 sshd[6145]: Accepted publickey for core from 10.0.0.1 port 60098 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:51.859072 sshd[6145]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:51.866204 systemd-logind[1440]: New session 21 of user core. Mar 6 01:43:51.880507 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 6 01:43:52.625492 sshd[6145]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:52.636994 systemd[1]: sshd@20-10.0.0.120:22-10.0.0.1:60098.service: Deactivated successfully. Mar 6 01:43:52.641055 systemd[1]: session-21.scope: Deactivated successfully. Mar 6 01:43:52.649066 systemd-logind[1440]: Session 21 logged out. Waiting for processes to exit. Mar 6 01:43:52.664109 systemd[1]: Started sshd@21-10.0.0.120:22-10.0.0.1:44118.service - OpenSSH per-connection server daemon (10.0.0.1:44118). Mar 6 01:43:52.666836 systemd-logind[1440]: Removed session 21. Mar 6 01:43:52.718389 sshd[6171]: Accepted publickey for core from 10.0.0.1 port 44118 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:52.721112 sshd[6171]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:52.729903 systemd-logind[1440]: New session 22 of user core. Mar 6 01:43:52.744549 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 6 01:43:53.116613 sshd[6171]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:53.129605 systemd[1]: sshd@21-10.0.0.120:22-10.0.0.1:44118.service: Deactivated successfully. Mar 6 01:43:53.134596 systemd[1]: session-22.scope: Deactivated successfully. Mar 6 01:43:53.139472 systemd-logind[1440]: Session 22 logged out. Waiting for processes to exit. Mar 6 01:43:53.151467 systemd[1]: Started sshd@22-10.0.0.120:22-10.0.0.1:44122.service - OpenSSH per-connection server daemon (10.0.0.1:44122). Mar 6 01:43:53.154422 systemd-logind[1440]: Removed session 22. Mar 6 01:43:53.206398 sshd[6186]: Accepted publickey for core from 10.0.0.1 port 44122 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:53.209319 sshd[6186]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:53.217519 systemd-logind[1440]: New session 23 of user core. Mar 6 01:43:53.224555 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 6 01:43:53.375497 sshd[6186]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:53.381879 systemd[1]: sshd@22-10.0.0.120:22-10.0.0.1:44122.service: Deactivated successfully. Mar 6 01:43:53.385846 systemd[1]: session-23.scope: Deactivated successfully. Mar 6 01:43:53.387434 systemd-logind[1440]: Session 23 logged out. Waiting for processes to exit. Mar 6 01:43:53.389693 systemd-logind[1440]: Removed session 23. Mar 6 01:43:58.400830 systemd[1]: Started sshd@23-10.0.0.120:22-10.0.0.1:44134.service - OpenSSH per-connection server daemon (10.0.0.1:44134). Mar 6 01:43:58.448131 sshd[6244]: Accepted publickey for core from 10.0.0.1 port 44134 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:43:58.462216 sshd[6244]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:43:58.475136 systemd-logind[1440]: New session 24 of user core. Mar 6 01:43:58.484686 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 6 01:43:58.646677 sshd[6244]: pam_unix(sshd:session): session closed for user core Mar 6 01:43:58.654559 systemd[1]: sshd@23-10.0.0.120:22-10.0.0.1:44134.service: Deactivated successfully. Mar 6 01:43:58.660090 systemd[1]: session-24.scope: Deactivated successfully. Mar 6 01:43:58.669400 systemd-logind[1440]: Session 24 logged out. Waiting for processes to exit. Mar 6 01:43:58.671636 systemd-logind[1440]: Removed session 24. Mar 6 01:44:03.677873 systemd[1]: Started sshd@24-10.0.0.120:22-10.0.0.1:58508.service - OpenSSH per-connection server daemon (10.0.0.1:58508). Mar 6 01:44:03.714773 sshd[6260]: Accepted publickey for core from 10.0.0.1 port 58508 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:44:03.717065 sshd[6260]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:44:03.727184 systemd-logind[1440]: New session 25 of user core. Mar 6 01:44:03.732622 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 6 01:44:03.892869 sshd[6260]: pam_unix(sshd:session): session closed for user core Mar 6 01:44:03.898141 systemd[1]: sshd@24-10.0.0.120:22-10.0.0.1:58508.service: Deactivated successfully. Mar 6 01:44:03.900703 systemd[1]: session-25.scope: Deactivated successfully. Mar 6 01:44:03.902229 systemd-logind[1440]: Session 25 logged out. Waiting for processes to exit. Mar 6 01:44:03.904656 systemd-logind[1440]: Removed session 25. Mar 6 01:44:07.040226 kubelet[2506]: E0306 01:44:07.040088 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:44:07.041365 kubelet[2506]: E0306 01:44:07.040640 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 6 01:44:08.921743 systemd[1]: Started sshd@25-10.0.0.120:22-10.0.0.1:58518.service - OpenSSH per-connection server daemon (10.0.0.1:58518). Mar 6 01:44:08.967545 sshd[6275]: Accepted publickey for core from 10.0.0.1 port 58518 ssh2: RSA SHA256:po+n4m2L0Y6JnDj1VTc5p26N9zFlj54R7gCeXzXqR3M Mar 6 01:44:08.970156 sshd[6275]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 01:44:08.977764 systemd-logind[1440]: New session 26 of user core. Mar 6 01:44:08.983597 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 6 01:44:09.142195 sshd[6275]: pam_unix(sshd:session): session closed for user core Mar 6 01:44:09.151230 systemd[1]: sshd@25-10.0.0.120:22-10.0.0.1:58518.service: Deactivated successfully. Mar 6 01:44:09.154898 systemd[1]: session-26.scope: Deactivated successfully. Mar 6 01:44:09.156416 systemd-logind[1440]: Session 26 logged out. Waiting for processes to exit. Mar 6 01:44:09.158374 systemd-logind[1440]: Removed session 26. Mar 6 01:44:10.054095 kubelet[2506]: E0306 01:44:10.053996 2506 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8"