Mar 4 01:17:09.036380 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Tue Mar 3 22:42:33 -00 2026 Mar 4 01:17:09.036413 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=cfbb17c272ffeca64391861cc763ec4868ca597850b31cbd6ed67c590a72edc7 Mar 4 01:17:09.036437 kernel: BIOS-provided physical RAM map: Mar 4 01:17:09.036455 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 4 01:17:09.036464 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 4 01:17:09.036474 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 4 01:17:09.036484 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Mar 4 01:17:09.036494 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Mar 4 01:17:09.036504 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Mar 4 01:17:09.036513 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Mar 4 01:17:09.036523 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 4 01:17:09.036556 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 4 01:17:09.036581 kernel: NX (Execute Disable) protection: active Mar 4 01:17:09.036592 kernel: APIC: Static calls initialized Mar 4 01:17:09.036604 kernel: SMBIOS 2.8 present. Mar 4 01:17:09.036620 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.16.0-3.module_el8.7.0+3346+68867adb 04/01/2014 Mar 4 01:17:09.036632 kernel: Hypervisor detected: KVM Mar 4 01:17:09.036648 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 4 01:17:09.036659 kernel: kvm-clock: using sched offset of 5328103604 cycles Mar 4 01:17:09.036670 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 4 01:17:09.036681 kernel: tsc: Detected 2799.998 MHz processor Mar 4 01:17:09.036692 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 4 01:17:09.036703 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 4 01:17:09.036713 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Mar 4 01:17:09.036724 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 4 01:17:09.036735 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 4 01:17:09.036750 kernel: Using GB pages for direct mapping Mar 4 01:17:09.036761 kernel: ACPI: Early table checksum verification disabled Mar 4 01:17:09.036772 kernel: ACPI: RSDP 0x00000000000F59E0 000014 (v00 BOCHS ) Mar 4 01:17:09.036786 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 01:17:09.036797 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 01:17:09.036808 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 01:17:09.036818 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Mar 4 01:17:09.036829 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 01:17:09.036840 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 01:17:09.036855 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 01:17:09.036866 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 4 01:17:09.036877 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Mar 4 01:17:09.036888 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Mar 4 01:17:09.036899 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Mar 4 01:17:09.036916 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Mar 4 01:17:09.036928 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Mar 4 01:17:09.036944 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Mar 4 01:17:09.036961 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Mar 4 01:17:09.036977 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Mar 4 01:17:09.036989 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Mar 4 01:17:09.037001 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Mar 4 01:17:09.037012 kernel: SRAT: PXM 0 -> APIC 0x03 -> Node 0 Mar 4 01:17:09.037023 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Mar 4 01:17:09.037034 kernel: SRAT: PXM 0 -> APIC 0x05 -> Node 0 Mar 4 01:17:09.037058 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Mar 4 01:17:09.037069 kernel: SRAT: PXM 0 -> APIC 0x07 -> Node 0 Mar 4 01:17:09.037080 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Mar 4 01:17:09.037098 kernel: SRAT: PXM 0 -> APIC 0x09 -> Node 0 Mar 4 01:17:09.037109 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Mar 4 01:17:09.037120 kernel: SRAT: PXM 0 -> APIC 0x0b -> Node 0 Mar 4 01:17:09.037132 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Mar 4 01:17:09.037143 kernel: SRAT: PXM 0 -> APIC 0x0d -> Node 0 Mar 4 01:17:09.037165 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Mar 4 01:17:09.037186 kernel: SRAT: PXM 0 -> APIC 0x0f -> Node 0 Mar 4 01:17:09.037199 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Mar 4 01:17:09.037210 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Mar 4 01:17:09.037221 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Mar 4 01:17:09.037232 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00000000-0x7ffdbfff] Mar 4 01:17:09.037244 kernel: NODE_DATA(0) allocated [mem 0x7ffd6000-0x7ffdbfff] Mar 4 01:17:09.037255 kernel: Zone ranges: Mar 4 01:17:09.037266 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 4 01:17:09.037277 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Mar 4 01:17:09.037294 kernel: Normal empty Mar 4 01:17:09.037305 kernel: Movable zone start for each node Mar 4 01:17:09.037316 kernel: Early memory node ranges Mar 4 01:17:09.037327 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 4 01:17:09.037339 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Mar 4 01:17:09.037350 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Mar 4 01:17:09.037361 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 4 01:17:09.037372 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 4 01:17:09.037388 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Mar 4 01:17:09.037400 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 4 01:17:09.037419 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 4 01:17:09.037441 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 4 01:17:09.037453 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 4 01:17:09.037471 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 4 01:17:09.037482 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 4 01:17:09.037493 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 4 01:17:09.037505 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 4 01:17:09.037516 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 4 01:17:09.037527 kernel: TSC deadline timer available Mar 4 01:17:09.039585 kernel: smpboot: Allowing 16 CPUs, 14 hotplug CPUs Mar 4 01:17:09.039599 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 4 01:17:09.039611 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Mar 4 01:17:09.039623 kernel: Booting paravirtualized kernel on KVM Mar 4 01:17:09.039634 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 4 01:17:09.039646 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Mar 4 01:17:09.039657 kernel: percpu: Embedded 57 pages/cpu s196328 r8192 d28952 u262144 Mar 4 01:17:09.039669 kernel: pcpu-alloc: s196328 r8192 d28952 u262144 alloc=1*2097152 Mar 4 01:17:09.039680 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Mar 4 01:17:09.039699 kernel: kvm-guest: PV spinlocks enabled Mar 4 01:17:09.039711 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 4 01:17:09.039724 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=cfbb17c272ffeca64391861cc763ec4868ca597850b31cbd6ed67c590a72edc7 Mar 4 01:17:09.039736 kernel: random: crng init done Mar 4 01:17:09.039747 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 4 01:17:09.039759 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 4 01:17:09.039770 kernel: Fallback order for Node 0: 0 Mar 4 01:17:09.039781 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515804 Mar 4 01:17:09.039798 kernel: Policy zone: DMA32 Mar 4 01:17:09.039816 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 4 01:17:09.039829 kernel: software IO TLB: area num 16. Mar 4 01:17:09.039841 kernel: Memory: 1901600K/2096616K available (12288K kernel code, 2288K rwdata, 22752K rodata, 42892K init, 2304K bss, 194756K reserved, 0K cma-reserved) Mar 4 01:17:09.039853 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Mar 4 01:17:09.039864 kernel: Kernel/User page tables isolation: enabled Mar 4 01:17:09.039876 kernel: ftrace: allocating 37996 entries in 149 pages Mar 4 01:17:09.039887 kernel: ftrace: allocated 149 pages with 4 groups Mar 4 01:17:09.039898 kernel: Dynamic Preempt: voluntary Mar 4 01:17:09.039915 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 4 01:17:09.039935 kernel: rcu: RCU event tracing is enabled. Mar 4 01:17:09.039947 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Mar 4 01:17:09.039958 kernel: Trampoline variant of Tasks RCU enabled. Mar 4 01:17:09.039970 kernel: Rude variant of Tasks RCU enabled. Mar 4 01:17:09.040000 kernel: Tracing variant of Tasks RCU enabled. Mar 4 01:17:09.040017 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 4 01:17:09.040029 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Mar 4 01:17:09.040040 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Mar 4 01:17:09.040052 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 4 01:17:09.040064 kernel: Console: colour VGA+ 80x25 Mar 4 01:17:09.040076 kernel: printk: console [tty0] enabled Mar 4 01:17:09.040092 kernel: printk: console [ttyS0] enabled Mar 4 01:17:09.040104 kernel: ACPI: Core revision 20230628 Mar 4 01:17:09.040116 kernel: APIC: Switch to symmetric I/O mode setup Mar 4 01:17:09.040128 kernel: x2apic enabled Mar 4 01:17:09.040140 kernel: APIC: Switched APIC routing to: physical x2apic Mar 4 01:17:09.040157 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Mar 4 01:17:09.040174 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Mar 4 01:17:09.040187 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 4 01:17:09.040199 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Mar 4 01:17:09.040211 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Mar 4 01:17:09.040223 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 4 01:17:09.040234 kernel: Spectre V2 : Mitigation: Retpolines Mar 4 01:17:09.040247 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Mar 4 01:17:09.040259 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Mar 4 01:17:09.040270 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 4 01:17:09.040288 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 4 01:17:09.040300 kernel: MDS: Mitigation: Clear CPU buffers Mar 4 01:17:09.040317 kernel: MMIO Stale Data: Unknown: No mitigations Mar 4 01:17:09.040329 kernel: SRBDS: Unknown: Dependent on hypervisor status Mar 4 01:17:09.040341 kernel: active return thunk: its_return_thunk Mar 4 01:17:09.040352 kernel: ITS: Mitigation: Aligned branch/return thunks Mar 4 01:17:09.040364 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 4 01:17:09.040376 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 4 01:17:09.040388 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 4 01:17:09.040399 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 4 01:17:09.040416 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Mar 4 01:17:09.040444 kernel: Freeing SMP alternatives memory: 32K Mar 4 01:17:09.040457 kernel: pid_max: default: 32768 minimum: 301 Mar 4 01:17:09.040469 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 4 01:17:09.040481 kernel: landlock: Up and running. Mar 4 01:17:09.040493 kernel: SELinux: Initializing. Mar 4 01:17:09.040505 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 4 01:17:09.040517 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 4 01:17:09.040529 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Mar 4 01:17:09.040557 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 4 01:17:09.040571 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 4 01:17:09.040589 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 4 01:17:09.040601 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Mar 4 01:17:09.040613 kernel: signal: max sigframe size: 1776 Mar 4 01:17:09.040625 kernel: rcu: Hierarchical SRCU implementation. Mar 4 01:17:09.040637 kernel: rcu: Max phase no-delay instances is 400. Mar 4 01:17:09.040649 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Mar 4 01:17:09.040661 kernel: smp: Bringing up secondary CPUs ... Mar 4 01:17:09.040673 kernel: smpboot: x86: Booting SMP configuration: Mar 4 01:17:09.040685 kernel: .... node #0, CPUs: #1 Mar 4 01:17:09.040702 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Mar 4 01:17:09.040714 kernel: smp: Brought up 1 node, 2 CPUs Mar 4 01:17:09.040726 kernel: smpboot: Max logical packages: 16 Mar 4 01:17:09.040737 kernel: smpboot: Total of 2 processors activated (11199.99 BogoMIPS) Mar 4 01:17:09.040749 kernel: devtmpfs: initialized Mar 4 01:17:09.040761 kernel: x86/mm: Memory block size: 128MB Mar 4 01:17:09.040773 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 4 01:17:09.040785 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Mar 4 01:17:09.040796 kernel: pinctrl core: initialized pinctrl subsystem Mar 4 01:17:09.040813 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 4 01:17:09.040825 kernel: audit: initializing netlink subsys (disabled) Mar 4 01:17:09.040837 kernel: audit: type=2000 audit(1772587027.952:1): state=initialized audit_enabled=0 res=1 Mar 4 01:17:09.040853 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 4 01:17:09.040865 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 4 01:17:09.040877 kernel: cpuidle: using governor menu Mar 4 01:17:09.040889 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 4 01:17:09.040900 kernel: dca service started, version 1.12.1 Mar 4 01:17:09.040912 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Mar 4 01:17:09.040932 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Mar 4 01:17:09.040945 kernel: PCI: Using configuration type 1 for base access Mar 4 01:17:09.040957 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 4 01:17:09.040969 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 4 01:17:09.040981 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 4 01:17:09.040993 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 4 01:17:09.041004 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 4 01:17:09.041016 kernel: ACPI: Added _OSI(Module Device) Mar 4 01:17:09.041028 kernel: ACPI: Added _OSI(Processor Device) Mar 4 01:17:09.041044 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 4 01:17:09.041069 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 4 01:17:09.041080 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 4 01:17:09.041091 kernel: ACPI: Interpreter enabled Mar 4 01:17:09.041102 kernel: ACPI: PM: (supports S0 S5) Mar 4 01:17:09.041114 kernel: ACPI: Using IOAPIC for interrupt routing Mar 4 01:17:09.041138 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 4 01:17:09.041151 kernel: PCI: Using E820 reservations for host bridge windows Mar 4 01:17:09.041163 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 4 01:17:09.041192 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 4 01:17:09.041527 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 4 01:17:09.042828 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 4 01:17:09.043020 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 4 01:17:09.043045 kernel: PCI host bridge to bus 0000:00 Mar 4 01:17:09.043243 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 4 01:17:09.043453 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 4 01:17:09.043641 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 4 01:17:09.043803 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Mar 4 01:17:09.045313 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 4 01:17:09.045499 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Mar 4 01:17:09.045682 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 4 01:17:09.045920 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Mar 4 01:17:09.046133 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 Mar 4 01:17:09.046325 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfa000000-0xfbffffff pref] Mar 4 01:17:09.046522 kernel: pci 0000:00:01.0: reg 0x14: [mem 0xfea50000-0xfea50fff] Mar 4 01:17:09.046727 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea40000-0xfea4ffff pref] Mar 4 01:17:09.046954 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 4 01:17:09.047163 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Mar 4 01:17:09.047358 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea51000-0xfea51fff] Mar 4 01:17:09.048204 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Mar 4 01:17:09.048396 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea52000-0xfea52fff] Mar 4 01:17:09.048645 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Mar 4 01:17:09.048822 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea53000-0xfea53fff] Mar 4 01:17:09.049016 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Mar 4 01:17:09.049203 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea54000-0xfea54fff] Mar 4 01:17:09.049405 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Mar 4 01:17:09.052679 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea55000-0xfea55fff] Mar 4 01:17:09.052890 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Mar 4 01:17:09.053086 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea56000-0xfea56fff] Mar 4 01:17:09.053314 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Mar 4 01:17:09.053513 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea57000-0xfea57fff] Mar 4 01:17:09.053774 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Mar 4 01:17:09.053954 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea58000-0xfea58fff] Mar 4 01:17:09.054150 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Mar 4 01:17:09.054341 kernel: pci 0000:00:03.0: reg 0x10: [io 0xd0c0-0xd0df] Mar 4 01:17:09.054531 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfea59000-0xfea59fff] Mar 4 01:17:09.055221 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Mar 4 01:17:09.055403 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfea00000-0xfea3ffff pref] Mar 4 01:17:09.056679 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Mar 4 01:17:09.056871 kernel: pci 0000:00:04.0: reg 0x10: [io 0xd000-0xd07f] Mar 4 01:17:09.057051 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfea5a000-0xfea5afff] Mar 4 01:17:09.057241 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfd004000-0xfd007fff 64bit pref] Mar 4 01:17:09.057498 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Mar 4 01:17:09.057705 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 4 01:17:09.057900 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Mar 4 01:17:09.058085 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xd0e0-0xd0ff] Mar 4 01:17:09.058259 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea5b000-0xfea5bfff] Mar 4 01:17:09.058505 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Mar 4 01:17:09.060481 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Mar 4 01:17:09.060757 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 Mar 4 01:17:09.060960 kernel: pci 0000:01:00.0: reg 0x10: [mem 0xfda00000-0xfda000ff 64bit] Mar 4 01:17:09.061158 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 4 01:17:09.061344 kernel: pci 0000:00:02.0: bridge window [io 0xc000-0xcfff] Mar 4 01:17:09.061533 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Mar 4 01:17:09.062790 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 4 01:17:09.063015 kernel: pci_bus 0000:02: extended config space not accessible Mar 4 01:17:09.063253 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 Mar 4 01:17:09.063474 kernel: pci 0000:02:01.0: reg 0x10: [mem 0xfd800000-0xfd80000f] Mar 4 01:17:09.064707 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 4 01:17:09.064892 kernel: pci 0000:01:00.0: bridge window [io 0xc000-0xcfff] Mar 4 01:17:09.065084 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 4 01:17:09.065271 kernel: pci 0000:01:00.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 4 01:17:09.065509 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 Mar 4 01:17:09.065726 kernel: pci 0000:03:00.0: reg 0x10: [mem 0xfe800000-0xfe803fff 64bit] Mar 4 01:17:09.065929 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 4 01:17:09.066114 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Mar 4 01:17:09.066288 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 4 01:17:09.066503 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 Mar 4 01:17:09.068730 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Mar 4 01:17:09.068931 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 4 01:17:09.069132 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Mar 4 01:17:09.069337 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 4 01:17:09.070609 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 4 01:17:09.070823 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Mar 4 01:17:09.071018 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 4 01:17:09.071202 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 4 01:17:09.071376 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Mar 4 01:17:09.072609 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 4 01:17:09.072795 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 4 01:17:09.073031 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Mar 4 01:17:09.073236 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 4 01:17:09.073407 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 4 01:17:09.075643 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Mar 4 01:17:09.075821 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 4 01:17:09.076012 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 4 01:17:09.076185 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Mar 4 01:17:09.076368 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 4 01:17:09.076394 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 4 01:17:09.076407 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 4 01:17:09.076443 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 4 01:17:09.076455 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 4 01:17:09.076468 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 4 01:17:09.076480 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 4 01:17:09.076492 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 4 01:17:09.076504 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 4 01:17:09.076516 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 4 01:17:09.076534 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 4 01:17:09.076592 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 4 01:17:09.076605 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 4 01:17:09.076617 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 4 01:17:09.076629 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 4 01:17:09.076641 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 4 01:17:09.076653 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 4 01:17:09.076665 kernel: iommu: Default domain type: Translated Mar 4 01:17:09.076678 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 4 01:17:09.076697 kernel: PCI: Using ACPI for IRQ routing Mar 4 01:17:09.076714 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 4 01:17:09.076726 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 4 01:17:09.076738 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Mar 4 01:17:09.076923 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 4 01:17:09.077097 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 4 01:17:09.077277 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 4 01:17:09.077296 kernel: vgaarb: loaded Mar 4 01:17:09.077308 kernel: clocksource: Switched to clocksource kvm-clock Mar 4 01:17:09.077328 kernel: VFS: Disk quotas dquot_6.6.0 Mar 4 01:17:09.077341 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 4 01:17:09.077353 kernel: pnp: PnP ACPI init Mar 4 01:17:09.077589 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Mar 4 01:17:09.077611 kernel: pnp: PnP ACPI: found 5 devices Mar 4 01:17:09.077624 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 4 01:17:09.077636 kernel: NET: Registered PF_INET protocol family Mar 4 01:17:09.077649 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 4 01:17:09.077669 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Mar 4 01:17:09.077681 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 4 01:17:09.077705 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 4 01:17:09.077717 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Mar 4 01:17:09.077728 kernel: TCP: Hash tables configured (established 16384 bind 16384) Mar 4 01:17:09.077740 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 4 01:17:09.077751 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 4 01:17:09.077762 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 4 01:17:09.077774 kernel: NET: Registered PF_XDP protocol family Mar 4 01:17:09.077975 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Mar 4 01:17:09.078164 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Mar 4 01:17:09.078368 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Mar 4 01:17:09.080599 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 4 01:17:09.080780 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 4 01:17:09.080984 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 4 01:17:09.081195 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 4 01:17:09.081370 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x1000-0x1fff] Mar 4 01:17:09.083593 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x2000-0x2fff] Mar 4 01:17:09.083783 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x3000-0x3fff] Mar 4 01:17:09.083956 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x4000-0x4fff] Mar 4 01:17:09.084141 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x5000-0x5fff] Mar 4 01:17:09.084334 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x6000-0x6fff] Mar 4 01:17:09.084571 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x7000-0x7fff] Mar 4 01:17:09.084801 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 4 01:17:09.084990 kernel: pci 0000:01:00.0: bridge window [io 0xc000-0xcfff] Mar 4 01:17:09.085169 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 4 01:17:09.085348 kernel: pci 0000:01:00.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 4 01:17:09.085547 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 4 01:17:09.085725 kernel: pci 0000:00:02.0: bridge window [io 0xc000-0xcfff] Mar 4 01:17:09.085911 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Mar 4 01:17:09.086088 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 4 01:17:09.086264 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 4 01:17:09.086467 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff] Mar 4 01:17:09.088682 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Mar 4 01:17:09.088861 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 4 01:17:09.089053 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 4 01:17:09.089226 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff] Mar 4 01:17:09.089406 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Mar 4 01:17:09.091641 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 4 01:17:09.091846 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 4 01:17:09.092022 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff] Mar 4 01:17:09.092203 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Mar 4 01:17:09.092396 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 4 01:17:09.092602 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 4 01:17:09.092787 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff] Mar 4 01:17:09.092961 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Mar 4 01:17:09.093145 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 4 01:17:09.093324 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 4 01:17:09.093523 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff] Mar 4 01:17:09.095742 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Mar 4 01:17:09.095964 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 4 01:17:09.096159 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 4 01:17:09.096356 kernel: pci 0000:00:02.6: bridge window [io 0x6000-0x6fff] Mar 4 01:17:09.096576 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Mar 4 01:17:09.096765 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 4 01:17:09.097038 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 4 01:17:09.097329 kernel: pci 0000:00:02.7: bridge window [io 0x7000-0x7fff] Mar 4 01:17:09.099566 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Mar 4 01:17:09.099768 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 4 01:17:09.099976 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 4 01:17:09.100168 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 4 01:17:09.100329 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 4 01:17:09.100502 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Mar 4 01:17:09.100725 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Mar 4 01:17:09.100885 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Mar 4 01:17:09.101104 kernel: pci_bus 0000:01: resource 0 [io 0xc000-0xcfff] Mar 4 01:17:09.101281 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Mar 4 01:17:09.101469 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Mar 4 01:17:09.103686 kernel: pci_bus 0000:02: resource 0 [io 0xc000-0xcfff] Mar 4 01:17:09.103866 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Mar 4 01:17:09.104041 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Mar 4 01:17:09.104236 kernel: pci_bus 0000:03: resource 0 [io 0x1000-0x1fff] Mar 4 01:17:09.104404 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Mar 4 01:17:09.104608 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 4 01:17:09.104822 kernel: pci_bus 0000:04: resource 0 [io 0x2000-0x2fff] Mar 4 01:17:09.104990 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Mar 4 01:17:09.105163 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 4 01:17:09.105356 kernel: pci_bus 0000:05: resource 0 [io 0x3000-0x3fff] Mar 4 01:17:09.105563 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Mar 4 01:17:09.105735 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 4 01:17:09.105923 kernel: pci_bus 0000:06: resource 0 [io 0x4000-0x4fff] Mar 4 01:17:09.106112 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Mar 4 01:17:09.106290 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 4 01:17:09.106514 kernel: pci_bus 0000:07: resource 0 [io 0x5000-0x5fff] Mar 4 01:17:09.106708 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Mar 4 01:17:09.106877 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 4 01:17:09.107084 kernel: pci_bus 0000:08: resource 0 [io 0x6000-0x6fff] Mar 4 01:17:09.107280 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Mar 4 01:17:09.107494 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 4 01:17:09.107701 kernel: pci_bus 0000:09: resource 0 [io 0x7000-0x7fff] Mar 4 01:17:09.107886 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Mar 4 01:17:09.108064 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 4 01:17:09.108085 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 4 01:17:09.108098 kernel: PCI: CLS 0 bytes, default 64 Mar 4 01:17:09.108119 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 4 01:17:09.108132 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Mar 4 01:17:09.108145 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Mar 4 01:17:09.108158 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Mar 4 01:17:09.108171 kernel: Initialise system trusted keyrings Mar 4 01:17:09.108184 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Mar 4 01:17:09.108197 kernel: Key type asymmetric registered Mar 4 01:17:09.108209 kernel: Asymmetric key parser 'x509' registered Mar 4 01:17:09.108222 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 4 01:17:09.108240 kernel: io scheduler mq-deadline registered Mar 4 01:17:09.108253 kernel: io scheduler kyber registered Mar 4 01:17:09.108265 kernel: io scheduler bfq registered Mar 4 01:17:09.108456 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 4 01:17:09.108665 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 4 01:17:09.108850 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 01:17:09.109052 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 4 01:17:09.109242 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 4 01:17:09.109448 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 01:17:09.109658 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 4 01:17:09.109846 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 4 01:17:09.110039 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 01:17:09.110233 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 4 01:17:09.110457 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 4 01:17:09.110734 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 01:17:09.110922 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 4 01:17:09.111107 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 4 01:17:09.111283 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 01:17:09.111473 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 4 01:17:09.111687 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 4 01:17:09.111872 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 01:17:09.112048 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 4 01:17:09.112233 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 4 01:17:09.112415 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 01:17:09.112654 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 4 01:17:09.112838 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 4 01:17:09.113022 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 4 01:17:09.113042 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 4 01:17:09.113056 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 4 01:17:09.113069 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Mar 4 01:17:09.113082 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 4 01:17:09.113095 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 4 01:17:09.113114 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 4 01:17:09.113128 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 4 01:17:09.113145 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 4 01:17:09.113342 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 4 01:17:09.113363 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 4 01:17:09.113594 kernel: rtc_cmos 00:03: registered as rtc0 Mar 4 01:17:09.113763 kernel: rtc_cmos 00:03: setting system clock to 2026-03-04T01:17:08 UTC (1772587028) Mar 4 01:17:09.113938 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Mar 4 01:17:09.113958 kernel: intel_pstate: CPU model not supported Mar 4 01:17:09.113980 kernel: NET: Registered PF_INET6 protocol family Mar 4 01:17:09.114001 kernel: Segment Routing with IPv6 Mar 4 01:17:09.114014 kernel: In-situ OAM (IOAM) with IPv6 Mar 4 01:17:09.114034 kernel: NET: Registered PF_PACKET protocol family Mar 4 01:17:09.114047 kernel: Key type dns_resolver registered Mar 4 01:17:09.114059 kernel: IPI shorthand broadcast: enabled Mar 4 01:17:09.114072 kernel: sched_clock: Marking stable (1504003696, 219079846)->(1866366198, -143282656) Mar 4 01:17:09.114085 kernel: registered taskstats version 1 Mar 4 01:17:09.114097 kernel: Loading compiled-in X.509 certificates Mar 4 01:17:09.114110 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: be1dcbe3e3dee66976c19d61f4b179b405e1c498' Mar 4 01:17:09.114128 kernel: Key type .fscrypt registered Mar 4 01:17:09.114140 kernel: Key type fscrypt-provisioning registered Mar 4 01:17:09.114152 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 4 01:17:09.114165 kernel: ima: Allocated hash algorithm: sha1 Mar 4 01:17:09.114178 kernel: ima: No architecture policies found Mar 4 01:17:09.114190 kernel: clk: Disabling unused clocks Mar 4 01:17:09.114211 kernel: Freeing unused kernel image (initmem) memory: 42892K Mar 4 01:17:09.114224 kernel: Write protecting the kernel read-only data: 36864k Mar 4 01:17:09.114237 kernel: Freeing unused kernel image (rodata/data gap) memory: 1824K Mar 4 01:17:09.114256 kernel: Run /init as init process Mar 4 01:17:09.114269 kernel: with arguments: Mar 4 01:17:09.114281 kernel: /init Mar 4 01:17:09.114293 kernel: with environment: Mar 4 01:17:09.114305 kernel: HOME=/ Mar 4 01:17:09.114318 kernel: TERM=linux Mar 4 01:17:09.114333 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 4 01:17:09.114349 systemd[1]: Detected virtualization kvm. Mar 4 01:17:09.114376 systemd[1]: Detected architecture x86-64. Mar 4 01:17:09.114389 systemd[1]: Running in initrd. Mar 4 01:17:09.114403 systemd[1]: No hostname configured, using default hostname. Mar 4 01:17:09.114422 systemd[1]: Hostname set to . Mar 4 01:17:09.114446 systemd[1]: Initializing machine ID from VM UUID. Mar 4 01:17:09.114460 systemd[1]: Queued start job for default target initrd.target. Mar 4 01:17:09.114476 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 4 01:17:09.114489 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 4 01:17:09.114509 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 4 01:17:09.114523 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 4 01:17:09.114572 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 4 01:17:09.114590 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 4 01:17:09.114605 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 4 01:17:09.114619 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 4 01:17:09.114633 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 4 01:17:09.114654 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 4 01:17:09.114667 systemd[1]: Reached target paths.target - Path Units. Mar 4 01:17:09.114680 systemd[1]: Reached target slices.target - Slice Units. Mar 4 01:17:09.114694 systemd[1]: Reached target swap.target - Swaps. Mar 4 01:17:09.114707 systemd[1]: Reached target timers.target - Timer Units. Mar 4 01:17:09.114720 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 4 01:17:09.114734 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 4 01:17:09.114747 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 4 01:17:09.114774 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 4 01:17:09.114787 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 4 01:17:09.114801 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 4 01:17:09.114815 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 4 01:17:09.114828 systemd[1]: Reached target sockets.target - Socket Units. Mar 4 01:17:09.114847 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 4 01:17:09.114860 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 4 01:17:09.114874 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 4 01:17:09.114892 systemd[1]: Starting systemd-fsck-usr.service... Mar 4 01:17:09.114911 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 4 01:17:09.114924 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 4 01:17:09.114938 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 4 01:17:09.115002 systemd-journald[202]: Collecting audit messages is disabled. Mar 4 01:17:09.115038 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 4 01:17:09.115059 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 4 01:17:09.115073 systemd[1]: Finished systemd-fsck-usr.service. Mar 4 01:17:09.115087 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 4 01:17:09.115105 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 4 01:17:09.115119 kernel: Bridge firewalling registered Mar 4 01:17:09.115133 systemd-journald[202]: Journal started Mar 4 01:17:09.115157 systemd-journald[202]: Runtime Journal (/run/log/journal/88995a24e03e4e04a60cd37092ca54a6) is 4.7M, max 38.0M, 33.2M free. Mar 4 01:17:09.069238 systemd-modules-load[203]: Inserted module 'overlay' Mar 4 01:17:09.100046 systemd-modules-load[203]: Inserted module 'br_netfilter' Mar 4 01:17:09.164474 systemd[1]: Started systemd-journald.service - Journal Service. Mar 4 01:17:09.164484 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 4 01:17:09.165620 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 4 01:17:09.174835 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 4 01:17:09.189805 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 4 01:17:09.195733 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 4 01:17:09.202737 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 4 01:17:09.207081 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 4 01:17:09.214661 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 4 01:17:09.225773 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 4 01:17:09.229382 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 4 01:17:09.230744 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 4 01:17:09.242665 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 4 01:17:09.249389 dracut-cmdline[231]: dracut-dracut-053 Mar 4 01:17:09.249704 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 4 01:17:09.256186 dracut-cmdline[231]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=cfbb17c272ffeca64391861cc763ec4868ca597850b31cbd6ed67c590a72edc7 Mar 4 01:17:09.295633 systemd-resolved[243]: Positive Trust Anchors: Mar 4 01:17:09.295651 systemd-resolved[243]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 4 01:17:09.295692 systemd-resolved[243]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 4 01:17:09.302362 systemd-resolved[243]: Defaulting to hostname 'linux'. Mar 4 01:17:09.305385 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 4 01:17:09.306528 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 4 01:17:09.368578 kernel: SCSI subsystem initialized Mar 4 01:17:09.379581 kernel: Loading iSCSI transport class v2.0-870. Mar 4 01:17:09.392575 kernel: iscsi: registered transport (tcp) Mar 4 01:17:09.417992 kernel: iscsi: registered transport (qla4xxx) Mar 4 01:17:09.418083 kernel: QLogic iSCSI HBA Driver Mar 4 01:17:09.474213 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 4 01:17:09.488855 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 4 01:17:09.536600 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 4 01:17:09.536699 kernel: device-mapper: uevent: version 1.0.3 Mar 4 01:17:09.539987 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 4 01:17:09.587606 kernel: raid6: sse2x4 gen() 13672 MB/s Mar 4 01:17:09.604581 kernel: raid6: sse2x2 gen() 9622 MB/s Mar 4 01:17:09.623145 kernel: raid6: sse2x1 gen() 9940 MB/s Mar 4 01:17:09.623206 kernel: raid6: using algorithm sse2x4 gen() 13672 MB/s Mar 4 01:17:09.642152 kernel: raid6: .... xor() 7821 MB/s, rmw enabled Mar 4 01:17:09.642256 kernel: raid6: using ssse3x2 recovery algorithm Mar 4 01:17:09.667589 kernel: xor: automatically using best checksumming function avx Mar 4 01:17:09.855651 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 4 01:17:09.872788 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 4 01:17:09.879895 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 4 01:17:09.911117 systemd-udevd[420]: Using default interface naming scheme 'v255'. Mar 4 01:17:09.919038 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 4 01:17:09.928748 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 4 01:17:09.950122 dracut-pre-trigger[427]: rd.md=0: removing MD RAID activation Mar 4 01:17:09.991855 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 4 01:17:10.004893 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 4 01:17:10.132601 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 4 01:17:10.141806 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 4 01:17:10.174640 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 4 01:17:10.176598 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 4 01:17:10.179597 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 4 01:17:10.182268 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 4 01:17:10.191507 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 4 01:17:10.223836 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 4 01:17:10.276735 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Mar 4 01:17:10.279722 kernel: cryptd: max_cpu_qlen set to 1000 Mar 4 01:17:10.295564 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Mar 4 01:17:10.307116 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 4 01:17:10.307306 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 4 01:17:10.339602 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 4 01:17:10.339646 kernel: GPT:17805311 != 125829119 Mar 4 01:17:10.339666 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 4 01:17:10.339691 kernel: GPT:17805311 != 125829119 Mar 4 01:17:10.339707 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 4 01:17:10.339724 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 4 01:17:10.339740 kernel: AVX version of gcm_enc/dec engaged. Mar 4 01:17:10.339757 kernel: AES CTR mode by8 optimization enabled Mar 4 01:17:10.338326 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 4 01:17:10.339318 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 4 01:17:10.339571 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 4 01:17:10.341559 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 4 01:17:10.354863 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 4 01:17:10.365602 kernel: libata version 3.00 loaded. Mar 4 01:17:10.373565 kernel: ahci 0000:00:1f.2: version 3.0 Mar 4 01:17:10.373830 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 4 01:17:10.378456 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Mar 4 01:17:10.378815 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 4 01:17:10.379050 kernel: scsi host0: ahci Mar 4 01:17:10.385259 kernel: scsi host1: ahci Mar 4 01:17:10.385603 kernel: scsi host2: ahci Mar 4 01:17:10.385813 kernel: scsi host3: ahci Mar 4 01:17:10.386022 kernel: scsi host4: ahci Mar 4 01:17:10.395595 kernel: scsi host5: ahci Mar 4 01:17:10.399652 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 38 Mar 4 01:17:10.399730 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 38 Mar 4 01:17:10.399760 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 38 Mar 4 01:17:10.403402 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 38 Mar 4 01:17:10.411098 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 38 Mar 4 01:17:10.411154 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 38 Mar 4 01:17:10.420595 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/vda6 scanned by (udev-worker) (474) Mar 4 01:17:10.425122 kernel: BTRFS: device fsid 251c1416-ef37-47f1-be3f-832af5870605 devid 1 transid 40 /dev/vda3 scanned by (udev-worker) (467) Mar 4 01:17:10.430560 kernel: ACPI: bus type USB registered Mar 4 01:17:10.430605 kernel: usbcore: registered new interface driver usbfs Mar 4 01:17:10.430636 kernel: usbcore: registered new interface driver hub Mar 4 01:17:10.430654 kernel: usbcore: registered new device driver usb Mar 4 01:17:10.463979 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 4 01:17:10.518037 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 4 01:17:10.519288 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 4 01:17:10.527501 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 4 01:17:10.533476 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 4 01:17:10.534362 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 4 01:17:10.546773 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 4 01:17:10.550727 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 4 01:17:10.559577 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 4 01:17:10.561989 disk-uuid[564]: Primary Header is updated. Mar 4 01:17:10.561989 disk-uuid[564]: Secondary Entries is updated. Mar 4 01:17:10.561989 disk-uuid[564]: Secondary Header is updated. Mar 4 01:17:10.590860 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 4 01:17:10.723857 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 4 01:17:10.723925 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 4 01:17:10.729604 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 4 01:17:10.729645 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 4 01:17:10.730939 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 4 01:17:10.732685 kernel: ata1: SATA link down (SStatus 0 SControl 300) Mar 4 01:17:10.773930 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Mar 4 01:17:10.774317 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Mar 4 01:17:10.778594 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 4 01:17:10.782789 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Mar 4 01:17:10.783059 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Mar 4 01:17:10.783323 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Mar 4 01:17:10.786941 kernel: hub 1-0:1.0: USB hub found Mar 4 01:17:10.787264 kernel: hub 1-0:1.0: 4 ports detected Mar 4 01:17:10.787520 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 4 01:17:10.792000 kernel: hub 2-0:1.0: USB hub found Mar 4 01:17:10.792283 kernel: hub 2-0:1.0: 4 ports detected Mar 4 01:17:11.027578 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 4 01:17:11.168571 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 4 01:17:11.175157 kernel: usbcore: registered new interface driver usbhid Mar 4 01:17:11.175229 kernel: usbhid: USB HID core driver Mar 4 01:17:11.181564 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Mar 4 01:17:11.184565 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Mar 4 01:17:11.577606 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 4 01:17:11.577694 disk-uuid[566]: The operation has completed successfully. Mar 4 01:17:11.632467 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 4 01:17:11.632672 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 4 01:17:11.655742 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 4 01:17:11.662103 sh[587]: Success Mar 4 01:17:11.682575 kernel: device-mapper: verity: sha256 using implementation "sha256-avx" Mar 4 01:17:11.755102 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 4 01:17:11.760057 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 4 01:17:11.761081 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 4 01:17:11.793820 kernel: BTRFS info (device dm-0): first mount of filesystem 251c1416-ef37-47f1-be3f-832af5870605 Mar 4 01:17:11.793886 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 4 01:17:11.795872 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 4 01:17:11.799264 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 4 01:17:11.799305 kernel: BTRFS info (device dm-0): using free space tree Mar 4 01:17:11.810115 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 4 01:17:11.811677 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 4 01:17:11.822762 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 4 01:17:11.825315 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 4 01:17:11.845849 kernel: BTRFS info (device vda6): first mount of filesystem 71a972ce-abd4-4705-b1cd-2b663b77d747 Mar 4 01:17:11.845925 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 4 01:17:11.845945 kernel: BTRFS info (device vda6): using free space tree Mar 4 01:17:11.855574 kernel: BTRFS info (device vda6): auto enabling async discard Mar 4 01:17:11.870281 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 4 01:17:11.873492 kernel: BTRFS info (device vda6): last unmount of filesystem 71a972ce-abd4-4705-b1cd-2b663b77d747 Mar 4 01:17:11.881524 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 4 01:17:11.890870 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 4 01:17:12.039150 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 4 01:17:12.047784 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 4 01:17:12.076917 ignition[687]: Ignition 2.19.0 Mar 4 01:17:12.076937 ignition[687]: Stage: fetch-offline Mar 4 01:17:12.077002 ignition[687]: no configs at "/usr/lib/ignition/base.d" Mar 4 01:17:12.081002 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 4 01:17:12.077022 ignition[687]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 4 01:17:12.077220 ignition[687]: parsed url from cmdline: "" Mar 4 01:17:12.077227 ignition[687]: no config URL provided Mar 4 01:17:12.077237 ignition[687]: reading system config file "/usr/lib/ignition/user.ign" Mar 4 01:17:12.077254 ignition[687]: no config at "/usr/lib/ignition/user.ign" Mar 4 01:17:12.077263 ignition[687]: failed to fetch config: resource requires networking Mar 4 01:17:12.078607 ignition[687]: Ignition finished successfully Mar 4 01:17:12.087177 systemd-networkd[775]: lo: Link UP Mar 4 01:17:12.087184 systemd-networkd[775]: lo: Gained carrier Mar 4 01:17:12.090297 systemd-networkd[775]: Enumeration completed Mar 4 01:17:12.090764 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 4 01:17:12.091141 systemd-networkd[775]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 4 01:17:12.091147 systemd-networkd[775]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 4 01:17:12.092712 systemd-networkd[775]: eth0: Link UP Mar 4 01:17:12.092718 systemd-networkd[775]: eth0: Gained carrier Mar 4 01:17:12.092729 systemd-networkd[775]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 4 01:17:12.093766 systemd[1]: Reached target network.target - Network. Mar 4 01:17:12.103747 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 4 01:17:12.126220 ignition[778]: Ignition 2.19.0 Mar 4 01:17:12.126245 ignition[778]: Stage: fetch Mar 4 01:17:12.126523 ignition[778]: no configs at "/usr/lib/ignition/base.d" Mar 4 01:17:12.126570 ignition[778]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 4 01:17:12.126767 ignition[778]: parsed url from cmdline: "" Mar 4 01:17:12.126773 ignition[778]: no config URL provided Mar 4 01:17:12.126782 ignition[778]: reading system config file "/usr/lib/ignition/user.ign" Mar 4 01:17:12.126799 ignition[778]: no config at "/usr/lib/ignition/user.ign" Mar 4 01:17:12.126950 ignition[778]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 4 01:17:12.126967 ignition[778]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 4 01:17:12.127016 ignition[778]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 4 01:17:12.127226 ignition[778]: GET error: Get "http://169.254.169.254/openstack/latest/user_data": dial tcp 169.254.169.254:80: connect: network is unreachable Mar 4 01:17:12.169755 systemd-networkd[775]: eth0: DHCPv4 address 10.243.77.214/30, gateway 10.243.77.213 acquired from 10.243.77.213 Mar 4 01:17:12.327834 ignition[778]: GET http://169.254.169.254/openstack/latest/user_data: attempt #2 Mar 4 01:17:12.387857 ignition[778]: GET result: OK Mar 4 01:17:12.388292 ignition[778]: parsing config with SHA512: 8db6424b6f2776e3d022e922e057440c51c30a04fe76ca7f33f380fde60e101853d409dac41abea5515a564f209161e926eaebaed7e14735eedc7a964f2deff4 Mar 4 01:17:12.394443 unknown[778]: fetched base config from "system" Mar 4 01:17:12.394463 unknown[778]: fetched base config from "system" Mar 4 01:17:12.395762 ignition[778]: fetch: fetch complete Mar 4 01:17:12.394479 unknown[778]: fetched user config from "openstack" Mar 4 01:17:12.395771 ignition[778]: fetch: fetch passed Mar 4 01:17:12.400195 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 4 01:17:12.395852 ignition[778]: Ignition finished successfully Mar 4 01:17:12.406795 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 4 01:17:12.427848 ignition[785]: Ignition 2.19.0 Mar 4 01:17:12.427862 ignition[785]: Stage: kargs Mar 4 01:17:12.428126 ignition[785]: no configs at "/usr/lib/ignition/base.d" Mar 4 01:17:12.428146 ignition[785]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 4 01:17:12.429403 ignition[785]: kargs: kargs passed Mar 4 01:17:12.431867 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 4 01:17:12.429472 ignition[785]: Ignition finished successfully Mar 4 01:17:12.503904 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 4 01:17:12.547973 ignition[791]: Ignition 2.19.0 Mar 4 01:17:12.549185 ignition[791]: Stage: disks Mar 4 01:17:12.550143 ignition[791]: no configs at "/usr/lib/ignition/base.d" Mar 4 01:17:12.550959 ignition[791]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 4 01:17:12.555638 ignition[791]: disks: disks passed Mar 4 01:17:12.556401 ignition[791]: Ignition finished successfully Mar 4 01:17:12.558588 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 4 01:17:12.560030 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 4 01:17:12.561615 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 4 01:17:12.562382 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 4 01:17:12.563998 systemd[1]: Reached target sysinit.target - System Initialization. Mar 4 01:17:12.565361 systemd[1]: Reached target basic.target - Basic System. Mar 4 01:17:12.571755 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 4 01:17:12.594073 systemd-fsck[800]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 4 01:17:12.598270 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 4 01:17:12.604669 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 4 01:17:12.723574 kernel: EXT4-fs (vda9): mounted filesystem 77c4d29a-0423-4e33-8b82-61754d97532c r/w with ordered data mode. Quota mode: none. Mar 4 01:17:12.724869 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 4 01:17:12.726218 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 4 01:17:12.733645 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 4 01:17:12.737667 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 4 01:17:12.739658 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 4 01:17:12.743749 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Mar 4 01:17:12.745595 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 4 01:17:12.749943 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/vda6 scanned by mount (808) Mar 4 01:17:12.745639 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 4 01:17:12.756913 kernel: BTRFS info (device vda6): first mount of filesystem 71a972ce-abd4-4705-b1cd-2b663b77d747 Mar 4 01:17:12.756943 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 4 01:17:12.756962 kernel: BTRFS info (device vda6): using free space tree Mar 4 01:17:12.763574 kernel: BTRFS info (device vda6): auto enabling async discard Mar 4 01:17:12.763392 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 4 01:17:12.767616 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 4 01:17:12.780979 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 4 01:17:12.854034 initrd-setup-root[836]: cut: /sysroot/etc/passwd: No such file or directory Mar 4 01:17:12.867880 initrd-setup-root[843]: cut: /sysroot/etc/group: No such file or directory Mar 4 01:17:12.872851 initrd-setup-root[850]: cut: /sysroot/etc/shadow: No such file or directory Mar 4 01:17:12.880527 initrd-setup-root[857]: cut: /sysroot/etc/gshadow: No such file or directory Mar 4 01:17:12.999747 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 4 01:17:13.003711 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 4 01:17:13.006724 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 4 01:17:13.022837 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 4 01:17:13.025030 kernel: BTRFS info (device vda6): last unmount of filesystem 71a972ce-abd4-4705-b1cd-2b663b77d747 Mar 4 01:17:13.050136 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 4 01:17:13.070628 ignition[925]: INFO : Ignition 2.19.0 Mar 4 01:17:13.070628 ignition[925]: INFO : Stage: mount Mar 4 01:17:13.072446 ignition[925]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 4 01:17:13.072446 ignition[925]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 4 01:17:13.076175 ignition[925]: INFO : mount: mount passed Mar 4 01:17:13.076175 ignition[925]: INFO : Ignition finished successfully Mar 4 01:17:13.075427 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 4 01:17:14.143794 systemd-networkd[775]: eth0: Gained IPv6LL Mar 4 01:17:15.653627 systemd-networkd[775]: eth0: Ignoring DHCPv6 address 2a02:1348:17c:d375:24:19ff:fef3:4dd6/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17c:d375:24:19ff:fef3:4dd6/64 assigned by NDisc. Mar 4 01:17:15.653644 systemd-networkd[775]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Mar 4 01:17:19.928217 coreos-metadata[810]: Mar 04 01:17:19.928 WARN failed to locate config-drive, using the metadata service API instead Mar 4 01:17:19.950771 coreos-metadata[810]: Mar 04 01:17:19.950 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 4 01:17:19.965878 coreos-metadata[810]: Mar 04 01:17:19.965 INFO Fetch successful Mar 4 01:17:19.967633 coreos-metadata[810]: Mar 04 01:17:19.967 INFO wrote hostname srv-8wmcq.gb1.brightbox.com to /sysroot/etc/hostname Mar 4 01:17:19.968956 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 4 01:17:19.969152 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Mar 4 01:17:19.976669 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 4 01:17:20.003888 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 4 01:17:20.019755 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 scanned by mount (941) Mar 4 01:17:20.019814 kernel: BTRFS info (device vda6): first mount of filesystem 71a972ce-abd4-4705-b1cd-2b663b77d747 Mar 4 01:17:20.023230 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 4 01:17:20.023261 kernel: BTRFS info (device vda6): using free space tree Mar 4 01:17:20.027555 kernel: BTRFS info (device vda6): auto enabling async discard Mar 4 01:17:20.030648 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 4 01:17:20.069094 ignition[958]: INFO : Ignition 2.19.0 Mar 4 01:17:20.069094 ignition[958]: INFO : Stage: files Mar 4 01:17:20.071062 ignition[958]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 4 01:17:20.071062 ignition[958]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 4 01:17:20.071062 ignition[958]: DEBUG : files: compiled without relabeling support, skipping Mar 4 01:17:20.073836 ignition[958]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 4 01:17:20.073836 ignition[958]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 4 01:17:20.075808 ignition[958]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 4 01:17:20.075808 ignition[958]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 4 01:17:20.077674 ignition[958]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 4 01:17:20.076217 unknown[958]: wrote ssh authorized keys file for user: core Mar 4 01:17:20.079731 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 4 01:17:20.079731 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Mar 4 01:17:20.273473 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 4 01:17:20.658114 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 4 01:17:20.658114 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 4 01:17:20.658114 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 4 01:17:20.658114 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 4 01:17:20.658114 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 4 01:17:20.658114 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 4 01:17:20.658114 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 4 01:17:20.658114 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 4 01:17:20.658114 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 4 01:17:20.675730 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 4 01:17:20.675730 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 4 01:17:20.675730 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 4 01:17:20.675730 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 4 01:17:20.675730 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 4 01:17:20.675730 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.4-x86-64.raw: attempt #1 Mar 4 01:17:21.066409 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 4 01:17:23.224768 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 4 01:17:23.224768 ignition[958]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 4 01:17:23.230861 ignition[958]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 4 01:17:23.230861 ignition[958]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 4 01:17:23.230861 ignition[958]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 4 01:17:23.230861 ignition[958]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 4 01:17:23.230861 ignition[958]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 4 01:17:23.230861 ignition[958]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 4 01:17:23.230861 ignition[958]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 4 01:17:23.230861 ignition[958]: INFO : files: files passed Mar 4 01:17:23.242136 ignition[958]: INFO : Ignition finished successfully Mar 4 01:17:23.235688 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 4 01:17:23.256021 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 4 01:17:23.260771 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 4 01:17:23.264945 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 4 01:17:23.265128 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 4 01:17:23.294167 initrd-setup-root-after-ignition[988]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 4 01:17:23.294167 initrd-setup-root-after-ignition[988]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 4 01:17:23.298675 initrd-setup-root-after-ignition[992]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 4 01:17:23.300693 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 4 01:17:23.301838 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 4 01:17:23.311881 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 4 01:17:23.357590 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 4 01:17:23.358617 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 4 01:17:23.361099 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 4 01:17:23.361891 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 4 01:17:23.363454 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 4 01:17:23.370922 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 4 01:17:23.389951 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 4 01:17:23.399793 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 4 01:17:23.414365 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 4 01:17:23.416556 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 4 01:17:23.418448 systemd[1]: Stopped target timers.target - Timer Units. Mar 4 01:17:23.419984 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 4 01:17:23.420219 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 4 01:17:23.422687 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 4 01:17:23.423706 systemd[1]: Stopped target basic.target - Basic System. Mar 4 01:17:23.424965 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 4 01:17:23.426249 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 4 01:17:23.427851 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 4 01:17:23.429349 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 4 01:17:23.430820 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 4 01:17:23.432300 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 4 01:17:23.433802 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 4 01:17:23.435199 systemd[1]: Stopped target swap.target - Swaps. Mar 4 01:17:23.436455 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 4 01:17:23.436804 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 4 01:17:23.438355 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 4 01:17:23.439331 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 4 01:17:23.440778 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 4 01:17:23.441194 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 4 01:17:23.442507 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 4 01:17:23.442839 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 4 01:17:23.444457 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 4 01:17:23.444651 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 4 01:17:23.446437 systemd[1]: ignition-files.service: Deactivated successfully. Mar 4 01:17:23.446734 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 4 01:17:23.458041 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 4 01:17:23.462797 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 4 01:17:23.463561 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 4 01:17:23.463851 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 4 01:17:23.466820 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 4 01:17:23.467629 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 4 01:17:23.477429 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 4 01:17:23.477589 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 4 01:17:23.508436 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 4 01:17:23.511926 ignition[1012]: INFO : Ignition 2.19.0 Mar 4 01:17:23.511926 ignition[1012]: INFO : Stage: umount Mar 4 01:17:23.513887 ignition[1012]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 4 01:17:23.515745 ignition[1012]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 4 01:17:23.514499 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 4 01:17:23.518797 ignition[1012]: INFO : umount: umount passed Mar 4 01:17:23.518797 ignition[1012]: INFO : Ignition finished successfully Mar 4 01:17:23.514738 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 4 01:17:23.518566 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 4 01:17:23.518761 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 4 01:17:23.520639 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 4 01:17:23.520768 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 4 01:17:23.521963 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 4 01:17:23.522036 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 4 01:17:23.523282 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 4 01:17:23.523350 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 4 01:17:23.524746 systemd[1]: Stopped target network.target - Network. Mar 4 01:17:23.526027 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 4 01:17:23.526125 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 4 01:17:23.527477 systemd[1]: Stopped target paths.target - Path Units. Mar 4 01:17:23.528764 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 4 01:17:23.533646 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 4 01:17:23.534973 systemd[1]: Stopped target slices.target - Slice Units. Mar 4 01:17:23.536273 systemd[1]: Stopped target sockets.target - Socket Units. Mar 4 01:17:23.537836 systemd[1]: iscsid.socket: Deactivated successfully. Mar 4 01:17:23.537923 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 4 01:17:23.539359 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 4 01:17:23.539426 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 4 01:17:23.540634 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 4 01:17:23.540735 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 4 01:17:23.542008 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 4 01:17:23.542094 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 4 01:17:23.543290 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 4 01:17:23.543360 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 4 01:17:23.545052 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 4 01:17:23.546835 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 4 01:17:23.549756 systemd-networkd[775]: eth0: DHCPv6 lease lost Mar 4 01:17:23.552519 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 4 01:17:23.554431 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 4 01:17:23.560021 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 4 01:17:23.560257 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 4 01:17:23.565980 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 4 01:17:23.566717 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 4 01:17:23.571713 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 4 01:17:23.572425 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 4 01:17:23.572516 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 4 01:17:23.573421 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 4 01:17:23.573494 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 4 01:17:23.574772 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 4 01:17:23.574840 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 4 01:17:23.576746 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 4 01:17:23.576820 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 4 01:17:23.582436 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 4 01:17:23.603935 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 4 01:17:23.604243 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 4 01:17:23.608812 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 4 01:17:23.608982 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 4 01:17:23.610806 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 4 01:17:23.610891 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 4 01:17:23.612012 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 4 01:17:23.612073 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 4 01:17:23.613499 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 4 01:17:23.613604 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 4 01:17:23.615735 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 4 01:17:23.615804 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 4 01:17:23.617253 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 4 01:17:23.617327 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 4 01:17:23.625805 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 4 01:17:23.626600 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 4 01:17:23.626687 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 4 01:17:23.628412 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 4 01:17:23.628483 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 4 01:17:23.630743 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 4 01:17:23.630821 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 4 01:17:23.631622 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 4 01:17:23.631691 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 4 01:17:23.651375 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 4 01:17:23.651648 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 4 01:17:23.653962 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 4 01:17:23.660874 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 4 01:17:23.682936 systemd[1]: Switching root. Mar 4 01:17:23.728306 systemd-journald[202]: Journal stopped Mar 4 01:17:25.352079 systemd-journald[202]: Received SIGTERM from PID 1 (systemd). Mar 4 01:17:25.352209 kernel: SELinux: policy capability network_peer_controls=1 Mar 4 01:17:25.352247 kernel: SELinux: policy capability open_perms=1 Mar 4 01:17:25.352269 kernel: SELinux: policy capability extended_socket_class=1 Mar 4 01:17:25.352288 kernel: SELinux: policy capability always_check_network=0 Mar 4 01:17:25.352323 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 4 01:17:25.352359 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 4 01:17:25.352387 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 4 01:17:25.352406 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 4 01:17:25.352425 kernel: audit: type=1403 audit(1772587044.114:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 4 01:17:25.352455 systemd[1]: Successfully loaded SELinux policy in 60.986ms. Mar 4 01:17:25.352492 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 22.488ms. Mar 4 01:17:25.352515 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 4 01:17:25.352576 systemd[1]: Detected virtualization kvm. Mar 4 01:17:25.352600 systemd[1]: Detected architecture x86-64. Mar 4 01:17:25.354610 systemd[1]: Detected first boot. Mar 4 01:17:25.354639 systemd[1]: Hostname set to . Mar 4 01:17:25.354661 systemd[1]: Initializing machine ID from VM UUID. Mar 4 01:17:25.354710 zram_generator::config[1057]: No configuration found. Mar 4 01:17:25.354742 systemd[1]: Populated /etc with preset unit settings. Mar 4 01:17:25.354764 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 4 01:17:25.354800 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 4 01:17:25.354823 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 4 01:17:25.354845 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 4 01:17:25.354866 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 4 01:17:25.354887 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 4 01:17:25.354908 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 4 01:17:25.354928 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 4 01:17:25.354949 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 4 01:17:25.354985 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 4 01:17:25.355009 systemd[1]: Created slice user.slice - User and Session Slice. Mar 4 01:17:25.355030 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 4 01:17:25.355062 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 4 01:17:25.355085 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 4 01:17:25.355106 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 4 01:17:25.355197 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 4 01:17:25.355219 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 4 01:17:25.355240 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 4 01:17:25.355277 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 4 01:17:25.355301 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 4 01:17:25.355323 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 4 01:17:25.355353 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 4 01:17:25.355374 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 4 01:17:25.355395 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 4 01:17:25.355429 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 4 01:17:25.355453 systemd[1]: Reached target slices.target - Slice Units. Mar 4 01:17:25.355487 systemd[1]: Reached target swap.target - Swaps. Mar 4 01:17:25.355531 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 4 01:17:25.360229 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 4 01:17:25.360265 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 4 01:17:25.360290 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 4 01:17:25.360328 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 4 01:17:25.360352 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 4 01:17:25.360393 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 4 01:17:25.360416 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 4 01:17:25.360437 systemd[1]: Mounting media.mount - External Media Directory... Mar 4 01:17:25.360457 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 4 01:17:25.360479 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 4 01:17:25.360500 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 4 01:17:25.360557 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 4 01:17:25.360585 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 4 01:17:25.360606 systemd[1]: Reached target machines.target - Containers. Mar 4 01:17:25.360640 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 4 01:17:25.360660 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 4 01:17:25.360690 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 4 01:17:25.360712 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 4 01:17:25.360744 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 4 01:17:25.360763 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 4 01:17:25.360796 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 4 01:17:25.360819 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 4 01:17:25.360838 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 4 01:17:25.360858 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 4 01:17:25.360878 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 4 01:17:25.360897 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 4 01:17:25.360917 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 4 01:17:25.360949 systemd[1]: Stopped systemd-fsck-usr.service. Mar 4 01:17:25.360982 kernel: fuse: init (API version 7.39) Mar 4 01:17:25.361005 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 4 01:17:25.361049 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 4 01:17:25.361072 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 4 01:17:25.361095 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 4 01:17:25.361115 kernel: loop: module loaded Mar 4 01:17:25.361135 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 4 01:17:25.361157 systemd[1]: verity-setup.service: Deactivated successfully. Mar 4 01:17:25.361187 systemd[1]: Stopped verity-setup.service. Mar 4 01:17:25.361224 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 4 01:17:25.361247 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 4 01:17:25.361308 systemd-journald[1150]: Collecting audit messages is disabled. Mar 4 01:17:25.361348 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 4 01:17:25.361371 systemd[1]: Mounted media.mount - External Media Directory. Mar 4 01:17:25.361403 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 4 01:17:25.361439 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 4 01:17:25.361462 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 4 01:17:25.361483 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 4 01:17:25.361505 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 4 01:17:25.361526 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 4 01:17:25.361562 systemd-journald[1150]: Journal started Mar 4 01:17:25.361644 systemd-journald[1150]: Runtime Journal (/run/log/journal/88995a24e03e4e04a60cd37092ca54a6) is 4.7M, max 38.0M, 33.2M free. Mar 4 01:17:24.955133 systemd[1]: Queued start job for default target multi-user.target. Mar 4 01:17:24.978344 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 4 01:17:24.979143 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 4 01:17:25.367575 systemd[1]: Started systemd-journald.service - Journal Service. Mar 4 01:17:25.367869 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 4 01:17:25.368119 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 4 01:17:25.371815 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 4 01:17:25.373009 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 4 01:17:25.373253 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 4 01:17:25.374414 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 4 01:17:25.374646 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 4 01:17:25.375927 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 4 01:17:25.376163 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 4 01:17:25.377235 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 4 01:17:25.378456 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 4 01:17:25.379838 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 4 01:17:25.395203 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 4 01:17:25.409646 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 4 01:17:25.425629 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 4 01:17:25.426487 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 4 01:17:25.426552 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 4 01:17:25.429276 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Mar 4 01:17:25.436795 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 4 01:17:25.444691 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 4 01:17:25.445619 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 4 01:17:25.449797 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 4 01:17:25.452659 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 4 01:17:25.453993 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 4 01:17:25.455593 kernel: ACPI: bus type drm_connector registered Mar 4 01:17:25.462185 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 4 01:17:25.463579 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 4 01:17:25.466811 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 4 01:17:25.476802 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 4 01:17:25.487754 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 4 01:17:25.491099 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 4 01:17:25.492641 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 4 01:17:25.494796 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 4 01:17:25.499281 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 4 01:17:25.501312 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 4 01:17:25.503170 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 4 01:17:25.515400 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 4 01:17:25.543684 systemd-journald[1150]: Time spent on flushing to /var/log/journal/88995a24e03e4e04a60cd37092ca54a6 is 185.223ms for 1146 entries. Mar 4 01:17:25.543684 systemd-journald[1150]: System Journal (/var/log/journal/88995a24e03e4e04a60cd37092ca54a6) is 8.0M, max 584.8M, 576.8M free. Mar 4 01:17:25.830798 systemd-journald[1150]: Received client request to flush runtime journal. Mar 4 01:17:25.830936 kernel: loop0: detected capacity change from 0 to 219192 Mar 4 01:17:25.831043 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 4 01:17:25.831082 kernel: loop1: detected capacity change from 0 to 140768 Mar 4 01:17:25.831129 kernel: loop2: detected capacity change from 0 to 8 Mar 4 01:17:25.538741 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Mar 4 01:17:25.652796 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 4 01:17:25.654141 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Mar 4 01:17:25.700613 systemd-tmpfiles[1190]: ACLs are not supported, ignoring. Mar 4 01:17:25.700633 systemd-tmpfiles[1190]: ACLs are not supported, ignoring. Mar 4 01:17:25.731749 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 4 01:17:25.767102 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 4 01:17:25.785911 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 4 01:17:25.857008 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 4 01:17:25.892571 kernel: loop3: detected capacity change from 0 to 142488 Mar 4 01:17:25.933723 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 4 01:17:25.942793 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 4 01:17:25.977584 kernel: loop4: detected capacity change from 0 to 219192 Mar 4 01:17:25.998549 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 4 01:17:26.008822 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 4 01:17:26.012688 kernel: loop5: detected capacity change from 0 to 140768 Mar 4 01:17:26.024197 systemd-tmpfiles[1213]: ACLs are not supported, ignoring. Mar 4 01:17:26.031284 systemd-tmpfiles[1213]: ACLs are not supported, ignoring. Mar 4 01:17:26.057080 kernel: loop6: detected capacity change from 0 to 8 Mar 4 01:17:26.072572 kernel: loop7: detected capacity change from 0 to 142488 Mar 4 01:17:26.075623 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 4 01:17:26.098439 (sd-merge)[1216]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Mar 4 01:17:26.099874 (sd-merge)[1216]: Merged extensions into '/usr'. Mar 4 01:17:26.107491 systemd[1]: Reloading requested from client PID 1189 ('systemd-sysext') (unit systemd-sysext.service)... Mar 4 01:17:26.107610 udevadm[1218]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 4 01:17:26.107797 systemd[1]: Reloading... Mar 4 01:17:26.273383 zram_generator::config[1243]: No configuration found. Mar 4 01:17:26.491115 ldconfig[1184]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 4 01:17:26.576820 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 4 01:17:26.643072 systemd[1]: Reloading finished in 533 ms. Mar 4 01:17:26.674991 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 4 01:17:26.676727 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 4 01:17:26.690851 systemd[1]: Starting ensure-sysext.service... Mar 4 01:17:26.700698 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 4 01:17:26.720688 systemd[1]: Reloading requested from client PID 1301 ('systemctl') (unit ensure-sysext.service)... Mar 4 01:17:26.720712 systemd[1]: Reloading... Mar 4 01:17:26.825352 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 4 01:17:26.826663 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 4 01:17:26.828274 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 4 01:17:26.828814 systemd-tmpfiles[1302]: ACLs are not supported, ignoring. Mar 4 01:17:26.829057 systemd-tmpfiles[1302]: ACLs are not supported, ignoring. Mar 4 01:17:26.838243 systemd-tmpfiles[1302]: Detected autofs mount point /boot during canonicalization of boot. Mar 4 01:17:26.839791 systemd-tmpfiles[1302]: Skipping /boot Mar 4 01:17:26.865314 systemd-tmpfiles[1302]: Detected autofs mount point /boot during canonicalization of boot. Mar 4 01:17:26.865334 systemd-tmpfiles[1302]: Skipping /boot Mar 4 01:17:26.929571 zram_generator::config[1325]: No configuration found. Mar 4 01:17:27.115523 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 4 01:17:27.182604 systemd[1]: Reloading finished in 461 ms. Mar 4 01:17:27.211287 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 4 01:17:27.217181 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 4 01:17:27.236674 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 4 01:17:27.242775 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 4 01:17:27.251756 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 4 01:17:27.257349 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 4 01:17:27.268848 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 4 01:17:27.277951 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 4 01:17:27.288919 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 4 01:17:27.294340 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 4 01:17:27.294649 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 4 01:17:27.305960 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 4 01:17:27.316951 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 4 01:17:27.320858 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 4 01:17:27.322754 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 4 01:17:27.322916 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 4 01:17:27.329321 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 4 01:17:27.330801 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 4 01:17:27.331057 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 4 01:17:27.331189 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 4 01:17:27.343385 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 4 01:17:27.349966 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 4 01:17:27.350229 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 4 01:17:27.355476 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 4 01:17:27.356131 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 4 01:17:27.366154 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 4 01:17:27.368161 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 4 01:17:27.368249 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 4 01:17:27.368308 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 4 01:17:27.370018 systemd[1]: Finished ensure-sysext.service. Mar 4 01:17:27.372354 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 4 01:17:27.381712 augenrules[1414]: No rules Mar 4 01:17:27.377189 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 4 01:17:27.384372 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 4 01:17:27.385631 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 4 01:17:27.400871 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 4 01:17:27.404429 systemd-udevd[1399]: Using default interface naming scheme 'v255'. Mar 4 01:17:27.405760 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 4 01:17:27.408418 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 4 01:17:27.409053 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 4 01:17:27.412137 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 4 01:17:27.420112 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 4 01:17:27.420352 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 4 01:17:27.437181 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 4 01:17:27.438885 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 4 01:17:27.450160 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 4 01:17:27.457440 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 4 01:17:27.467799 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 4 01:17:27.477629 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 4 01:17:27.670706 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 4 01:17:27.672807 systemd[1]: Reached target time-set.target - System Time Set. Mar 4 01:17:27.690439 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 4 01:17:27.693640 systemd-networkd[1434]: lo: Link UP Mar 4 01:17:27.693652 systemd-networkd[1434]: lo: Gained carrier Mar 4 01:17:27.695008 systemd-networkd[1434]: Enumeration completed Mar 4 01:17:27.695132 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 4 01:17:27.701766 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 4 01:17:27.716242 systemd-resolved[1397]: Positive Trust Anchors: Mar 4 01:17:27.716797 systemd-resolved[1397]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 4 01:17:27.716841 systemd-resolved[1397]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 4 01:17:27.737169 systemd-resolved[1397]: Using system hostname 'srv-8wmcq.gb1.brightbox.com'. Mar 4 01:17:27.740282 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 4 01:17:27.741578 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 40 scanned by (udev-worker) (1432) Mar 4 01:17:27.744167 systemd[1]: Reached target network.target - Network. Mar 4 01:17:27.745852 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 4 01:17:27.820962 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Mar 4 01:17:27.824566 kernel: mousedev: PS/2 mouse device common for all mice Mar 4 01:17:27.826236 systemd-networkd[1434]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 4 01:17:27.826249 systemd-networkd[1434]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 4 01:17:27.835275 systemd-networkd[1434]: eth0: Link UP Mar 4 01:17:27.835293 systemd-networkd[1434]: eth0: Gained carrier Mar 4 01:17:27.835368 systemd-networkd[1434]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 4 01:17:27.862642 kernel: ACPI: button: Power Button [PWRF] Mar 4 01:17:27.893699 systemd-networkd[1434]: eth0: DHCPv4 address 10.243.77.214/30, gateway 10.243.77.213 acquired from 10.243.77.213 Mar 4 01:17:27.895616 systemd-timesyncd[1424]: Network configuration changed, trying to establish connection. Mar 4 01:17:27.940575 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 4 01:17:27.948636 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Mar 4 01:17:27.949698 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 4 01:17:27.957907 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Mar 4 01:17:27.991326 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 4 01:17:28.003423 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 4 01:17:28.025890 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 4 01:17:28.072278 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 4 01:17:28.253305 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 4 01:17:28.289563 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 4 01:17:28.299193 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 4 01:17:28.318779 lvm[1475]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 4 01:17:28.358598 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 4 01:17:28.359896 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 4 01:17:28.360665 systemd[1]: Reached target sysinit.target - System Initialization. Mar 4 01:17:28.361540 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 4 01:17:28.362522 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 4 01:17:28.363708 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 4 01:17:28.364612 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 4 01:17:28.365378 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 4 01:17:28.366155 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 4 01:17:28.366208 systemd[1]: Reached target paths.target - Path Units. Mar 4 01:17:28.366874 systemd[1]: Reached target timers.target - Timer Units. Mar 4 01:17:28.369533 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 4 01:17:28.377914 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 4 01:17:28.387822 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 4 01:17:28.391041 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 4 01:17:28.392633 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 4 01:17:28.393609 systemd[1]: Reached target sockets.target - Socket Units. Mar 4 01:17:28.394261 systemd[1]: Reached target basic.target - Basic System. Mar 4 01:17:28.394993 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 4 01:17:28.395068 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 4 01:17:28.401697 systemd[1]: Starting containerd.service - containerd container runtime... Mar 4 01:17:28.407781 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 4 01:17:28.410626 lvm[1479]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 4 01:17:28.418829 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 4 01:17:28.421493 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 4 01:17:28.428784 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 4 01:17:28.429620 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 4 01:17:28.438818 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 4 01:17:28.445330 jq[1483]: false Mar 4 01:17:28.445712 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 4 01:17:28.448773 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 4 01:17:28.454791 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 4 01:17:28.470861 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 4 01:17:28.473770 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 4 01:17:28.475705 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 4 01:17:28.484779 systemd[1]: Starting update-engine.service - Update Engine... Mar 4 01:17:28.490243 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 4 01:17:28.494230 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 4 01:17:28.505173 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 4 01:17:28.505467 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 4 01:17:28.506035 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 4 01:17:28.506251 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 4 01:17:28.520613 update_engine[1494]: I20260304 01:17:28.519559 1494 main.cc:92] Flatcar Update Engine starting Mar 4 01:17:28.526457 jq[1495]: true Mar 4 01:17:28.538634 extend-filesystems[1484]: Found loop4 Mar 4 01:17:28.538634 extend-filesystems[1484]: Found loop5 Mar 4 01:17:28.538634 extend-filesystems[1484]: Found loop6 Mar 4 01:17:28.538634 extend-filesystems[1484]: Found loop7 Mar 4 01:17:28.538634 extend-filesystems[1484]: Found vda Mar 4 01:17:28.538634 extend-filesystems[1484]: Found vda1 Mar 4 01:17:28.538634 extend-filesystems[1484]: Found vda2 Mar 4 01:17:28.538634 extend-filesystems[1484]: Found vda3 Mar 4 01:17:28.538634 extend-filesystems[1484]: Found usr Mar 4 01:17:28.538634 extend-filesystems[1484]: Found vda4 Mar 4 01:17:28.538634 extend-filesystems[1484]: Found vda6 Mar 4 01:17:28.538634 extend-filesystems[1484]: Found vda7 Mar 4 01:17:28.538634 extend-filesystems[1484]: Found vda9 Mar 4 01:17:28.538634 extend-filesystems[1484]: Checking size of /dev/vda9 Mar 4 01:17:28.582084 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Mar 4 01:17:28.582599 extend-filesystems[1484]: Resized partition /dev/vda9 Mar 4 01:17:28.583930 extend-filesystems[1515]: resize2fs 1.47.1 (20-May-2024) Mar 4 01:17:28.585257 jq[1503]: true Mar 4 01:17:28.586445 systemd[1]: motdgen.service: Deactivated successfully. Mar 4 01:17:28.586794 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 4 01:17:28.614743 (ntainerd)[1516]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 4 01:17:28.630477 tar[1505]: linux-amd64/LICENSE Mar 4 01:17:28.630477 tar[1505]: linux-amd64/helm Mar 4 01:17:28.654332 dbus-daemon[1482]: [system] SELinux support is enabled Mar 4 01:17:28.654683 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 4 01:17:28.658990 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 4 01:17:28.659036 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 4 01:17:28.660410 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 4 01:17:28.660442 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 4 01:17:28.669309 dbus-daemon[1482]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.0' (uid=244 pid=1434 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Mar 4 01:17:28.674707 update_engine[1494]: I20260304 01:17:28.674070 1494 update_check_scheduler.cc:74] Next update check in 9m4s Mar 4 01:17:28.675061 systemd[1]: Started update-engine.service - Update Engine. Mar 4 01:17:28.675103 dbus-daemon[1482]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 4 01:17:28.690768 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Mar 4 01:17:28.701793 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 4 01:17:28.711698 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 40 scanned by (udev-worker) (1437) Mar 4 01:17:29.038186 systemd-logind[1492]: Watching system buttons on /dev/input/event2 (Power Button) Mar 4 01:17:29.038228 systemd-logind[1492]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 4 01:17:29.046775 bash[1540]: Updated "/home/core/.ssh/authorized_keys" Mar 4 01:17:29.042024 systemd-logind[1492]: New seat seat0. Mar 4 01:17:29.045280 systemd[1]: Started systemd-logind.service - User Login Management. Mar 4 01:17:29.052900 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 4 01:17:29.070152 systemd[1]: Starting sshkeys.service... Mar 4 01:17:29.122732 systemd-networkd[1434]: eth0: Gained IPv6LL Mar 4 01:17:29.132273 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 4 01:17:29.138496 systemd-timesyncd[1424]: Network configuration changed, trying to establish connection. Mar 4 01:17:29.157376 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 4 01:17:29.159568 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 4 01:17:29.163481 systemd[1]: Reached target network-online.target - Network is Online. Mar 4 01:17:29.184769 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Mar 4 01:17:29.184104 dbus-daemon[1482]: [system] Successfully activated service 'org.freedesktop.hostname1' Mar 4 01:17:29.181489 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 01:17:29.190941 dbus-daemon[1482]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.6' (uid=0 pid=1529 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Mar 4 01:17:29.185509 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 4 01:17:29.188467 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Mar 4 01:17:29.202987 systemd[1]: Starting polkit.service - Authorization Manager... Mar 4 01:17:29.305035 extend-filesystems[1515]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 4 01:17:29.305035 extend-filesystems[1515]: old_desc_blocks = 1, new_desc_blocks = 8 Mar 4 01:17:29.305035 extend-filesystems[1515]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Mar 4 01:17:29.335688 extend-filesystems[1484]: Resized filesystem in /dev/vda9 Mar 4 01:17:29.307781 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 4 01:17:29.308210 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 4 01:17:29.342628 polkitd[1553]: Started polkitd version 121 Mar 4 01:17:29.358497 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 4 01:17:29.374793 polkitd[1553]: Loading rules from directory /etc/polkit-1/rules.d Mar 4 01:17:29.374894 polkitd[1553]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 4 01:17:29.376596 polkitd[1553]: Finished loading, compiling and executing 2 rules Mar 4 01:17:29.378426 dbus-daemon[1482]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Mar 4 01:17:29.378836 systemd[1]: Started polkit.service - Authorization Manager. Mar 4 01:17:29.382886 polkitd[1553]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Mar 4 01:17:29.434204 systemd-hostnamed[1529]: Hostname set to (static) Mar 4 01:17:29.490773 locksmithd[1530]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 4 01:17:29.528445 containerd[1516]: time="2026-03-04T01:17:29.526149435Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Mar 4 01:17:29.644649 containerd[1516]: time="2026-03-04T01:17:29.643143937Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 4 01:17:29.647835 containerd[1516]: time="2026-03-04T01:17:29.647786484Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 4 01:17:29.647967 containerd[1516]: time="2026-03-04T01:17:29.647929789Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 4 01:17:29.648097 containerd[1516]: time="2026-03-04T01:17:29.648072280Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 4 01:17:29.664574 containerd[1516]: time="2026-03-04T01:17:29.662909130Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 4 01:17:29.664574 containerd[1516]: time="2026-03-04T01:17:29.662995044Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 4 01:17:29.664574 containerd[1516]: time="2026-03-04T01:17:29.663143033Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 4 01:17:29.664574 containerd[1516]: time="2026-03-04T01:17:29.663168151Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 4 01:17:29.664574 containerd[1516]: time="2026-03-04T01:17:29.663517551Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 4 01:17:29.664574 containerd[1516]: time="2026-03-04T01:17:29.663564789Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 4 01:17:29.664574 containerd[1516]: time="2026-03-04T01:17:29.663588251Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 4 01:17:29.664574 containerd[1516]: time="2026-03-04T01:17:29.663605208Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 4 01:17:29.664574 containerd[1516]: time="2026-03-04T01:17:29.663777192Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 4 01:17:29.664574 containerd[1516]: time="2026-03-04T01:17:29.664242167Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 4 01:17:29.664574 containerd[1516]: time="2026-03-04T01:17:29.664381680Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 4 01:17:29.665110 containerd[1516]: time="2026-03-04T01:17:29.664406569Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 4 01:17:29.669506 containerd[1516]: time="2026-03-04T01:17:29.669101606Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 4 01:17:29.669506 containerd[1516]: time="2026-03-04T01:17:29.669237602Z" level=info msg="metadata content store policy set" policy=shared Mar 4 01:17:29.683076 containerd[1516]: time="2026-03-04T01:17:29.683017911Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 4 01:17:29.683423 containerd[1516]: time="2026-03-04T01:17:29.683371776Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 4 01:17:29.683492 containerd[1516]: time="2026-03-04T01:17:29.683467182Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 4 01:17:29.683573 containerd[1516]: time="2026-03-04T01:17:29.683499830Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 4 01:17:29.683573 containerd[1516]: time="2026-03-04T01:17:29.683523900Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 4 01:17:29.686335 containerd[1516]: time="2026-03-04T01:17:29.685726654Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 4 01:17:29.686335 containerd[1516]: time="2026-03-04T01:17:29.686070444Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 4 01:17:29.686335 containerd[1516]: time="2026-03-04T01:17:29.686266730Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 4 01:17:29.686335 containerd[1516]: time="2026-03-04T01:17:29.686295776Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 4 01:17:29.686335 containerd[1516]: time="2026-03-04T01:17:29.686317739Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 4 01:17:29.686335 containerd[1516]: time="2026-03-04T01:17:29.686338199Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 4 01:17:29.686643 containerd[1516]: time="2026-03-04T01:17:29.686358566Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 4 01:17:29.686643 containerd[1516]: time="2026-03-04T01:17:29.686377933Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 4 01:17:29.686643 containerd[1516]: time="2026-03-04T01:17:29.686400398Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 4 01:17:29.686643 containerd[1516]: time="2026-03-04T01:17:29.686423537Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 4 01:17:29.686643 containerd[1516]: time="2026-03-04T01:17:29.686455595Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 4 01:17:29.686643 containerd[1516]: time="2026-03-04T01:17:29.686482281Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 4 01:17:29.686643 containerd[1516]: time="2026-03-04T01:17:29.686504379Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 4 01:17:29.686643 containerd[1516]: time="2026-03-04T01:17:29.686566804Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.686643 containerd[1516]: time="2026-03-04T01:17:29.686592899Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.686643 containerd[1516]: time="2026-03-04T01:17:29.686612800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.686643 containerd[1516]: time="2026-03-04T01:17:29.686632436Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.688127 containerd[1516]: time="2026-03-04T01:17:29.686680179Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.688127 containerd[1516]: time="2026-03-04T01:17:29.686707741Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.688127 containerd[1516]: time="2026-03-04T01:17:29.686727120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.688127 containerd[1516]: time="2026-03-04T01:17:29.686746533Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.688127 containerd[1516]: time="2026-03-04T01:17:29.686766713Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.688127 containerd[1516]: time="2026-03-04T01:17:29.686788151Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.688127 containerd[1516]: time="2026-03-04T01:17:29.686805878Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.688127 containerd[1516]: time="2026-03-04T01:17:29.686831122Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.688127 containerd[1516]: time="2026-03-04T01:17:29.686850975Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.688127 containerd[1516]: time="2026-03-04T01:17:29.686875322Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 4 01:17:29.688127 containerd[1516]: time="2026-03-04T01:17:29.686912882Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.688127 containerd[1516]: time="2026-03-04T01:17:29.686942217Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.688127 containerd[1516]: time="2026-03-04T01:17:29.686961575Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 4 01:17:29.688127 containerd[1516]: time="2026-03-04T01:17:29.687039800Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 4 01:17:29.689672 containerd[1516]: time="2026-03-04T01:17:29.687073400Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 4 01:17:29.689672 containerd[1516]: time="2026-03-04T01:17:29.687092473Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 4 01:17:29.689672 containerd[1516]: time="2026-03-04T01:17:29.687110290Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 4 01:17:29.689672 containerd[1516]: time="2026-03-04T01:17:29.687128053Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.689672 containerd[1516]: time="2026-03-04T01:17:29.687153395Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 4 01:17:29.689672 containerd[1516]: time="2026-03-04T01:17:29.687170444Z" level=info msg="NRI interface is disabled by configuration." Mar 4 01:17:29.689672 containerd[1516]: time="2026-03-04T01:17:29.687206894Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 4 01:17:29.691307 containerd[1516]: time="2026-03-04T01:17:29.690177934Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 4 01:17:29.691307 containerd[1516]: time="2026-03-04T01:17:29.690272218Z" level=info msg="Connect containerd service" Mar 4 01:17:29.691307 containerd[1516]: time="2026-03-04T01:17:29.690327968Z" level=info msg="using legacy CRI server" Mar 4 01:17:29.691307 containerd[1516]: time="2026-03-04T01:17:29.690349258Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 4 01:17:29.691307 containerd[1516]: time="2026-03-04T01:17:29.690483996Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 4 01:17:29.691307 containerd[1516]: time="2026-03-04T01:17:29.693611875Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 4 01:17:29.691307 containerd[1516]: time="2026-03-04T01:17:29.693724442Z" level=info msg="Start subscribing containerd event" Mar 4 01:17:29.691307 containerd[1516]: time="2026-03-04T01:17:29.693797849Z" level=info msg="Start recovering state" Mar 4 01:17:29.691307 containerd[1516]: time="2026-03-04T01:17:29.693881537Z" level=info msg="Start event monitor" Mar 4 01:17:29.691307 containerd[1516]: time="2026-03-04T01:17:29.693906128Z" level=info msg="Start snapshots syncer" Mar 4 01:17:29.691307 containerd[1516]: time="2026-03-04T01:17:29.693973564Z" level=info msg="Start cni network conf syncer for default" Mar 4 01:17:29.691307 containerd[1516]: time="2026-03-04T01:17:29.693993780Z" level=info msg="Start streaming server" Mar 4 01:17:29.691307 containerd[1516]: time="2026-03-04T01:17:29.694280316Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 4 01:17:29.691307 containerd[1516]: time="2026-03-04T01:17:29.694370490Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 4 01:17:29.724890 containerd[1516]: time="2026-03-04T01:17:29.718911453Z" level=info msg="containerd successfully booted in 0.200290s" Mar 4 01:17:29.719080 systemd[1]: Started containerd.service - containerd container runtime. Mar 4 01:17:29.926868 sshd_keygen[1524]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 4 01:17:30.065581 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 4 01:17:30.080409 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 4 01:17:30.116366 systemd[1]: issuegen.service: Deactivated successfully. Mar 4 01:17:30.116716 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 4 01:17:30.128829 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 4 01:17:30.189837 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 4 01:17:30.204240 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 4 01:17:30.221194 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 4 01:17:30.222410 systemd[1]: Reached target getty.target - Login Prompts. Mar 4 01:17:30.636869 systemd-timesyncd[1424]: Network configuration changed, trying to establish connection. Mar 4 01:17:30.643147 systemd-networkd[1434]: eth0: Ignoring DHCPv6 address 2a02:1348:17c:d375:24:19ff:fef3:4dd6/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17c:d375:24:19ff:fef3:4dd6/64 assigned by NDisc. Mar 4 01:17:30.643157 systemd-networkd[1434]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Mar 4 01:17:30.702422 tar[1505]: linux-amd64/README.md Mar 4 01:17:30.723834 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 4 01:17:31.384961 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 01:17:31.400954 (kubelet)[1608]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 4 01:17:31.682198 systemd-timesyncd[1424]: Network configuration changed, trying to establish connection. Mar 4 01:17:32.237276 kubelet[1608]: E0304 01:17:32.237000 1608 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 4 01:17:32.244714 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 4 01:17:32.247373 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 4 01:17:32.249879 systemd[1]: kubelet.service: Consumed 1.891s CPU time. Mar 4 01:17:33.248596 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 4 01:17:33.262771 systemd[1]: Started sshd@0-10.243.77.214:22-20.161.92.111:60814.service - OpenSSH per-connection server daemon (20.161.92.111:60814). Mar 4 01:17:33.939428 sshd[1618]: Accepted publickey for core from 20.161.92.111 port 60814 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:17:33.948085 sshd[1618]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:17:33.978920 systemd-logind[1492]: New session 1 of user core. Mar 4 01:17:33.986441 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 4 01:17:34.003513 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 4 01:17:34.055848 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 4 01:17:34.073313 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 4 01:17:34.102810 (systemd)[1622]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 4 01:17:34.306814 systemd[1622]: Queued start job for default target default.target. Mar 4 01:17:34.322400 systemd[1622]: Created slice app.slice - User Application Slice. Mar 4 01:17:34.322478 systemd[1622]: Reached target paths.target - Paths. Mar 4 01:17:34.322511 systemd[1622]: Reached target timers.target - Timers. Mar 4 01:17:34.328380 systemd[1622]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 4 01:17:34.387922 systemd[1622]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 4 01:17:34.388352 systemd[1622]: Reached target sockets.target - Sockets. Mar 4 01:17:34.388400 systemd[1622]: Reached target basic.target - Basic System. Mar 4 01:17:34.389332 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 4 01:17:34.389392 systemd[1622]: Reached target default.target - Main User Target. Mar 4 01:17:34.389515 systemd[1622]: Startup finished in 270ms. Mar 4 01:17:34.410499 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 4 01:17:34.890694 systemd[1]: Started sshd@1-10.243.77.214:22-20.161.92.111:60830.service - OpenSSH per-connection server daemon (20.161.92.111:60830). Mar 4 01:17:35.293042 login[1596]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 4 01:17:35.295678 login[1597]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 4 01:17:35.306032 systemd-logind[1492]: New session 3 of user core. Mar 4 01:17:35.313885 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 4 01:17:35.319686 systemd-logind[1492]: New session 2 of user core. Mar 4 01:17:35.326002 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 4 01:17:35.507900 sshd[1633]: Accepted publickey for core from 20.161.92.111 port 60830 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:17:35.510646 sshd[1633]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:17:35.517514 systemd-logind[1492]: New session 4 of user core. Mar 4 01:17:35.528117 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 4 01:17:35.927650 sshd[1633]: pam_unix(sshd:session): session closed for user core Mar 4 01:17:35.936090 systemd[1]: sshd@1-10.243.77.214:22-20.161.92.111:60830.service: Deactivated successfully. Mar 4 01:17:35.939715 systemd[1]: session-4.scope: Deactivated successfully. Mar 4 01:17:35.941001 systemd-logind[1492]: Session 4 logged out. Waiting for processes to exit. Mar 4 01:17:35.943639 systemd-logind[1492]: Removed session 4. Mar 4 01:17:36.032072 systemd[1]: Started sshd@2-10.243.77.214:22-20.161.92.111:60832.service - OpenSSH per-connection server daemon (20.161.92.111:60832). Mar 4 01:17:36.147976 coreos-metadata[1481]: Mar 04 01:17:36.147 WARN failed to locate config-drive, using the metadata service API instead Mar 4 01:17:36.173917 coreos-metadata[1481]: Mar 04 01:17:36.173 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Mar 4 01:17:36.194575 coreos-metadata[1481]: Mar 04 01:17:36.194 INFO Fetch failed with 404: resource not found Mar 4 01:17:36.194575 coreos-metadata[1481]: Mar 04 01:17:36.194 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 4 01:17:36.195262 coreos-metadata[1481]: Mar 04 01:17:36.195 INFO Fetch successful Mar 4 01:17:36.195375 coreos-metadata[1481]: Mar 04 01:17:36.195 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Mar 4 01:17:36.250631 coreos-metadata[1481]: Mar 04 01:17:36.250 INFO Fetch successful Mar 4 01:17:36.250960 coreos-metadata[1481]: Mar 04 01:17:36.250 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Mar 4 01:17:36.273516 coreos-metadata[1481]: Mar 04 01:17:36.273 INFO Fetch successful Mar 4 01:17:36.273516 coreos-metadata[1481]: Mar 04 01:17:36.273 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Mar 4 01:17:36.287756 coreos-metadata[1481]: Mar 04 01:17:36.287 INFO Fetch successful Mar 4 01:17:36.287976 coreos-metadata[1481]: Mar 04 01:17:36.287 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Mar 4 01:17:36.377579 coreos-metadata[1481]: Mar 04 01:17:36.377 INFO Fetch successful Mar 4 01:17:36.417801 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 4 01:17:36.419025 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 4 01:17:36.614330 coreos-metadata[1547]: Mar 04 01:17:36.613 WARN failed to locate config-drive, using the metadata service API instead Mar 4 01:17:36.616568 sshd[1666]: Accepted publickey for core from 20.161.92.111 port 60832 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:17:36.618714 sshd[1666]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:17:36.629647 systemd-logind[1492]: New session 5 of user core. Mar 4 01:17:36.633896 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 4 01:17:36.645266 coreos-metadata[1547]: Mar 04 01:17:36.645 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 4 01:17:36.670863 coreos-metadata[1547]: Mar 04 01:17:36.670 INFO Fetch successful Mar 4 01:17:36.671250 coreos-metadata[1547]: Mar 04 01:17:36.671 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 4 01:17:36.698249 coreos-metadata[1547]: Mar 04 01:17:36.698 INFO Fetch successful Mar 4 01:17:36.705452 unknown[1547]: wrote ssh authorized keys file for user: core Mar 4 01:17:36.727384 update-ssh-keys[1679]: Updated "/home/core/.ssh/authorized_keys" Mar 4 01:17:36.728415 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 4 01:17:36.731996 systemd[1]: Finished sshkeys.service. Mar 4 01:17:36.735403 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 4 01:17:36.735654 systemd[1]: Startup finished in 1.674s (kernel) + 15.356s (initrd) + 12.675s (userspace) = 29.706s. Mar 4 01:17:37.023087 sshd[1666]: pam_unix(sshd:session): session closed for user core Mar 4 01:17:37.029108 systemd-logind[1492]: Session 5 logged out. Waiting for processes to exit. Mar 4 01:17:37.029776 systemd[1]: sshd@2-10.243.77.214:22-20.161.92.111:60832.service: Deactivated successfully. Mar 4 01:17:37.032209 systemd[1]: session-5.scope: Deactivated successfully. Mar 4 01:17:37.033681 systemd-logind[1492]: Removed session 5. Mar 4 01:17:42.477713 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 4 01:17:42.484836 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 01:17:42.829630 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 01:17:42.836996 (kubelet)[1692]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 4 01:17:42.909436 kubelet[1692]: E0304 01:17:42.909309 1692 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 4 01:17:42.913314 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 4 01:17:42.913626 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 4 01:17:47.141974 systemd[1]: Started sshd@3-10.243.77.214:22-20.161.92.111:33174.service - OpenSSH per-connection server daemon (20.161.92.111:33174). Mar 4 01:17:47.744588 sshd[1701]: Accepted publickey for core from 20.161.92.111 port 33174 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:17:47.746359 sshd[1701]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:17:47.754623 systemd-logind[1492]: New session 6 of user core. Mar 4 01:17:47.763191 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 4 01:17:48.169044 sshd[1701]: pam_unix(sshd:session): session closed for user core Mar 4 01:17:48.175187 systemd[1]: sshd@3-10.243.77.214:22-20.161.92.111:33174.service: Deactivated successfully. Mar 4 01:17:48.175343 systemd-logind[1492]: Session 6 logged out. Waiting for processes to exit. Mar 4 01:17:48.178675 systemd[1]: session-6.scope: Deactivated successfully. Mar 4 01:17:48.181428 systemd-logind[1492]: Removed session 6. Mar 4 01:17:48.276917 systemd[1]: Started sshd@4-10.243.77.214:22-20.161.92.111:33184.service - OpenSSH per-connection server daemon (20.161.92.111:33184). Mar 4 01:17:48.867984 sshd[1708]: Accepted publickey for core from 20.161.92.111 port 33184 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:17:48.870309 sshd[1708]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:17:48.876904 systemd-logind[1492]: New session 7 of user core. Mar 4 01:17:48.884772 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 4 01:17:49.282396 sshd[1708]: pam_unix(sshd:session): session closed for user core Mar 4 01:17:49.287909 systemd[1]: sshd@4-10.243.77.214:22-20.161.92.111:33184.service: Deactivated successfully. Mar 4 01:17:49.290130 systemd[1]: session-7.scope: Deactivated successfully. Mar 4 01:17:49.291233 systemd-logind[1492]: Session 7 logged out. Waiting for processes to exit. Mar 4 01:17:49.292856 systemd-logind[1492]: Removed session 7. Mar 4 01:17:49.401049 systemd[1]: Started sshd@5-10.243.77.214:22-20.161.92.111:34316.service - OpenSSH per-connection server daemon (20.161.92.111:34316). Mar 4 01:17:49.986693 sshd[1715]: Accepted publickey for core from 20.161.92.111 port 34316 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:17:49.990172 sshd[1715]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:17:49.998487 systemd-logind[1492]: New session 8 of user core. Mar 4 01:17:50.004831 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 4 01:17:50.408447 sshd[1715]: pam_unix(sshd:session): session closed for user core Mar 4 01:17:50.413595 systemd[1]: sshd@5-10.243.77.214:22-20.161.92.111:34316.service: Deactivated successfully. Mar 4 01:17:50.416396 systemd[1]: session-8.scope: Deactivated successfully. Mar 4 01:17:50.418349 systemd-logind[1492]: Session 8 logged out. Waiting for processes to exit. Mar 4 01:17:50.420399 systemd-logind[1492]: Removed session 8. Mar 4 01:17:50.532940 systemd[1]: Started sshd@6-10.243.77.214:22-20.161.92.111:34330.service - OpenSSH per-connection server daemon (20.161.92.111:34330). Mar 4 01:17:51.113870 sshd[1722]: Accepted publickey for core from 20.161.92.111 port 34330 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:17:51.116125 sshd[1722]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:17:51.124230 systemd-logind[1492]: New session 9 of user core. Mar 4 01:17:51.133838 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 4 01:17:51.456071 sudo[1725]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 4 01:17:51.456618 sudo[1725]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 4 01:17:51.477464 sudo[1725]: pam_unix(sudo:session): session closed for user root Mar 4 01:17:51.571641 sshd[1722]: pam_unix(sshd:session): session closed for user core Mar 4 01:17:51.578243 systemd[1]: sshd@6-10.243.77.214:22-20.161.92.111:34330.service: Deactivated successfully. Mar 4 01:17:51.580906 systemd[1]: session-9.scope: Deactivated successfully. Mar 4 01:17:51.582036 systemd-logind[1492]: Session 9 logged out. Waiting for processes to exit. Mar 4 01:17:51.583453 systemd-logind[1492]: Removed session 9. Mar 4 01:17:51.673946 systemd[1]: Started sshd@7-10.243.77.214:22-20.161.92.111:34342.service - OpenSSH per-connection server daemon (20.161.92.111:34342). Mar 4 01:17:52.238558 sshd[1730]: Accepted publickey for core from 20.161.92.111 port 34342 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:17:52.239613 sshd[1730]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:17:52.248255 systemd-logind[1492]: New session 10 of user core. Mar 4 01:17:52.254779 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 4 01:17:52.634121 sudo[1734]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 4 01:17:52.635582 sudo[1734]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 4 01:17:52.641734 sudo[1734]: pam_unix(sudo:session): session closed for user root Mar 4 01:17:52.650744 sudo[1733]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Mar 4 01:17:52.651258 sudo[1733]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 4 01:17:52.674965 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Mar 4 01:17:52.677919 auditctl[1737]: No rules Mar 4 01:17:52.679032 systemd[1]: audit-rules.service: Deactivated successfully. Mar 4 01:17:52.679323 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Mar 4 01:17:52.683832 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 4 01:17:52.732008 augenrules[1755]: No rules Mar 4 01:17:52.733403 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 4 01:17:52.735748 sudo[1733]: pam_unix(sudo:session): session closed for user root Mar 4 01:17:52.827977 sshd[1730]: pam_unix(sshd:session): session closed for user core Mar 4 01:17:52.832518 systemd-logind[1492]: Session 10 logged out. Waiting for processes to exit. Mar 4 01:17:52.833722 systemd[1]: sshd@7-10.243.77.214:22-20.161.92.111:34342.service: Deactivated successfully. Mar 4 01:17:52.836252 systemd[1]: session-10.scope: Deactivated successfully. Mar 4 01:17:52.838323 systemd-logind[1492]: Removed session 10. Mar 4 01:17:52.939696 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 4 01:17:52.945800 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 01:17:52.948865 systemd[1]: Started sshd@8-10.243.77.214:22-20.161.92.111:34356.service - OpenSSH per-connection server daemon (20.161.92.111:34356). Mar 4 01:17:53.286346 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 01:17:53.296281 (kubelet)[1773]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 4 01:17:53.373213 kubelet[1773]: E0304 01:17:53.373143 1773 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 4 01:17:53.376770 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 4 01:17:53.377087 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 4 01:17:53.569784 sshd[1764]: Accepted publickey for core from 20.161.92.111 port 34356 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:17:53.572213 sshd[1764]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:17:53.578954 systemd-logind[1492]: New session 11 of user core. Mar 4 01:17:53.591025 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 4 01:17:53.903320 sudo[1781]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 4 01:17:53.903832 sudo[1781]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 4 01:17:54.485348 (dockerd)[1796]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 4 01:17:54.485508 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 4 01:17:55.107593 dockerd[1796]: time="2026-03-04T01:17:55.106632257Z" level=info msg="Starting up" Mar 4 01:17:55.253635 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport635874902-merged.mount: Deactivated successfully. Mar 4 01:17:55.289209 dockerd[1796]: time="2026-03-04T01:17:55.288769812Z" level=info msg="Loading containers: start." Mar 4 01:17:55.447613 kernel: Initializing XFRM netlink socket Mar 4 01:17:55.485619 systemd-timesyncd[1424]: Network configuration changed, trying to establish connection. Mar 4 01:17:55.552772 systemd-networkd[1434]: docker0: Link UP Mar 4 01:17:55.570635 dockerd[1796]: time="2026-03-04T01:17:55.570533543Z" level=info msg="Loading containers: done." Mar 4 01:17:55.596277 dockerd[1796]: time="2026-03-04T01:17:55.595492372Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 4 01:17:55.596277 dockerd[1796]: time="2026-03-04T01:17:55.595692156Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Mar 4 01:17:55.596277 dockerd[1796]: time="2026-03-04T01:17:55.595872380Z" level=info msg="Daemon has completed initialization" Mar 4 01:17:55.653746 dockerd[1796]: time="2026-03-04T01:17:55.653650737Z" level=info msg="API listen on /run/docker.sock" Mar 4 01:17:55.653916 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 4 01:17:56.248716 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck856072875-merged.mount: Deactivated successfully. Mar 4 01:17:56.404614 containerd[1516]: time="2026-03-04T01:17:56.403579451Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\"" Mar 4 01:17:57.091478 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2176226913.mount: Deactivated successfully. Mar 4 01:17:59.482307 systemd-timesyncd[1424]: Contacted time server [2a00:fd80:aaaa:ffff::eeee:ff3]:123 (2.flatcar.pool.ntp.org). Mar 4 01:17:59.482340 systemd-resolved[1397]: Clock change detected. Flushing caches. Mar 4 01:17:59.482442 systemd-timesyncd[1424]: Initial clock synchronization to Wed 2026-03-04 01:17:59.481583 UTC. Mar 4 01:18:01.788857 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Mar 4 01:18:04.562897 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 4 01:18:04.576654 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 01:18:04.930157 containerd[1516]: time="2026-03-04T01:18:04.929904822Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:04.934280 containerd[1516]: time="2026-03-04T01:18:04.934211682Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.5: active requests=0, bytes read=27074505" Mar 4 01:18:04.935698 containerd[1516]: time="2026-03-04T01:18:04.935639370Z" level=info msg="ImageCreate event name:\"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:04.954065 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 01:18:04.958084 containerd[1516]: time="2026-03-04T01:18:04.957306376Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:04.961099 containerd[1516]: time="2026-03-04T01:18:04.959634903Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.5\" with image id \"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\", size \"27071096\" in 7.47136546s" Mar 4 01:18:04.961099 containerd[1516]: time="2026-03-04T01:18:04.959806662Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\" returns image reference \"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\"" Mar 4 01:18:04.964585 containerd[1516]: time="2026-03-04T01:18:04.964553572Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\"" Mar 4 01:18:04.970543 (kubelet)[2006]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 4 01:18:05.084475 kubelet[2006]: E0304 01:18:05.084329 2006 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 4 01:18:05.087840 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 4 01:18:05.088306 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 4 01:18:07.052230 containerd[1516]: time="2026-03-04T01:18:07.052125108Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:07.054069 containerd[1516]: time="2026-03-04T01:18:07.053893034Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.5: active requests=0, bytes read=21165831" Mar 4 01:18:07.055041 containerd[1516]: time="2026-03-04T01:18:07.054969879Z" level=info msg="ImageCreate event name:\"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:07.059429 containerd[1516]: time="2026-03-04T01:18:07.059373575Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:07.062079 containerd[1516]: time="2026-03-04T01:18:07.061308423Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.5\" with image id \"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\", size \"22822771\" in 2.096701142s" Mar 4 01:18:07.062079 containerd[1516]: time="2026-03-04T01:18:07.061378952Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\" returns image reference \"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\"" Mar 4 01:18:07.063508 containerd[1516]: time="2026-03-04T01:18:07.063304643Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\"" Mar 4 01:18:08.566648 containerd[1516]: time="2026-03-04T01:18:08.566401717Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:08.568823 containerd[1516]: time="2026-03-04T01:18:08.568721431Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.5: active requests=0, bytes read=15729832" Mar 4 01:18:08.570642 containerd[1516]: time="2026-03-04T01:18:08.570578153Z" level=info msg="ImageCreate event name:\"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:08.575671 containerd[1516]: time="2026-03-04T01:18:08.575590964Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:08.578712 containerd[1516]: time="2026-03-04T01:18:08.578642343Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.5\" with image id \"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\", size \"17386790\" in 1.515295155s" Mar 4 01:18:08.578712 containerd[1516]: time="2026-03-04T01:18:08.578707903Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\" returns image reference \"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\"" Mar 4 01:18:08.579940 containerd[1516]: time="2026-03-04T01:18:08.579880820Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\"" Mar 4 01:18:10.226938 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3224382523.mount: Deactivated successfully. Mar 4 01:18:10.838867 containerd[1516]: time="2026-03-04T01:18:10.838155798Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:10.839709 containerd[1516]: time="2026-03-04T01:18:10.839670780Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.5: active requests=0, bytes read=25861778" Mar 4 01:18:10.840547 containerd[1516]: time="2026-03-04T01:18:10.840499329Z" level=info msg="ImageCreate event name:\"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:10.843730 containerd[1516]: time="2026-03-04T01:18:10.843523980Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:10.845493 containerd[1516]: time="2026-03-04T01:18:10.844646818Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.5\" with image id \"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\", repo tag \"registry.k8s.io/kube-proxy:v1.34.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\", size \"25860789\" in 2.264410257s" Mar 4 01:18:10.845493 containerd[1516]: time="2026-03-04T01:18:10.844749621Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\" returns image reference \"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\"" Mar 4 01:18:10.846918 containerd[1516]: time="2026-03-04T01:18:10.846834761Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Mar 4 01:18:11.498630 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1637561184.mount: Deactivated successfully. Mar 4 01:18:13.404088 containerd[1516]: time="2026-03-04T01:18:13.403987618Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:13.407885 containerd[1516]: time="2026-03-04T01:18:13.407408534Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=22388015" Mar 4 01:18:13.408875 containerd[1516]: time="2026-03-04T01:18:13.408833803Z" level=info msg="ImageCreate event name:\"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:13.415788 containerd[1516]: time="2026-03-04T01:18:13.415723030Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:13.417597 containerd[1516]: time="2026-03-04T01:18:13.417551561Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"22384805\" in 2.570443585s" Mar 4 01:18:13.417762 containerd[1516]: time="2026-03-04T01:18:13.417733812Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\"" Mar 4 01:18:13.420290 containerd[1516]: time="2026-03-04T01:18:13.420218952Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 4 01:18:13.939793 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4133313953.mount: Deactivated successfully. Mar 4 01:18:13.944447 containerd[1516]: time="2026-03-04T01:18:13.944371592Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:13.947499 containerd[1516]: time="2026-03-04T01:18:13.947394939Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321226" Mar 4 01:18:13.951195 containerd[1516]: time="2026-03-04T01:18:13.951127230Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:13.953080 containerd[1516]: time="2026-03-04T01:18:13.952947810Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:13.955364 containerd[1516]: time="2026-03-04T01:18:13.954148315Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 532.060953ms" Mar 4 01:18:13.955364 containerd[1516]: time="2026-03-04T01:18:13.954193432Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Mar 4 01:18:13.955622 containerd[1516]: time="2026-03-04T01:18:13.955596738Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\"" Mar 4 01:18:14.492531 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount750831097.mount: Deactivated successfully. Mar 4 01:18:15.127581 update_engine[1494]: I20260304 01:18:15.127404 1494 update_attempter.cc:509] Updating boot flags... Mar 4 01:18:15.148291 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 4 01:18:15.162626 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 01:18:15.255086 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 40 scanned by (udev-worker) (2140) Mar 4 01:18:15.408250 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 40 scanned by (udev-worker) (2139) Mar 4 01:18:15.520197 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 40 scanned by (udev-worker) (2139) Mar 4 01:18:15.701290 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 01:18:15.702407 (kubelet)[2157]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 4 01:18:15.798094 kubelet[2157]: E0304 01:18:15.796691 2157 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 4 01:18:15.799390 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 4 01:18:15.799655 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 4 01:18:16.409669 containerd[1516]: time="2026-03-04T01:18:16.409606130Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:16.411304 containerd[1516]: time="2026-03-04T01:18:16.411261683Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.5-0: active requests=0, bytes read=22860682" Mar 4 01:18:16.412087 containerd[1516]: time="2026-03-04T01:18:16.411742313Z" level=info msg="ImageCreate event name:\"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:16.416098 containerd[1516]: time="2026-03-04T01:18:16.416064900Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:16.418293 containerd[1516]: time="2026-03-04T01:18:16.417745325Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.5-0\" with image id \"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\", repo tag \"registry.k8s.io/etcd:3.6.5-0\", repo digest \"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\", size \"22871747\" in 2.462114797s" Mar 4 01:18:16.418293 containerd[1516]: time="2026-03-04T01:18:16.417793428Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\" returns image reference \"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\"" Mar 4 01:18:20.845337 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 01:18:20.857576 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 01:18:20.908655 systemd[1]: Reloading requested from client PID 2205 ('systemctl') (unit session-11.scope)... Mar 4 01:18:20.908798 systemd[1]: Reloading... Mar 4 01:18:21.098096 zram_generator::config[2241]: No configuration found. Mar 4 01:18:21.320034 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 4 01:18:21.430202 systemd[1]: Reloading finished in 520 ms. Mar 4 01:18:21.517507 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 01:18:21.522983 systemd[1]: kubelet.service: Deactivated successfully. Mar 4 01:18:21.523365 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 01:18:21.529429 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 01:18:21.712066 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 01:18:21.728749 (kubelet)[2313]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 4 01:18:21.875084 kubelet[2313]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 4 01:18:21.875084 kubelet[2313]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 4 01:18:21.875084 kubelet[2313]: I0304 01:18:21.874339 2313 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 4 01:18:22.746078 kubelet[2313]: I0304 01:18:22.744682 2313 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 4 01:18:22.746078 kubelet[2313]: I0304 01:18:22.744736 2313 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 4 01:18:22.746078 kubelet[2313]: I0304 01:18:22.744832 2313 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 4 01:18:22.746078 kubelet[2313]: I0304 01:18:22.744860 2313 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 4 01:18:22.746078 kubelet[2313]: I0304 01:18:22.745369 2313 server.go:956] "Client rotation is on, will bootstrap in background" Mar 4 01:18:22.766708 kubelet[2313]: I0304 01:18:22.766668 2313 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 4 01:18:22.770426 kubelet[2313]: E0304 01:18:22.770385 2313 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.243.77.214:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.243.77.214:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 4 01:18:22.776153 kubelet[2313]: E0304 01:18:22.776111 2313 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 4 01:18:22.776272 kubelet[2313]: I0304 01:18:22.776181 2313 server.go:1400] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Mar 4 01:18:22.784972 kubelet[2313]: I0304 01:18:22.784917 2313 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 4 01:18:22.798108 kubelet[2313]: I0304 01:18:22.797988 2313 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 4 01:18:22.798406 kubelet[2313]: I0304 01:18:22.798094 2313 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-8wmcq.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 4 01:18:22.798763 kubelet[2313]: I0304 01:18:22.798465 2313 topology_manager.go:138] "Creating topology manager with none policy" Mar 4 01:18:22.798763 kubelet[2313]: I0304 01:18:22.798486 2313 container_manager_linux.go:306] "Creating device plugin manager" Mar 4 01:18:22.798908 kubelet[2313]: I0304 01:18:22.798778 2313 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 4 01:18:22.800885 kubelet[2313]: I0304 01:18:22.800832 2313 state_mem.go:36] "Initialized new in-memory state store" Mar 4 01:18:22.801419 kubelet[2313]: I0304 01:18:22.801394 2313 kubelet.go:475] "Attempting to sync node with API server" Mar 4 01:18:22.801496 kubelet[2313]: I0304 01:18:22.801458 2313 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 4 01:18:22.801614 kubelet[2313]: I0304 01:18:22.801596 2313 kubelet.go:387] "Adding apiserver pod source" Mar 4 01:18:22.802392 kubelet[2313]: I0304 01:18:22.801684 2313 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 4 01:18:22.806304 kubelet[2313]: E0304 01:18:22.805592 2313 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.243.77.214:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.243.77.214:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 4 01:18:22.806304 kubelet[2313]: E0304 01:18:22.806260 2313 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.243.77.214:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-8wmcq.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.243.77.214:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 4 01:18:22.806679 kubelet[2313]: I0304 01:18:22.806654 2313 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 4 01:18:22.807614 kubelet[2313]: I0304 01:18:22.807589 2313 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 4 01:18:22.807786 kubelet[2313]: I0304 01:18:22.807765 2313 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 4 01:18:22.808141 kubelet[2313]: W0304 01:18:22.808120 2313 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 4 01:18:22.814321 kubelet[2313]: I0304 01:18:22.814299 2313 server.go:1262] "Started kubelet" Mar 4 01:18:22.818117 kubelet[2313]: I0304 01:18:22.817944 2313 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 4 01:18:22.823077 kubelet[2313]: E0304 01:18:22.819942 2313 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.243.77.214:6443/api/v1/namespaces/default/events\": dial tcp 10.243.77.214:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-8wmcq.gb1.brightbox.com.18997e9414ae9941 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-8wmcq.gb1.brightbox.com,UID:srv-8wmcq.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-8wmcq.gb1.brightbox.com,},FirstTimestamp:2026-03-04 01:18:22.814247233 +0000 UTC m=+1.019338243,LastTimestamp:2026-03-04 01:18:22.814247233 +0000 UTC m=+1.019338243,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-8wmcq.gb1.brightbox.com,}" Mar 4 01:18:22.823077 kubelet[2313]: I0304 01:18:22.822204 2313 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 4 01:18:22.826391 kubelet[2313]: I0304 01:18:22.826366 2313 server.go:310] "Adding debug handlers to kubelet server" Mar 4 01:18:22.826772 kubelet[2313]: I0304 01:18:22.826744 2313 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 4 01:18:22.828170 kubelet[2313]: I0304 01:18:22.828129 2313 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 4 01:18:22.828535 kubelet[2313]: I0304 01:18:22.828507 2313 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 4 01:18:22.829508 kubelet[2313]: I0304 01:18:22.829485 2313 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 4 01:18:22.830937 kubelet[2313]: I0304 01:18:22.830829 2313 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 4 01:18:22.835694 kubelet[2313]: E0304 01:18:22.835660 2313 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"srv-8wmcq.gb1.brightbox.com\" not found" Mar 4 01:18:22.838739 kubelet[2313]: I0304 01:18:22.838695 2313 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 4 01:18:22.838850 kubelet[2313]: I0304 01:18:22.838822 2313 reconciler.go:29] "Reconciler: start to sync state" Mar 4 01:18:22.840078 kubelet[2313]: E0304 01:18:22.839949 2313 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.243.77.214:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-8wmcq.gb1.brightbox.com?timeout=10s\": dial tcp 10.243.77.214:6443: connect: connection refused" interval="200ms" Mar 4 01:18:22.842073 kubelet[2313]: E0304 01:18:22.841121 2313 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.243.77.214:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.243.77.214:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 4 01:18:22.842073 kubelet[2313]: I0304 01:18:22.841746 2313 factory.go:223] Registration of the systemd container factory successfully Mar 4 01:18:22.842073 kubelet[2313]: I0304 01:18:22.841867 2313 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 4 01:18:22.844179 kubelet[2313]: E0304 01:18:22.844152 2313 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 4 01:18:22.844596 kubelet[2313]: I0304 01:18:22.844573 2313 factory.go:223] Registration of the containerd container factory successfully Mar 4 01:18:22.866588 kubelet[2313]: I0304 01:18:22.866528 2313 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 4 01:18:22.880800 kubelet[2313]: I0304 01:18:22.880761 2313 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 4 01:18:22.883078 kubelet[2313]: I0304 01:18:22.883010 2313 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 4 01:18:22.883220 kubelet[2313]: I0304 01:18:22.883134 2313 kubelet.go:2428] "Starting kubelet main sync loop" Mar 4 01:18:22.883290 kubelet[2313]: E0304 01:18:22.883250 2313 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 4 01:18:22.885073 kubelet[2313]: E0304 01:18:22.885022 2313 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.243.77.214:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.243.77.214:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 4 01:18:22.897114 kubelet[2313]: I0304 01:18:22.896695 2313 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 4 01:18:22.897114 kubelet[2313]: I0304 01:18:22.896723 2313 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 4 01:18:22.897114 kubelet[2313]: I0304 01:18:22.896763 2313 state_mem.go:36] "Initialized new in-memory state store" Mar 4 01:18:22.899645 kubelet[2313]: I0304 01:18:22.899612 2313 policy_none.go:49] "None policy: Start" Mar 4 01:18:22.900426 kubelet[2313]: I0304 01:18:22.899930 2313 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 4 01:18:22.900426 kubelet[2313]: I0304 01:18:22.899980 2313 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 4 01:18:22.901648 kubelet[2313]: I0304 01:18:22.901627 2313 policy_none.go:47] "Start" Mar 4 01:18:22.915750 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 4 01:18:22.931612 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 4 01:18:22.936532 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 4 01:18:22.936828 kubelet[2313]: E0304 01:18:22.936645 2313 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"srv-8wmcq.gb1.brightbox.com\" not found" Mar 4 01:18:22.946416 kubelet[2313]: E0304 01:18:22.946388 2313 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 4 01:18:22.947742 kubelet[2313]: I0304 01:18:22.946905 2313 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 4 01:18:22.947742 kubelet[2313]: I0304 01:18:22.946948 2313 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 4 01:18:22.947742 kubelet[2313]: I0304 01:18:22.947429 2313 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 4 01:18:22.950731 kubelet[2313]: E0304 01:18:22.950605 2313 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 4 01:18:22.950731 kubelet[2313]: E0304 01:18:22.950683 2313 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-8wmcq.gb1.brightbox.com\" not found" Mar 4 01:18:23.002631 systemd[1]: Created slice kubepods-burstable-pod84ee84b9cbe9ea994f2b8d6c0ece6142.slice - libcontainer container kubepods-burstable-pod84ee84b9cbe9ea994f2b8d6c0ece6142.slice. Mar 4 01:18:23.015566 kubelet[2313]: E0304 01:18:23.015484 2313 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-8wmcq.gb1.brightbox.com\" not found" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.018352 systemd[1]: Created slice kubepods-burstable-pod09fe44c20517f10f1a7f0b75277410d1.slice - libcontainer container kubepods-burstable-pod09fe44c20517f10f1a7f0b75277410d1.slice. Mar 4 01:18:23.026863 kubelet[2313]: E0304 01:18:23.026820 2313 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-8wmcq.gb1.brightbox.com\" not found" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.031249 systemd[1]: Created slice kubepods-burstable-pod19a595fcc2933c514b3c99b8ea3d852e.slice - libcontainer container kubepods-burstable-pod19a595fcc2933c514b3c99b8ea3d852e.slice. Mar 4 01:18:23.035809 kubelet[2313]: E0304 01:18:23.035489 2313 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-8wmcq.gb1.brightbox.com\" not found" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.041447 kubelet[2313]: E0304 01:18:23.041404 2313 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.243.77.214:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-8wmcq.gb1.brightbox.com?timeout=10s\": dial tcp 10.243.77.214:6443: connect: connection refused" interval="400ms" Mar 4 01:18:23.046017 kubelet[2313]: E0304 01:18:23.045865 2313 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.243.77.214:6443/api/v1/namespaces/default/events\": dial tcp 10.243.77.214:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-8wmcq.gb1.brightbox.com.18997e9414ae9941 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-8wmcq.gb1.brightbox.com,UID:srv-8wmcq.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-8wmcq.gb1.brightbox.com,},FirstTimestamp:2026-03-04 01:18:22.814247233 +0000 UTC m=+1.019338243,LastTimestamp:2026-03-04 01:18:22.814247233 +0000 UTC m=+1.019338243,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-8wmcq.gb1.brightbox.com,}" Mar 4 01:18:23.051096 kubelet[2313]: I0304 01:18:23.050798 2313 kubelet_node_status.go:75] "Attempting to register node" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.051299 kubelet[2313]: E0304 01:18:23.051266 2313 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.243.77.214:6443/api/v1/nodes\": dial tcp 10.243.77.214:6443: connect: connection refused" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.140033 kubelet[2313]: I0304 01:18:23.139936 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/84ee84b9cbe9ea994f2b8d6c0ece6142-kubeconfig\") pod \"kube-scheduler-srv-8wmcq.gb1.brightbox.com\" (UID: \"84ee84b9cbe9ea994f2b8d6c0ece6142\") " pod="kube-system/kube-scheduler-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.140033 kubelet[2313]: I0304 01:18:23.140015 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/09fe44c20517f10f1a7f0b75277410d1-ca-certs\") pod \"kube-apiserver-srv-8wmcq.gb1.brightbox.com\" (UID: \"09fe44c20517f10f1a7f0b75277410d1\") " pod="kube-system/kube-apiserver-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.140340 kubelet[2313]: I0304 01:18:23.140083 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/19a595fcc2933c514b3c99b8ea3d852e-ca-certs\") pod \"kube-controller-manager-srv-8wmcq.gb1.brightbox.com\" (UID: \"19a595fcc2933c514b3c99b8ea3d852e\") " pod="kube-system/kube-controller-manager-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.140340 kubelet[2313]: I0304 01:18:23.140169 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/19a595fcc2933c514b3c99b8ea3d852e-flexvolume-dir\") pod \"kube-controller-manager-srv-8wmcq.gb1.brightbox.com\" (UID: \"19a595fcc2933c514b3c99b8ea3d852e\") " pod="kube-system/kube-controller-manager-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.140340 kubelet[2313]: I0304 01:18:23.140255 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/19a595fcc2933c514b3c99b8ea3d852e-k8s-certs\") pod \"kube-controller-manager-srv-8wmcq.gb1.brightbox.com\" (UID: \"19a595fcc2933c514b3c99b8ea3d852e\") " pod="kube-system/kube-controller-manager-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.140514 kubelet[2313]: I0304 01:18:23.140360 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/19a595fcc2933c514b3c99b8ea3d852e-kubeconfig\") pod \"kube-controller-manager-srv-8wmcq.gb1.brightbox.com\" (UID: \"19a595fcc2933c514b3c99b8ea3d852e\") " pod="kube-system/kube-controller-manager-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.140514 kubelet[2313]: I0304 01:18:23.140394 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/09fe44c20517f10f1a7f0b75277410d1-k8s-certs\") pod \"kube-apiserver-srv-8wmcq.gb1.brightbox.com\" (UID: \"09fe44c20517f10f1a7f0b75277410d1\") " pod="kube-system/kube-apiserver-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.140514 kubelet[2313]: I0304 01:18:23.140424 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/09fe44c20517f10f1a7f0b75277410d1-usr-share-ca-certificates\") pod \"kube-apiserver-srv-8wmcq.gb1.brightbox.com\" (UID: \"09fe44c20517f10f1a7f0b75277410d1\") " pod="kube-system/kube-apiserver-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.140514 kubelet[2313]: I0304 01:18:23.140455 2313 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/19a595fcc2933c514b3c99b8ea3d852e-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-8wmcq.gb1.brightbox.com\" (UID: \"19a595fcc2933c514b3c99b8ea3d852e\") " pod="kube-system/kube-controller-manager-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.254801 kubelet[2313]: I0304 01:18:23.254633 2313 kubelet_node_status.go:75] "Attempting to register node" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.255310 kubelet[2313]: E0304 01:18:23.255182 2313 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.243.77.214:6443/api/v1/nodes\": dial tcp 10.243.77.214:6443: connect: connection refused" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.321303 containerd[1516]: time="2026-03-04T01:18:23.320723684Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-8wmcq.gb1.brightbox.com,Uid:84ee84b9cbe9ea994f2b8d6c0ece6142,Namespace:kube-system,Attempt:0,}" Mar 4 01:18:23.331864 containerd[1516]: time="2026-03-04T01:18:23.331805928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-8wmcq.gb1.brightbox.com,Uid:09fe44c20517f10f1a7f0b75277410d1,Namespace:kube-system,Attempt:0,}" Mar 4 01:18:23.338104 containerd[1516]: time="2026-03-04T01:18:23.337695769Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-8wmcq.gb1.brightbox.com,Uid:19a595fcc2933c514b3c99b8ea3d852e,Namespace:kube-system,Attempt:0,}" Mar 4 01:18:23.442427 kubelet[2313]: E0304 01:18:23.442367 2313 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.243.77.214:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-8wmcq.gb1.brightbox.com?timeout=10s\": dial tcp 10.243.77.214:6443: connect: connection refused" interval="800ms" Mar 4 01:18:23.658380 kubelet[2313]: I0304 01:18:23.658300 2313 kubelet_node_status.go:75] "Attempting to register node" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.658827 kubelet[2313]: E0304 01:18:23.658782 2313 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.243.77.214:6443/api/v1/nodes\": dial tcp 10.243.77.214:6443: connect: connection refused" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:23.982866 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1503146409.mount: Deactivated successfully. Mar 4 01:18:23.988808 kubelet[2313]: E0304 01:18:23.988696 2313 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.243.77.214:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.243.77.214:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 4 01:18:23.995076 containerd[1516]: time="2026-03-04T01:18:23.993838937Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 4 01:18:23.995659 containerd[1516]: time="2026-03-04T01:18:23.995611508Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 4 01:18:23.996771 containerd[1516]: time="2026-03-04T01:18:23.996701734Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 4 01:18:23.996855 containerd[1516]: time="2026-03-04T01:18:23.996771671Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Mar 4 01:18:23.998731 containerd[1516]: time="2026-03-04T01:18:23.998670071Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 4 01:18:23.998821 containerd[1516]: time="2026-03-04T01:18:23.998775994Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 4 01:18:24.003575 containerd[1516]: time="2026-03-04T01:18:24.003538320Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 4 01:18:24.006594 containerd[1516]: time="2026-03-04T01:18:24.006551311Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 674.634472ms" Mar 4 01:18:24.008665 containerd[1516]: time="2026-03-04T01:18:24.008586855Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 4 01:18:24.011662 containerd[1516]: time="2026-03-04T01:18:24.011231225Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 690.283893ms" Mar 4 01:18:24.013340 containerd[1516]: time="2026-03-04T01:18:24.012589724Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 674.789151ms" Mar 4 01:18:24.064614 kubelet[2313]: E0304 01:18:24.064562 2313 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.243.77.214:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-8wmcq.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.243.77.214:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 4 01:18:24.244130 kubelet[2313]: E0304 01:18:24.243629 2313 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.243.77.214:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-8wmcq.gb1.brightbox.com?timeout=10s\": dial tcp 10.243.77.214:6443: connect: connection refused" interval="1.6s" Mar 4 01:18:24.283194 kubelet[2313]: E0304 01:18:24.283102 2313 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.243.77.214:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.243.77.214:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 4 01:18:24.341530 containerd[1516]: time="2026-03-04T01:18:24.341380490Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 4 01:18:24.342605 containerd[1516]: time="2026-03-04T01:18:24.342402722Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 4 01:18:24.342998 containerd[1516]: time="2026-03-04T01:18:24.342572631Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:18:24.344146 containerd[1516]: time="2026-03-04T01:18:24.342856571Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:18:24.345079 containerd[1516]: time="2026-03-04T01:18:24.344944217Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 4 01:18:24.345941 containerd[1516]: time="2026-03-04T01:18:24.345810431Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 4 01:18:24.348291 containerd[1516]: time="2026-03-04T01:18:24.347295611Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:18:24.348291 containerd[1516]: time="2026-03-04T01:18:24.347437373Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:18:24.355940 containerd[1516]: time="2026-03-04T01:18:24.355825137Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 4 01:18:24.355940 containerd[1516]: time="2026-03-04T01:18:24.355890069Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 4 01:18:24.356362 containerd[1516]: time="2026-03-04T01:18:24.355906831Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:18:24.356362 containerd[1516]: time="2026-03-04T01:18:24.356011015Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:18:24.392376 systemd[1]: Started cri-containerd-8b4ad46918d56b5fde491f35a60bdf0969604abf599d493b07311c9e4ff79281.scope - libcontainer container 8b4ad46918d56b5fde491f35a60bdf0969604abf599d493b07311c9e4ff79281. Mar 4 01:18:24.405230 systemd[1]: Started cri-containerd-f96652309224ac97e76d00ce9dfa69d74987f05eebb75e57d8834e7222534efd.scope - libcontainer container f96652309224ac97e76d00ce9dfa69d74987f05eebb75e57d8834e7222534efd. Mar 4 01:18:24.426358 systemd[1]: Started cri-containerd-c1d1b1e9f3aedf0942920fa6e7421dbef188feb475321d89471e6244acb5e27b.scope - libcontainer container c1d1b1e9f3aedf0942920fa6e7421dbef188feb475321d89471e6244acb5e27b. Mar 4 01:18:24.428734 kubelet[2313]: E0304 01:18:24.428423 2313 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.243.77.214:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.243.77.214:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 4 01:18:24.469965 kubelet[2313]: I0304 01:18:24.469889 2313 kubelet_node_status.go:75] "Attempting to register node" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:24.474520 kubelet[2313]: E0304 01:18:24.474369 2313 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.243.77.214:6443/api/v1/nodes\": dial tcp 10.243.77.214:6443: connect: connection refused" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:24.534765 containerd[1516]: time="2026-03-04T01:18:24.534537881Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-8wmcq.gb1.brightbox.com,Uid:19a595fcc2933c514b3c99b8ea3d852e,Namespace:kube-system,Attempt:0,} returns sandbox id \"8b4ad46918d56b5fde491f35a60bdf0969604abf599d493b07311c9e4ff79281\"" Mar 4 01:18:24.551753 containerd[1516]: time="2026-03-04T01:18:24.551408199Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-8wmcq.gb1.brightbox.com,Uid:09fe44c20517f10f1a7f0b75277410d1,Namespace:kube-system,Attempt:0,} returns sandbox id \"f96652309224ac97e76d00ce9dfa69d74987f05eebb75e57d8834e7222534efd\"" Mar 4 01:18:24.554818 containerd[1516]: time="2026-03-04T01:18:24.554031503Z" level=info msg="CreateContainer within sandbox \"8b4ad46918d56b5fde491f35a60bdf0969604abf599d493b07311c9e4ff79281\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 4 01:18:24.561149 containerd[1516]: time="2026-03-04T01:18:24.561097817Z" level=info msg="CreateContainer within sandbox \"f96652309224ac97e76d00ce9dfa69d74987f05eebb75e57d8834e7222534efd\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 4 01:18:24.567861 containerd[1516]: time="2026-03-04T01:18:24.567812578Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-8wmcq.gb1.brightbox.com,Uid:84ee84b9cbe9ea994f2b8d6c0ece6142,Namespace:kube-system,Attempt:0,} returns sandbox id \"c1d1b1e9f3aedf0942920fa6e7421dbef188feb475321d89471e6244acb5e27b\"" Mar 4 01:18:24.574823 containerd[1516]: time="2026-03-04T01:18:24.574756405Z" level=info msg="CreateContainer within sandbox \"c1d1b1e9f3aedf0942920fa6e7421dbef188feb475321d89471e6244acb5e27b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 4 01:18:24.588693 containerd[1516]: time="2026-03-04T01:18:24.588637694Z" level=info msg="CreateContainer within sandbox \"8b4ad46918d56b5fde491f35a60bdf0969604abf599d493b07311c9e4ff79281\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"b59cfdc6b55413a97f874643e606cdd991e344371354b2525fdfec9780d71be9\"" Mar 4 01:18:24.591432 containerd[1516]: time="2026-03-04T01:18:24.591384906Z" level=info msg="StartContainer for \"b59cfdc6b55413a97f874643e606cdd991e344371354b2525fdfec9780d71be9\"" Mar 4 01:18:24.599126 containerd[1516]: time="2026-03-04T01:18:24.598756253Z" level=info msg="CreateContainer within sandbox \"f96652309224ac97e76d00ce9dfa69d74987f05eebb75e57d8834e7222534efd\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"e0b6f2d5363096d59de6197e743b270824eb96a3357c67009fd7884fd4500623\"" Mar 4 01:18:24.603070 containerd[1516]: time="2026-03-04T01:18:24.602926842Z" level=info msg="CreateContainer within sandbox \"c1d1b1e9f3aedf0942920fa6e7421dbef188feb475321d89471e6244acb5e27b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c1f5e32312b8df665a0d1794b4851a33c1dad0ac686004fd5de425df932c1d6b\"" Mar 4 01:18:24.603189 containerd[1516]: time="2026-03-04T01:18:24.603099944Z" level=info msg="StartContainer for \"e0b6f2d5363096d59de6197e743b270824eb96a3357c67009fd7884fd4500623\"" Mar 4 01:18:24.605113 containerd[1516]: time="2026-03-04T01:18:24.603936345Z" level=info msg="StartContainer for \"c1f5e32312b8df665a0d1794b4851a33c1dad0ac686004fd5de425df932c1d6b\"" Mar 4 01:18:24.687564 systemd[1]: Started cri-containerd-b59cfdc6b55413a97f874643e606cdd991e344371354b2525fdfec9780d71be9.scope - libcontainer container b59cfdc6b55413a97f874643e606cdd991e344371354b2525fdfec9780d71be9. Mar 4 01:18:24.690888 systemd[1]: Started cri-containerd-c1f5e32312b8df665a0d1794b4851a33c1dad0ac686004fd5de425df932c1d6b.scope - libcontainer container c1f5e32312b8df665a0d1794b4851a33c1dad0ac686004fd5de425df932c1d6b. Mar 4 01:18:24.717609 systemd[1]: Started cri-containerd-e0b6f2d5363096d59de6197e743b270824eb96a3357c67009fd7884fd4500623.scope - libcontainer container e0b6f2d5363096d59de6197e743b270824eb96a3357c67009fd7884fd4500623. Mar 4 01:18:24.786783 containerd[1516]: time="2026-03-04T01:18:24.786511695Z" level=info msg="StartContainer for \"c1f5e32312b8df665a0d1794b4851a33c1dad0ac686004fd5de425df932c1d6b\" returns successfully" Mar 4 01:18:24.844113 containerd[1516]: time="2026-03-04T01:18:24.843937570Z" level=info msg="StartContainer for \"e0b6f2d5363096d59de6197e743b270824eb96a3357c67009fd7884fd4500623\" returns successfully" Mar 4 01:18:24.844113 containerd[1516]: time="2026-03-04T01:18:24.843937592Z" level=info msg="StartContainer for \"b59cfdc6b55413a97f874643e606cdd991e344371354b2525fdfec9780d71be9\" returns successfully" Mar 4 01:18:24.904729 kubelet[2313]: E0304 01:18:24.904441 2313 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-8wmcq.gb1.brightbox.com\" not found" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:24.906205 kubelet[2313]: E0304 01:18:24.906157 2313 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-8wmcq.gb1.brightbox.com\" not found" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:24.908546 kubelet[2313]: E0304 01:18:24.908491 2313 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-8wmcq.gb1.brightbox.com\" not found" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:24.933354 kubelet[2313]: E0304 01:18:24.933271 2313 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.243.77.214:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.243.77.214:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 4 01:18:25.911889 kubelet[2313]: E0304 01:18:25.911848 2313 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-8wmcq.gb1.brightbox.com\" not found" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:25.912426 kubelet[2313]: E0304 01:18:25.912377 2313 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-8wmcq.gb1.brightbox.com\" not found" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:26.079862 kubelet[2313]: I0304 01:18:26.079823 2313 kubelet_node_status.go:75] "Attempting to register node" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:27.756969 kubelet[2313]: E0304 01:18:27.756790 2313 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-8wmcq.gb1.brightbox.com\" not found" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:27.807339 kubelet[2313]: I0304 01:18:27.807292 2313 apiserver.go:52] "Watching apiserver" Mar 4 01:18:27.839347 kubelet[2313]: I0304 01:18:27.839289 2313 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 4 01:18:27.993250 kubelet[2313]: I0304 01:18:27.993169 2313 kubelet_node_status.go:78] "Successfully registered node" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:27.993250 kubelet[2313]: E0304 01:18:27.993260 2313 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"srv-8wmcq.gb1.brightbox.com\": node \"srv-8wmcq.gb1.brightbox.com\" not found" Mar 4 01:18:28.039799 kubelet[2313]: I0304 01:18:28.039523 2313 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:28.059488 kubelet[2313]: E0304 01:18:28.057824 2313 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-8wmcq.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:28.059488 kubelet[2313]: I0304 01:18:28.057919 2313 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:28.064663 kubelet[2313]: E0304 01:18:28.064362 2313 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-8wmcq.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:28.064663 kubelet[2313]: I0304 01:18:28.064403 2313 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:28.067422 kubelet[2313]: E0304 01:18:28.067377 2313 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-8wmcq.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:29.643093 kubelet[2313]: I0304 01:18:29.642653 2313 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:29.657508 kubelet[2313]: I0304 01:18:29.655698 2313 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 4 01:18:30.811945 systemd[1]: Reloading requested from client PID 2598 ('systemctl') (unit session-11.scope)... Mar 4 01:18:30.812084 systemd[1]: Reloading... Mar 4 01:18:30.964100 zram_generator::config[2638]: No configuration found. Mar 4 01:18:31.154236 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 4 01:18:31.280785 kubelet[2313]: I0304 01:18:31.280724 2313 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:31.288678 systemd[1]: Reloading finished in 475 ms. Mar 4 01:18:31.291747 kubelet[2313]: I0304 01:18:31.291686 2313 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 4 01:18:31.359641 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 01:18:31.375127 systemd[1]: kubelet.service: Deactivated successfully. Mar 4 01:18:31.375582 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 01:18:31.375726 systemd[1]: kubelet.service: Consumed 1.624s CPU time, 123.0M memory peak, 0B memory swap peak. Mar 4 01:18:31.386568 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 4 01:18:31.736137 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 4 01:18:31.747572 (kubelet)[2701]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 4 01:18:31.855082 kubelet[2701]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 4 01:18:31.855082 kubelet[2701]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 4 01:18:31.855082 kubelet[2701]: I0304 01:18:31.854374 2701 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 4 01:18:31.872820 kubelet[2701]: I0304 01:18:31.872751 2701 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 4 01:18:31.872820 kubelet[2701]: I0304 01:18:31.872787 2701 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 4 01:18:31.873042 kubelet[2701]: I0304 01:18:31.872843 2701 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 4 01:18:31.873042 kubelet[2701]: I0304 01:18:31.872861 2701 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 4 01:18:31.875889 kubelet[2701]: I0304 01:18:31.873581 2701 server.go:956] "Client rotation is on, will bootstrap in background" Mar 4 01:18:31.879281 kubelet[2701]: I0304 01:18:31.878581 2701 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 4 01:18:31.890305 kubelet[2701]: I0304 01:18:31.889283 2701 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 4 01:18:31.895110 kubelet[2701]: E0304 01:18:31.895010 2701 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 4 01:18:31.895269 kubelet[2701]: I0304 01:18:31.895118 2701 server.go:1400] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Mar 4 01:18:31.901710 kubelet[2701]: I0304 01:18:31.901393 2701 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 4 01:18:31.902287 kubelet[2701]: I0304 01:18:31.902160 2701 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 4 01:18:31.903321 kubelet[2701]: I0304 01:18:31.902256 2701 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-8wmcq.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 4 01:18:31.903321 kubelet[2701]: I0304 01:18:31.902577 2701 topology_manager.go:138] "Creating topology manager with none policy" Mar 4 01:18:31.903321 kubelet[2701]: I0304 01:18:31.902594 2701 container_manager_linux.go:306] "Creating device plugin manager" Mar 4 01:18:31.903321 kubelet[2701]: I0304 01:18:31.902658 2701 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 4 01:18:31.903321 kubelet[2701]: I0304 01:18:31.903097 2701 state_mem.go:36] "Initialized new in-memory state store" Mar 4 01:18:31.903764 kubelet[2701]: I0304 01:18:31.903446 2701 kubelet.go:475] "Attempting to sync node with API server" Mar 4 01:18:31.904347 kubelet[2701]: I0304 01:18:31.904311 2701 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 4 01:18:31.904421 kubelet[2701]: I0304 01:18:31.904393 2701 kubelet.go:387] "Adding apiserver pod source" Mar 4 01:18:31.904481 kubelet[2701]: I0304 01:18:31.904422 2701 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 4 01:18:31.914106 kubelet[2701]: I0304 01:18:31.911397 2701 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 4 01:18:31.919090 kubelet[2701]: I0304 01:18:31.916344 2701 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 4 01:18:31.919090 kubelet[2701]: I0304 01:18:31.916398 2701 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 4 01:18:31.950225 kubelet[2701]: I0304 01:18:31.949773 2701 server.go:1262] "Started kubelet" Mar 4 01:18:31.958318 kubelet[2701]: I0304 01:18:31.957695 2701 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 4 01:18:31.966908 kubelet[2701]: I0304 01:18:31.966686 2701 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 4 01:18:31.969189 kubelet[2701]: I0304 01:18:31.968153 2701 server.go:310] "Adding debug handlers to kubelet server" Mar 4 01:18:31.982860 kubelet[2701]: I0304 01:18:31.981899 2701 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 4 01:18:31.986335 kubelet[2701]: I0304 01:18:31.968464 2701 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 4 01:18:31.986335 kubelet[2701]: I0304 01:18:31.986272 2701 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 4 01:18:31.986735 kubelet[2701]: I0304 01:18:31.986628 2701 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 4 01:18:31.989071 kubelet[2701]: I0304 01:18:31.988779 2701 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 4 01:18:31.997662 kubelet[2701]: I0304 01:18:31.997457 2701 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 4 01:18:32.007072 kubelet[2701]: I0304 01:18:32.005392 2701 reconciler.go:29] "Reconciler: start to sync state" Mar 4 01:18:32.007950 kubelet[2701]: I0304 01:18:32.007713 2701 factory.go:223] Registration of the systemd container factory successfully Mar 4 01:18:32.009070 kubelet[2701]: I0304 01:18:32.008202 2701 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 4 01:18:32.021421 kubelet[2701]: E0304 01:18:32.021293 2701 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 4 01:18:32.026952 kubelet[2701]: I0304 01:18:32.025036 2701 factory.go:223] Registration of the containerd container factory successfully Mar 4 01:18:32.033452 kubelet[2701]: I0304 01:18:32.033402 2701 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 4 01:18:32.038593 kubelet[2701]: I0304 01:18:32.038560 2701 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 4 01:18:32.038792 kubelet[2701]: I0304 01:18:32.038771 2701 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 4 01:18:32.038922 kubelet[2701]: I0304 01:18:32.038903 2701 kubelet.go:2428] "Starting kubelet main sync loop" Mar 4 01:18:32.039124 kubelet[2701]: E0304 01:18:32.039095 2701 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 4 01:18:32.139744 kubelet[2701]: E0304 01:18:32.139365 2701 kubelet.go:2452] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 4 01:18:32.188610 kubelet[2701]: I0304 01:18:32.187817 2701 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 4 01:18:32.188610 kubelet[2701]: I0304 01:18:32.187853 2701 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 4 01:18:32.188610 kubelet[2701]: I0304 01:18:32.187887 2701 state_mem.go:36] "Initialized new in-memory state store" Mar 4 01:18:32.188610 kubelet[2701]: I0304 01:18:32.188164 2701 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 4 01:18:32.188610 kubelet[2701]: I0304 01:18:32.188183 2701 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 4 01:18:32.188610 kubelet[2701]: I0304 01:18:32.188222 2701 policy_none.go:49] "None policy: Start" Mar 4 01:18:32.188610 kubelet[2701]: I0304 01:18:32.188271 2701 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 4 01:18:32.188610 kubelet[2701]: I0304 01:18:32.188305 2701 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 4 01:18:32.188610 kubelet[2701]: I0304 01:18:32.188491 2701 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 4 01:18:32.188610 kubelet[2701]: I0304 01:18:32.188514 2701 policy_none.go:47] "Start" Mar 4 01:18:32.220036 kubelet[2701]: E0304 01:18:32.215559 2701 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 4 01:18:32.220036 kubelet[2701]: I0304 01:18:32.215876 2701 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 4 01:18:32.220036 kubelet[2701]: I0304 01:18:32.215905 2701 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 4 01:18:32.220036 kubelet[2701]: I0304 01:18:32.216722 2701 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 4 01:18:32.228885 kubelet[2701]: E0304 01:18:32.228843 2701 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 4 01:18:32.341239 kubelet[2701]: I0304 01:18:32.341158 2701 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.342539 kubelet[2701]: I0304 01:18:32.341794 2701 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.347333 kubelet[2701]: I0304 01:18:32.347310 2701 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.354189 kubelet[2701]: I0304 01:18:32.354161 2701 kubelet_node_status.go:75] "Attempting to register node" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.362897 kubelet[2701]: I0304 01:18:32.362751 2701 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 4 01:18:32.366860 kubelet[2701]: I0304 01:18:32.366535 2701 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 4 01:18:32.366860 kubelet[2701]: E0304 01:18:32.366595 2701 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-8wmcq.gb1.brightbox.com\" already exists" pod="kube-system/kube-scheduler-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.368474 kubelet[2701]: I0304 01:18:32.368125 2701 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 4 01:18:32.368474 kubelet[2701]: E0304 01:18:32.368293 2701 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-8wmcq.gb1.brightbox.com\" already exists" pod="kube-system/kube-controller-manager-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.374073 kubelet[2701]: I0304 01:18:32.373475 2701 kubelet_node_status.go:124] "Node was previously registered" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.374073 kubelet[2701]: I0304 01:18:32.373583 2701 kubelet_node_status.go:78] "Successfully registered node" node="srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.410848 kubelet[2701]: I0304 01:18:32.410790 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/09fe44c20517f10f1a7f0b75277410d1-ca-certs\") pod \"kube-apiserver-srv-8wmcq.gb1.brightbox.com\" (UID: \"09fe44c20517f10f1a7f0b75277410d1\") " pod="kube-system/kube-apiserver-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.411333 kubelet[2701]: I0304 01:18:32.411117 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/09fe44c20517f10f1a7f0b75277410d1-usr-share-ca-certificates\") pod \"kube-apiserver-srv-8wmcq.gb1.brightbox.com\" (UID: \"09fe44c20517f10f1a7f0b75277410d1\") " pod="kube-system/kube-apiserver-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.411333 kubelet[2701]: I0304 01:18:32.411204 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/19a595fcc2933c514b3c99b8ea3d852e-flexvolume-dir\") pod \"kube-controller-manager-srv-8wmcq.gb1.brightbox.com\" (UID: \"19a595fcc2933c514b3c99b8ea3d852e\") " pod="kube-system/kube-controller-manager-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.411333 kubelet[2701]: I0304 01:18:32.411237 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/19a595fcc2933c514b3c99b8ea3d852e-k8s-certs\") pod \"kube-controller-manager-srv-8wmcq.gb1.brightbox.com\" (UID: \"19a595fcc2933c514b3c99b8ea3d852e\") " pod="kube-system/kube-controller-manager-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.412260 kubelet[2701]: I0304 01:18:32.411307 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/84ee84b9cbe9ea994f2b8d6c0ece6142-kubeconfig\") pod \"kube-scheduler-srv-8wmcq.gb1.brightbox.com\" (UID: \"84ee84b9cbe9ea994f2b8d6c0ece6142\") " pod="kube-system/kube-scheduler-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.412260 kubelet[2701]: I0304 01:18:32.412118 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/09fe44c20517f10f1a7f0b75277410d1-k8s-certs\") pod \"kube-apiserver-srv-8wmcq.gb1.brightbox.com\" (UID: \"09fe44c20517f10f1a7f0b75277410d1\") " pod="kube-system/kube-apiserver-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.412260 kubelet[2701]: I0304 01:18:32.412206 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/19a595fcc2933c514b3c99b8ea3d852e-ca-certs\") pod \"kube-controller-manager-srv-8wmcq.gb1.brightbox.com\" (UID: \"19a595fcc2933c514b3c99b8ea3d852e\") " pod="kube-system/kube-controller-manager-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.414170 kubelet[2701]: I0304 01:18:32.413953 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/19a595fcc2933c514b3c99b8ea3d852e-kubeconfig\") pod \"kube-controller-manager-srv-8wmcq.gb1.brightbox.com\" (UID: \"19a595fcc2933c514b3c99b8ea3d852e\") " pod="kube-system/kube-controller-manager-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.414170 kubelet[2701]: I0304 01:18:32.414118 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/19a595fcc2933c514b3c99b8ea3d852e-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-8wmcq.gb1.brightbox.com\" (UID: \"19a595fcc2933c514b3c99b8ea3d852e\") " pod="kube-system/kube-controller-manager-srv-8wmcq.gb1.brightbox.com" Mar 4 01:18:32.906472 kubelet[2701]: I0304 01:18:32.906401 2701 apiserver.go:52] "Watching apiserver" Mar 4 01:18:32.998841 kubelet[2701]: I0304 01:18:32.998791 2701 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 4 01:18:33.028852 kubelet[2701]: I0304 01:18:33.028701 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-8wmcq.gb1.brightbox.com" podStartSLOduration=4.028587653 podStartE2EDuration="4.028587653s" podCreationTimestamp="2026-03-04 01:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-04 01:18:33.010214426 +0000 UTC m=+1.245474659" watchObservedRunningTime="2026-03-04 01:18:33.028587653 +0000 UTC m=+1.263847875" Mar 4 01:18:33.041704 kubelet[2701]: I0304 01:18:33.041619 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-8wmcq.gb1.brightbox.com" podStartSLOduration=1.041247012 podStartE2EDuration="1.041247012s" podCreationTimestamp="2026-03-04 01:18:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-04 01:18:33.030275675 +0000 UTC m=+1.265535897" watchObservedRunningTime="2026-03-04 01:18:33.041247012 +0000 UTC m=+1.276507230" Mar 4 01:18:33.058639 kubelet[2701]: I0304 01:18:33.058561 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-8wmcq.gb1.brightbox.com" podStartSLOduration=2.058535877 podStartE2EDuration="2.058535877s" podCreationTimestamp="2026-03-04 01:18:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-04 01:18:33.042504848 +0000 UTC m=+1.277765062" watchObservedRunningTime="2026-03-04 01:18:33.058535877 +0000 UTC m=+1.293796099" Mar 4 01:18:36.490285 kubelet[2701]: I0304 01:18:36.490228 2701 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 4 01:18:36.492445 containerd[1516]: time="2026-03-04T01:18:36.492356706Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 4 01:18:36.494176 kubelet[2701]: I0304 01:18:36.493266 2701 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 4 01:18:37.509999 systemd[1]: Created slice kubepods-besteffort-pod8302587f_5694_4a22_b3b6_9945162998df.slice - libcontainer container kubepods-besteffort-pod8302587f_5694_4a22_b3b6_9945162998df.slice. Mar 4 01:18:37.540667 kubelet[2701]: I0304 01:18:37.540598 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twdn8\" (UniqueName: \"kubernetes.io/projected/8302587f-5694-4a22-b3b6-9945162998df-kube-api-access-twdn8\") pod \"kube-proxy-tjpv4\" (UID: \"8302587f-5694-4a22-b3b6-9945162998df\") " pod="kube-system/kube-proxy-tjpv4" Mar 4 01:18:37.540667 kubelet[2701]: I0304 01:18:37.540669 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/8302587f-5694-4a22-b3b6-9945162998df-kube-proxy\") pod \"kube-proxy-tjpv4\" (UID: \"8302587f-5694-4a22-b3b6-9945162998df\") " pod="kube-system/kube-proxy-tjpv4" Mar 4 01:18:37.542003 kubelet[2701]: I0304 01:18:37.540719 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8302587f-5694-4a22-b3b6-9945162998df-xtables-lock\") pod \"kube-proxy-tjpv4\" (UID: \"8302587f-5694-4a22-b3b6-9945162998df\") " pod="kube-system/kube-proxy-tjpv4" Mar 4 01:18:37.542003 kubelet[2701]: I0304 01:18:37.540759 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8302587f-5694-4a22-b3b6-9945162998df-lib-modules\") pod \"kube-proxy-tjpv4\" (UID: \"8302587f-5694-4a22-b3b6-9945162998df\") " pod="kube-system/kube-proxy-tjpv4" Mar 4 01:18:37.770854 systemd[1]: Created slice kubepods-besteffort-podfe1c0a7d_cd11_49ec_820d_4477ef3ef2f4.slice - libcontainer container kubepods-besteffort-podfe1c0a7d_cd11_49ec_820d_4477ef3ef2f4.slice. Mar 4 01:18:37.833760 containerd[1516]: time="2026-03-04T01:18:37.833643606Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-tjpv4,Uid:8302587f-5694-4a22-b3b6-9945162998df,Namespace:kube-system,Attempt:0,}" Mar 4 01:18:37.842851 kubelet[2701]: I0304 01:18:37.842796 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8k25\" (UniqueName: \"kubernetes.io/projected/fe1c0a7d-cd11-49ec-820d-4477ef3ef2f4-kube-api-access-r8k25\") pod \"tigera-operator-5588576f44-wmflv\" (UID: \"fe1c0a7d-cd11-49ec-820d-4477ef3ef2f4\") " pod="tigera-operator/tigera-operator-5588576f44-wmflv" Mar 4 01:18:37.842980 kubelet[2701]: I0304 01:18:37.842864 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fe1c0a7d-cd11-49ec-820d-4477ef3ef2f4-var-lib-calico\") pod \"tigera-operator-5588576f44-wmflv\" (UID: \"fe1c0a7d-cd11-49ec-820d-4477ef3ef2f4\") " pod="tigera-operator/tigera-operator-5588576f44-wmflv" Mar 4 01:18:37.879350 containerd[1516]: time="2026-03-04T01:18:37.878681432Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 4 01:18:37.879350 containerd[1516]: time="2026-03-04T01:18:37.878864963Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 4 01:18:37.879350 containerd[1516]: time="2026-03-04T01:18:37.878884089Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:18:37.879350 containerd[1516]: time="2026-03-04T01:18:37.879112425Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:18:37.923320 systemd[1]: Started cri-containerd-ecf90699b175729696e5137e800f75d6bc371bcd7894d77ca5d766a4862b0617.scope - libcontainer container ecf90699b175729696e5137e800f75d6bc371bcd7894d77ca5d766a4862b0617. Mar 4 01:18:37.981085 containerd[1516]: time="2026-03-04T01:18:37.980901209Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-tjpv4,Uid:8302587f-5694-4a22-b3b6-9945162998df,Namespace:kube-system,Attempt:0,} returns sandbox id \"ecf90699b175729696e5137e800f75d6bc371bcd7894d77ca5d766a4862b0617\"" Mar 4 01:18:37.992623 containerd[1516]: time="2026-03-04T01:18:37.992356979Z" level=info msg="CreateContainer within sandbox \"ecf90699b175729696e5137e800f75d6bc371bcd7894d77ca5d766a4862b0617\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 4 01:18:38.014575 containerd[1516]: time="2026-03-04T01:18:38.014507476Z" level=info msg="CreateContainer within sandbox \"ecf90699b175729696e5137e800f75d6bc371bcd7894d77ca5d766a4862b0617\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"ade77f482f1eb7c7bfd59e498b2abb935e785b51ce77a8d70a3be0a10f1e2eeb\"" Mar 4 01:18:38.016085 containerd[1516]: time="2026-03-04T01:18:38.015630519Z" level=info msg="StartContainer for \"ade77f482f1eb7c7bfd59e498b2abb935e785b51ce77a8d70a3be0a10f1e2eeb\"" Mar 4 01:18:38.059372 systemd[1]: Started cri-containerd-ade77f482f1eb7c7bfd59e498b2abb935e785b51ce77a8d70a3be0a10f1e2eeb.scope - libcontainer container ade77f482f1eb7c7bfd59e498b2abb935e785b51ce77a8d70a3be0a10f1e2eeb. Mar 4 01:18:38.080579 containerd[1516]: time="2026-03-04T01:18:38.080190288Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-wmflv,Uid:fe1c0a7d-cd11-49ec-820d-4477ef3ef2f4,Namespace:tigera-operator,Attempt:0,}" Mar 4 01:18:38.123174 containerd[1516]: time="2026-03-04T01:18:38.123121049Z" level=info msg="StartContainer for \"ade77f482f1eb7c7bfd59e498b2abb935e785b51ce77a8d70a3be0a10f1e2eeb\" returns successfully" Mar 4 01:18:38.136897 containerd[1516]: time="2026-03-04T01:18:38.135403187Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 4 01:18:38.136897 containerd[1516]: time="2026-03-04T01:18:38.135533221Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 4 01:18:38.136897 containerd[1516]: time="2026-03-04T01:18:38.135553259Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:18:38.136897 containerd[1516]: time="2026-03-04T01:18:38.135705001Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:18:38.198299 systemd[1]: Started cri-containerd-9b7e96f8b8dae6be7c4c75bce561ba62dbd1d0c0150e8609c863ab736ae0330b.scope - libcontainer container 9b7e96f8b8dae6be7c4c75bce561ba62dbd1d0c0150e8609c863ab736ae0330b. Mar 4 01:18:38.220733 kubelet[2701]: I0304 01:18:38.220471 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-tjpv4" podStartSLOduration=1.219659257 podStartE2EDuration="1.219659257s" podCreationTimestamp="2026-03-04 01:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-04 01:18:38.199256686 +0000 UTC m=+6.434516923" watchObservedRunningTime="2026-03-04 01:18:38.219659257 +0000 UTC m=+6.454919478" Mar 4 01:18:38.295103 containerd[1516]: time="2026-03-04T01:18:38.294318878Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-wmflv,Uid:fe1c0a7d-cd11-49ec-820d-4477ef3ef2f4,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"9b7e96f8b8dae6be7c4c75bce561ba62dbd1d0c0150e8609c863ab736ae0330b\"" Mar 4 01:18:38.300067 containerd[1516]: time="2026-03-04T01:18:38.300013211Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 4 01:18:41.266651 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1806358986.mount: Deactivated successfully. Mar 4 01:18:43.527889 containerd[1516]: time="2026-03-04T01:18:43.526070964Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:43.527889 containerd[1516]: time="2026-03-04T01:18:43.527412137Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Mar 4 01:18:43.527889 containerd[1516]: time="2026-03-04T01:18:43.527797902Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:43.534991 containerd[1516]: time="2026-03-04T01:18:43.534910570Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:18:43.537519 containerd[1516]: time="2026-03-04T01:18:43.537028725Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 5.236776158s" Mar 4 01:18:43.537519 containerd[1516]: time="2026-03-04T01:18:43.537201566Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Mar 4 01:18:43.552787 containerd[1516]: time="2026-03-04T01:18:43.552736861Z" level=info msg="CreateContainer within sandbox \"9b7e96f8b8dae6be7c4c75bce561ba62dbd1d0c0150e8609c863ab736ae0330b\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 4 01:18:43.571196 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2878341638.mount: Deactivated successfully. Mar 4 01:18:43.573278 containerd[1516]: time="2026-03-04T01:18:43.573113197Z" level=info msg="CreateContainer within sandbox \"9b7e96f8b8dae6be7c4c75bce561ba62dbd1d0c0150e8609c863ab736ae0330b\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"ed9d1c3733b9b732e4422ca13ee19f161e8a8fb03e2ce130aa1de0722147c07b\"" Mar 4 01:18:43.574258 containerd[1516]: time="2026-03-04T01:18:43.574191384Z" level=info msg="StartContainer for \"ed9d1c3733b9b732e4422ca13ee19f161e8a8fb03e2ce130aa1de0722147c07b\"" Mar 4 01:18:43.632815 systemd[1]: run-containerd-runc-k8s.io-ed9d1c3733b9b732e4422ca13ee19f161e8a8fb03e2ce130aa1de0722147c07b-runc.C1XfQz.mount: Deactivated successfully. Mar 4 01:18:43.649472 systemd[1]: Started cri-containerd-ed9d1c3733b9b732e4422ca13ee19f161e8a8fb03e2ce130aa1de0722147c07b.scope - libcontainer container ed9d1c3733b9b732e4422ca13ee19f161e8a8fb03e2ce130aa1de0722147c07b. Mar 4 01:18:43.699899 containerd[1516]: time="2026-03-04T01:18:43.699854105Z" level=info msg="StartContainer for \"ed9d1c3733b9b732e4422ca13ee19f161e8a8fb03e2ce130aa1de0722147c07b\" returns successfully" Mar 4 01:18:44.221509 kubelet[2701]: I0304 01:18:44.221082 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5588576f44-wmflv" podStartSLOduration=1.9657661100000001 podStartE2EDuration="7.208083282s" podCreationTimestamp="2026-03-04 01:18:37 +0000 UTC" firstStartedPulling="2026-03-04 01:18:38.298542988 +0000 UTC m=+6.533803194" lastFinishedPulling="2026-03-04 01:18:43.540860152 +0000 UTC m=+11.776120366" observedRunningTime="2026-03-04 01:18:44.207429198 +0000 UTC m=+12.442689450" watchObservedRunningTime="2026-03-04 01:18:44.208083282 +0000 UTC m=+12.443343519" Mar 4 01:18:47.820512 systemd[1]: cri-containerd-ed9d1c3733b9b732e4422ca13ee19f161e8a8fb03e2ce130aa1de0722147c07b.scope: Deactivated successfully. Mar 4 01:18:47.872126 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ed9d1c3733b9b732e4422ca13ee19f161e8a8fb03e2ce130aa1de0722147c07b-rootfs.mount: Deactivated successfully. Mar 4 01:18:48.108188 containerd[1516]: time="2026-03-04T01:18:48.094528114Z" level=info msg="shim disconnected" id=ed9d1c3733b9b732e4422ca13ee19f161e8a8fb03e2ce130aa1de0722147c07b namespace=k8s.io Mar 4 01:18:48.108188 containerd[1516]: time="2026-03-04T01:18:48.107783149Z" level=warning msg="cleaning up after shim disconnected" id=ed9d1c3733b9b732e4422ca13ee19f161e8a8fb03e2ce130aa1de0722147c07b namespace=k8s.io Mar 4 01:18:48.109809 containerd[1516]: time="2026-03-04T01:18:48.109347757Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 4 01:18:48.207406 kubelet[2701]: I0304 01:18:48.207341 2701 scope.go:117] "RemoveContainer" containerID="ed9d1c3733b9b732e4422ca13ee19f161e8a8fb03e2ce130aa1de0722147c07b" Mar 4 01:18:48.213341 containerd[1516]: time="2026-03-04T01:18:48.213010793Z" level=info msg="CreateContainer within sandbox \"9b7e96f8b8dae6be7c4c75bce561ba62dbd1d0c0150e8609c863ab736ae0330b\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 4 01:18:48.251549 containerd[1516]: time="2026-03-04T01:18:48.251488387Z" level=info msg="CreateContainer within sandbox \"9b7e96f8b8dae6be7c4c75bce561ba62dbd1d0c0150e8609c863ab736ae0330b\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"754b857c02a96746bd38012d796fead37d779b9f4be1ad3a503406c287ce0b40\"" Mar 4 01:18:48.252513 containerd[1516]: time="2026-03-04T01:18:48.252474266Z" level=info msg="StartContainer for \"754b857c02a96746bd38012d796fead37d779b9f4be1ad3a503406c287ce0b40\"" Mar 4 01:18:48.321256 systemd[1]: Started cri-containerd-754b857c02a96746bd38012d796fead37d779b9f4be1ad3a503406c287ce0b40.scope - libcontainer container 754b857c02a96746bd38012d796fead37d779b9f4be1ad3a503406c287ce0b40. Mar 4 01:18:48.482100 containerd[1516]: time="2026-03-04T01:18:48.480472268Z" level=info msg="StartContainer for \"754b857c02a96746bd38012d796fead37d779b9f4be1ad3a503406c287ce0b40\" returns successfully" Mar 4 01:18:51.353198 sudo[1781]: pam_unix(sudo:session): session closed for user root Mar 4 01:18:51.453556 sshd[1764]: pam_unix(sshd:session): session closed for user core Mar 4 01:18:51.462894 systemd[1]: sshd@8-10.243.77.214:22-20.161.92.111:34356.service: Deactivated successfully. Mar 4 01:18:51.467690 systemd[1]: session-11.scope: Deactivated successfully. Mar 4 01:18:51.468103 systemd[1]: session-11.scope: Consumed 7.684s CPU time, 158.3M memory peak, 0B memory swap peak. Mar 4 01:18:51.469310 systemd-logind[1492]: Session 11 logged out. Waiting for processes to exit. Mar 4 01:18:51.472014 systemd-logind[1492]: Removed session 11. Mar 4 01:18:56.938173 systemd[1]: Created slice kubepods-besteffort-pod9b8cb62c_6c94_477a_8e1b_7ec0c712260c.slice - libcontainer container kubepods-besteffort-pod9b8cb62c_6c94_477a_8e1b_7ec0c712260c.slice. Mar 4 01:18:56.993393 kubelet[2701]: I0304 01:18:56.993015 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/9b8cb62c-6c94-477a-8e1b-7ec0c712260c-typha-certs\") pod \"calico-typha-6b7566967d-gwrf7\" (UID: \"9b8cb62c-6c94-477a-8e1b-7ec0c712260c\") " pod="calico-system/calico-typha-6b7566967d-gwrf7" Mar 4 01:18:56.993393 kubelet[2701]: I0304 01:18:56.993145 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdh7h\" (UniqueName: \"kubernetes.io/projected/9b8cb62c-6c94-477a-8e1b-7ec0c712260c-kube-api-access-qdh7h\") pod \"calico-typha-6b7566967d-gwrf7\" (UID: \"9b8cb62c-6c94-477a-8e1b-7ec0c712260c\") " pod="calico-system/calico-typha-6b7566967d-gwrf7" Mar 4 01:18:56.993393 kubelet[2701]: I0304 01:18:56.993203 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8cb62c-6c94-477a-8e1b-7ec0c712260c-tigera-ca-bundle\") pod \"calico-typha-6b7566967d-gwrf7\" (UID: \"9b8cb62c-6c94-477a-8e1b-7ec0c712260c\") " pod="calico-system/calico-typha-6b7566967d-gwrf7" Mar 4 01:18:57.115894 systemd[1]: Created slice kubepods-besteffort-pod656dc4cf_f7f0_4acf_a15b_36d0d2711510.slice - libcontainer container kubepods-besteffort-pod656dc4cf_f7f0_4acf_a15b_36d0d2711510.slice. Mar 4 01:18:57.197100 kubelet[2701]: I0304 01:18:57.195921 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/656dc4cf-f7f0-4acf-a15b-36d0d2711510-flexvol-driver-host\") pod \"calico-node-5rp7s\" (UID: \"656dc4cf-f7f0-4acf-a15b-36d0d2711510\") " pod="calico-system/calico-node-5rp7s" Mar 4 01:18:57.197100 kubelet[2701]: I0304 01:18:57.196003 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/656dc4cf-f7f0-4acf-a15b-36d0d2711510-xtables-lock\") pod \"calico-node-5rp7s\" (UID: \"656dc4cf-f7f0-4acf-a15b-36d0d2711510\") " pod="calico-system/calico-node-5rp7s" Mar 4 01:18:57.197100 kubelet[2701]: I0304 01:18:57.196068 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/656dc4cf-f7f0-4acf-a15b-36d0d2711510-bpffs\") pod \"calico-node-5rp7s\" (UID: \"656dc4cf-f7f0-4acf-a15b-36d0d2711510\") " pod="calico-system/calico-node-5rp7s" Mar 4 01:18:57.197100 kubelet[2701]: I0304 01:18:57.196122 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/656dc4cf-f7f0-4acf-a15b-36d0d2711510-cni-bin-dir\") pod \"calico-node-5rp7s\" (UID: \"656dc4cf-f7f0-4acf-a15b-36d0d2711510\") " pod="calico-system/calico-node-5rp7s" Mar 4 01:18:57.197100 kubelet[2701]: I0304 01:18:57.196160 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/656dc4cf-f7f0-4acf-a15b-36d0d2711510-lib-modules\") pod \"calico-node-5rp7s\" (UID: \"656dc4cf-f7f0-4acf-a15b-36d0d2711510\") " pod="calico-system/calico-node-5rp7s" Mar 4 01:18:57.197505 kubelet[2701]: I0304 01:18:57.196195 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/656dc4cf-f7f0-4acf-a15b-36d0d2711510-cni-net-dir\") pod \"calico-node-5rp7s\" (UID: \"656dc4cf-f7f0-4acf-a15b-36d0d2711510\") " pod="calico-system/calico-node-5rp7s" Mar 4 01:18:57.197505 kubelet[2701]: I0304 01:18:57.196235 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/656dc4cf-f7f0-4acf-a15b-36d0d2711510-var-lib-calico\") pod \"calico-node-5rp7s\" (UID: \"656dc4cf-f7f0-4acf-a15b-36d0d2711510\") " pod="calico-system/calico-node-5rp7s" Mar 4 01:18:57.197505 kubelet[2701]: I0304 01:18:57.196277 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/656dc4cf-f7f0-4acf-a15b-36d0d2711510-nodeproc\") pod \"calico-node-5rp7s\" (UID: \"656dc4cf-f7f0-4acf-a15b-36d0d2711510\") " pod="calico-system/calico-node-5rp7s" Mar 4 01:18:57.197505 kubelet[2701]: I0304 01:18:57.196309 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/656dc4cf-f7f0-4acf-a15b-36d0d2711510-sys-fs\") pod \"calico-node-5rp7s\" (UID: \"656dc4cf-f7f0-4acf-a15b-36d0d2711510\") " pod="calico-system/calico-node-5rp7s" Mar 4 01:18:57.197505 kubelet[2701]: I0304 01:18:57.196352 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/656dc4cf-f7f0-4acf-a15b-36d0d2711510-var-run-calico\") pod \"calico-node-5rp7s\" (UID: \"656dc4cf-f7f0-4acf-a15b-36d0d2711510\") " pod="calico-system/calico-node-5rp7s" Mar 4 01:18:57.197778 kubelet[2701]: I0304 01:18:57.196459 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/656dc4cf-f7f0-4acf-a15b-36d0d2711510-policysync\") pod \"calico-node-5rp7s\" (UID: \"656dc4cf-f7f0-4acf-a15b-36d0d2711510\") " pod="calico-system/calico-node-5rp7s" Mar 4 01:18:57.197778 kubelet[2701]: I0304 01:18:57.196522 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/656dc4cf-f7f0-4acf-a15b-36d0d2711510-cni-log-dir\") pod \"calico-node-5rp7s\" (UID: \"656dc4cf-f7f0-4acf-a15b-36d0d2711510\") " pod="calico-system/calico-node-5rp7s" Mar 4 01:18:57.197778 kubelet[2701]: I0304 01:18:57.196551 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/656dc4cf-f7f0-4acf-a15b-36d0d2711510-tigera-ca-bundle\") pod \"calico-node-5rp7s\" (UID: \"656dc4cf-f7f0-4acf-a15b-36d0d2711510\") " pod="calico-system/calico-node-5rp7s" Mar 4 01:18:57.197778 kubelet[2701]: I0304 01:18:57.196629 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5j85\" (UniqueName: \"kubernetes.io/projected/656dc4cf-f7f0-4acf-a15b-36d0d2711510-kube-api-access-n5j85\") pod \"calico-node-5rp7s\" (UID: \"656dc4cf-f7f0-4acf-a15b-36d0d2711510\") " pod="calico-system/calico-node-5rp7s" Mar 4 01:18:57.197778 kubelet[2701]: I0304 01:18:57.196685 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/656dc4cf-f7f0-4acf-a15b-36d0d2711510-node-certs\") pod \"calico-node-5rp7s\" (UID: \"656dc4cf-f7f0-4acf-a15b-36d0d2711510\") " pod="calico-system/calico-node-5rp7s" Mar 4 01:18:57.264653 containerd[1516]: time="2026-03-04T01:18:57.263169932Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b7566967d-gwrf7,Uid:9b8cb62c-6c94-477a-8e1b-7ec0c712260c,Namespace:calico-system,Attempt:0,}" Mar 4 01:18:57.279866 kubelet[2701]: E0304 01:18:57.279481 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fnbg8" podUID="621d6fb0-0e39-428b-8e9a-8e8c65b0d05c" Mar 4 01:18:57.317082 kubelet[2701]: E0304 01:18:57.316737 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.323186 kubelet[2701]: W0304 01:18:57.322859 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.324758 kubelet[2701]: E0304 01:18:57.323904 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.336278 kubelet[2701]: E0304 01:18:57.336165 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.336278 kubelet[2701]: W0304 01:18:57.336190 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.336278 kubelet[2701]: E0304 01:18:57.336216 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.366175 kubelet[2701]: E0304 01:18:57.366032 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.366175 kubelet[2701]: W0304 01:18:57.366089 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.366175 kubelet[2701]: E0304 01:18:57.366119 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.376099 kubelet[2701]: E0304 01:18:57.375029 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.376099 kubelet[2701]: W0304 01:18:57.375088 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.376099 kubelet[2701]: E0304 01:18:57.375118 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.378472 kubelet[2701]: E0304 01:18:57.378441 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.378472 kubelet[2701]: W0304 01:18:57.378464 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.378623 kubelet[2701]: E0304 01:18:57.378495 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.379631 kubelet[2701]: E0304 01:18:57.379601 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.379631 kubelet[2701]: W0304 01:18:57.379624 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.379777 kubelet[2701]: E0304 01:18:57.379641 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.380397 containerd[1516]: time="2026-03-04T01:18:57.376021238Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 4 01:18:57.380397 containerd[1516]: time="2026-03-04T01:18:57.378943919Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 4 01:18:57.380397 containerd[1516]: time="2026-03-04T01:18:57.379121801Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:18:57.380646 containerd[1516]: time="2026-03-04T01:18:57.380346251Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:18:57.382609 kubelet[2701]: E0304 01:18:57.382580 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.382609 kubelet[2701]: W0304 01:18:57.382601 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.384107 kubelet[2701]: E0304 01:18:57.382619 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.384107 kubelet[2701]: E0304 01:18:57.383113 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.384107 kubelet[2701]: W0304 01:18:57.383143 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.384107 kubelet[2701]: E0304 01:18:57.383270 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.384345 kubelet[2701]: E0304 01:18:57.383947 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.384345 kubelet[2701]: W0304 01:18:57.384234 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.384345 kubelet[2701]: E0304 01:18:57.384251 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.387438 kubelet[2701]: E0304 01:18:57.387080 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.387438 kubelet[2701]: W0304 01:18:57.387151 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.387438 kubelet[2701]: E0304 01:18:57.387170 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.387659 kubelet[2701]: E0304 01:18:57.387480 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.387659 kubelet[2701]: W0304 01:18:57.387494 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.387659 kubelet[2701]: E0304 01:18:57.387511 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.388588 kubelet[2701]: E0304 01:18:57.388562 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.388588 kubelet[2701]: W0304 01:18:57.388582 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.388724 kubelet[2701]: E0304 01:18:57.388598 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.389594 kubelet[2701]: E0304 01:18:57.389566 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.389594 kubelet[2701]: W0304 01:18:57.389595 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.389735 kubelet[2701]: E0304 01:18:57.389612 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.390426 kubelet[2701]: E0304 01:18:57.390398 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.390563 kubelet[2701]: W0304 01:18:57.390537 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.390672 kubelet[2701]: E0304 01:18:57.390566 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.392748 kubelet[2701]: E0304 01:18:57.392529 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.392748 kubelet[2701]: W0304 01:18:57.392550 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.392748 kubelet[2701]: E0304 01:18:57.392567 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.394204 kubelet[2701]: E0304 01:18:57.394162 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.394204 kubelet[2701]: W0304 01:18:57.394203 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.394583 kubelet[2701]: E0304 01:18:57.394222 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.395880 kubelet[2701]: E0304 01:18:57.395857 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.395880 kubelet[2701]: W0304 01:18:57.395878 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.396119 kubelet[2701]: E0304 01:18:57.395894 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.399088 kubelet[2701]: E0304 01:18:57.396926 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.399088 kubelet[2701]: W0304 01:18:57.396977 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.399088 kubelet[2701]: E0304 01:18:57.396996 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.399088 kubelet[2701]: E0304 01:18:57.397746 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.399088 kubelet[2701]: W0304 01:18:57.397762 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.399088 kubelet[2701]: E0304 01:18:57.397777 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.399088 kubelet[2701]: E0304 01:18:57.398260 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.399088 kubelet[2701]: W0304 01:18:57.398275 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.399088 kubelet[2701]: E0304 01:18:57.398290 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.399088 kubelet[2701]: E0304 01:18:57.398647 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.399688 kubelet[2701]: W0304 01:18:57.398661 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.399688 kubelet[2701]: E0304 01:18:57.398686 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.399688 kubelet[2701]: E0304 01:18:57.399114 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.399688 kubelet[2701]: W0304 01:18:57.399128 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.399688 kubelet[2701]: E0304 01:18:57.399144 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.399688 kubelet[2701]: E0304 01:18:57.399523 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.399688 kubelet[2701]: W0304 01:18:57.399536 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.399688 kubelet[2701]: E0304 01:18:57.399551 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.405890 kubelet[2701]: E0304 01:18:57.405788 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.405890 kubelet[2701]: W0304 01:18:57.405816 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.405890 kubelet[2701]: E0304 01:18:57.405834 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.405890 kubelet[2701]: I0304 01:18:57.405878 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/621d6fb0-0e39-428b-8e9a-8e8c65b0d05c-kubelet-dir\") pod \"csi-node-driver-fnbg8\" (UID: \"621d6fb0-0e39-428b-8e9a-8e8c65b0d05c\") " pod="calico-system/csi-node-driver-fnbg8" Mar 4 01:18:57.408325 kubelet[2701]: E0304 01:18:57.408170 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.408325 kubelet[2701]: W0304 01:18:57.408195 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.408325 kubelet[2701]: E0304 01:18:57.408262 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.409989 kubelet[2701]: I0304 01:18:57.408377 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/621d6fb0-0e39-428b-8e9a-8e8c65b0d05c-socket-dir\") pod \"csi-node-driver-fnbg8\" (UID: \"621d6fb0-0e39-428b-8e9a-8e8c65b0d05c\") " pod="calico-system/csi-node-driver-fnbg8" Mar 4 01:18:57.411197 kubelet[2701]: E0304 01:18:57.411172 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.411197 kubelet[2701]: W0304 01:18:57.411195 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.411331 kubelet[2701]: E0304 01:18:57.411227 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.412165 kubelet[2701]: I0304 01:18:57.411465 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/621d6fb0-0e39-428b-8e9a-8e8c65b0d05c-varrun\") pod \"csi-node-driver-fnbg8\" (UID: \"621d6fb0-0e39-428b-8e9a-8e8c65b0d05c\") " pod="calico-system/csi-node-driver-fnbg8" Mar 4 01:18:57.412165 kubelet[2701]: E0304 01:18:57.411962 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.412165 kubelet[2701]: W0304 01:18:57.411977 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.412165 kubelet[2701]: E0304 01:18:57.411992 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.414086 kubelet[2701]: E0304 01:18:57.414022 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.414566 kubelet[2701]: W0304 01:18:57.414042 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.414566 kubelet[2701]: E0304 01:18:57.414209 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.414870 kubelet[2701]: E0304 01:18:57.414849 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.414870 kubelet[2701]: W0304 01:18:57.414868 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.415168 kubelet[2701]: E0304 01:18:57.414884 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.415492 kubelet[2701]: I0304 01:18:57.415231 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzmnf\" (UniqueName: \"kubernetes.io/projected/621d6fb0-0e39-428b-8e9a-8e8c65b0d05c-kube-api-access-nzmnf\") pod \"csi-node-driver-fnbg8\" (UID: \"621d6fb0-0e39-428b-8e9a-8e8c65b0d05c\") " pod="calico-system/csi-node-driver-fnbg8" Mar 4 01:18:57.415836 kubelet[2701]: E0304 01:18:57.415804 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.415836 kubelet[2701]: W0304 01:18:57.415823 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.415836 kubelet[2701]: E0304 01:18:57.415840 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.417069 kubelet[2701]: E0304 01:18:57.417017 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.417658 kubelet[2701]: W0304 01:18:57.417527 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.417737 kubelet[2701]: E0304 01:18:57.417557 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.419558 kubelet[2701]: E0304 01:18:57.419468 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.419558 kubelet[2701]: W0304 01:18:57.419494 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.419558 kubelet[2701]: E0304 01:18:57.419514 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.420686 kubelet[2701]: E0304 01:18:57.420651 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.420686 kubelet[2701]: W0304 01:18:57.420682 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.421010 kubelet[2701]: E0304 01:18:57.420698 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.421977 kubelet[2701]: E0304 01:18:57.421955 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.421977 kubelet[2701]: W0304 01:18:57.421982 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.422140 kubelet[2701]: E0304 01:18:57.421999 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.423087 kubelet[2701]: E0304 01:18:57.423026 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.423087 kubelet[2701]: W0304 01:18:57.423078 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.423227 kubelet[2701]: E0304 01:18:57.423098 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.426177 kubelet[2701]: E0304 01:18:57.426139 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.426177 kubelet[2701]: W0304 01:18:57.426166 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.426307 kubelet[2701]: E0304 01:18:57.426183 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.426829 kubelet[2701]: I0304 01:18:57.426798 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/621d6fb0-0e39-428b-8e9a-8e8c65b0d05c-registration-dir\") pod \"csi-node-driver-fnbg8\" (UID: \"621d6fb0-0e39-428b-8e9a-8e8c65b0d05c\") " pod="calico-system/csi-node-driver-fnbg8" Mar 4 01:18:57.428779 kubelet[2701]: E0304 01:18:57.428747 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.429370 kubelet[2701]: W0304 01:18:57.428783 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.429370 kubelet[2701]: E0304 01:18:57.428803 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.429370 kubelet[2701]: E0304 01:18:57.429185 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.429370 kubelet[2701]: W0304 01:18:57.429205 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.429370 kubelet[2701]: E0304 01:18:57.429221 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.435545 containerd[1516]: time="2026-03-04T01:18:57.434434201Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5rp7s,Uid:656dc4cf-f7f0-4acf-a15b-36d0d2711510,Namespace:calico-system,Attempt:0,}" Mar 4 01:18:57.479382 systemd[1]: Started cri-containerd-7b62584eb835da2f5a1f398dc80b0e1a5cf9f31332063e638ad2451f58519b46.scope - libcontainer container 7b62584eb835da2f5a1f398dc80b0e1a5cf9f31332063e638ad2451f58519b46. Mar 4 01:18:57.519784 containerd[1516]: time="2026-03-04T01:18:57.519612463Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 4 01:18:57.522282 containerd[1516]: time="2026-03-04T01:18:57.522180134Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 4 01:18:57.522412 containerd[1516]: time="2026-03-04T01:18:57.522261875Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:18:57.522714 containerd[1516]: time="2026-03-04T01:18:57.522654268Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:18:57.529089 kubelet[2701]: E0304 01:18:57.528693 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.529089 kubelet[2701]: W0304 01:18:57.528743 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.529089 kubelet[2701]: E0304 01:18:57.528784 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.531268 kubelet[2701]: E0304 01:18:57.529971 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.531268 kubelet[2701]: W0304 01:18:57.530009 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.531268 kubelet[2701]: E0304 01:18:57.530026 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.531839 kubelet[2701]: E0304 01:18:57.530951 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.531839 kubelet[2701]: W0304 01:18:57.531729 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.531839 kubelet[2701]: E0304 01:18:57.531749 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.533351 kubelet[2701]: E0304 01:18:57.532301 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.533351 kubelet[2701]: W0304 01:18:57.532316 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.533351 kubelet[2701]: E0304 01:18:57.532674 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.533724 kubelet[2701]: E0304 01:18:57.533434 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.533724 kubelet[2701]: W0304 01:18:57.533448 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.533724 kubelet[2701]: E0304 01:18:57.533464 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.537008 kubelet[2701]: E0304 01:18:57.534589 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.537008 kubelet[2701]: W0304 01:18:57.534634 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.537008 kubelet[2701]: E0304 01:18:57.534651 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.537008 kubelet[2701]: E0304 01:18:57.534978 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.537008 kubelet[2701]: W0304 01:18:57.534992 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.537008 kubelet[2701]: E0304 01:18:57.535007 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.537008 kubelet[2701]: E0304 01:18:57.535842 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.537008 kubelet[2701]: W0304 01:18:57.535865 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.537008 kubelet[2701]: E0304 01:18:57.536390 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.538946 kubelet[2701]: E0304 01:18:57.538924 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.539200 kubelet[2701]: W0304 01:18:57.539094 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.539200 kubelet[2701]: E0304 01:18:57.539121 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.540826 kubelet[2701]: E0304 01:18:57.540643 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.540826 kubelet[2701]: W0304 01:18:57.540661 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.540826 kubelet[2701]: E0304 01:18:57.540678 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.544334 kubelet[2701]: E0304 01:18:57.544163 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.544334 kubelet[2701]: W0304 01:18:57.544182 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.544334 kubelet[2701]: E0304 01:18:57.544199 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.546151 kubelet[2701]: E0304 01:18:57.546131 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.546405 kubelet[2701]: W0304 01:18:57.546250 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.547074 kubelet[2701]: E0304 01:18:57.546503 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.548206 kubelet[2701]: E0304 01:18:57.548187 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.549088 kubelet[2701]: W0304 01:18:57.548296 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.549088 kubelet[2701]: E0304 01:18:57.548320 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.549863 kubelet[2701]: E0304 01:18:57.549651 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.549863 kubelet[2701]: W0304 01:18:57.549667 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.549863 kubelet[2701]: E0304 01:18:57.549695 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.552288 kubelet[2701]: E0304 01:18:57.552165 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.552288 kubelet[2701]: W0304 01:18:57.552185 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.552288 kubelet[2701]: E0304 01:18:57.552200 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.552923 kubelet[2701]: E0304 01:18:57.552750 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.552923 kubelet[2701]: W0304 01:18:57.552767 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.552923 kubelet[2701]: E0304 01:18:57.552782 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.557209 kubelet[2701]: E0304 01:18:57.556953 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.557209 kubelet[2701]: W0304 01:18:57.556972 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.557209 kubelet[2701]: E0304 01:18:57.556989 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.558897 kubelet[2701]: E0304 01:18:57.558499 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.558897 kubelet[2701]: W0304 01:18:57.558518 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.558897 kubelet[2701]: E0304 01:18:57.558535 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.560628 kubelet[2701]: E0304 01:18:57.560168 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.560628 kubelet[2701]: W0304 01:18:57.560188 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.560628 kubelet[2701]: E0304 01:18:57.560204 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.562295 kubelet[2701]: E0304 01:18:57.562275 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.562562 kubelet[2701]: W0304 01:18:57.562423 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.562562 kubelet[2701]: E0304 01:18:57.562449 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.564293 kubelet[2701]: E0304 01:18:57.564273 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.564436 kubelet[2701]: W0304 01:18:57.564416 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.564643 kubelet[2701]: E0304 01:18:57.564570 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.566178 kubelet[2701]: E0304 01:18:57.566158 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.566306 kubelet[2701]: W0304 01:18:57.566282 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.566432 kubelet[2701]: E0304 01:18:57.566411 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.567530 kubelet[2701]: E0304 01:18:57.567510 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.567652 kubelet[2701]: W0304 01:18:57.567632 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.567759 kubelet[2701]: E0304 01:18:57.567738 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.568491 kubelet[2701]: E0304 01:18:57.568381 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.569412 kubelet[2701]: W0304 01:18:57.568649 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.569412 kubelet[2701]: E0304 01:18:57.568675 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.570269 systemd[1]: Started cri-containerd-ee6a614508325bf6bf8d5a0f596adccec957adae26c2ca167d2a0af973974aa6.scope - libcontainer container ee6a614508325bf6bf8d5a0f596adccec957adae26c2ca167d2a0af973974aa6. Mar 4 01:18:57.571116 kubelet[2701]: E0304 01:18:57.571096 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.571278 kubelet[2701]: W0304 01:18:57.571223 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.571613 kubelet[2701]: E0304 01:18:57.571589 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.604002 kubelet[2701]: E0304 01:18:57.603843 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:18:57.604002 kubelet[2701]: W0304 01:18:57.603894 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:18:57.604002 kubelet[2701]: E0304 01:18:57.603939 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:18:57.631423 containerd[1516]: time="2026-03-04T01:18:57.631317430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b7566967d-gwrf7,Uid:9b8cb62c-6c94-477a-8e1b-7ec0c712260c,Namespace:calico-system,Attempt:0,} returns sandbox id \"7b62584eb835da2f5a1f398dc80b0e1a5cf9f31332063e638ad2451f58519b46\"" Mar 4 01:18:57.637295 containerd[1516]: time="2026-03-04T01:18:57.637253998Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 4 01:18:57.648959 containerd[1516]: time="2026-03-04T01:18:57.648896985Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5rp7s,Uid:656dc4cf-f7f0-4acf-a15b-36d0d2711510,Namespace:calico-system,Attempt:0,} returns sandbox id \"ee6a614508325bf6bf8d5a0f596adccec957adae26c2ca167d2a0af973974aa6\"" Mar 4 01:18:59.041838 kubelet[2701]: E0304 01:18:59.040955 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fnbg8" podUID="621d6fb0-0e39-428b-8e9a-8e8c65b0d05c" Mar 4 01:18:59.235423 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2673907564.mount: Deactivated successfully. Mar 4 01:19:01.040015 kubelet[2701]: E0304 01:19:01.039873 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fnbg8" podUID="621d6fb0-0e39-428b-8e9a-8e8c65b0d05c" Mar 4 01:19:01.524726 containerd[1516]: time="2026-03-04T01:19:01.524618525Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:01.527193 containerd[1516]: time="2026-03-04T01:19:01.527094457Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Mar 4 01:19:01.529157 containerd[1516]: time="2026-03-04T01:19:01.528450650Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:01.535792 containerd[1516]: time="2026-03-04T01:19:01.535745594Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 3.898431763s" Mar 4 01:19:01.536096 containerd[1516]: time="2026-03-04T01:19:01.535977796Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:01.537162 containerd[1516]: time="2026-03-04T01:19:01.537109291Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Mar 4 01:19:01.540037 containerd[1516]: time="2026-03-04T01:19:01.540006494Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 4 01:19:01.593225 containerd[1516]: time="2026-03-04T01:19:01.593173646Z" level=info msg="CreateContainer within sandbox \"7b62584eb835da2f5a1f398dc80b0e1a5cf9f31332063e638ad2451f58519b46\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 4 01:19:01.620801 containerd[1516]: time="2026-03-04T01:19:01.620740472Z" level=info msg="CreateContainer within sandbox \"7b62584eb835da2f5a1f398dc80b0e1a5cf9f31332063e638ad2451f58519b46\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"c668c8ff85774069e43cd8d4f9653e8f98d20292f4ba6662dcb0b1caa86a6d85\"" Mar 4 01:19:01.624109 containerd[1516]: time="2026-03-04T01:19:01.624036204Z" level=info msg="StartContainer for \"c668c8ff85774069e43cd8d4f9653e8f98d20292f4ba6662dcb0b1caa86a6d85\"" Mar 4 01:19:01.776553 systemd[1]: Started cri-containerd-c668c8ff85774069e43cd8d4f9653e8f98d20292f4ba6662dcb0b1caa86a6d85.scope - libcontainer container c668c8ff85774069e43cd8d4f9653e8f98d20292f4ba6662dcb0b1caa86a6d85. Mar 4 01:19:01.945708 containerd[1516]: time="2026-03-04T01:19:01.945384726Z" level=info msg="StartContainer for \"c668c8ff85774069e43cd8d4f9653e8f98d20292f4ba6662dcb0b1caa86a6d85\" returns successfully" Mar 4 01:19:02.313019 kubelet[2701]: I0304 01:19:02.311581 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6b7566967d-gwrf7" podStartSLOduration=2.40884214 podStartE2EDuration="6.31152901s" podCreationTimestamp="2026-03-04 01:18:56 +0000 UTC" firstStartedPulling="2026-03-04 01:18:57.636515386 +0000 UTC m=+25.871775600" lastFinishedPulling="2026-03-04 01:19:01.53920225 +0000 UTC m=+29.774462470" observedRunningTime="2026-03-04 01:19:02.309415362 +0000 UTC m=+30.544675584" watchObservedRunningTime="2026-03-04 01:19:02.31152901 +0000 UTC m=+30.546789236" Mar 4 01:19:02.337685 kubelet[2701]: E0304 01:19:02.337332 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.337685 kubelet[2701]: W0304 01:19:02.337375 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.337685 kubelet[2701]: E0304 01:19:02.337443 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.339304 kubelet[2701]: E0304 01:19:02.339124 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.339304 kubelet[2701]: W0304 01:19:02.339143 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.339304 kubelet[2701]: E0304 01:19:02.339173 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.339991 kubelet[2701]: E0304 01:19:02.339688 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.339991 kubelet[2701]: W0304 01:19:02.339706 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.339991 kubelet[2701]: E0304 01:19:02.339722 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.340295 kubelet[2701]: E0304 01:19:02.340276 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.340404 kubelet[2701]: W0304 01:19:02.340384 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.341084 kubelet[2701]: E0304 01:19:02.340495 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.341542 kubelet[2701]: E0304 01:19:02.341451 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.341542 kubelet[2701]: W0304 01:19:02.341469 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.341542 kubelet[2701]: E0304 01:19:02.341484 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.342487 kubelet[2701]: E0304 01:19:02.341934 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.342487 kubelet[2701]: W0304 01:19:02.341948 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.342487 kubelet[2701]: E0304 01:19:02.341998 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.344196 kubelet[2701]: E0304 01:19:02.343111 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.344196 kubelet[2701]: W0304 01:19:02.343132 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.344196 kubelet[2701]: E0304 01:19:02.343163 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.344771 kubelet[2701]: E0304 01:19:02.344585 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.344771 kubelet[2701]: W0304 01:19:02.344603 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.344771 kubelet[2701]: E0304 01:19:02.344618 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.345142 kubelet[2701]: E0304 01:19:02.345027 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.345142 kubelet[2701]: W0304 01:19:02.345067 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.345142 kubelet[2701]: E0304 01:19:02.345085 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.346101 kubelet[2701]: E0304 01:19:02.345940 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.346101 kubelet[2701]: W0304 01:19:02.345959 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.346101 kubelet[2701]: E0304 01:19:02.345987 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.347713 kubelet[2701]: E0304 01:19:02.346864 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.347713 kubelet[2701]: W0304 01:19:02.346881 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.347713 kubelet[2701]: E0304 01:19:02.346898 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.348016 kubelet[2701]: E0304 01:19:02.347996 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.348267 kubelet[2701]: W0304 01:19:02.348145 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.348267 kubelet[2701]: E0304 01:19:02.348172 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.349246 kubelet[2701]: E0304 01:19:02.349217 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.349481 kubelet[2701]: W0304 01:19:02.349345 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.349481 kubelet[2701]: E0304 01:19:02.349402 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.351187 kubelet[2701]: E0304 01:19:02.350990 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.351187 kubelet[2701]: W0304 01:19:02.351010 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.351187 kubelet[2701]: E0304 01:19:02.351026 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.351603 kubelet[2701]: E0304 01:19:02.351486 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.351603 kubelet[2701]: W0304 01:19:02.351504 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.351603 kubelet[2701]: E0304 01:19:02.351520 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.387010 kubelet[2701]: E0304 01:19:02.386767 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.387010 kubelet[2701]: W0304 01:19:02.386799 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.387010 kubelet[2701]: E0304 01:19:02.386824 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.387592 kubelet[2701]: E0304 01:19:02.387504 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.387592 kubelet[2701]: W0304 01:19:02.387522 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.387592 kubelet[2701]: E0304 01:19:02.387538 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.387930 kubelet[2701]: E0304 01:19:02.387908 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.387930 kubelet[2701]: W0304 01:19:02.387926 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.388041 kubelet[2701]: E0304 01:19:02.387944 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.391147 kubelet[2701]: E0304 01:19:02.390775 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.391147 kubelet[2701]: W0304 01:19:02.390795 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.391147 kubelet[2701]: E0304 01:19:02.390812 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.391374 kubelet[2701]: E0304 01:19:02.391216 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.391374 kubelet[2701]: W0304 01:19:02.391231 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.391374 kubelet[2701]: E0304 01:19:02.391247 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.392348 kubelet[2701]: E0304 01:19:02.391862 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.392348 kubelet[2701]: W0304 01:19:02.391877 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.392348 kubelet[2701]: E0304 01:19:02.391892 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.392348 kubelet[2701]: E0304 01:19:02.392255 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.392348 kubelet[2701]: W0304 01:19:02.392269 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.393524 kubelet[2701]: E0304 01:19:02.392346 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.394252 kubelet[2701]: E0304 01:19:02.394227 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.394252 kubelet[2701]: W0304 01:19:02.394247 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.394252 kubelet[2701]: E0304 01:19:02.394263 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.395327 kubelet[2701]: E0304 01:19:02.395305 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.395442 kubelet[2701]: W0304 01:19:02.395371 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.395442 kubelet[2701]: E0304 01:19:02.395392 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.395854 kubelet[2701]: E0304 01:19:02.395807 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.395854 kubelet[2701]: W0304 01:19:02.395826 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.395854 kubelet[2701]: E0304 01:19:02.395842 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.398261 kubelet[2701]: E0304 01:19:02.398223 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.398261 kubelet[2701]: W0304 01:19:02.398242 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.398881 kubelet[2701]: E0304 01:19:02.398275 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.399561 kubelet[2701]: E0304 01:19:02.399537 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.399561 kubelet[2701]: W0304 01:19:02.399558 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.399696 kubelet[2701]: E0304 01:19:02.399574 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.401450 kubelet[2701]: E0304 01:19:02.401378 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.401450 kubelet[2701]: W0304 01:19:02.401394 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.401450 kubelet[2701]: E0304 01:19:02.401410 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.402487 kubelet[2701]: E0304 01:19:02.402024 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.402487 kubelet[2701]: W0304 01:19:02.402038 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.402487 kubelet[2701]: E0304 01:19:02.402080 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.403162 kubelet[2701]: E0304 01:19:02.403138 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.403162 kubelet[2701]: W0304 01:19:02.403157 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.403283 kubelet[2701]: E0304 01:19:02.403173 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.403963 kubelet[2701]: E0304 01:19:02.403941 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.405980 kubelet[2701]: W0304 01:19:02.404315 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.405980 kubelet[2701]: E0304 01:19:02.404340 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.406761 kubelet[2701]: E0304 01:19:02.406733 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.406761 kubelet[2701]: W0304 01:19:02.406757 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.406944 kubelet[2701]: E0304 01:19:02.406776 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:02.408123 kubelet[2701]: E0304 01:19:02.408020 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:02.408123 kubelet[2701]: W0304 01:19:02.408040 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:02.408123 kubelet[2701]: E0304 01:19:02.408077 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.039660 kubelet[2701]: E0304 01:19:03.039563 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fnbg8" podUID="621d6fb0-0e39-428b-8e9a-8e8c65b0d05c" Mar 4 01:19:03.305264 containerd[1516]: time="2026-03-04T01:19:03.305083604Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:03.306550 containerd[1516]: time="2026-03-04T01:19:03.306437661Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Mar 4 01:19:03.307340 containerd[1516]: time="2026-03-04T01:19:03.307302541Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:03.312330 containerd[1516]: time="2026-03-04T01:19:03.312237590Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:03.317324 containerd[1516]: time="2026-03-04T01:19:03.316798931Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 1.776605161s" Mar 4 01:19:03.317324 containerd[1516]: time="2026-03-04T01:19:03.316864771Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Mar 4 01:19:03.323583 containerd[1516]: time="2026-03-04T01:19:03.323546427Z" level=info msg="CreateContainer within sandbox \"ee6a614508325bf6bf8d5a0f596adccec957adae26c2ca167d2a0af973974aa6\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 4 01:19:03.358250 kubelet[2701]: E0304 01:19:03.358099 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.358250 kubelet[2701]: W0304 01:19:03.358159 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.358250 kubelet[2701]: E0304 01:19:03.358209 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.360938 kubelet[2701]: E0304 01:19:03.359126 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.360938 kubelet[2701]: W0304 01:19:03.359141 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.360938 kubelet[2701]: E0304 01:19:03.359157 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.360938 kubelet[2701]: E0304 01:19:03.360180 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.360938 kubelet[2701]: W0304 01:19:03.360195 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.360938 kubelet[2701]: E0304 01:19:03.360210 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.363986 kubelet[2701]: E0304 01:19:03.361532 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.363986 kubelet[2701]: W0304 01:19:03.361668 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.363986 kubelet[2701]: E0304 01:19:03.361687 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.363986 kubelet[2701]: E0304 01:19:03.362701 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.363986 kubelet[2701]: W0304 01:19:03.362716 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.363986 kubelet[2701]: E0304 01:19:03.362731 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.363986 kubelet[2701]: E0304 01:19:03.363654 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.363986 kubelet[2701]: W0304 01:19:03.363670 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.363986 kubelet[2701]: E0304 01:19:03.363685 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.364495 kubelet[2701]: E0304 01:19:03.364434 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.364495 kubelet[2701]: W0304 01:19:03.364449 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.364495 kubelet[2701]: E0304 01:19:03.364464 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.365351 kubelet[2701]: E0304 01:19:03.365262 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.365351 kubelet[2701]: W0304 01:19:03.365282 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.365351 kubelet[2701]: E0304 01:19:03.365298 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.366490 kubelet[2701]: E0304 01:19:03.366465 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.366490 kubelet[2701]: W0304 01:19:03.366486 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.366640 kubelet[2701]: E0304 01:19:03.366540 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.367450 kubelet[2701]: E0304 01:19:03.367425 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.367450 kubelet[2701]: W0304 01:19:03.367446 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.368027 kubelet[2701]: E0304 01:19:03.367935 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.368522 kubelet[2701]: E0304 01:19:03.368496 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.368522 kubelet[2701]: W0304 01:19:03.368516 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.368682 kubelet[2701]: E0304 01:19:03.368532 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.370284 kubelet[2701]: E0304 01:19:03.370004 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.370284 kubelet[2701]: W0304 01:19:03.370023 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.370284 kubelet[2701]: E0304 01:19:03.370041 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.372434 kubelet[2701]: E0304 01:19:03.370889 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.372434 kubelet[2701]: W0304 01:19:03.370909 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.372434 kubelet[2701]: E0304 01:19:03.371075 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.376575 kubelet[2701]: E0304 01:19:03.374916 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.376575 kubelet[2701]: W0304 01:19:03.374940 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.376575 kubelet[2701]: E0304 01:19:03.375080 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.378732 kubelet[2701]: E0304 01:19:03.378386 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.378732 kubelet[2701]: W0304 01:19:03.378410 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.378732 kubelet[2701]: E0304 01:19:03.378427 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.402384 kubelet[2701]: E0304 01:19:03.402330 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.402384 kubelet[2701]: W0304 01:19:03.402373 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.402649 kubelet[2701]: E0304 01:19:03.402399 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.402796 kubelet[2701]: E0304 01:19:03.402748 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.402796 kubelet[2701]: W0304 01:19:03.402795 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.402940 kubelet[2701]: E0304 01:19:03.402816 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.403167 kubelet[2701]: E0304 01:19:03.403138 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.403167 kubelet[2701]: W0304 01:19:03.403158 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.403291 kubelet[2701]: E0304 01:19:03.403174 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.403491 kubelet[2701]: E0304 01:19:03.403470 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.403491 kubelet[2701]: W0304 01:19:03.403489 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.403625 kubelet[2701]: E0304 01:19:03.403504 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.403814 kubelet[2701]: E0304 01:19:03.403794 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.403814 kubelet[2701]: W0304 01:19:03.403814 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.404890 kubelet[2701]: E0304 01:19:03.403829 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.404890 kubelet[2701]: E0304 01:19:03.404479 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.404890 kubelet[2701]: W0304 01:19:03.404502 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.404890 kubelet[2701]: E0304 01:19:03.404518 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.404890 kubelet[2701]: E0304 01:19:03.404806 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.404890 kubelet[2701]: W0304 01:19:03.404830 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.404890 kubelet[2701]: E0304 01:19:03.404845 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.405233 kubelet[2701]: E0304 01:19:03.405176 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.405233 kubelet[2701]: W0304 01:19:03.405192 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.405233 kubelet[2701]: E0304 01:19:03.405207 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.405599 kubelet[2701]: E0304 01:19:03.405572 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.405599 kubelet[2701]: W0304 01:19:03.405587 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.405726 kubelet[2701]: E0304 01:19:03.405602 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.406363 kubelet[2701]: E0304 01:19:03.406326 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.406363 kubelet[2701]: W0304 01:19:03.406360 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.406524 kubelet[2701]: E0304 01:19:03.406376 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.406748 kubelet[2701]: E0304 01:19:03.406728 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.406849 kubelet[2701]: W0304 01:19:03.406770 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.406849 kubelet[2701]: E0304 01:19:03.406788 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.407295 kubelet[2701]: E0304 01:19:03.407254 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.407361 kubelet[2701]: W0304 01:19:03.407297 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.407361 kubelet[2701]: E0304 01:19:03.407315 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.407767 kubelet[2701]: E0304 01:19:03.407746 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.407837 kubelet[2701]: W0304 01:19:03.407790 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.407837 kubelet[2701]: E0304 01:19:03.407810 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.408230 kubelet[2701]: E0304 01:19:03.408205 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.408230 kubelet[2701]: W0304 01:19:03.408228 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.408338 kubelet[2701]: E0304 01:19:03.408244 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.408648 kubelet[2701]: E0304 01:19:03.408628 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.408648 kubelet[2701]: W0304 01:19:03.408646 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.408762 kubelet[2701]: E0304 01:19:03.408662 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.409330 kubelet[2701]: E0304 01:19:03.409307 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.409330 kubelet[2701]: W0304 01:19:03.409327 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.421907 kubelet[2701]: E0304 01:19:03.409367 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.421907 kubelet[2701]: E0304 01:19:03.409716 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.421907 kubelet[2701]: W0304 01:19:03.409730 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.421907 kubelet[2701]: E0304 01:19:03.409744 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.421907 kubelet[2701]: E0304 01:19:03.410750 2701 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 4 01:19:03.421907 kubelet[2701]: W0304 01:19:03.410790 2701 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 4 01:19:03.421907 kubelet[2701]: E0304 01:19:03.410813 2701 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 4 01:19:03.437293 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3807313811.mount: Deactivated successfully. Mar 4 01:19:03.440146 containerd[1516]: time="2026-03-04T01:19:03.440082540Z" level=info msg="CreateContainer within sandbox \"ee6a614508325bf6bf8d5a0f596adccec957adae26c2ca167d2a0af973974aa6\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"02cc47ed104914f65741f23a0d8c9d6dcaba920be38c0f58b7e0b932a97c365c\"" Mar 4 01:19:03.443010 containerd[1516]: time="2026-03-04T01:19:03.442397865Z" level=info msg="StartContainer for \"02cc47ed104914f65741f23a0d8c9d6dcaba920be38c0f58b7e0b932a97c365c\"" Mar 4 01:19:03.503491 systemd[1]: Started cri-containerd-02cc47ed104914f65741f23a0d8c9d6dcaba920be38c0f58b7e0b932a97c365c.scope - libcontainer container 02cc47ed104914f65741f23a0d8c9d6dcaba920be38c0f58b7e0b932a97c365c. Mar 4 01:19:03.549933 containerd[1516]: time="2026-03-04T01:19:03.549662284Z" level=info msg="StartContainer for \"02cc47ed104914f65741f23a0d8c9d6dcaba920be38c0f58b7e0b932a97c365c\" returns successfully" Mar 4 01:19:03.577270 systemd[1]: cri-containerd-02cc47ed104914f65741f23a0d8c9d6dcaba920be38c0f58b7e0b932a97c365c.scope: Deactivated successfully. Mar 4 01:19:03.612073 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-02cc47ed104914f65741f23a0d8c9d6dcaba920be38c0f58b7e0b932a97c365c-rootfs.mount: Deactivated successfully. Mar 4 01:19:03.685569 containerd[1516]: time="2026-03-04T01:19:03.685337325Z" level=info msg="shim disconnected" id=02cc47ed104914f65741f23a0d8c9d6dcaba920be38c0f58b7e0b932a97c365c namespace=k8s.io Mar 4 01:19:03.685569 containerd[1516]: time="2026-03-04T01:19:03.685486641Z" level=warning msg="cleaning up after shim disconnected" id=02cc47ed104914f65741f23a0d8c9d6dcaba920be38c0f58b7e0b932a97c365c namespace=k8s.io Mar 4 01:19:03.685569 containerd[1516]: time="2026-03-04T01:19:03.685512720Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 4 01:19:04.296271 containerd[1516]: time="2026-03-04T01:19:04.296212715Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 4 01:19:05.041566 kubelet[2701]: E0304 01:19:05.040668 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fnbg8" podUID="621d6fb0-0e39-428b-8e9a-8e8c65b0d05c" Mar 4 01:19:07.040158 kubelet[2701]: E0304 01:19:07.039788 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fnbg8" podUID="621d6fb0-0e39-428b-8e9a-8e8c65b0d05c" Mar 4 01:19:09.041481 kubelet[2701]: E0304 01:19:09.041120 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fnbg8" podUID="621d6fb0-0e39-428b-8e9a-8e8c65b0d05c" Mar 4 01:19:11.044337 kubelet[2701]: E0304 01:19:11.044164 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fnbg8" podUID="621d6fb0-0e39-428b-8e9a-8e8c65b0d05c" Mar 4 01:19:13.039936 kubelet[2701]: E0304 01:19:13.039688 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fnbg8" podUID="621d6fb0-0e39-428b-8e9a-8e8c65b0d05c" Mar 4 01:19:15.040428 kubelet[2701]: E0304 01:19:15.039638 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fnbg8" podUID="621d6fb0-0e39-428b-8e9a-8e8c65b0d05c" Mar 4 01:19:16.890460 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3673795966.mount: Deactivated successfully. Mar 4 01:19:16.954221 containerd[1516]: time="2026-03-04T01:19:16.942949742Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:16.957848 containerd[1516]: time="2026-03-04T01:19:16.955816363Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Mar 4 01:19:16.957974 containerd[1516]: time="2026-03-04T01:19:16.956655136Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:16.961174 containerd[1516]: time="2026-03-04T01:19:16.961066408Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:16.962579 containerd[1516]: time="2026-03-04T01:19:16.961827151Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 12.665542104s" Mar 4 01:19:16.962579 containerd[1516]: time="2026-03-04T01:19:16.961883431Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Mar 4 01:19:16.968696 containerd[1516]: time="2026-03-04T01:19:16.968563537Z" level=info msg="CreateContainer within sandbox \"ee6a614508325bf6bf8d5a0f596adccec957adae26c2ca167d2a0af973974aa6\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 4 01:19:17.003411 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1028237283.mount: Deactivated successfully. Mar 4 01:19:17.010165 containerd[1516]: time="2026-03-04T01:19:17.009847837Z" level=info msg="CreateContainer within sandbox \"ee6a614508325bf6bf8d5a0f596adccec957adae26c2ca167d2a0af973974aa6\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"8a403daada01e20b4c924dc6efce177c03ea2f0cf5315dfded13a0231b14f56b\"" Mar 4 01:19:17.012777 containerd[1516]: time="2026-03-04T01:19:17.012515151Z" level=info msg="StartContainer for \"8a403daada01e20b4c924dc6efce177c03ea2f0cf5315dfded13a0231b14f56b\"" Mar 4 01:19:17.041345 kubelet[2701]: E0304 01:19:17.041257 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fnbg8" podUID="621d6fb0-0e39-428b-8e9a-8e8c65b0d05c" Mar 4 01:19:17.110618 systemd[1]: Started cri-containerd-8a403daada01e20b4c924dc6efce177c03ea2f0cf5315dfded13a0231b14f56b.scope - libcontainer container 8a403daada01e20b4c924dc6efce177c03ea2f0cf5315dfded13a0231b14f56b. Mar 4 01:19:17.203847 containerd[1516]: time="2026-03-04T01:19:17.203601671Z" level=info msg="StartContainer for \"8a403daada01e20b4c924dc6efce177c03ea2f0cf5315dfded13a0231b14f56b\" returns successfully" Mar 4 01:19:17.269862 systemd[1]: cri-containerd-8a403daada01e20b4c924dc6efce177c03ea2f0cf5315dfded13a0231b14f56b.scope: Deactivated successfully. Mar 4 01:19:17.307810 containerd[1516]: time="2026-03-04T01:19:17.307680880Z" level=info msg="shim disconnected" id=8a403daada01e20b4c924dc6efce177c03ea2f0cf5315dfded13a0231b14f56b namespace=k8s.io Mar 4 01:19:17.314010 containerd[1516]: time="2026-03-04T01:19:17.308153537Z" level=warning msg="cleaning up after shim disconnected" id=8a403daada01e20b4c924dc6efce177c03ea2f0cf5315dfded13a0231b14f56b namespace=k8s.io Mar 4 01:19:17.314010 containerd[1516]: time="2026-03-04T01:19:17.308214655Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 4 01:19:17.339814 containerd[1516]: time="2026-03-04T01:19:17.339758975Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 4 01:19:17.885225 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8a403daada01e20b4c924dc6efce177c03ea2f0cf5315dfded13a0231b14f56b-rootfs.mount: Deactivated successfully. Mar 4 01:19:19.040482 kubelet[2701]: E0304 01:19:19.040129 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fnbg8" podUID="621d6fb0-0e39-428b-8e9a-8e8c65b0d05c" Mar 4 01:19:21.042694 kubelet[2701]: E0304 01:19:21.040120 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fnbg8" podUID="621d6fb0-0e39-428b-8e9a-8e8c65b0d05c" Mar 4 01:19:22.243571 containerd[1516]: time="2026-03-04T01:19:22.243472217Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:22.246625 containerd[1516]: time="2026-03-04T01:19:22.245912332Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Mar 4 01:19:22.249554 containerd[1516]: time="2026-03-04T01:19:22.249479492Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:22.255782 containerd[1516]: time="2026-03-04T01:19:22.255722289Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 4.915896543s" Mar 4 01:19:22.256444 containerd[1516]: time="2026-03-04T01:19:22.256115497Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Mar 4 01:19:22.271428 containerd[1516]: time="2026-03-04T01:19:22.271373995Z" level=info msg="CreateContainer within sandbox \"ee6a614508325bf6bf8d5a0f596adccec957adae26c2ca167d2a0af973974aa6\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 4 01:19:22.276705 containerd[1516]: time="2026-03-04T01:19:22.275646155Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:22.311928 containerd[1516]: time="2026-03-04T01:19:22.311878076Z" level=info msg="CreateContainer within sandbox \"ee6a614508325bf6bf8d5a0f596adccec957adae26c2ca167d2a0af973974aa6\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"94856ab3e07e4925167851e04e369e208f6835d6a5f99ebd5216ad63bae680f2\"" Mar 4 01:19:22.313504 containerd[1516]: time="2026-03-04T01:19:22.313460940Z" level=info msg="StartContainer for \"94856ab3e07e4925167851e04e369e208f6835d6a5f99ebd5216ad63bae680f2\"" Mar 4 01:19:22.398308 systemd[1]: Started cri-containerd-94856ab3e07e4925167851e04e369e208f6835d6a5f99ebd5216ad63bae680f2.scope - libcontainer container 94856ab3e07e4925167851e04e369e208f6835d6a5f99ebd5216ad63bae680f2. Mar 4 01:19:22.453242 containerd[1516]: time="2026-03-04T01:19:22.453172640Z" level=info msg="StartContainer for \"94856ab3e07e4925167851e04e369e208f6835d6a5f99ebd5216ad63bae680f2\" returns successfully" Mar 4 01:19:23.039986 kubelet[2701]: E0304 01:19:23.039863 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fnbg8" podUID="621d6fb0-0e39-428b-8e9a-8e8c65b0d05c" Mar 4 01:19:23.775412 systemd[1]: cri-containerd-94856ab3e07e4925167851e04e369e208f6835d6a5f99ebd5216ad63bae680f2.scope: Deactivated successfully. Mar 4 01:19:23.820295 kubelet[2701]: I0304 01:19:23.819656 2701 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Mar 4 01:19:23.829477 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-94856ab3e07e4925167851e04e369e208f6835d6a5f99ebd5216ad63bae680f2-rootfs.mount: Deactivated successfully. Mar 4 01:19:23.834268 containerd[1516]: time="2026-03-04T01:19:23.834092773Z" level=info msg="shim disconnected" id=94856ab3e07e4925167851e04e369e208f6835d6a5f99ebd5216ad63bae680f2 namespace=k8s.io Mar 4 01:19:23.834820 containerd[1516]: time="2026-03-04T01:19:23.834269932Z" level=warning msg="cleaning up after shim disconnected" id=94856ab3e07e4925167851e04e369e208f6835d6a5f99ebd5216ad63bae680f2 namespace=k8s.io Mar 4 01:19:23.834820 containerd[1516]: time="2026-03-04T01:19:23.834294943Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 4 01:19:23.928541 systemd[1]: Created slice kubepods-burstable-pod49b3e387_9ccb_4b0f_9de1_bd709b96a755.slice - libcontainer container kubepods-burstable-pod49b3e387_9ccb_4b0f_9de1_bd709b96a755.slice. Mar 4 01:19:23.977580 systemd[1]: Created slice kubepods-besteffort-pod11453fdd_1482_4ede_8950_e97c22d85781.slice - libcontainer container kubepods-besteffort-pod11453fdd_1482_4ede_8950_e97c22d85781.slice. Mar 4 01:19:23.997952 systemd[1]: Created slice kubepods-burstable-pod58ea317b_5eaa_44c7_a296_9b69e1cee2ab.slice - libcontainer container kubepods-burstable-pod58ea317b_5eaa_44c7_a296_9b69e1cee2ab.slice. Mar 4 01:19:24.011618 systemd[1]: Created slice kubepods-besteffort-pod0181c27b_f567_403e_991b_d2716b48e52b.slice - libcontainer container kubepods-besteffort-pod0181c27b_f567_403e_991b_d2716b48e52b.slice. Mar 4 01:19:24.024735 systemd[1]: Created slice kubepods-besteffort-pod0aba09a3_09ff_4e29_a406_0f9932fc94f6.slice - libcontainer container kubepods-besteffort-pod0aba09a3_09ff_4e29_a406_0f9932fc94f6.slice. Mar 4 01:19:24.033170 kubelet[2701]: I0304 01:19:24.031110 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdrs6\" (UniqueName: \"kubernetes.io/projected/0aba09a3-09ff-4e29-a406-0f9932fc94f6-kube-api-access-qdrs6\") pod \"calico-apiserver-5d996b5b5-c257w\" (UID: \"0aba09a3-09ff-4e29-a406-0f9932fc94f6\") " pod="calico-system/calico-apiserver-5d996b5b5-c257w" Mar 4 01:19:24.033408 kubelet[2701]: I0304 01:19:24.033383 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11453fdd-1482-4ede-8950-e97c22d85781-tigera-ca-bundle\") pod \"calico-kube-controllers-58bc6545dc-jbw6r\" (UID: \"11453fdd-1482-4ede-8950-e97c22d85781\") " pod="calico-system/calico-kube-controllers-58bc6545dc-jbw6r" Mar 4 01:19:24.033592 kubelet[2701]: I0304 01:19:24.033563 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49b3e387-9ccb-4b0f-9de1-bd709b96a755-config-volume\") pod \"coredns-66bc5c9577-ghd5d\" (UID: \"49b3e387-9ccb-4b0f-9de1-bd709b96a755\") " pod="kube-system/coredns-66bc5c9577-ghd5d" Mar 4 01:19:24.033749 kubelet[2701]: I0304 01:19:24.033724 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmxsf\" (UniqueName: \"kubernetes.io/projected/11453fdd-1482-4ede-8950-e97c22d85781-kube-api-access-tmxsf\") pod \"calico-kube-controllers-58bc6545dc-jbw6r\" (UID: \"11453fdd-1482-4ede-8950-e97c22d85781\") " pod="calico-system/calico-kube-controllers-58bc6545dc-jbw6r" Mar 4 01:19:24.033901 kubelet[2701]: I0304 01:19:24.033873 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0aba09a3-09ff-4e29-a406-0f9932fc94f6-calico-apiserver-certs\") pod \"calico-apiserver-5d996b5b5-c257w\" (UID: \"0aba09a3-09ff-4e29-a406-0f9932fc94f6\") " pod="calico-system/calico-apiserver-5d996b5b5-c257w" Mar 4 01:19:24.034146 kubelet[2701]: I0304 01:19:24.034113 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gczms\" (UniqueName: \"kubernetes.io/projected/58ea317b-5eaa-44c7-a296-9b69e1cee2ab-kube-api-access-gczms\") pod \"coredns-66bc5c9577-dg6p5\" (UID: \"58ea317b-5eaa-44c7-a296-9b69e1cee2ab\") " pod="kube-system/coredns-66bc5c9577-dg6p5" Mar 4 01:19:24.035343 kubelet[2701]: I0304 01:19:24.035144 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxtn2\" (UniqueName: \"kubernetes.io/projected/49b3e387-9ccb-4b0f-9de1-bd709b96a755-kube-api-access-rxtn2\") pod \"coredns-66bc5c9577-ghd5d\" (UID: \"49b3e387-9ccb-4b0f-9de1-bd709b96a755\") " pod="kube-system/coredns-66bc5c9577-ghd5d" Mar 4 01:19:24.035343 kubelet[2701]: I0304 01:19:24.035204 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58ea317b-5eaa-44c7-a296-9b69e1cee2ab-config-volume\") pod \"coredns-66bc5c9577-dg6p5\" (UID: \"58ea317b-5eaa-44c7-a296-9b69e1cee2ab\") " pod="kube-system/coredns-66bc5c9577-dg6p5" Mar 4 01:19:24.037509 systemd[1]: Created slice kubepods-besteffort-pode815ce3e_bc88_40d3_b47c_e2c0c6843ef4.slice - libcontainer container kubepods-besteffort-pode815ce3e_bc88_40d3_b47c_e2c0c6843ef4.slice. Mar 4 01:19:24.055000 systemd[1]: Created slice kubepods-besteffort-pod0036b8ad_7c49_4b71_addf_d1386c2532e8.slice - libcontainer container kubepods-besteffort-pod0036b8ad_7c49_4b71_addf_d1386c2532e8.slice. Mar 4 01:19:24.140079 kubelet[2701]: I0304 01:19:24.136478 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/0036b8ad-7c49-4b71-addf-d1386c2532e8-goldmane-key-pair\") pod \"goldmane-cccfbd5cf-hwgzw\" (UID: \"0036b8ad-7c49-4b71-addf-d1386c2532e8\") " pod="calico-system/goldmane-cccfbd5cf-hwgzw" Mar 4 01:19:24.140079 kubelet[2701]: I0304 01:19:24.136619 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0181c27b-f567-403e-991b-d2716b48e52b-whisker-backend-key-pair\") pod \"whisker-64f8464b7b-m2c64\" (UID: \"0181c27b-f567-403e-991b-d2716b48e52b\") " pod="calico-system/whisker-64f8464b7b-m2c64" Mar 4 01:19:24.140079 kubelet[2701]: I0304 01:19:24.136698 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e815ce3e-bc88-40d3-b47c-e2c0c6843ef4-calico-apiserver-certs\") pod \"calico-apiserver-5d996b5b5-4hclm\" (UID: \"e815ce3e-bc88-40d3-b47c-e2c0c6843ef4\") " pod="calico-system/calico-apiserver-5d996b5b5-4hclm" Mar 4 01:19:24.140079 kubelet[2701]: I0304 01:19:24.136732 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6bd5\" (UniqueName: \"kubernetes.io/projected/0181c27b-f567-403e-991b-d2716b48e52b-kube-api-access-f6bd5\") pod \"whisker-64f8464b7b-m2c64\" (UID: \"0181c27b-f567-403e-991b-d2716b48e52b\") " pod="calico-system/whisker-64f8464b7b-m2c64" Mar 4 01:19:24.140079 kubelet[2701]: I0304 01:19:24.136761 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzmvf\" (UniqueName: \"kubernetes.io/projected/e815ce3e-bc88-40d3-b47c-e2c0c6843ef4-kube-api-access-zzmvf\") pod \"calico-apiserver-5d996b5b5-4hclm\" (UID: \"e815ce3e-bc88-40d3-b47c-e2c0c6843ef4\") " pod="calico-system/calico-apiserver-5d996b5b5-4hclm" Mar 4 01:19:24.141401 kubelet[2701]: I0304 01:19:24.136819 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0036b8ad-7c49-4b71-addf-d1386c2532e8-config\") pod \"goldmane-cccfbd5cf-hwgzw\" (UID: \"0036b8ad-7c49-4b71-addf-d1386c2532e8\") " pod="calico-system/goldmane-cccfbd5cf-hwgzw" Mar 4 01:19:24.141401 kubelet[2701]: I0304 01:19:24.136849 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/0181c27b-f567-403e-991b-d2716b48e52b-nginx-config\") pod \"whisker-64f8464b7b-m2c64\" (UID: \"0181c27b-f567-403e-991b-d2716b48e52b\") " pod="calico-system/whisker-64f8464b7b-m2c64" Mar 4 01:19:24.141401 kubelet[2701]: I0304 01:19:24.136878 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5ssh\" (UniqueName: \"kubernetes.io/projected/0036b8ad-7c49-4b71-addf-d1386c2532e8-kube-api-access-b5ssh\") pod \"goldmane-cccfbd5cf-hwgzw\" (UID: \"0036b8ad-7c49-4b71-addf-d1386c2532e8\") " pod="calico-system/goldmane-cccfbd5cf-hwgzw" Mar 4 01:19:24.141401 kubelet[2701]: I0304 01:19:24.136923 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0036b8ad-7c49-4b71-addf-d1386c2532e8-goldmane-ca-bundle\") pod \"goldmane-cccfbd5cf-hwgzw\" (UID: \"0036b8ad-7c49-4b71-addf-d1386c2532e8\") " pod="calico-system/goldmane-cccfbd5cf-hwgzw" Mar 4 01:19:24.141401 kubelet[2701]: I0304 01:19:24.136952 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0181c27b-f567-403e-991b-d2716b48e52b-whisker-ca-bundle\") pod \"whisker-64f8464b7b-m2c64\" (UID: \"0181c27b-f567-403e-991b-d2716b48e52b\") " pod="calico-system/whisker-64f8464b7b-m2c64" Mar 4 01:19:24.276197 containerd[1516]: time="2026-03-04T01:19:24.276145390Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-ghd5d,Uid:49b3e387-9ccb-4b0f-9de1-bd709b96a755,Namespace:kube-system,Attempt:0,}" Mar 4 01:19:24.293174 containerd[1516]: time="2026-03-04T01:19:24.292676322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58bc6545dc-jbw6r,Uid:11453fdd-1482-4ede-8950-e97c22d85781,Namespace:calico-system,Attempt:0,}" Mar 4 01:19:24.307903 containerd[1516]: time="2026-03-04T01:19:24.307856775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-dg6p5,Uid:58ea317b-5eaa-44c7-a296-9b69e1cee2ab,Namespace:kube-system,Attempt:0,}" Mar 4 01:19:24.359925 containerd[1516]: time="2026-03-04T01:19:24.359520017Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d996b5b5-4hclm,Uid:e815ce3e-bc88-40d3-b47c-e2c0c6843ef4,Namespace:calico-system,Attempt:0,}" Mar 4 01:19:24.360818 containerd[1516]: time="2026-03-04T01:19:24.360780761Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-64f8464b7b-m2c64,Uid:0181c27b-f567-403e-991b-d2716b48e52b,Namespace:calico-system,Attempt:0,}" Mar 4 01:19:24.362914 containerd[1516]: time="2026-03-04T01:19:24.362881257Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d996b5b5-c257w,Uid:0aba09a3-09ff-4e29-a406-0f9932fc94f6,Namespace:calico-system,Attempt:0,}" Mar 4 01:19:24.370620 containerd[1516]: time="2026-03-04T01:19:24.370509220Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-hwgzw,Uid:0036b8ad-7c49-4b71-addf-d1386c2532e8,Namespace:calico-system,Attempt:0,}" Mar 4 01:19:24.494329 containerd[1516]: time="2026-03-04T01:19:24.492884075Z" level=info msg="CreateContainer within sandbox \"ee6a614508325bf6bf8d5a0f596adccec957adae26c2ca167d2a0af973974aa6\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 4 01:19:24.589626 containerd[1516]: time="2026-03-04T01:19:24.589557815Z" level=info msg="CreateContainer within sandbox \"ee6a614508325bf6bf8d5a0f596adccec957adae26c2ca167d2a0af973974aa6\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"be999855bfe87efaa77f75574df620deaf88cfd313f649212b51ed0583175c27\"" Mar 4 01:19:24.592443 containerd[1516]: time="2026-03-04T01:19:24.592406344Z" level=info msg="StartContainer for \"be999855bfe87efaa77f75574df620deaf88cfd313f649212b51ed0583175c27\"" Mar 4 01:19:24.726264 systemd[1]: Started cri-containerd-be999855bfe87efaa77f75574df620deaf88cfd313f649212b51ed0583175c27.scope - libcontainer container be999855bfe87efaa77f75574df620deaf88cfd313f649212b51ed0583175c27. Mar 4 01:19:24.911549 containerd[1516]: time="2026-03-04T01:19:24.909185808Z" level=error msg="Failed to destroy network for sandbox \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:24.917001 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b-shm.mount: Deactivated successfully. Mar 4 01:19:24.926542 containerd[1516]: time="2026-03-04T01:19:24.926360230Z" level=info msg="StartContainer for \"be999855bfe87efaa77f75574df620deaf88cfd313f649212b51ed0583175c27\" returns successfully" Mar 4 01:19:24.946636 containerd[1516]: time="2026-03-04T01:19:24.946216052Z" level=error msg="encountered an error cleaning up failed sandbox \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:24.946636 containerd[1516]: time="2026-03-04T01:19:24.946370500Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58bc6545dc-jbw6r,Uid:11453fdd-1482-4ede-8950-e97c22d85781,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:24.947475 kubelet[2701]: E0304 01:19:24.946928 2701 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:24.960747 kubelet[2701]: E0304 01:19:24.960672 2701 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58bc6545dc-jbw6r" Mar 4 01:19:24.960943 kubelet[2701]: E0304 01:19:24.960768 2701 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58bc6545dc-jbw6r" Mar 4 01:19:24.960943 kubelet[2701]: E0304 01:19:24.960878 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-58bc6545dc-jbw6r_calico-system(11453fdd-1482-4ede-8950-e97c22d85781)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-58bc6545dc-jbw6r_calico-system(11453fdd-1482-4ede-8950-e97c22d85781)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-58bc6545dc-jbw6r" podUID="11453fdd-1482-4ede-8950-e97c22d85781" Mar 4 01:19:24.970538 containerd[1516]: time="2026-03-04T01:19:24.970406065Z" level=error msg="Failed to destroy network for sandbox \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:24.971414 containerd[1516]: time="2026-03-04T01:19:24.971378225Z" level=error msg="encountered an error cleaning up failed sandbox \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:24.977389 containerd[1516]: time="2026-03-04T01:19:24.974156902Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-64f8464b7b-m2c64,Uid:0181c27b-f567-403e-991b-d2716b48e52b,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:24.984151 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471-shm.mount: Deactivated successfully. Mar 4 01:19:24.990334 kubelet[2701]: E0304 01:19:24.990278 2701 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:24.992391 kubelet[2701]: E0304 01:19:24.992171 2701 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-64f8464b7b-m2c64" Mar 4 01:19:24.992391 kubelet[2701]: E0304 01:19:24.992220 2701 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-64f8464b7b-m2c64" Mar 4 01:19:24.992391 kubelet[2701]: E0304 01:19:24.992326 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-64f8464b7b-m2c64_calico-system(0181c27b-f567-403e-991b-d2716b48e52b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-64f8464b7b-m2c64_calico-system(0181c27b-f567-403e-991b-d2716b48e52b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-64f8464b7b-m2c64" podUID="0181c27b-f567-403e-991b-d2716b48e52b" Mar 4 01:19:25.032339 containerd[1516]: time="2026-03-04T01:19:25.031040670Z" level=error msg="Failed to destroy network for sandbox \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.043080 containerd[1516]: time="2026-03-04T01:19:25.036479992Z" level=error msg="encountered an error cleaning up failed sandbox \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.043080 containerd[1516]: time="2026-03-04T01:19:25.036611091Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-dg6p5,Uid:58ea317b-5eaa-44c7-a296-9b69e1cee2ab,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.043080 containerd[1516]: time="2026-03-04T01:19:25.037246684Z" level=error msg="Failed to destroy network for sandbox \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.043080 containerd[1516]: time="2026-03-04T01:19:25.038957717Z" level=error msg="encountered an error cleaning up failed sandbox \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.043080 containerd[1516]: time="2026-03-04T01:19:25.039036836Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-hwgzw,Uid:0036b8ad-7c49-4b71-addf-d1386c2532e8,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.042463 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1-shm.mount: Deactivated successfully. Mar 4 01:19:25.043620 kubelet[2701]: E0304 01:19:25.039322 2701 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.043620 kubelet[2701]: E0304 01:19:25.039770 2701 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-dg6p5" Mar 4 01:19:25.043620 kubelet[2701]: E0304 01:19:25.039808 2701 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-dg6p5" Mar 4 01:19:25.043620 kubelet[2701]: E0304 01:19:25.040443 2701 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.053251 kubelet[2701]: E0304 01:19:25.040485 2701 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-hwgzw" Mar 4 01:19:25.053251 kubelet[2701]: E0304 01:19:25.040541 2701 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-hwgzw" Mar 4 01:19:25.053251 kubelet[2701]: E0304 01:19:25.040948 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-dg6p5_kube-system(58ea317b-5eaa-44c7-a296-9b69e1cee2ab)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-dg6p5_kube-system(58ea317b-5eaa-44c7-a296-9b69e1cee2ab)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-dg6p5" podUID="58ea317b-5eaa-44c7-a296-9b69e1cee2ab" Mar 4 01:19:25.052948 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a-shm.mount: Deactivated successfully. Mar 4 01:19:25.053658 containerd[1516]: time="2026-03-04T01:19:25.045796552Z" level=error msg="Failed to destroy network for sandbox \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.053658 containerd[1516]: time="2026-03-04T01:19:25.049084358Z" level=error msg="encountered an error cleaning up failed sandbox \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.053658 containerd[1516]: time="2026-03-04T01:19:25.049754816Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d996b5b5-4hclm,Uid:e815ce3e-bc88-40d3-b47c-e2c0c6843ef4,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.053861 kubelet[2701]: E0304 01:19:25.040632 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-cccfbd5cf-hwgzw_calico-system(0036b8ad-7c49-4b71-addf-d1386c2532e8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-cccfbd5cf-hwgzw_calico-system(0036b8ad-7c49-4b71-addf-d1386c2532e8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-cccfbd5cf-hwgzw" podUID="0036b8ad-7c49-4b71-addf-d1386c2532e8" Mar 4 01:19:25.053861 kubelet[2701]: E0304 01:19:25.052649 2701 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.053861 kubelet[2701]: E0304 01:19:25.053126 2701 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5d996b5b5-4hclm" Mar 4 01:19:25.054766 kubelet[2701]: E0304 01:19:25.053294 2701 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5d996b5b5-4hclm" Mar 4 01:19:25.054766 kubelet[2701]: E0304 01:19:25.054641 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d996b5b5-4hclm_calico-system(e815ce3e-bc88-40d3-b47c-e2c0c6843ef4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d996b5b5-4hclm_calico-system(e815ce3e-bc88-40d3-b47c-e2c0c6843ef4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-5d996b5b5-4hclm" podUID="e815ce3e-bc88-40d3-b47c-e2c0c6843ef4" Mar 4 01:19:25.054411 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e-shm.mount: Deactivated successfully. Mar 4 01:19:25.075369 containerd[1516]: time="2026-03-04T01:19:25.074310133Z" level=error msg="Failed to destroy network for sandbox \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.077223 systemd[1]: Created slice kubepods-besteffort-pod621d6fb0_0e39_428b_8e9a_8e8c65b0d05c.slice - libcontainer container kubepods-besteffort-pod621d6fb0_0e39_428b_8e9a_8e8c65b0d05c.slice. Mar 4 01:19:25.079127 containerd[1516]: time="2026-03-04T01:19:25.078831812Z" level=error msg="encountered an error cleaning up failed sandbox \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.079127 containerd[1516]: time="2026-03-04T01:19:25.078947335Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d996b5b5-c257w,Uid:0aba09a3-09ff-4e29-a406-0f9932fc94f6,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.081023 kubelet[2701]: E0304 01:19:25.079798 2701 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.082558 kubelet[2701]: E0304 01:19:25.081225 2701 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5d996b5b5-c257w" Mar 4 01:19:25.082558 kubelet[2701]: E0304 01:19:25.081278 2701 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-5d996b5b5-c257w" Mar 4 01:19:25.082558 kubelet[2701]: E0304 01:19:25.081391 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d996b5b5-c257w_calico-system(0aba09a3-09ff-4e29-a406-0f9932fc94f6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d996b5b5-c257w_calico-system(0aba09a3-09ff-4e29-a406-0f9932fc94f6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-5d996b5b5-c257w" podUID="0aba09a3-09ff-4e29-a406-0f9932fc94f6" Mar 4 01:19:25.084323 containerd[1516]: time="2026-03-04T01:19:25.084131565Z" level=error msg="Failed to destroy network for sandbox \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.085627 containerd[1516]: time="2026-03-04T01:19:25.085544457Z" level=error msg="encountered an error cleaning up failed sandbox \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.085838 containerd[1516]: time="2026-03-04T01:19:25.085668108Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-ghd5d,Uid:49b3e387-9ccb-4b0f-9de1-bd709b96a755,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.086300 kubelet[2701]: E0304 01:19:25.086033 2701 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.086300 kubelet[2701]: E0304 01:19:25.086123 2701 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-ghd5d" Mar 4 01:19:25.086300 kubelet[2701]: E0304 01:19:25.086150 2701 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-ghd5d" Mar 4 01:19:25.086469 kubelet[2701]: E0304 01:19:25.086236 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-ghd5d_kube-system(49b3e387-9ccb-4b0f-9de1-bd709b96a755)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-ghd5d_kube-system(49b3e387-9ccb-4b0f-9de1-bd709b96a755)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-ghd5d" podUID="49b3e387-9ccb-4b0f-9de1-bd709b96a755" Mar 4 01:19:25.092323 containerd[1516]: time="2026-03-04T01:19:25.089244675Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fnbg8,Uid:621d6fb0-0e39-428b-8e9a-8e8c65b0d05c,Namespace:calico-system,Attempt:0,}" Mar 4 01:19:25.195171 containerd[1516]: time="2026-03-04T01:19:25.194850413Z" level=error msg="Failed to destroy network for sandbox \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.195799 containerd[1516]: time="2026-03-04T01:19:25.195633655Z" level=error msg="encountered an error cleaning up failed sandbox \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.195799 containerd[1516]: time="2026-03-04T01:19:25.195729454Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fnbg8,Uid:621d6fb0-0e39-428b-8e9a-8e8c65b0d05c,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.196175 kubelet[2701]: E0304 01:19:25.196121 2701 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.198214 kubelet[2701]: E0304 01:19:25.196198 2701 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fnbg8" Mar 4 01:19:25.198214 kubelet[2701]: E0304 01:19:25.196240 2701 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fnbg8" Mar 4 01:19:25.198214 kubelet[2701]: E0304 01:19:25.196382 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fnbg8_calico-system(621d6fb0-0e39-428b-8e9a-8e8c65b0d05c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fnbg8_calico-system(621d6fb0-0e39-428b-8e9a-8e8c65b0d05c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fnbg8" podUID="621d6fb0-0e39-428b-8e9a-8e8c65b0d05c" Mar 4 01:19:25.477315 kubelet[2701]: I0304 01:19:25.463660 2701 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Mar 4 01:19:25.480171 kubelet[2701]: I0304 01:19:25.479511 2701 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Mar 4 01:19:25.486569 containerd[1516]: time="2026-03-04T01:19:25.483832374Z" level=info msg="StopPodSandbox for \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\"" Mar 4 01:19:25.486569 containerd[1516]: time="2026-03-04T01:19:25.484958720Z" level=info msg="StopPodSandbox for \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\"" Mar 4 01:19:25.486985 containerd[1516]: time="2026-03-04T01:19:25.486924605Z" level=info msg="Ensure that sandbox b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4 in task-service has been cleanup successfully" Mar 4 01:19:25.494029 containerd[1516]: time="2026-03-04T01:19:25.493963169Z" level=info msg="Ensure that sandbox dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471 in task-service has been cleanup successfully" Mar 4 01:19:25.495536 kubelet[2701]: I0304 01:19:25.495490 2701 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Mar 4 01:19:25.501724 containerd[1516]: time="2026-03-04T01:19:25.501670210Z" level=info msg="StopPodSandbox for \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\"" Mar 4 01:19:25.505969 containerd[1516]: time="2026-03-04T01:19:25.505344108Z" level=info msg="Ensure that sandbox 312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a in task-service has been cleanup successfully" Mar 4 01:19:25.512074 kubelet[2701]: I0304 01:19:25.510787 2701 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Mar 4 01:19:25.512242 containerd[1516]: time="2026-03-04T01:19:25.511888155Z" level=info msg="StopPodSandbox for \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\"" Mar 4 01:19:25.515721 containerd[1516]: time="2026-03-04T01:19:25.515671039Z" level=info msg="Ensure that sandbox 999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1 in task-service has been cleanup successfully" Mar 4 01:19:25.516253 kubelet[2701]: I0304 01:19:25.516223 2701 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Mar 4 01:19:25.517481 containerd[1516]: time="2026-03-04T01:19:25.517323977Z" level=info msg="StopPodSandbox for \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\"" Mar 4 01:19:25.532116 containerd[1516]: time="2026-03-04T01:19:25.531713426Z" level=info msg="Ensure that sandbox 7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a in task-service has been cleanup successfully" Mar 4 01:19:25.543276 kubelet[2701]: I0304 01:19:25.542994 2701 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Mar 4 01:19:25.553995 containerd[1516]: time="2026-03-04T01:19:25.550848073Z" level=info msg="StopPodSandbox for \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\"" Mar 4 01:19:25.558154 containerd[1516]: time="2026-03-04T01:19:25.556576285Z" level=info msg="Ensure that sandbox 5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b in task-service has been cleanup successfully" Mar 4 01:19:25.626794 kubelet[2701]: I0304 01:19:25.626588 2701 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Mar 4 01:19:25.639092 containerd[1516]: time="2026-03-04T01:19:25.638341810Z" level=info msg="StopPodSandbox for \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\"" Mar 4 01:19:25.639092 containerd[1516]: time="2026-03-04T01:19:25.638726793Z" level=info msg="Ensure that sandbox b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7 in task-service has been cleanup successfully" Mar 4 01:19:25.661675 kubelet[2701]: I0304 01:19:25.661627 2701 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Mar 4 01:19:25.679383 containerd[1516]: time="2026-03-04T01:19:25.679295833Z" level=info msg="StopPodSandbox for \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\"" Mar 4 01:19:25.681557 containerd[1516]: time="2026-03-04T01:19:25.681521213Z" level=info msg="Ensure that sandbox 495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e in task-service has been cleanup successfully" Mar 4 01:19:25.837704 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4-shm.mount: Deactivated successfully. Mar 4 01:19:25.837875 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a-shm.mount: Deactivated successfully. Mar 4 01:19:25.934004 containerd[1516]: time="2026-03-04T01:19:25.932007122Z" level=error msg="StopPodSandbox for \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\" failed" error="failed to destroy network for sandbox \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.934004 containerd[1516]: time="2026-03-04T01:19:25.932490521Z" level=error msg="StopPodSandbox for \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\" failed" error="failed to destroy network for sandbox \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:25.934824 kubelet[2701]: E0304 01:19:25.932683 2701 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Mar 4 01:19:25.938719 kubelet[2701]: E0304 01:19:25.935890 2701 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Mar 4 01:19:25.938719 kubelet[2701]: E0304 01:19:25.936620 2701 kuberuntime_manager.go:1665] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b"} Mar 4 01:19:25.938719 kubelet[2701]: E0304 01:19:25.936740 2701 kuberuntime_manager.go:1665] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4"} Mar 4 01:19:25.938719 kubelet[2701]: E0304 01:19:25.936822 2701 kuberuntime_manager.go:1233] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0aba09a3-09ff-4e29-a406-0f9932fc94f6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 4 01:19:25.938719 kubelet[2701]: E0304 01:19:25.936880 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0aba09a3-09ff-4e29-a406-0f9932fc94f6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-5d996b5b5-c257w" podUID="0aba09a3-09ff-4e29-a406-0f9932fc94f6" Mar 4 01:19:25.942059 kubelet[2701]: E0304 01:19:25.936746 2701 kuberuntime_manager.go:1233] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"11453fdd-1482-4ede-8950-e97c22d85781\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 4 01:19:25.942059 kubelet[2701]: E0304 01:19:25.936949 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"11453fdd-1482-4ede-8950-e97c22d85781\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-58bc6545dc-jbw6r" podUID="11453fdd-1482-4ede-8950-e97c22d85781" Mar 4 01:19:26.041148 containerd[1516]: time="2026-03-04T01:19:26.037443518Z" level=error msg="StopPodSandbox for \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\" failed" error="failed to destroy network for sandbox \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:26.041148 containerd[1516]: time="2026-03-04T01:19:26.038936010Z" level=error msg="StopPodSandbox for \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\" failed" error="failed to destroy network for sandbox \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:26.041596 kubelet[2701]: E0304 01:19:26.040327 2701 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Mar 4 01:19:26.041596 kubelet[2701]: E0304 01:19:26.040456 2701 kuberuntime_manager.go:1665] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7"} Mar 4 01:19:26.041596 kubelet[2701]: E0304 01:19:26.040559 2701 kuberuntime_manager.go:1233] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"621d6fb0-0e39-428b-8e9a-8e8c65b0d05c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 4 01:19:26.041596 kubelet[2701]: E0304 01:19:26.040617 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"621d6fb0-0e39-428b-8e9a-8e8c65b0d05c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fnbg8" podUID="621d6fb0-0e39-428b-8e9a-8e8c65b0d05c" Mar 4 01:19:26.042026 kubelet[2701]: E0304 01:19:26.040716 2701 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Mar 4 01:19:26.042026 kubelet[2701]: E0304 01:19:26.040762 2701 kuberuntime_manager.go:1665] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1"} Mar 4 01:19:26.042026 kubelet[2701]: E0304 01:19:26.040813 2701 kuberuntime_manager.go:1233] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"58ea317b-5eaa-44c7-a296-9b69e1cee2ab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 4 01:19:26.042026 kubelet[2701]: E0304 01:19:26.040863 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"58ea317b-5eaa-44c7-a296-9b69e1cee2ab\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-dg6p5" podUID="58ea317b-5eaa-44c7-a296-9b69e1cee2ab" Mar 4 01:19:26.047112 containerd[1516]: time="2026-03-04T01:19:26.045341755Z" level=error msg="StopPodSandbox for \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\" failed" error="failed to destroy network for sandbox \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:26.047214 kubelet[2701]: E0304 01:19:26.045999 2701 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Mar 4 01:19:26.047214 kubelet[2701]: E0304 01:19:26.046073 2701 kuberuntime_manager.go:1665] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471"} Mar 4 01:19:26.047214 kubelet[2701]: E0304 01:19:26.046129 2701 kuberuntime_manager.go:1233] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0181c27b-f567-403e-991b-d2716b48e52b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 4 01:19:26.047214 kubelet[2701]: E0304 01:19:26.046181 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0181c27b-f567-403e-991b-d2716b48e52b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-64f8464b7b-m2c64" podUID="0181c27b-f567-403e-991b-d2716b48e52b" Mar 4 01:19:26.085704 containerd[1516]: time="2026-03-04T01:19:26.084266239Z" level=error msg="StopPodSandbox for \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\" failed" error="failed to destroy network for sandbox \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:26.085955 kubelet[2701]: E0304 01:19:26.084724 2701 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Mar 4 01:19:26.085955 kubelet[2701]: E0304 01:19:26.084808 2701 kuberuntime_manager.go:1665] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a"} Mar 4 01:19:26.085955 kubelet[2701]: E0304 01:19:26.084860 2701 kuberuntime_manager.go:1233] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"49b3e387-9ccb-4b0f-9de1-bd709b96a755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 4 01:19:26.085955 kubelet[2701]: E0304 01:19:26.084918 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"49b3e387-9ccb-4b0f-9de1-bd709b96a755\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-ghd5d" podUID="49b3e387-9ccb-4b0f-9de1-bd709b96a755" Mar 4 01:19:26.103194 containerd[1516]: time="2026-03-04T01:19:26.100305913Z" level=error msg="StopPodSandbox for \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\" failed" error="failed to destroy network for sandbox \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:26.103455 kubelet[2701]: E0304 01:19:26.100880 2701 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Mar 4 01:19:26.103455 kubelet[2701]: E0304 01:19:26.100955 2701 kuberuntime_manager.go:1665] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a"} Mar 4 01:19:26.103455 kubelet[2701]: E0304 01:19:26.100998 2701 kuberuntime_manager.go:1233] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"e815ce3e-bc88-40d3-b47c-e2c0c6843ef4\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 4 01:19:26.103455 kubelet[2701]: E0304 01:19:26.101035 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"e815ce3e-bc88-40d3-b47c-e2c0c6843ef4\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-5d996b5b5-4hclm" podUID="e815ce3e-bc88-40d3-b47c-e2c0c6843ef4" Mar 4 01:19:26.120658 containerd[1516]: time="2026-03-04T01:19:26.119204337Z" level=error msg="StopPodSandbox for \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\" failed" error="failed to destroy network for sandbox \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 4 01:19:26.120886 kubelet[2701]: E0304 01:19:26.120333 2701 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Mar 4 01:19:26.120886 kubelet[2701]: E0304 01:19:26.120435 2701 kuberuntime_manager.go:1665] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e"} Mar 4 01:19:26.120886 kubelet[2701]: E0304 01:19:26.120487 2701 kuberuntime_manager.go:1233] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0036b8ad-7c49-4b71-addf-d1386c2532e8\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 4 01:19:26.120886 kubelet[2701]: E0304 01:19:26.120573 2701 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0036b8ad-7c49-4b71-addf-d1386c2532e8\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-cccfbd5cf-hwgzw" podUID="0036b8ad-7c49-4b71-addf-d1386c2532e8" Mar 4 01:19:26.245929 kubelet[2701]: I0304 01:19:26.242032 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-5rp7s" podStartSLOduration=4.628518432 podStartE2EDuration="29.237771112s" podCreationTimestamp="2026-03-04 01:18:57 +0000 UTC" firstStartedPulling="2026-03-04 01:18:57.650914827 +0000 UTC m=+25.886175040" lastFinishedPulling="2026-03-04 01:19:22.260167514 +0000 UTC m=+50.495427720" observedRunningTime="2026-03-04 01:19:25.750452886 +0000 UTC m=+53.985713105" watchObservedRunningTime="2026-03-04 01:19:26.237771112 +0000 UTC m=+54.473031330" Mar 4 01:19:26.667357 containerd[1516]: time="2026-03-04T01:19:26.667295926Z" level=info msg="StopPodSandbox for \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\"" Mar 4 01:19:26.722203 systemd[1]: run-containerd-runc-k8s.io-be999855bfe87efaa77f75574df620deaf88cfd313f649212b51ed0583175c27-runc.NexNtA.mount: Deactivated successfully. Mar 4 01:19:26.949565 containerd[1516]: 2026-03-04 01:19:26.854 [INFO][4061] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Mar 4 01:19:26.949565 containerd[1516]: 2026-03-04 01:19:26.855 [INFO][4061] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" iface="eth0" netns="/var/run/netns/cni-32ae3347-6cd5-30e1-21b3-528a102a768f" Mar 4 01:19:26.949565 containerd[1516]: 2026-03-04 01:19:26.856 [INFO][4061] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" iface="eth0" netns="/var/run/netns/cni-32ae3347-6cd5-30e1-21b3-528a102a768f" Mar 4 01:19:26.949565 containerd[1516]: 2026-03-04 01:19:26.857 [INFO][4061] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" iface="eth0" netns="/var/run/netns/cni-32ae3347-6cd5-30e1-21b3-528a102a768f" Mar 4 01:19:26.949565 containerd[1516]: 2026-03-04 01:19:26.857 [INFO][4061] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Mar 4 01:19:26.949565 containerd[1516]: 2026-03-04 01:19:26.857 [INFO][4061] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Mar 4 01:19:26.949565 containerd[1516]: 2026-03-04 01:19:26.928 [INFO][4098] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" HandleID="k8s-pod-network.dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Workload="srv--8wmcq.gb1.brightbox.com-k8s-whisker--64f8464b7b--m2c64-eth0" Mar 4 01:19:26.949565 containerd[1516]: 2026-03-04 01:19:26.929 [INFO][4098] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:26.949565 containerd[1516]: 2026-03-04 01:19:26.929 [INFO][4098] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:26.949565 containerd[1516]: 2026-03-04 01:19:26.940 [WARNING][4098] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" HandleID="k8s-pod-network.dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Workload="srv--8wmcq.gb1.brightbox.com-k8s-whisker--64f8464b7b--m2c64-eth0" Mar 4 01:19:26.949565 containerd[1516]: 2026-03-04 01:19:26.940 [INFO][4098] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" HandleID="k8s-pod-network.dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Workload="srv--8wmcq.gb1.brightbox.com-k8s-whisker--64f8464b7b--m2c64-eth0" Mar 4 01:19:26.949565 containerd[1516]: 2026-03-04 01:19:26.942 [INFO][4098] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:26.949565 containerd[1516]: 2026-03-04 01:19:26.946 [INFO][4061] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Mar 4 01:19:26.954019 containerd[1516]: time="2026-03-04T01:19:26.950136272Z" level=info msg="TearDown network for sandbox \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\" successfully" Mar 4 01:19:26.954019 containerd[1516]: time="2026-03-04T01:19:26.950228037Z" level=info msg="StopPodSandbox for \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\" returns successfully" Mar 4 01:19:26.955458 systemd[1]: run-netns-cni\x2d32ae3347\x2d6cd5\x2d30e1\x2d21b3\x2d528a102a768f.mount: Deactivated successfully. Mar 4 01:19:27.064615 kubelet[2701]: I0304 01:19:27.063625 2701 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0181c27b-f567-403e-991b-d2716b48e52b-whisker-ca-bundle\") pod \"0181c27b-f567-403e-991b-d2716b48e52b\" (UID: \"0181c27b-f567-403e-991b-d2716b48e52b\") " Mar 4 01:19:27.064615 kubelet[2701]: I0304 01:19:27.063699 2701 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/0181c27b-f567-403e-991b-d2716b48e52b-nginx-config\") pod \"0181c27b-f567-403e-991b-d2716b48e52b\" (UID: \"0181c27b-f567-403e-991b-d2716b48e52b\") " Mar 4 01:19:27.064615 kubelet[2701]: I0304 01:19:27.063810 2701 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0181c27b-f567-403e-991b-d2716b48e52b-whisker-backend-key-pair\") pod \"0181c27b-f567-403e-991b-d2716b48e52b\" (UID: \"0181c27b-f567-403e-991b-d2716b48e52b\") " Mar 4 01:19:27.064615 kubelet[2701]: I0304 01:19:27.063896 2701 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6bd5\" (UniqueName: \"kubernetes.io/projected/0181c27b-f567-403e-991b-d2716b48e52b-kube-api-access-f6bd5\") pod \"0181c27b-f567-403e-991b-d2716b48e52b\" (UID: \"0181c27b-f567-403e-991b-d2716b48e52b\") " Mar 4 01:19:27.074986 systemd[1]: var-lib-kubelet-pods-0181c27b\x2df567\x2d403e\x2d991b\x2dd2716b48e52b-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2df6bd5.mount: Deactivated successfully. Mar 4 01:19:27.078962 kubelet[2701]: I0304 01:19:27.072390 2701 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0181c27b-f567-403e-991b-d2716b48e52b-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "0181c27b-f567-403e-991b-d2716b48e52b" (UID: "0181c27b-f567-403e-991b-d2716b48e52b"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 4 01:19:27.080069 kubelet[2701]: I0304 01:19:27.079925 2701 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0181c27b-f567-403e-991b-d2716b48e52b-kube-api-access-f6bd5" (OuterVolumeSpecName: "kube-api-access-f6bd5") pod "0181c27b-f567-403e-991b-d2716b48e52b" (UID: "0181c27b-f567-403e-991b-d2716b48e52b"). InnerVolumeSpecName "kube-api-access-f6bd5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 4 01:19:27.080069 kubelet[2701]: I0304 01:19:27.072344 2701 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0181c27b-f567-403e-991b-d2716b48e52b-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "0181c27b-f567-403e-991b-d2716b48e52b" (UID: "0181c27b-f567-403e-991b-d2716b48e52b"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 4 01:19:27.083716 kubelet[2701]: I0304 01:19:27.083667 2701 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0181c27b-f567-403e-991b-d2716b48e52b-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "0181c27b-f567-403e-991b-d2716b48e52b" (UID: "0181c27b-f567-403e-991b-d2716b48e52b"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 4 01:19:27.088770 systemd[1]: var-lib-kubelet-pods-0181c27b\x2df567\x2d403e\x2d991b\x2dd2716b48e52b-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 4 01:19:27.165387 kubelet[2701]: I0304 01:19:27.164739 2701 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f6bd5\" (UniqueName: \"kubernetes.io/projected/0181c27b-f567-403e-991b-d2716b48e52b-kube-api-access-f6bd5\") on node \"srv-8wmcq.gb1.brightbox.com\" DevicePath \"\"" Mar 4 01:19:27.165754 kubelet[2701]: I0304 01:19:27.165651 2701 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0181c27b-f567-403e-991b-d2716b48e52b-whisker-ca-bundle\") on node \"srv-8wmcq.gb1.brightbox.com\" DevicePath \"\"" Mar 4 01:19:27.165754 kubelet[2701]: I0304 01:19:27.165684 2701 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/0181c27b-f567-403e-991b-d2716b48e52b-nginx-config\") on node \"srv-8wmcq.gb1.brightbox.com\" DevicePath \"\"" Mar 4 01:19:27.165754 kubelet[2701]: I0304 01:19:27.165703 2701 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0181c27b-f567-403e-991b-d2716b48e52b-whisker-backend-key-pair\") on node \"srv-8wmcq.gb1.brightbox.com\" DevicePath \"\"" Mar 4 01:19:27.688411 systemd[1]: Removed slice kubepods-besteffort-pod0181c27b_f567_403e_991b_d2716b48e52b.slice - libcontainer container kubepods-besteffort-pod0181c27b_f567_403e_991b_d2716b48e52b.slice. Mar 4 01:19:27.844015 systemd[1]: Created slice kubepods-besteffort-pod5fe3b977_6f7e_4715_bb07_29355c94a222.slice - libcontainer container kubepods-besteffort-pod5fe3b977_6f7e_4715_bb07_29355c94a222.slice. Mar 4 01:19:27.982470 kubelet[2701]: I0304 01:19:27.981660 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5fe3b977-6f7e-4715-bb07-29355c94a222-whisker-backend-key-pair\") pod \"whisker-5cdbc7c47c-brr6w\" (UID: \"5fe3b977-6f7e-4715-bb07-29355c94a222\") " pod="calico-system/whisker-5cdbc7c47c-brr6w" Mar 4 01:19:27.982470 kubelet[2701]: I0304 01:19:27.981721 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fe3b977-6f7e-4715-bb07-29355c94a222-whisker-ca-bundle\") pod \"whisker-5cdbc7c47c-brr6w\" (UID: \"5fe3b977-6f7e-4715-bb07-29355c94a222\") " pod="calico-system/whisker-5cdbc7c47c-brr6w" Mar 4 01:19:27.982470 kubelet[2701]: I0304 01:19:27.981767 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6zqp\" (UniqueName: \"kubernetes.io/projected/5fe3b977-6f7e-4715-bb07-29355c94a222-kube-api-access-q6zqp\") pod \"whisker-5cdbc7c47c-brr6w\" (UID: \"5fe3b977-6f7e-4715-bb07-29355c94a222\") " pod="calico-system/whisker-5cdbc7c47c-brr6w" Mar 4 01:19:27.982470 kubelet[2701]: I0304 01:19:27.981814 2701 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/5fe3b977-6f7e-4715-bb07-29355c94a222-nginx-config\") pod \"whisker-5cdbc7c47c-brr6w\" (UID: \"5fe3b977-6f7e-4715-bb07-29355c94a222\") " pod="calico-system/whisker-5cdbc7c47c-brr6w" Mar 4 01:19:28.052640 kubelet[2701]: I0304 01:19:28.052271 2701 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0181c27b-f567-403e-991b-d2716b48e52b" path="/var/lib/kubelet/pods/0181c27b-f567-403e-991b-d2716b48e52b/volumes" Mar 4 01:19:28.171851 containerd[1516]: time="2026-03-04T01:19:28.171794421Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5cdbc7c47c-brr6w,Uid:5fe3b977-6f7e-4715-bb07-29355c94a222,Namespace:calico-system,Attempt:0,}" Mar 4 01:19:28.547103 systemd-networkd[1434]: caliebceb332f22: Link UP Mar 4 01:19:28.549184 systemd-networkd[1434]: caliebceb332f22: Gained carrier Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.274 [ERROR][4221] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.304 [INFO][4221] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--8wmcq.gb1.brightbox.com-k8s-whisker--5cdbc7c47c--brr6w-eth0 whisker-5cdbc7c47c- calico-system 5fe3b977-6f7e-4715-bb07-29355c94a222 946 0 2026-03-04 01:19:27 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5cdbc7c47c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s srv-8wmcq.gb1.brightbox.com whisker-5cdbc7c47c-brr6w eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] caliebceb332f22 [] [] }} ContainerID="330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" Namespace="calico-system" Pod="whisker-5cdbc7c47c-brr6w" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-whisker--5cdbc7c47c--brr6w-" Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.304 [INFO][4221] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" Namespace="calico-system" Pod="whisker-5cdbc7c47c-brr6w" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-whisker--5cdbc7c47c--brr6w-eth0" Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.408 [INFO][4234] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" HandleID="k8s-pod-network.330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" Workload="srv--8wmcq.gb1.brightbox.com-k8s-whisker--5cdbc7c47c--brr6w-eth0" Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.431 [INFO][4234] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" HandleID="k8s-pod-network.330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" Workload="srv--8wmcq.gb1.brightbox.com-k8s-whisker--5cdbc7c47c--brr6w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00038e090), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-8wmcq.gb1.brightbox.com", "pod":"whisker-5cdbc7c47c-brr6w", "timestamp":"2026-03-04 01:19:28.408988705 +0000 UTC"}, Hostname:"srv-8wmcq.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003989a0)} Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.431 [INFO][4234] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.431 [INFO][4234] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.432 [INFO][4234] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-8wmcq.gb1.brightbox.com' Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.438 [INFO][4234] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.456 [INFO][4234] ipam/ipam.go 409: Looking up existing affinities for host host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.474 [INFO][4234] ipam/ipam.go 526: Trying affinity for 192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.478 [INFO][4234] ipam/ipam.go 160: Attempting to load block cidr=192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.482 [INFO][4234] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.482 [INFO][4234] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.45.192/26 handle="k8s-pod-network.330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.484 [INFO][4234] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1 Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.494 [INFO][4234] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.45.192/26 handle="k8s-pod-network.330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.504 [INFO][4234] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.45.193/26] block=192.168.45.192/26 handle="k8s-pod-network.330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.504 [INFO][4234] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.45.193/26] handle="k8s-pod-network.330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.504 [INFO][4234] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:28.618167 containerd[1516]: 2026-03-04 01:19:28.504 [INFO][4234] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.45.193/26] IPv6=[] ContainerID="330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" HandleID="k8s-pod-network.330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" Workload="srv--8wmcq.gb1.brightbox.com-k8s-whisker--5cdbc7c47c--brr6w-eth0" Mar 4 01:19:28.624699 containerd[1516]: 2026-03-04 01:19:28.509 [INFO][4221] cni-plugin/k8s.go 418: Populated endpoint ContainerID="330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" Namespace="calico-system" Pod="whisker-5cdbc7c47c-brr6w" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-whisker--5cdbc7c47c--brr6w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-whisker--5cdbc7c47c--brr6w-eth0", GenerateName:"whisker-5cdbc7c47c-", Namespace:"calico-system", SelfLink:"", UID:"5fe3b977-6f7e-4715-bb07-29355c94a222", ResourceVersion:"946", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 19, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5cdbc7c47c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"", Pod:"whisker-5cdbc7c47c-brr6w", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.45.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliebceb332f22", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:19:28.624699 containerd[1516]: 2026-03-04 01:19:28.509 [INFO][4221] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.45.193/32] ContainerID="330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" Namespace="calico-system" Pod="whisker-5cdbc7c47c-brr6w" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-whisker--5cdbc7c47c--brr6w-eth0" Mar 4 01:19:28.624699 containerd[1516]: 2026-03-04 01:19:28.509 [INFO][4221] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliebceb332f22 ContainerID="330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" Namespace="calico-system" Pod="whisker-5cdbc7c47c-brr6w" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-whisker--5cdbc7c47c--brr6w-eth0" Mar 4 01:19:28.624699 containerd[1516]: 2026-03-04 01:19:28.553 [INFO][4221] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" Namespace="calico-system" Pod="whisker-5cdbc7c47c-brr6w" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-whisker--5cdbc7c47c--brr6w-eth0" Mar 4 01:19:28.624699 containerd[1516]: 2026-03-04 01:19:28.572 [INFO][4221] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" Namespace="calico-system" Pod="whisker-5cdbc7c47c-brr6w" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-whisker--5cdbc7c47c--brr6w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-whisker--5cdbc7c47c--brr6w-eth0", GenerateName:"whisker-5cdbc7c47c-", Namespace:"calico-system", SelfLink:"", UID:"5fe3b977-6f7e-4715-bb07-29355c94a222", ResourceVersion:"946", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 19, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5cdbc7c47c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1", Pod:"whisker-5cdbc7c47c-brr6w", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.45.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliebceb332f22", MAC:"32:17:91:cf:eb:a4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:19:28.624699 containerd[1516]: 2026-03-04 01:19:28.599 [INFO][4221] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1" Namespace="calico-system" Pod="whisker-5cdbc7c47c-brr6w" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-whisker--5cdbc7c47c--brr6w-eth0" Mar 4 01:19:28.709101 containerd[1516]: time="2026-03-04T01:19:28.707998702Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 4 01:19:28.709101 containerd[1516]: time="2026-03-04T01:19:28.708154719Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 4 01:19:28.709101 containerd[1516]: time="2026-03-04T01:19:28.708178247Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:19:28.709101 containerd[1516]: time="2026-03-04T01:19:28.708359268Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:19:28.763479 systemd[1]: Started cri-containerd-330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1.scope - libcontainer container 330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1. Mar 4 01:19:28.880482 containerd[1516]: time="2026-03-04T01:19:28.877644327Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5cdbc7c47c-brr6w,Uid:5fe3b977-6f7e-4715-bb07-29355c94a222,Namespace:calico-system,Attempt:0,} returns sandbox id \"330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1\"" Mar 4 01:19:28.895447 containerd[1516]: time="2026-03-04T01:19:28.895386748Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 4 01:19:28.978396 kernel: calico-node[4162]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 4 01:19:29.779682 systemd-networkd[1434]: caliebceb332f22: Gained IPv6LL Mar 4 01:19:30.032358 systemd-networkd[1434]: vxlan.calico: Link UP Mar 4 01:19:30.032382 systemd-networkd[1434]: vxlan.calico: Gained carrier Mar 4 01:19:31.327471 containerd[1516]: time="2026-03-04T01:19:31.326328196Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Mar 4 01:19:31.363908 containerd[1516]: time="2026-03-04T01:19:31.363224354Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 2.467747335s" Mar 4 01:19:31.363908 containerd[1516]: time="2026-03-04T01:19:31.363345907Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Mar 4 01:19:31.392561 containerd[1516]: time="2026-03-04T01:19:31.391939273Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:31.394251 containerd[1516]: time="2026-03-04T01:19:31.393379474Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:31.395499 containerd[1516]: time="2026-03-04T01:19:31.394460779Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:31.412605 containerd[1516]: time="2026-03-04T01:19:31.412561040Z" level=info msg="CreateContainer within sandbox \"330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 4 01:19:31.501633 containerd[1516]: time="2026-03-04T01:19:31.501480416Z" level=info msg="CreateContainer within sandbox \"330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"2f132502eaea49ab0f8e1236d4269db0805852b737fad51c0ec9b6375498c997\"" Mar 4 01:19:31.503237 containerd[1516]: time="2026-03-04T01:19:31.503188259Z" level=info msg="StartContainer for \"2f132502eaea49ab0f8e1236d4269db0805852b737fad51c0ec9b6375498c997\"" Mar 4 01:19:31.660278 systemd[1]: Started cri-containerd-2f132502eaea49ab0f8e1236d4269db0805852b737fad51c0ec9b6375498c997.scope - libcontainer container 2f132502eaea49ab0f8e1236d4269db0805852b737fad51c0ec9b6375498c997. Mar 4 01:19:31.738975 containerd[1516]: time="2026-03-04T01:19:31.738922938Z" level=info msg="StartContainer for \"2f132502eaea49ab0f8e1236d4269db0805852b737fad51c0ec9b6375498c997\" returns successfully" Mar 4 01:19:31.741296 systemd-networkd[1434]: vxlan.calico: Gained IPv6LL Mar 4 01:19:31.744971 containerd[1516]: time="2026-03-04T01:19:31.744929459Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 4 01:19:32.031691 containerd[1516]: time="2026-03-04T01:19:32.030119052Z" level=info msg="StopPodSandbox for \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\"" Mar 4 01:19:32.364172 containerd[1516]: 2026-03-04 01:19:32.179 [WARNING][4490] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-whisker--64f8464b7b--m2c64-eth0" Mar 4 01:19:32.364172 containerd[1516]: 2026-03-04 01:19:32.181 [INFO][4490] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Mar 4 01:19:32.364172 containerd[1516]: 2026-03-04 01:19:32.181 [INFO][4490] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" iface="eth0" netns="" Mar 4 01:19:32.364172 containerd[1516]: 2026-03-04 01:19:32.181 [INFO][4490] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Mar 4 01:19:32.364172 containerd[1516]: 2026-03-04 01:19:32.181 [INFO][4490] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Mar 4 01:19:32.364172 containerd[1516]: 2026-03-04 01:19:32.345 [INFO][4499] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" HandleID="k8s-pod-network.dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Workload="srv--8wmcq.gb1.brightbox.com-k8s-whisker--64f8464b7b--m2c64-eth0" Mar 4 01:19:32.364172 containerd[1516]: 2026-03-04 01:19:32.346 [INFO][4499] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:32.364172 containerd[1516]: 2026-03-04 01:19:32.346 [INFO][4499] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:32.364172 containerd[1516]: 2026-03-04 01:19:32.356 [WARNING][4499] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" HandleID="k8s-pod-network.dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Workload="srv--8wmcq.gb1.brightbox.com-k8s-whisker--64f8464b7b--m2c64-eth0" Mar 4 01:19:32.364172 containerd[1516]: 2026-03-04 01:19:32.356 [INFO][4499] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" HandleID="k8s-pod-network.dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Workload="srv--8wmcq.gb1.brightbox.com-k8s-whisker--64f8464b7b--m2c64-eth0" Mar 4 01:19:32.364172 containerd[1516]: 2026-03-04 01:19:32.359 [INFO][4499] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:32.364172 containerd[1516]: 2026-03-04 01:19:32.361 [INFO][4490] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Mar 4 01:19:32.366376 containerd[1516]: time="2026-03-04T01:19:32.364187508Z" level=info msg="TearDown network for sandbox \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\" successfully" Mar 4 01:19:32.366376 containerd[1516]: time="2026-03-04T01:19:32.364230770Z" level=info msg="StopPodSandbox for \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\" returns successfully" Mar 4 01:19:32.366376 containerd[1516]: time="2026-03-04T01:19:32.365487535Z" level=info msg="RemovePodSandbox for \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\"" Mar 4 01:19:32.369120 containerd[1516]: time="2026-03-04T01:19:32.369067935Z" level=info msg="Forcibly stopping sandbox \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\"" Mar 4 01:19:32.475670 containerd[1516]: 2026-03-04 01:19:32.420 [WARNING][4513] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-whisker--64f8464b7b--m2c64-eth0" Mar 4 01:19:32.475670 containerd[1516]: 2026-03-04 01:19:32.421 [INFO][4513] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Mar 4 01:19:32.475670 containerd[1516]: 2026-03-04 01:19:32.421 [INFO][4513] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" iface="eth0" netns="" Mar 4 01:19:32.475670 containerd[1516]: 2026-03-04 01:19:32.421 [INFO][4513] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Mar 4 01:19:32.475670 containerd[1516]: 2026-03-04 01:19:32.421 [INFO][4513] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Mar 4 01:19:32.475670 containerd[1516]: 2026-03-04 01:19:32.457 [INFO][4520] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" HandleID="k8s-pod-network.dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Workload="srv--8wmcq.gb1.brightbox.com-k8s-whisker--64f8464b7b--m2c64-eth0" Mar 4 01:19:32.475670 containerd[1516]: 2026-03-04 01:19:32.457 [INFO][4520] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:32.475670 containerd[1516]: 2026-03-04 01:19:32.457 [INFO][4520] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:32.475670 containerd[1516]: 2026-03-04 01:19:32.468 [WARNING][4520] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" HandleID="k8s-pod-network.dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Workload="srv--8wmcq.gb1.brightbox.com-k8s-whisker--64f8464b7b--m2c64-eth0" Mar 4 01:19:32.475670 containerd[1516]: 2026-03-04 01:19:32.468 [INFO][4520] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" HandleID="k8s-pod-network.dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Workload="srv--8wmcq.gb1.brightbox.com-k8s-whisker--64f8464b7b--m2c64-eth0" Mar 4 01:19:32.475670 containerd[1516]: 2026-03-04 01:19:32.470 [INFO][4520] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:32.475670 containerd[1516]: 2026-03-04 01:19:32.473 [INFO][4513] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471" Mar 4 01:19:32.475670 containerd[1516]: time="2026-03-04T01:19:32.475516904Z" level=info msg="TearDown network for sandbox \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\" successfully" Mar 4 01:19:32.483667 containerd[1516]: time="2026-03-04T01:19:32.483589153Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 4 01:19:32.483859 containerd[1516]: time="2026-03-04T01:19:32.483695763Z" level=info msg="RemovePodSandbox \"dfd41188f52d6b09301f7a0852658778177a2a7932bbe8350b718bda13580471\" returns successfully" Mar 4 01:19:34.029495 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2119390200.mount: Deactivated successfully. Mar 4 01:19:34.053658 containerd[1516]: time="2026-03-04T01:19:34.052120670Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:34.054430 containerd[1516]: time="2026-03-04T01:19:34.054357595Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Mar 4 01:19:34.058394 containerd[1516]: time="2026-03-04T01:19:34.058359182Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:34.064672 containerd[1516]: time="2026-03-04T01:19:34.064624584Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:34.066448 containerd[1516]: time="2026-03-04T01:19:34.066402500Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 2.321402088s" Mar 4 01:19:34.066615 containerd[1516]: time="2026-03-04T01:19:34.066585070Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Mar 4 01:19:34.076026 containerd[1516]: time="2026-03-04T01:19:34.075984695Z" level=info msg="CreateContainer within sandbox \"330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 4 01:19:34.104267 containerd[1516]: time="2026-03-04T01:19:34.104210178Z" level=info msg="CreateContainer within sandbox \"330350f19dfc7be3273226e43dcea47a0f75d2800ae7189c699ab79d279728c1\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"04463b8ac8e0eed89655378679d008963ef0c723e9b85767a86074a7b0f8f074\"" Mar 4 01:19:34.105858 containerd[1516]: time="2026-03-04T01:19:34.105817470Z" level=info msg="StartContainer for \"04463b8ac8e0eed89655378679d008963ef0c723e9b85767a86074a7b0f8f074\"" Mar 4 01:19:34.162627 systemd[1]: Started cri-containerd-04463b8ac8e0eed89655378679d008963ef0c723e9b85767a86074a7b0f8f074.scope - libcontainer container 04463b8ac8e0eed89655378679d008963ef0c723e9b85767a86074a7b0f8f074. Mar 4 01:19:34.240972 containerd[1516]: time="2026-03-04T01:19:34.240883406Z" level=info msg="StartContainer for \"04463b8ac8e0eed89655378679d008963ef0c723e9b85767a86074a7b0f8f074\" returns successfully" Mar 4 01:19:34.767504 kubelet[2701]: I0304 01:19:34.767213 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5cdbc7c47c-brr6w" podStartSLOduration=2.579479685 podStartE2EDuration="7.764977315s" podCreationTimestamp="2026-03-04 01:19:27 +0000 UTC" firstStartedPulling="2026-03-04 01:19:28.882974409 +0000 UTC m=+57.118234615" lastFinishedPulling="2026-03-04 01:19:34.068472039 +0000 UTC m=+62.303732245" observedRunningTime="2026-03-04 01:19:34.761294312 +0000 UTC m=+62.996554533" watchObservedRunningTime="2026-03-04 01:19:34.764977315 +0000 UTC m=+63.000237527" Mar 4 01:19:37.045267 containerd[1516]: time="2026-03-04T01:19:37.043103965Z" level=info msg="StopPodSandbox for \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\"" Mar 4 01:19:37.231690 containerd[1516]: 2026-03-04 01:19:37.138 [INFO][4597] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Mar 4 01:19:37.231690 containerd[1516]: 2026-03-04 01:19:37.138 [INFO][4597] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" iface="eth0" netns="/var/run/netns/cni-8c1d0f17-0a88-3c43-8144-c36ad81d40d3" Mar 4 01:19:37.231690 containerd[1516]: 2026-03-04 01:19:37.139 [INFO][4597] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" iface="eth0" netns="/var/run/netns/cni-8c1d0f17-0a88-3c43-8144-c36ad81d40d3" Mar 4 01:19:37.231690 containerd[1516]: 2026-03-04 01:19:37.142 [INFO][4597] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" iface="eth0" netns="/var/run/netns/cni-8c1d0f17-0a88-3c43-8144-c36ad81d40d3" Mar 4 01:19:37.231690 containerd[1516]: 2026-03-04 01:19:37.142 [INFO][4597] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Mar 4 01:19:37.231690 containerd[1516]: 2026-03-04 01:19:37.142 [INFO][4597] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Mar 4 01:19:37.231690 containerd[1516]: 2026-03-04 01:19:37.212 [INFO][4604] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" HandleID="k8s-pod-network.7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" Mar 4 01:19:37.231690 containerd[1516]: 2026-03-04 01:19:37.213 [INFO][4604] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:37.231690 containerd[1516]: 2026-03-04 01:19:37.213 [INFO][4604] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:37.231690 containerd[1516]: 2026-03-04 01:19:37.224 [WARNING][4604] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" HandleID="k8s-pod-network.7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" Mar 4 01:19:37.231690 containerd[1516]: 2026-03-04 01:19:37.224 [INFO][4604] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" HandleID="k8s-pod-network.7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" Mar 4 01:19:37.231690 containerd[1516]: 2026-03-04 01:19:37.227 [INFO][4604] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:37.231690 containerd[1516]: 2026-03-04 01:19:37.229 [INFO][4597] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Mar 4 01:19:37.234731 containerd[1516]: time="2026-03-04T01:19:37.232189312Z" level=info msg="TearDown network for sandbox \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\" successfully" Mar 4 01:19:37.234731 containerd[1516]: time="2026-03-04T01:19:37.232253461Z" level=info msg="StopPodSandbox for \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\" returns successfully" Mar 4 01:19:37.239156 systemd[1]: run-netns-cni\x2d8c1d0f17\x2d0a88\x2d3c43\x2d8144\x2dc36ad81d40d3.mount: Deactivated successfully. Mar 4 01:19:37.242318 containerd[1516]: time="2026-03-04T01:19:37.242105372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-ghd5d,Uid:49b3e387-9ccb-4b0f-9de1-bd709b96a755,Namespace:kube-system,Attempt:1,}" Mar 4 01:19:37.512367 systemd-networkd[1434]: cali3233f0f9962: Link UP Mar 4 01:19:37.512796 systemd-networkd[1434]: cali3233f0f9962: Gained carrier Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.340 [INFO][4612] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0 coredns-66bc5c9577- kube-system 49b3e387-9ccb-4b0f-9de1-bd709b96a755 989 0 2026-03-04 01:18:37 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-8wmcq.gb1.brightbox.com coredns-66bc5c9577-ghd5d eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3233f0f9962 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" Namespace="kube-system" Pod="coredns-66bc5c9577-ghd5d" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-" Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.341 [INFO][4612] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" Namespace="kube-system" Pod="coredns-66bc5c9577-ghd5d" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.397 [INFO][4624] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" HandleID="k8s-pod-network.b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.416 [INFO][4624] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" HandleID="k8s-pod-network.b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ef4b0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-8wmcq.gb1.brightbox.com", "pod":"coredns-66bc5c9577-ghd5d", "timestamp":"2026-03-04 01:19:37.397225778 +0000 UTC"}, Hostname:"srv-8wmcq.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003e8f20)} Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.416 [INFO][4624] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.416 [INFO][4624] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.417 [INFO][4624] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-8wmcq.gb1.brightbox.com' Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.427 [INFO][4624] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.440 [INFO][4624] ipam/ipam.go 409: Looking up existing affinities for host host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.450 [INFO][4624] ipam/ipam.go 526: Trying affinity for 192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.454 [INFO][4624] ipam/ipam.go 160: Attempting to load block cidr=192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.459 [INFO][4624] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.459 [INFO][4624] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.45.192/26 handle="k8s-pod-network.b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.462 [INFO][4624] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0 Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.469 [INFO][4624] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.45.192/26 handle="k8s-pod-network.b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.478 [INFO][4624] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.45.194/26] block=192.168.45.192/26 handle="k8s-pod-network.b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.478 [INFO][4624] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.45.194/26] handle="k8s-pod-network.b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.478 [INFO][4624] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:37.548992 containerd[1516]: 2026-03-04 01:19:37.478 [INFO][4624] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.45.194/26] IPv6=[] ContainerID="b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" HandleID="k8s-pod-network.b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" Mar 4 01:19:37.550885 containerd[1516]: 2026-03-04 01:19:37.485 [INFO][4612] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" Namespace="kube-system" Pod="coredns-66bc5c9577-ghd5d" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"49b3e387-9ccb-4b0f-9de1-bd709b96a755", ResourceVersion:"989", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"", Pod:"coredns-66bc5c9577-ghd5d", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.45.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3233f0f9962", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:19:37.550885 containerd[1516]: 2026-03-04 01:19:37.485 [INFO][4612] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.45.194/32] ContainerID="b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" Namespace="kube-system" Pod="coredns-66bc5c9577-ghd5d" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" Mar 4 01:19:37.550885 containerd[1516]: 2026-03-04 01:19:37.485 [INFO][4612] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3233f0f9962 ContainerID="b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" Namespace="kube-system" Pod="coredns-66bc5c9577-ghd5d" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" Mar 4 01:19:37.550885 containerd[1516]: 2026-03-04 01:19:37.506 [INFO][4612] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" Namespace="kube-system" Pod="coredns-66bc5c9577-ghd5d" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" Mar 4 01:19:37.550885 containerd[1516]: 2026-03-04 01:19:37.506 [INFO][4612] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" Namespace="kube-system" Pod="coredns-66bc5c9577-ghd5d" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"49b3e387-9ccb-4b0f-9de1-bd709b96a755", ResourceVersion:"989", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0", Pod:"coredns-66bc5c9577-ghd5d", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.45.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3233f0f9962", MAC:"ae:02:0f:38:64:ea", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:19:37.553368 containerd[1516]: 2026-03-04 01:19:37.536 [INFO][4612] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0" Namespace="kube-system" Pod="coredns-66bc5c9577-ghd5d" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" Mar 4 01:19:37.603852 containerd[1516]: time="2026-03-04T01:19:37.603671034Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 4 01:19:37.604442 containerd[1516]: time="2026-03-04T01:19:37.604150212Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 4 01:19:37.604442 containerd[1516]: time="2026-03-04T01:19:37.604227074Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:19:37.604941 containerd[1516]: time="2026-03-04T01:19:37.604763240Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:19:37.656303 systemd[1]: Started cri-containerd-b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0.scope - libcontainer container b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0. Mar 4 01:19:37.729019 containerd[1516]: time="2026-03-04T01:19:37.728824829Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-ghd5d,Uid:49b3e387-9ccb-4b0f-9de1-bd709b96a755,Namespace:kube-system,Attempt:1,} returns sandbox id \"b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0\"" Mar 4 01:19:37.738424 containerd[1516]: time="2026-03-04T01:19:37.738372402Z" level=info msg="CreateContainer within sandbox \"b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 4 01:19:37.771038 containerd[1516]: time="2026-03-04T01:19:37.770036808Z" level=info msg="CreateContainer within sandbox \"b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5d567a6b114aa73064e5970b7b561e9df069df7354032c06be4b71895356009d\"" Mar 4 01:19:37.773334 containerd[1516]: time="2026-03-04T01:19:37.772770936Z" level=info msg="StartContainer for \"5d567a6b114aa73064e5970b7b561e9df069df7354032c06be4b71895356009d\"" Mar 4 01:19:37.812330 systemd[1]: Started cri-containerd-5d567a6b114aa73064e5970b7b561e9df069df7354032c06be4b71895356009d.scope - libcontainer container 5d567a6b114aa73064e5970b7b561e9df069df7354032c06be4b71895356009d. Mar 4 01:19:37.865386 containerd[1516]: time="2026-03-04T01:19:37.865301779Z" level=info msg="StartContainer for \"5d567a6b114aa73064e5970b7b561e9df069df7354032c06be4b71895356009d\" returns successfully" Mar 4 01:19:38.044392 containerd[1516]: time="2026-03-04T01:19:38.043795970Z" level=info msg="StopPodSandbox for \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\"" Mar 4 01:19:38.236096 containerd[1516]: 2026-03-04 01:19:38.156 [INFO][4733] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Mar 4 01:19:38.236096 containerd[1516]: 2026-03-04 01:19:38.156 [INFO][4733] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" iface="eth0" netns="/var/run/netns/cni-803270bf-16c0-fb27-35be-34d2a6c5ce83" Mar 4 01:19:38.236096 containerd[1516]: 2026-03-04 01:19:38.157 [INFO][4733] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" iface="eth0" netns="/var/run/netns/cni-803270bf-16c0-fb27-35be-34d2a6c5ce83" Mar 4 01:19:38.236096 containerd[1516]: 2026-03-04 01:19:38.158 [INFO][4733] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" iface="eth0" netns="/var/run/netns/cni-803270bf-16c0-fb27-35be-34d2a6c5ce83" Mar 4 01:19:38.236096 containerd[1516]: 2026-03-04 01:19:38.158 [INFO][4733] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Mar 4 01:19:38.236096 containerd[1516]: 2026-03-04 01:19:38.158 [INFO][4733] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Mar 4 01:19:38.236096 containerd[1516]: 2026-03-04 01:19:38.210 [INFO][4741] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" HandleID="k8s-pod-network.312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" Mar 4 01:19:38.236096 containerd[1516]: 2026-03-04 01:19:38.210 [INFO][4741] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:38.236096 containerd[1516]: 2026-03-04 01:19:38.210 [INFO][4741] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:38.236096 containerd[1516]: 2026-03-04 01:19:38.226 [WARNING][4741] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" HandleID="k8s-pod-network.312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" Mar 4 01:19:38.236096 containerd[1516]: 2026-03-04 01:19:38.226 [INFO][4741] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" HandleID="k8s-pod-network.312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" Mar 4 01:19:38.236096 containerd[1516]: 2026-03-04 01:19:38.229 [INFO][4741] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:38.236096 containerd[1516]: 2026-03-04 01:19:38.231 [INFO][4733] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Mar 4 01:19:38.242734 containerd[1516]: time="2026-03-04T01:19:38.236173340Z" level=info msg="TearDown network for sandbox \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\" successfully" Mar 4 01:19:38.242734 containerd[1516]: time="2026-03-04T01:19:38.236222860Z" level=info msg="StopPodSandbox for \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\" returns successfully" Mar 4 01:19:38.238094 systemd[1]: run-containerd-runc-k8s.io-b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0-runc.4pPaiV.mount: Deactivated successfully. Mar 4 01:19:38.244102 systemd[1]: run-netns-cni\x2d803270bf\x2d16c0\x2dfb27\x2d35be\x2d34d2a6c5ce83.mount: Deactivated successfully. Mar 4 01:19:38.247335 containerd[1516]: time="2026-03-04T01:19:38.246890130Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d996b5b5-4hclm,Uid:e815ce3e-bc88-40d3-b47c-e2c0c6843ef4,Namespace:calico-system,Attempt:1,}" Mar 4 01:19:38.471400 systemd-networkd[1434]: calieca23766474: Link UP Mar 4 01:19:38.471779 systemd-networkd[1434]: calieca23766474: Gained carrier Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.338 [INFO][4749] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0 calico-apiserver-5d996b5b5- calico-system e815ce3e-bc88-40d3-b47c-e2c0c6843ef4 998 0 2026-03-04 01:18:55 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5d996b5b5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-8wmcq.gb1.brightbox.com calico-apiserver-5d996b5b5-4hclm eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calieca23766474 [] [] }} ContainerID="dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" Namespace="calico-system" Pod="calico-apiserver-5d996b5b5-4hclm" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-" Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.338 [INFO][4749] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" Namespace="calico-system" Pod="calico-apiserver-5d996b5b5-4hclm" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.389 [INFO][4762] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" HandleID="k8s-pod-network.dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.406 [INFO][4762] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" HandleID="k8s-pod-network.dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002f7f60), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-8wmcq.gb1.brightbox.com", "pod":"calico-apiserver-5d996b5b5-4hclm", "timestamp":"2026-03-04 01:19:38.389024134 +0000 UTC"}, Hostname:"srv-8wmcq.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0002b7080)} Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.406 [INFO][4762] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.406 [INFO][4762] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.406 [INFO][4762] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-8wmcq.gb1.brightbox.com' Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.412 [INFO][4762] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.424 [INFO][4762] ipam/ipam.go 409: Looking up existing affinities for host host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.433 [INFO][4762] ipam/ipam.go 526: Trying affinity for 192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.435 [INFO][4762] ipam/ipam.go 160: Attempting to load block cidr=192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.441 [INFO][4762] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.441 [INFO][4762] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.45.192/26 handle="k8s-pod-network.dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.445 [INFO][4762] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6 Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.451 [INFO][4762] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.45.192/26 handle="k8s-pod-network.dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.460 [INFO][4762] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.45.195/26] block=192.168.45.192/26 handle="k8s-pod-network.dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.460 [INFO][4762] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.45.195/26] handle="k8s-pod-network.dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.460 [INFO][4762] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:38.498682 containerd[1516]: 2026-03-04 01:19:38.460 [INFO][4762] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.45.195/26] IPv6=[] ContainerID="dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" HandleID="k8s-pod-network.dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" Mar 4 01:19:38.502330 containerd[1516]: 2026-03-04 01:19:38.464 [INFO][4749] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" Namespace="calico-system" Pod="calico-apiserver-5d996b5b5-4hclm" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0", GenerateName:"calico-apiserver-5d996b5b5-", Namespace:"calico-system", SelfLink:"", UID:"e815ce3e-bc88-40d3-b47c-e2c0c6843ef4", ResourceVersion:"998", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d996b5b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-5d996b5b5-4hclm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.45.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calieca23766474", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:19:38.502330 containerd[1516]: 2026-03-04 01:19:38.464 [INFO][4749] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.45.195/32] ContainerID="dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" Namespace="calico-system" Pod="calico-apiserver-5d996b5b5-4hclm" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" Mar 4 01:19:38.502330 containerd[1516]: 2026-03-04 01:19:38.464 [INFO][4749] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calieca23766474 ContainerID="dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" Namespace="calico-system" Pod="calico-apiserver-5d996b5b5-4hclm" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" Mar 4 01:19:38.502330 containerd[1516]: 2026-03-04 01:19:38.471 [INFO][4749] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" Namespace="calico-system" Pod="calico-apiserver-5d996b5b5-4hclm" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" Mar 4 01:19:38.502330 containerd[1516]: 2026-03-04 01:19:38.475 [INFO][4749] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" Namespace="calico-system" Pod="calico-apiserver-5d996b5b5-4hclm" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0", GenerateName:"calico-apiserver-5d996b5b5-", Namespace:"calico-system", SelfLink:"", UID:"e815ce3e-bc88-40d3-b47c-e2c0c6843ef4", ResourceVersion:"998", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d996b5b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6", Pod:"calico-apiserver-5d996b5b5-4hclm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.45.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calieca23766474", MAC:"5e:51:13:ea:e4:29", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:19:38.502330 containerd[1516]: 2026-03-04 01:19:38.492 [INFO][4749] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6" Namespace="calico-system" Pod="calico-apiserver-5d996b5b5-4hclm" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" Mar 4 01:19:38.549455 containerd[1516]: time="2026-03-04T01:19:38.548719552Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 4 01:19:38.549455 containerd[1516]: time="2026-03-04T01:19:38.548819348Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 4 01:19:38.549455 containerd[1516]: time="2026-03-04T01:19:38.548836930Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:19:38.549455 containerd[1516]: time="2026-03-04T01:19:38.549092678Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:19:38.602016 systemd[1]: Started cri-containerd-dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6.scope - libcontainer container dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6. Mar 4 01:19:38.679764 containerd[1516]: time="2026-03-04T01:19:38.679610185Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d996b5b5-4hclm,Uid:e815ce3e-bc88-40d3-b47c-e2c0c6843ef4,Namespace:calico-system,Attempt:1,} returns sandbox id \"dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6\"" Mar 4 01:19:38.682821 containerd[1516]: time="2026-03-04T01:19:38.682634150Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 4 01:19:38.784101 kubelet[2701]: I0304 01:19:38.783659 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-ghd5d" podStartSLOduration=61.783575192 podStartE2EDuration="1m1.783575192s" podCreationTimestamp="2026-03-04 01:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-04 01:19:38.781175414 +0000 UTC m=+67.016435646" watchObservedRunningTime="2026-03-04 01:19:38.783575192 +0000 UTC m=+67.018835423" Mar 4 01:19:39.042663 containerd[1516]: time="2026-03-04T01:19:39.042272787Z" level=info msg="StopPodSandbox for \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\"" Mar 4 01:19:39.214686 containerd[1516]: 2026-03-04 01:19:39.139 [INFO][4842] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Mar 4 01:19:39.214686 containerd[1516]: 2026-03-04 01:19:39.142 [INFO][4842] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" iface="eth0" netns="/var/run/netns/cni-4ae96aa8-2102-aad8-5f6b-4ba262627000" Mar 4 01:19:39.214686 containerd[1516]: 2026-03-04 01:19:39.144 [INFO][4842] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" iface="eth0" netns="/var/run/netns/cni-4ae96aa8-2102-aad8-5f6b-4ba262627000" Mar 4 01:19:39.214686 containerd[1516]: 2026-03-04 01:19:39.144 [INFO][4842] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" iface="eth0" netns="/var/run/netns/cni-4ae96aa8-2102-aad8-5f6b-4ba262627000" Mar 4 01:19:39.214686 containerd[1516]: 2026-03-04 01:19:39.144 [INFO][4842] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Mar 4 01:19:39.214686 containerd[1516]: 2026-03-04 01:19:39.144 [INFO][4842] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Mar 4 01:19:39.214686 containerd[1516]: 2026-03-04 01:19:39.196 [INFO][4850] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" HandleID="k8s-pod-network.999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" Mar 4 01:19:39.214686 containerd[1516]: 2026-03-04 01:19:39.196 [INFO][4850] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:39.214686 containerd[1516]: 2026-03-04 01:19:39.196 [INFO][4850] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:39.214686 containerd[1516]: 2026-03-04 01:19:39.207 [WARNING][4850] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" HandleID="k8s-pod-network.999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" Mar 4 01:19:39.214686 containerd[1516]: 2026-03-04 01:19:39.208 [INFO][4850] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" HandleID="k8s-pod-network.999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" Mar 4 01:19:39.214686 containerd[1516]: 2026-03-04 01:19:39.210 [INFO][4850] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:39.214686 containerd[1516]: 2026-03-04 01:19:39.212 [INFO][4842] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Mar 4 01:19:39.215740 containerd[1516]: time="2026-03-04T01:19:39.215547391Z" level=info msg="TearDown network for sandbox \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\" successfully" Mar 4 01:19:39.215740 containerd[1516]: time="2026-03-04T01:19:39.215599789Z" level=info msg="StopPodSandbox for \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\" returns successfully" Mar 4 01:19:39.219952 containerd[1516]: time="2026-03-04T01:19:39.219425321Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-dg6p5,Uid:58ea317b-5eaa-44c7-a296-9b69e1cee2ab,Namespace:kube-system,Attempt:1,}" Mar 4 01:19:39.242021 systemd[1]: run-netns-cni\x2d4ae96aa8\x2d2102\x2daad8\x2d5f6b\x2d4ba262627000.mount: Deactivated successfully. Mar 4 01:19:39.292274 systemd-networkd[1434]: cali3233f0f9962: Gained IPv6LL Mar 4 01:19:39.420342 systemd-networkd[1434]: calie91308b0b81: Link UP Mar 4 01:19:39.422056 systemd-networkd[1434]: calie91308b0b81: Gained carrier Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.296 [INFO][4857] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0 coredns-66bc5c9577- kube-system 58ea317b-5eaa-44c7-a296-9b69e1cee2ab 1014 0 2026-03-04 01:18:37 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-8wmcq.gb1.brightbox.com coredns-66bc5c9577-dg6p5 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie91308b0b81 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" Namespace="kube-system" Pod="coredns-66bc5c9577-dg6p5" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-" Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.297 [INFO][4857] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" Namespace="kube-system" Pod="coredns-66bc5c9577-dg6p5" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.345 [INFO][4871] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" HandleID="k8s-pod-network.58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.355 [INFO][4871] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" HandleID="k8s-pod-network.58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277dd0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-8wmcq.gb1.brightbox.com", "pod":"coredns-66bc5c9577-dg6p5", "timestamp":"2026-03-04 01:19:39.345850774 +0000 UTC"}, Hostname:"srv-8wmcq.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000355b80)} Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.355 [INFO][4871] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.355 [INFO][4871] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.355 [INFO][4871] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-8wmcq.gb1.brightbox.com' Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.359 [INFO][4871] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.369 [INFO][4871] ipam/ipam.go 409: Looking up existing affinities for host host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.377 [INFO][4871] ipam/ipam.go 526: Trying affinity for 192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.380 [INFO][4871] ipam/ipam.go 160: Attempting to load block cidr=192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.384 [INFO][4871] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.384 [INFO][4871] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.45.192/26 handle="k8s-pod-network.58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.387 [INFO][4871] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.397 [INFO][4871] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.45.192/26 handle="k8s-pod-network.58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.409 [INFO][4871] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.45.196/26] block=192.168.45.192/26 handle="k8s-pod-network.58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.409 [INFO][4871] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.45.196/26] handle="k8s-pod-network.58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.409 [INFO][4871] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:39.453309 containerd[1516]: 2026-03-04 01:19:39.409 [INFO][4871] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.45.196/26] IPv6=[] ContainerID="58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" HandleID="k8s-pod-network.58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" Mar 4 01:19:39.457151 containerd[1516]: 2026-03-04 01:19:39.412 [INFO][4857] cni-plugin/k8s.go 418: Populated endpoint ContainerID="58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" Namespace="kube-system" Pod="coredns-66bc5c9577-dg6p5" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"58ea317b-5eaa-44c7-a296-9b69e1cee2ab", ResourceVersion:"1014", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"", Pod:"coredns-66bc5c9577-dg6p5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.45.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie91308b0b81", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:19:39.457151 containerd[1516]: 2026-03-04 01:19:39.412 [INFO][4857] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.45.196/32] ContainerID="58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" Namespace="kube-system" Pod="coredns-66bc5c9577-dg6p5" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" Mar 4 01:19:39.457151 containerd[1516]: 2026-03-04 01:19:39.412 [INFO][4857] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie91308b0b81 ContainerID="58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" Namespace="kube-system" Pod="coredns-66bc5c9577-dg6p5" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" Mar 4 01:19:39.457151 containerd[1516]: 2026-03-04 01:19:39.423 [INFO][4857] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" Namespace="kube-system" Pod="coredns-66bc5c9577-dg6p5" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" Mar 4 01:19:39.457151 containerd[1516]: 2026-03-04 01:19:39.429 [INFO][4857] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" Namespace="kube-system" Pod="coredns-66bc5c9577-dg6p5" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"58ea317b-5eaa-44c7-a296-9b69e1cee2ab", ResourceVersion:"1014", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f", Pod:"coredns-66bc5c9577-dg6p5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.45.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie91308b0b81", MAC:"c2:99:93:30:8a:e2", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:19:39.458574 containerd[1516]: 2026-03-04 01:19:39.449 [INFO][4857] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f" Namespace="kube-system" Pod="coredns-66bc5c9577-dg6p5" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" Mar 4 01:19:39.500115 containerd[1516]: time="2026-03-04T01:19:39.499189577Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 4 01:19:39.500115 containerd[1516]: time="2026-03-04T01:19:39.499282253Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 4 01:19:39.500115 containerd[1516]: time="2026-03-04T01:19:39.499380434Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:19:39.501130 containerd[1516]: time="2026-03-04T01:19:39.499510254Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:19:39.547340 systemd[1]: Started cri-containerd-58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f.scope - libcontainer container 58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f. Mar 4 01:19:39.624336 containerd[1516]: time="2026-03-04T01:19:39.624287463Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-dg6p5,Uid:58ea317b-5eaa-44c7-a296-9b69e1cee2ab,Namespace:kube-system,Attempt:1,} returns sandbox id \"58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f\"" Mar 4 01:19:39.644445 containerd[1516]: time="2026-03-04T01:19:39.644388242Z" level=info msg="CreateContainer within sandbox \"58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 4 01:19:39.680658 containerd[1516]: time="2026-03-04T01:19:39.680401270Z" level=info msg="CreateContainer within sandbox \"58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"134ce6127f416de488019be5432ec799a551dc51082e2a7c697799c734e6cf23\"" Mar 4 01:19:39.682302 containerd[1516]: time="2026-03-04T01:19:39.682169704Z" level=info msg="StartContainer for \"134ce6127f416de488019be5432ec799a551dc51082e2a7c697799c734e6cf23\"" Mar 4 01:19:39.759347 systemd[1]: Started cri-containerd-134ce6127f416de488019be5432ec799a551dc51082e2a7c697799c734e6cf23.scope - libcontainer container 134ce6127f416de488019be5432ec799a551dc51082e2a7c697799c734e6cf23. Mar 4 01:19:39.803804 containerd[1516]: time="2026-03-04T01:19:39.801788604Z" level=info msg="StartContainer for \"134ce6127f416de488019be5432ec799a551dc51082e2a7c697799c734e6cf23\" returns successfully" Mar 4 01:19:39.996280 systemd-networkd[1434]: calieca23766474: Gained IPv6LL Mar 4 01:19:40.042504 containerd[1516]: time="2026-03-04T01:19:40.042379355Z" level=info msg="StopPodSandbox for \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\"" Mar 4 01:19:40.046885 containerd[1516]: time="2026-03-04T01:19:40.046842712Z" level=info msg="StopPodSandbox for \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\"" Mar 4 01:19:40.311143 containerd[1516]: 2026-03-04 01:19:40.165 [INFO][5000] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Mar 4 01:19:40.311143 containerd[1516]: 2026-03-04 01:19:40.166 [INFO][5000] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" iface="eth0" netns="/var/run/netns/cni-8f145cea-e38e-8914-fcb5-260062e1eb4d" Mar 4 01:19:40.311143 containerd[1516]: 2026-03-04 01:19:40.182 [INFO][5000] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" iface="eth0" netns="/var/run/netns/cni-8f145cea-e38e-8914-fcb5-260062e1eb4d" Mar 4 01:19:40.311143 containerd[1516]: 2026-03-04 01:19:40.186 [INFO][5000] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" iface="eth0" netns="/var/run/netns/cni-8f145cea-e38e-8914-fcb5-260062e1eb4d" Mar 4 01:19:40.311143 containerd[1516]: 2026-03-04 01:19:40.186 [INFO][5000] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Mar 4 01:19:40.311143 containerd[1516]: 2026-03-04 01:19:40.186 [INFO][5000] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Mar 4 01:19:40.311143 containerd[1516]: 2026-03-04 01:19:40.280 [INFO][5019] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" HandleID="k8s-pod-network.495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Workload="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" Mar 4 01:19:40.311143 containerd[1516]: 2026-03-04 01:19:40.280 [INFO][5019] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:40.311143 containerd[1516]: 2026-03-04 01:19:40.281 [INFO][5019] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:40.311143 containerd[1516]: 2026-03-04 01:19:40.296 [WARNING][5019] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" HandleID="k8s-pod-network.495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Workload="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" Mar 4 01:19:40.311143 containerd[1516]: 2026-03-04 01:19:40.296 [INFO][5019] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" HandleID="k8s-pod-network.495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Workload="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" Mar 4 01:19:40.311143 containerd[1516]: 2026-03-04 01:19:40.300 [INFO][5019] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:40.311143 containerd[1516]: 2026-03-04 01:19:40.305 [INFO][5000] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Mar 4 01:19:40.315452 containerd[1516]: time="2026-03-04T01:19:40.311528516Z" level=info msg="TearDown network for sandbox \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\" successfully" Mar 4 01:19:40.315452 containerd[1516]: time="2026-03-04T01:19:40.311571090Z" level=info msg="StopPodSandbox for \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\" returns successfully" Mar 4 01:19:40.316967 systemd[1]: run-netns-cni\x2d8f145cea\x2de38e\x2d8914\x2dfcb5\x2d260062e1eb4d.mount: Deactivated successfully. Mar 4 01:19:40.321373 containerd[1516]: time="2026-03-04T01:19:40.320896623Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-hwgzw,Uid:0036b8ad-7c49-4b71-addf-d1386c2532e8,Namespace:calico-system,Attempt:1,}" Mar 4 01:19:40.346420 containerd[1516]: 2026-03-04 01:19:40.166 [INFO][4999] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Mar 4 01:19:40.346420 containerd[1516]: 2026-03-04 01:19:40.166 [INFO][4999] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" iface="eth0" netns="/var/run/netns/cni-bbbb4796-8fe2-d955-8ed3-03a5f1cf1495" Mar 4 01:19:40.346420 containerd[1516]: 2026-03-04 01:19:40.166 [INFO][4999] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" iface="eth0" netns="/var/run/netns/cni-bbbb4796-8fe2-d955-8ed3-03a5f1cf1495" Mar 4 01:19:40.346420 containerd[1516]: 2026-03-04 01:19:40.166 [INFO][4999] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" iface="eth0" netns="/var/run/netns/cni-bbbb4796-8fe2-d955-8ed3-03a5f1cf1495" Mar 4 01:19:40.346420 containerd[1516]: 2026-03-04 01:19:40.166 [INFO][4999] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Mar 4 01:19:40.346420 containerd[1516]: 2026-03-04 01:19:40.167 [INFO][4999] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Mar 4 01:19:40.346420 containerd[1516]: 2026-03-04 01:19:40.280 [INFO][5013] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" HandleID="k8s-pod-network.b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" Mar 4 01:19:40.346420 containerd[1516]: 2026-03-04 01:19:40.282 [INFO][5013] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:40.346420 containerd[1516]: 2026-03-04 01:19:40.300 [INFO][5013] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:40.346420 containerd[1516]: 2026-03-04 01:19:40.321 [WARNING][5013] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" HandleID="k8s-pod-network.b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" Mar 4 01:19:40.346420 containerd[1516]: 2026-03-04 01:19:40.321 [INFO][5013] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" HandleID="k8s-pod-network.b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" Mar 4 01:19:40.346420 containerd[1516]: 2026-03-04 01:19:40.324 [INFO][5013] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:40.346420 containerd[1516]: 2026-03-04 01:19:40.326 [INFO][4999] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Mar 4 01:19:40.350733 containerd[1516]: time="2026-03-04T01:19:40.350416372Z" level=info msg="TearDown network for sandbox \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\" successfully" Mar 4 01:19:40.350733 containerd[1516]: time="2026-03-04T01:19:40.350454279Z" level=info msg="StopPodSandbox for \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\" returns successfully" Mar 4 01:19:40.356755 systemd[1]: run-netns-cni\x2dbbbb4796\x2d8fe2\x2dd955\x2d8ed3\x2d03a5f1cf1495.mount: Deactivated successfully. Mar 4 01:19:40.388396 containerd[1516]: time="2026-03-04T01:19:40.388344000Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d996b5b5-c257w,Uid:0aba09a3-09ff-4e29-a406-0f9932fc94f6,Namespace:calico-system,Attempt:1,}" Mar 4 01:19:40.714717 systemd-networkd[1434]: calib4c44dadcf8: Link UP Mar 4 01:19:40.716332 systemd-networkd[1434]: calib4c44dadcf8: Gained carrier Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.539 [INFO][5036] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0 calico-apiserver-5d996b5b5- calico-system 0aba09a3-09ff-4e29-a406-0f9932fc94f6 1027 0 2026-03-04 01:18:55 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5d996b5b5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-8wmcq.gb1.brightbox.com calico-apiserver-5d996b5b5-c257w eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calib4c44dadcf8 [] [] }} ContainerID="b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" Namespace="calico-system" Pod="calico-apiserver-5d996b5b5-c257w" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-" Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.540 [INFO][5036] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" Namespace="calico-system" Pod="calico-apiserver-5d996b5b5-c257w" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.610 [INFO][5050] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" HandleID="k8s-pod-network.b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.638 [INFO][5050] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" HandleID="k8s-pod-network.b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277890), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-8wmcq.gb1.brightbox.com", "pod":"calico-apiserver-5d996b5b5-c257w", "timestamp":"2026-03-04 01:19:40.610536941 +0000 UTC"}, Hostname:"srv-8wmcq.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001126e0)} Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.638 [INFO][5050] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.638 [INFO][5050] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.639 [INFO][5050] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-8wmcq.gb1.brightbox.com' Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.645 [INFO][5050] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.662 [INFO][5050] ipam/ipam.go 409: Looking up existing affinities for host host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.670 [INFO][5050] ipam/ipam.go 526: Trying affinity for 192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.674 [INFO][5050] ipam/ipam.go 160: Attempting to load block cidr=192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.680 [INFO][5050] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.680 [INFO][5050] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.45.192/26 handle="k8s-pod-network.b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.683 [INFO][5050] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.689 [INFO][5050] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.45.192/26 handle="k8s-pod-network.b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.702 [INFO][5050] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.45.197/26] block=192.168.45.192/26 handle="k8s-pod-network.b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.702 [INFO][5050] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.45.197/26] handle="k8s-pod-network.b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.703 [INFO][5050] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:40.746565 containerd[1516]: 2026-03-04 01:19:40.704 [INFO][5050] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.45.197/26] IPv6=[] ContainerID="b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" HandleID="k8s-pod-network.b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" Mar 4 01:19:40.749672 containerd[1516]: 2026-03-04 01:19:40.708 [INFO][5036] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" Namespace="calico-system" Pod="calico-apiserver-5d996b5b5-c257w" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0", GenerateName:"calico-apiserver-5d996b5b5-", Namespace:"calico-system", SelfLink:"", UID:"0aba09a3-09ff-4e29-a406-0f9932fc94f6", ResourceVersion:"1027", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d996b5b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-5d996b5b5-c257w", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.45.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calib4c44dadcf8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:19:40.749672 containerd[1516]: 2026-03-04 01:19:40.709 [INFO][5036] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.45.197/32] ContainerID="b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" Namespace="calico-system" Pod="calico-apiserver-5d996b5b5-c257w" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" Mar 4 01:19:40.749672 containerd[1516]: 2026-03-04 01:19:40.709 [INFO][5036] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib4c44dadcf8 ContainerID="b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" Namespace="calico-system" Pod="calico-apiserver-5d996b5b5-c257w" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" Mar 4 01:19:40.749672 containerd[1516]: 2026-03-04 01:19:40.715 [INFO][5036] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" Namespace="calico-system" Pod="calico-apiserver-5d996b5b5-c257w" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" Mar 4 01:19:40.749672 containerd[1516]: 2026-03-04 01:19:40.717 [INFO][5036] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" Namespace="calico-system" Pod="calico-apiserver-5d996b5b5-c257w" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0", GenerateName:"calico-apiserver-5d996b5b5-", Namespace:"calico-system", SelfLink:"", UID:"0aba09a3-09ff-4e29-a406-0f9932fc94f6", ResourceVersion:"1027", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d996b5b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b", Pod:"calico-apiserver-5d996b5b5-c257w", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.45.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calib4c44dadcf8", MAC:"42:1c:e7:5c:be:bd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:19:40.749672 containerd[1516]: 2026-03-04 01:19:40.736 [INFO][5036] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b" Namespace="calico-system" Pod="calico-apiserver-5d996b5b5-c257w" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" Mar 4 01:19:40.876314 containerd[1516]: time="2026-03-04T01:19:40.874972389Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 4 01:19:40.876314 containerd[1516]: time="2026-03-04T01:19:40.875111855Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 4 01:19:40.876314 containerd[1516]: time="2026-03-04T01:19:40.875135636Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:19:40.876314 containerd[1516]: time="2026-03-04T01:19:40.875351339Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:19:40.896040 kubelet[2701]: I0304 01:19:40.894173 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-dg6p5" podStartSLOduration=63.89413078 podStartE2EDuration="1m3.89413078s" podCreationTimestamp="2026-03-04 01:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-04 01:19:40.885013544 +0000 UTC m=+69.120273775" watchObservedRunningTime="2026-03-04 01:19:40.89413078 +0000 UTC m=+69.129391020" Mar 4 01:19:40.946835 systemd[1]: Started cri-containerd-b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b.scope - libcontainer container b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b. Mar 4 01:19:40.977807 systemd-networkd[1434]: cali81eae1f738e: Link UP Mar 4 01:19:40.984641 systemd-networkd[1434]: cali81eae1f738e: Gained carrier Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.549 [INFO][5031] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0 goldmane-cccfbd5cf- calico-system 0036b8ad-7c49-4b71-addf-d1386c2532e8 1026 0 2026-03-04 01:18:55 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:cccfbd5cf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s srv-8wmcq.gb1.brightbox.com goldmane-cccfbd5cf-hwgzw eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali81eae1f738e [] [] }} ContainerID="1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" Namespace="calico-system" Pod="goldmane-cccfbd5cf-hwgzw" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-" Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.549 [INFO][5031] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" Namespace="calico-system" Pod="goldmane-cccfbd5cf-hwgzw" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.646 [INFO][5055] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" HandleID="k8s-pod-network.1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" Workload="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.660 [INFO][5055] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" HandleID="k8s-pod-network.1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" Workload="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fc40), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-8wmcq.gb1.brightbox.com", "pod":"goldmane-cccfbd5cf-hwgzw", "timestamp":"2026-03-04 01:19:40.646226018 +0000 UTC"}, Hostname:"srv-8wmcq.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00034e580)} Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.661 [INFO][5055] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.703 [INFO][5055] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.703 [INFO][5055] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-8wmcq.gb1.brightbox.com' Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.753 [INFO][5055] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.800 [INFO][5055] ipam/ipam.go 409: Looking up existing affinities for host host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.825 [INFO][5055] ipam/ipam.go 526: Trying affinity for 192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.844 [INFO][5055] ipam/ipam.go 160: Attempting to load block cidr=192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.868 [INFO][5055] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.868 [INFO][5055] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.45.192/26 handle="k8s-pod-network.1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.882 [INFO][5055] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.904 [INFO][5055] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.45.192/26 handle="k8s-pod-network.1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.943 [INFO][5055] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.45.198/26] block=192.168.45.192/26 handle="k8s-pod-network.1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.944 [INFO][5055] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.45.198/26] handle="k8s-pod-network.1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.944 [INFO][5055] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:41.035025 containerd[1516]: 2026-03-04 01:19:40.944 [INFO][5055] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.45.198/26] IPv6=[] ContainerID="1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" HandleID="k8s-pod-network.1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" Workload="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" Mar 4 01:19:41.037386 containerd[1516]: 2026-03-04 01:19:40.954 [INFO][5031] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" Namespace="calico-system" Pod="goldmane-cccfbd5cf-hwgzw" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"0036b8ad-7c49-4b71-addf-d1386c2532e8", ResourceVersion:"1026", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"", Pod:"goldmane-cccfbd5cf-hwgzw", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.45.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali81eae1f738e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:19:41.037386 containerd[1516]: 2026-03-04 01:19:40.956 [INFO][5031] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.45.198/32] ContainerID="1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" Namespace="calico-system" Pod="goldmane-cccfbd5cf-hwgzw" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" Mar 4 01:19:41.037386 containerd[1516]: 2026-03-04 01:19:40.956 [INFO][5031] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali81eae1f738e ContainerID="1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" Namespace="calico-system" Pod="goldmane-cccfbd5cf-hwgzw" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" Mar 4 01:19:41.037386 containerd[1516]: 2026-03-04 01:19:40.994 [INFO][5031] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" Namespace="calico-system" Pod="goldmane-cccfbd5cf-hwgzw" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" Mar 4 01:19:41.037386 containerd[1516]: 2026-03-04 01:19:41.002 [INFO][5031] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" Namespace="calico-system" Pod="goldmane-cccfbd5cf-hwgzw" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"0036b8ad-7c49-4b71-addf-d1386c2532e8", ResourceVersion:"1026", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d", Pod:"goldmane-cccfbd5cf-hwgzw", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.45.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali81eae1f738e", MAC:"36:c1:c2:02:6b:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:19:41.037386 containerd[1516]: 2026-03-04 01:19:41.028 [INFO][5031] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d" Namespace="calico-system" Pod="goldmane-cccfbd5cf-hwgzw" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" Mar 4 01:19:41.042329 containerd[1516]: time="2026-03-04T01:19:41.041376787Z" level=info msg="StopPodSandbox for \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\"" Mar 4 01:19:41.042983 containerd[1516]: time="2026-03-04T01:19:41.042953378Z" level=info msg="StopPodSandbox for \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\"" Mar 4 01:19:41.163155 containerd[1516]: time="2026-03-04T01:19:41.162262700Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 4 01:19:41.163155 containerd[1516]: time="2026-03-04T01:19:41.162361283Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 4 01:19:41.163155 containerd[1516]: time="2026-03-04T01:19:41.162380060Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:19:41.163155 containerd[1516]: time="2026-03-04T01:19:41.162653777Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:19:41.212597 systemd-networkd[1434]: calie91308b0b81: Gained IPv6LL Mar 4 01:19:41.305343 systemd[1]: Started cri-containerd-1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d.scope - libcontainer container 1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d. Mar 4 01:19:41.376530 containerd[1516]: time="2026-03-04T01:19:41.376422537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d996b5b5-c257w,Uid:0aba09a3-09ff-4e29-a406-0f9932fc94f6,Namespace:calico-system,Attempt:1,} returns sandbox id \"b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b\"" Mar 4 01:19:41.481618 containerd[1516]: 2026-03-04 01:19:41.285 [INFO][5156] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Mar 4 01:19:41.481618 containerd[1516]: 2026-03-04 01:19:41.286 [INFO][5156] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" iface="eth0" netns="/var/run/netns/cni-ab01dee8-2e4c-0548-3b3e-a337ebe9eff2" Mar 4 01:19:41.481618 containerd[1516]: 2026-03-04 01:19:41.288 [INFO][5156] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" iface="eth0" netns="/var/run/netns/cni-ab01dee8-2e4c-0548-3b3e-a337ebe9eff2" Mar 4 01:19:41.481618 containerd[1516]: 2026-03-04 01:19:41.289 [INFO][5156] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" iface="eth0" netns="/var/run/netns/cni-ab01dee8-2e4c-0548-3b3e-a337ebe9eff2" Mar 4 01:19:41.481618 containerd[1516]: 2026-03-04 01:19:41.289 [INFO][5156] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Mar 4 01:19:41.481618 containerd[1516]: 2026-03-04 01:19:41.289 [INFO][5156] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Mar 4 01:19:41.481618 containerd[1516]: 2026-03-04 01:19:41.445 [INFO][5194] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" HandleID="k8s-pod-network.b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Workload="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" Mar 4 01:19:41.481618 containerd[1516]: 2026-03-04 01:19:41.447 [INFO][5194] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:41.481618 containerd[1516]: 2026-03-04 01:19:41.447 [INFO][5194] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:41.481618 containerd[1516]: 2026-03-04 01:19:41.468 [WARNING][5194] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" HandleID="k8s-pod-network.b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Workload="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" Mar 4 01:19:41.481618 containerd[1516]: 2026-03-04 01:19:41.468 [INFO][5194] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" HandleID="k8s-pod-network.b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Workload="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" Mar 4 01:19:41.481618 containerd[1516]: 2026-03-04 01:19:41.473 [INFO][5194] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:41.481618 containerd[1516]: 2026-03-04 01:19:41.477 [INFO][5156] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Mar 4 01:19:41.484756 containerd[1516]: time="2026-03-04T01:19:41.482345172Z" level=info msg="TearDown network for sandbox \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\" successfully" Mar 4 01:19:41.484756 containerd[1516]: time="2026-03-04T01:19:41.482387250Z" level=info msg="StopPodSandbox for \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\" returns successfully" Mar 4 01:19:41.491809 systemd[1]: run-netns-cni\x2dab01dee8\x2d2e4c\x2d0548\x2d3b3e\x2da337ebe9eff2.mount: Deactivated successfully. Mar 4 01:19:41.493380 containerd[1516]: time="2026-03-04T01:19:41.492232798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fnbg8,Uid:621d6fb0-0e39-428b-8e9a-8e8c65b0d05c,Namespace:calico-system,Attempt:1,}" Mar 4 01:19:41.584816 containerd[1516]: 2026-03-04 01:19:41.397 [INFO][5143] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Mar 4 01:19:41.584816 containerd[1516]: 2026-03-04 01:19:41.401 [INFO][5143] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" iface="eth0" netns="/var/run/netns/cni-9741698b-648b-b1c5-6b8c-193b0ff15307" Mar 4 01:19:41.584816 containerd[1516]: 2026-03-04 01:19:41.401 [INFO][5143] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" iface="eth0" netns="/var/run/netns/cni-9741698b-648b-b1c5-6b8c-193b0ff15307" Mar 4 01:19:41.584816 containerd[1516]: 2026-03-04 01:19:41.402 [INFO][5143] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" iface="eth0" netns="/var/run/netns/cni-9741698b-648b-b1c5-6b8c-193b0ff15307" Mar 4 01:19:41.584816 containerd[1516]: 2026-03-04 01:19:41.402 [INFO][5143] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Mar 4 01:19:41.584816 containerd[1516]: 2026-03-04 01:19:41.402 [INFO][5143] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Mar 4 01:19:41.584816 containerd[1516]: 2026-03-04 01:19:41.549 [INFO][5218] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" HandleID="k8s-pod-network.5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" Mar 4 01:19:41.584816 containerd[1516]: 2026-03-04 01:19:41.554 [INFO][5218] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:41.584816 containerd[1516]: 2026-03-04 01:19:41.555 [INFO][5218] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:41.584816 containerd[1516]: 2026-03-04 01:19:41.569 [WARNING][5218] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" HandleID="k8s-pod-network.5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" Mar 4 01:19:41.584816 containerd[1516]: 2026-03-04 01:19:41.569 [INFO][5218] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" HandleID="k8s-pod-network.5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" Mar 4 01:19:41.584816 containerd[1516]: 2026-03-04 01:19:41.574 [INFO][5218] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:41.584816 containerd[1516]: 2026-03-04 01:19:41.578 [INFO][5143] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Mar 4 01:19:41.586518 containerd[1516]: time="2026-03-04T01:19:41.585172700Z" level=info msg="TearDown network for sandbox \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\" successfully" Mar 4 01:19:41.586518 containerd[1516]: time="2026-03-04T01:19:41.585222494Z" level=info msg="StopPodSandbox for \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\" returns successfully" Mar 4 01:19:41.589544 containerd[1516]: time="2026-03-04T01:19:41.589201007Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58bc6545dc-jbw6r,Uid:11453fdd-1482-4ede-8950-e97c22d85781,Namespace:calico-system,Attempt:1,}" Mar 4 01:19:41.596566 containerd[1516]: time="2026-03-04T01:19:41.596514959Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-hwgzw,Uid:0036b8ad-7c49-4b71-addf-d1386c2532e8,Namespace:calico-system,Attempt:1,} returns sandbox id \"1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d\"" Mar 4 01:19:41.910590 systemd-networkd[1434]: cali968ca8cd64c: Link UP Mar 4 01:19:41.911712 systemd-networkd[1434]: cali968ca8cd64c: Gained carrier Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.677 [INFO][5230] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0 csi-node-driver- calico-system 621d6fb0-0e39-428b-8e9a-8e8c65b0d05c 1046 0 2026-03-04 01:18:57 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:98cbb5577 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-8wmcq.gb1.brightbox.com csi-node-driver-fnbg8 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali968ca8cd64c [] [] }} ContainerID="aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" Namespace="calico-system" Pod="csi-node-driver-fnbg8" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-" Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.678 [INFO][5230] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" Namespace="calico-system" Pod="csi-node-driver-fnbg8" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.768 [INFO][5262] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" HandleID="k8s-pod-network.aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" Workload="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.800 [INFO][5262] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" HandleID="k8s-pod-network.aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" Workload="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fb910), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-8wmcq.gb1.brightbox.com", "pod":"csi-node-driver-fnbg8", "timestamp":"2026-03-04 01:19:41.768910089 +0000 UTC"}, Hostname:"srv-8wmcq.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003771e0)} Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.800 [INFO][5262] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.800 [INFO][5262] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.800 [INFO][5262] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-8wmcq.gb1.brightbox.com' Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.805 [INFO][5262] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.834 [INFO][5262] ipam/ipam.go 409: Looking up existing affinities for host host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.857 [INFO][5262] ipam/ipam.go 526: Trying affinity for 192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.861 [INFO][5262] ipam/ipam.go 160: Attempting to load block cidr=192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.866 [INFO][5262] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.866 [INFO][5262] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.45.192/26 handle="k8s-pod-network.aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.868 [INFO][5262] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.876 [INFO][5262] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.45.192/26 handle="k8s-pod-network.aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.887 [INFO][5262] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.45.199/26] block=192.168.45.192/26 handle="k8s-pod-network.aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.887 [INFO][5262] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.45.199/26] handle="k8s-pod-network.aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.890 [INFO][5262] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:41.948246 containerd[1516]: 2026-03-04 01:19:41.890 [INFO][5262] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.45.199/26] IPv6=[] ContainerID="aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" HandleID="k8s-pod-network.aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" Workload="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" Mar 4 01:19:41.951975 containerd[1516]: 2026-03-04 01:19:41.892 [INFO][5230] cni-plugin/k8s.go 418: Populated endpoint ContainerID="aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" Namespace="calico-system" Pod="csi-node-driver-fnbg8" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"621d6fb0-0e39-428b-8e9a-8e8c65b0d05c", ResourceVersion:"1046", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-fnbg8", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.45.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali968ca8cd64c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:19:41.951975 containerd[1516]: 2026-03-04 01:19:41.893 [INFO][5230] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.45.199/32] ContainerID="aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" Namespace="calico-system" Pod="csi-node-driver-fnbg8" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" Mar 4 01:19:41.951975 containerd[1516]: 2026-03-04 01:19:41.893 [INFO][5230] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali968ca8cd64c ContainerID="aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" Namespace="calico-system" Pod="csi-node-driver-fnbg8" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" Mar 4 01:19:41.951975 containerd[1516]: 2026-03-04 01:19:41.909 [INFO][5230] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" Namespace="calico-system" Pod="csi-node-driver-fnbg8" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" Mar 4 01:19:41.951975 containerd[1516]: 2026-03-04 01:19:41.910 [INFO][5230] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" Namespace="calico-system" Pod="csi-node-driver-fnbg8" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"621d6fb0-0e39-428b-8e9a-8e8c65b0d05c", ResourceVersion:"1046", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c", Pod:"csi-node-driver-fnbg8", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.45.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali968ca8cd64c", MAC:"8a:3f:0e:f0:96:68", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:19:41.951975 containerd[1516]: 2026-03-04 01:19:41.945 [INFO][5230] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c" Namespace="calico-system" Pod="csi-node-driver-fnbg8" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" Mar 4 01:19:42.005692 containerd[1516]: time="2026-03-04T01:19:42.005319060Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 4 01:19:42.005692 containerd[1516]: time="2026-03-04T01:19:42.005393550Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 4 01:19:42.005692 containerd[1516]: time="2026-03-04T01:19:42.005410307Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:19:42.005692 containerd[1516]: time="2026-03-04T01:19:42.005569906Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:19:42.044364 systemd[1]: Started cri-containerd-aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c.scope - libcontainer container aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c. Mar 4 01:19:42.078167 systemd-networkd[1434]: cali70ef227e547: Link UP Mar 4 01:19:42.083650 systemd-networkd[1434]: cali70ef227e547: Gained carrier Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:41.706 [INFO][5248] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0 calico-kube-controllers-58bc6545dc- calico-system 11453fdd-1482-4ede-8950-e97c22d85781 1048 0 2026-03-04 01:18:57 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:58bc6545dc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-8wmcq.gb1.brightbox.com calico-kube-controllers-58bc6545dc-jbw6r eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali70ef227e547 [] [] }} ContainerID="a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" Namespace="calico-system" Pod="calico-kube-controllers-58bc6545dc-jbw6r" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-" Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:41.706 [INFO][5248] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" Namespace="calico-system" Pod="calico-kube-controllers-58bc6545dc-jbw6r" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:41.784 [INFO][5267] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" HandleID="k8s-pod-network.a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:41.811 [INFO][5267] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" HandleID="k8s-pod-network.a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277af0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-8wmcq.gb1.brightbox.com", "pod":"calico-kube-controllers-58bc6545dc-jbw6r", "timestamp":"2026-03-04 01:19:41.784336416 +0000 UTC"}, Hostname:"srv-8wmcq.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001122c0)} Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:41.811 [INFO][5267] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:41.888 [INFO][5267] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:41.888 [INFO][5267] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-8wmcq.gb1.brightbox.com' Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:41.914 [INFO][5267] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:41.938 [INFO][5267] ipam/ipam.go 409: Looking up existing affinities for host host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:41.959 [INFO][5267] ipam/ipam.go 526: Trying affinity for 192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:41.966 [INFO][5267] ipam/ipam.go 160: Attempting to load block cidr=192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:41.981 [INFO][5267] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.45.192/26 host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:41.985 [INFO][5267] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.45.192/26 handle="k8s-pod-network.a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:41.992 [INFO][5267] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468 Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:42.004 [INFO][5267] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.45.192/26 handle="k8s-pod-network.a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:42.024 [INFO][5267] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.45.200/26] block=192.168.45.192/26 handle="k8s-pod-network.a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:42.027 [INFO][5267] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.45.200/26] handle="k8s-pod-network.a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" host="srv-8wmcq.gb1.brightbox.com" Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:42.029 [INFO][5267] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:19:42.137634 containerd[1516]: 2026-03-04 01:19:42.030 [INFO][5267] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.45.200/26] IPv6=[] ContainerID="a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" HandleID="k8s-pod-network.a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" Mar 4 01:19:42.140834 containerd[1516]: 2026-03-04 01:19:42.063 [INFO][5248] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" Namespace="calico-system" Pod="calico-kube-controllers-58bc6545dc-jbw6r" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0", GenerateName:"calico-kube-controllers-58bc6545dc-", Namespace:"calico-system", SelfLink:"", UID:"11453fdd-1482-4ede-8950-e97c22d85781", ResourceVersion:"1048", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58bc6545dc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-58bc6545dc-jbw6r", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.45.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali70ef227e547", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:19:42.140834 containerd[1516]: 2026-03-04 01:19:42.063 [INFO][5248] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.45.200/32] ContainerID="a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" Namespace="calico-system" Pod="calico-kube-controllers-58bc6545dc-jbw6r" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" Mar 4 01:19:42.140834 containerd[1516]: 2026-03-04 01:19:42.063 [INFO][5248] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali70ef227e547 ContainerID="a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" Namespace="calico-system" Pod="calico-kube-controllers-58bc6545dc-jbw6r" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" Mar 4 01:19:42.140834 containerd[1516]: 2026-03-04 01:19:42.083 [INFO][5248] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" Namespace="calico-system" Pod="calico-kube-controllers-58bc6545dc-jbw6r" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" Mar 4 01:19:42.140834 containerd[1516]: 2026-03-04 01:19:42.086 [INFO][5248] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" Namespace="calico-system" Pod="calico-kube-controllers-58bc6545dc-jbw6r" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0", GenerateName:"calico-kube-controllers-58bc6545dc-", Namespace:"calico-system", SelfLink:"", UID:"11453fdd-1482-4ede-8950-e97c22d85781", ResourceVersion:"1048", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58bc6545dc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468", Pod:"calico-kube-controllers-58bc6545dc-jbw6r", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.45.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali70ef227e547", MAC:"ea:50:95:f1:20:d4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:19:42.140834 containerd[1516]: 2026-03-04 01:19:42.132 [INFO][5248] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468" Namespace="calico-system" Pod="calico-kube-controllers-58bc6545dc-jbw6r" WorkloadEndpoint="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" Mar 4 01:19:42.198557 containerd[1516]: time="2026-03-04T01:19:42.198026671Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fnbg8,Uid:621d6fb0-0e39-428b-8e9a-8e8c65b0d05c,Namespace:calico-system,Attempt:1,} returns sandbox id \"aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c\"" Mar 4 01:19:42.247902 systemd[1]: run-netns-cni\x2d9741698b\x2d648b\x2db1c5\x2d6b8c\x2d193b0ff15307.mount: Deactivated successfully. Mar 4 01:19:42.266720 containerd[1516]: time="2026-03-04T01:19:42.266471574Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 4 01:19:42.266720 containerd[1516]: time="2026-03-04T01:19:42.266648697Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 4 01:19:42.266720 containerd[1516]: time="2026-03-04T01:19:42.266667215Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:19:42.267278 containerd[1516]: time="2026-03-04T01:19:42.266814458Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 4 01:19:42.360420 systemd[1]: run-containerd-runc-k8s.io-a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468-runc.ECG0De.mount: Deactivated successfully. Mar 4 01:19:42.377379 systemd[1]: Started cri-containerd-a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468.scope - libcontainer container a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468. Mar 4 01:19:42.428312 systemd-networkd[1434]: cali81eae1f738e: Gained IPv6LL Mar 4 01:19:42.507904 containerd[1516]: time="2026-03-04T01:19:42.507401705Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58bc6545dc-jbw6r,Uid:11453fdd-1482-4ede-8950-e97c22d85781,Namespace:calico-system,Attempt:1,} returns sandbox id \"a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468\"" Mar 4 01:19:42.750130 systemd-networkd[1434]: calib4c44dadcf8: Gained IPv6LL Mar 4 01:19:43.580846 systemd-networkd[1434]: cali70ef227e547: Gained IPv6LL Mar 4 01:19:43.836659 systemd-networkd[1434]: cali968ca8cd64c: Gained IPv6LL Mar 4 01:19:44.457136 containerd[1516]: time="2026-03-04T01:19:44.456564710Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:44.459086 containerd[1516]: time="2026-03-04T01:19:44.458800445Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Mar 4 01:19:44.461131 containerd[1516]: time="2026-03-04T01:19:44.460656224Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:44.468727 containerd[1516]: time="2026-03-04T01:19:44.468674462Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:44.473791 containerd[1516]: time="2026-03-04T01:19:44.473721082Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 5.791027386s" Mar 4 01:19:44.474005 containerd[1516]: time="2026-03-04T01:19:44.473968956Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 4 01:19:44.478931 containerd[1516]: time="2026-03-04T01:19:44.477932741Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 4 01:19:44.500390 containerd[1516]: time="2026-03-04T01:19:44.500332207Z" level=info msg="CreateContainer within sandbox \"dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 4 01:19:44.521364 containerd[1516]: time="2026-03-04T01:19:44.519727118Z" level=info msg="CreateContainer within sandbox \"dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"53678747cf86876feb7301a7f984d28274e772a0ef5aaf99fbad7580a568579d\"" Mar 4 01:19:44.524832 containerd[1516]: time="2026-03-04T01:19:44.524266697Z" level=info msg="StartContainer for \"53678747cf86876feb7301a7f984d28274e772a0ef5aaf99fbad7580a568579d\"" Mar 4 01:19:44.643063 systemd[1]: run-containerd-runc-k8s.io-53678747cf86876feb7301a7f984d28274e772a0ef5aaf99fbad7580a568579d-runc.rZTosl.mount: Deactivated successfully. Mar 4 01:19:44.655308 systemd[1]: Started cri-containerd-53678747cf86876feb7301a7f984d28274e772a0ef5aaf99fbad7580a568579d.scope - libcontainer container 53678747cf86876feb7301a7f984d28274e772a0ef5aaf99fbad7580a568579d. Mar 4 01:19:44.731928 containerd[1516]: time="2026-03-04T01:19:44.730920588Z" level=info msg="StartContainer for \"53678747cf86876feb7301a7f984d28274e772a0ef5aaf99fbad7580a568579d\" returns successfully" Mar 4 01:19:44.832945 containerd[1516]: time="2026-03-04T01:19:44.832888386Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:44.833851 containerd[1516]: time="2026-03-04T01:19:44.833653476Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 4 01:19:44.839923 containerd[1516]: time="2026-03-04T01:19:44.839854690Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 361.863322ms" Mar 4 01:19:44.840184 containerd[1516]: time="2026-03-04T01:19:44.839939303Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 4 01:19:44.841785 containerd[1516]: time="2026-03-04T01:19:44.841535504Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 4 01:19:44.846259 containerd[1516]: time="2026-03-04T01:19:44.846224890Z" level=info msg="CreateContainer within sandbox \"b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 4 01:19:44.956818 kubelet[2701]: I0304 01:19:44.953990 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-5d996b5b5-4hclm" podStartSLOduration=44.102576736 podStartE2EDuration="49.898024952s" podCreationTimestamp="2026-03-04 01:18:55 +0000 UTC" firstStartedPulling="2026-03-04 01:19:38.682297353 +0000 UTC m=+66.917557559" lastFinishedPulling="2026-03-04 01:19:44.477745569 +0000 UTC m=+72.713005775" observedRunningTime="2026-03-04 01:19:44.892404243 +0000 UTC m=+73.127664468" watchObservedRunningTime="2026-03-04 01:19:44.898024952 +0000 UTC m=+73.133285166" Mar 4 01:19:45.010168 containerd[1516]: time="2026-03-04T01:19:45.009852316Z" level=info msg="CreateContainer within sandbox \"b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"fc40557653bc5a313987e05630e4bb9c1cfbfefd180131a5d90269bb367f6938\"" Mar 4 01:19:45.012150 containerd[1516]: time="2026-03-04T01:19:45.011633001Z" level=info msg="StartContainer for \"fc40557653bc5a313987e05630e4bb9c1cfbfefd180131a5d90269bb367f6938\"" Mar 4 01:19:45.083285 systemd[1]: Started cri-containerd-fc40557653bc5a313987e05630e4bb9c1cfbfefd180131a5d90269bb367f6938.scope - libcontainer container fc40557653bc5a313987e05630e4bb9c1cfbfefd180131a5d90269bb367f6938. Mar 4 01:19:45.192172 containerd[1516]: time="2026-03-04T01:19:45.190311635Z" level=info msg="StartContainer for \"fc40557653bc5a313987e05630e4bb9c1cfbfefd180131a5d90269bb367f6938\" returns successfully" Mar 4 01:19:45.928088 kubelet[2701]: I0304 01:19:45.912356 2701 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 4 01:19:45.989034 kubelet[2701]: I0304 01:19:45.988697 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-5d996b5b5-c257w" podStartSLOduration=47.529281932 podStartE2EDuration="50.988661345s" podCreationTimestamp="2026-03-04 01:18:55 +0000 UTC" firstStartedPulling="2026-03-04 01:19:41.381934595 +0000 UTC m=+69.617194801" lastFinishedPulling="2026-03-04 01:19:44.841313995 +0000 UTC m=+73.076574214" observedRunningTime="2026-03-04 01:19:45.966657769 +0000 UTC m=+74.201918000" watchObservedRunningTime="2026-03-04 01:19:45.988661345 +0000 UTC m=+74.223921594" Mar 4 01:19:46.919086 kubelet[2701]: I0304 01:19:46.918961 2701 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 4 01:19:48.477506 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount624682763.mount: Deactivated successfully. Mar 4 01:19:49.378698 containerd[1516]: time="2026-03-04T01:19:49.378377841Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:49.381103 containerd[1516]: time="2026-03-04T01:19:49.381005810Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Mar 4 01:19:49.382167 containerd[1516]: time="2026-03-04T01:19:49.382098264Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:49.410808 containerd[1516]: time="2026-03-04T01:19:49.410736648Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:49.444750 containerd[1516]: time="2026-03-04T01:19:49.444682045Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 4.603066651s" Mar 4 01:19:49.445386 containerd[1516]: time="2026-03-04T01:19:49.445019229Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Mar 4 01:19:49.531080 containerd[1516]: time="2026-03-04T01:19:49.530668936Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 4 01:19:49.661439 containerd[1516]: time="2026-03-04T01:19:49.661293460Z" level=info msg="CreateContainer within sandbox \"1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 4 01:19:49.696377 containerd[1516]: time="2026-03-04T01:19:49.696322677Z" level=info msg="CreateContainer within sandbox \"1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"56ac03c53b040553f19e8ab24ee84617f395094c502dbfc667a0d74e6e49fded\"" Mar 4 01:19:49.698672 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2093241174.mount: Deactivated successfully. Mar 4 01:19:49.701207 containerd[1516]: time="2026-03-04T01:19:49.700455587Z" level=info msg="StartContainer for \"56ac03c53b040553f19e8ab24ee84617f395094c502dbfc667a0d74e6e49fded\"" Mar 4 01:19:49.816974 systemd[1]: run-containerd-runc-k8s.io-56ac03c53b040553f19e8ab24ee84617f395094c502dbfc667a0d74e6e49fded-runc.j9tzo3.mount: Deactivated successfully. Mar 4 01:19:49.830276 systemd[1]: Started cri-containerd-56ac03c53b040553f19e8ab24ee84617f395094c502dbfc667a0d74e6e49fded.scope - libcontainer container 56ac03c53b040553f19e8ab24ee84617f395094c502dbfc667a0d74e6e49fded. Mar 4 01:19:49.944078 containerd[1516]: time="2026-03-04T01:19:49.943942023Z" level=info msg="StartContainer for \"56ac03c53b040553f19e8ab24ee84617f395094c502dbfc667a0d74e6e49fded\" returns successfully" Mar 4 01:19:51.728468 containerd[1516]: time="2026-03-04T01:19:51.728339488Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:51.731964 containerd[1516]: time="2026-03-04T01:19:51.729642551Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Mar 4 01:19:51.731964 containerd[1516]: time="2026-03-04T01:19:51.731326414Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:51.736086 containerd[1516]: time="2026-03-04T01:19:51.735191481Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:51.737073 containerd[1516]: time="2026-03-04T01:19:51.736684694Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 2.205952137s" Mar 4 01:19:51.737073 containerd[1516]: time="2026-03-04T01:19:51.736730946Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Mar 4 01:19:51.747290 containerd[1516]: time="2026-03-04T01:19:51.747096388Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 4 01:19:51.756683 containerd[1516]: time="2026-03-04T01:19:51.756540729Z" level=info msg="CreateContainer within sandbox \"aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 4 01:19:51.791230 containerd[1516]: time="2026-03-04T01:19:51.790634642Z" level=info msg="CreateContainer within sandbox \"aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"545b88679a68c65dd344de2afad84c0e4f976ed4efc2168eca6beec6b169858f\"" Mar 4 01:19:51.795303 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3401735469.mount: Deactivated successfully. Mar 4 01:19:51.796352 containerd[1516]: time="2026-03-04T01:19:51.796080579Z" level=info msg="StartContainer for \"545b88679a68c65dd344de2afad84c0e4f976ed4efc2168eca6beec6b169858f\"" Mar 4 01:19:51.864539 systemd[1]: Started cri-containerd-545b88679a68c65dd344de2afad84c0e4f976ed4efc2168eca6beec6b169858f.scope - libcontainer container 545b88679a68c65dd344de2afad84c0e4f976ed4efc2168eca6beec6b169858f. Mar 4 01:19:51.934075 containerd[1516]: time="2026-03-04T01:19:51.933617527Z" level=info msg="StartContainer for \"545b88679a68c65dd344de2afad84c0e4f976ed4efc2168eca6beec6b169858f\" returns successfully" Mar 4 01:19:57.142682 containerd[1516]: time="2026-03-04T01:19:57.142180395Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:57.162911 containerd[1516]: time="2026-03-04T01:19:57.149386541Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Mar 4 01:19:57.200076 containerd[1516]: time="2026-03-04T01:19:57.197930527Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:57.203211 containerd[1516]: time="2026-03-04T01:19:57.203158556Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:19:57.209029 containerd[1516]: time="2026-03-04T01:19:57.208984521Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 5.457718704s" Mar 4 01:19:57.209248 containerd[1516]: time="2026-03-04T01:19:57.209219690Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Mar 4 01:19:57.255211 containerd[1516]: time="2026-03-04T01:19:57.255164649Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 4 01:19:57.416682 containerd[1516]: time="2026-03-04T01:19:57.416087505Z" level=info msg="CreateContainer within sandbox \"a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 4 01:19:57.465534 containerd[1516]: time="2026-03-04T01:19:57.463187676Z" level=info msg="CreateContainer within sandbox \"a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"bb738bfaac862e861db17431164d689aedecfdb600eb5fcc02222c6418018d00\"" Mar 4 01:19:57.471602 containerd[1516]: time="2026-03-04T01:19:57.471530024Z" level=info msg="StartContainer for \"bb738bfaac862e861db17431164d689aedecfdb600eb5fcc02222c6418018d00\"" Mar 4 01:19:57.701370 systemd[1]: Started cri-containerd-bb738bfaac862e861db17431164d689aedecfdb600eb5fcc02222c6418018d00.scope - libcontainer container bb738bfaac862e861db17431164d689aedecfdb600eb5fcc02222c6418018d00. Mar 4 01:19:57.919465 containerd[1516]: time="2026-03-04T01:19:57.919171894Z" level=info msg="StartContainer for \"bb738bfaac862e861db17431164d689aedecfdb600eb5fcc02222c6418018d00\" returns successfully" Mar 4 01:19:58.597875 kubelet[2701]: I0304 01:19:58.596380 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-58bc6545dc-jbw6r" podStartSLOduration=46.844931088 podStartE2EDuration="1m1.587165752s" podCreationTimestamp="2026-03-04 01:18:57 +0000 UTC" firstStartedPulling="2026-03-04 01:19:42.510919416 +0000 UTC m=+70.746179622" lastFinishedPulling="2026-03-04 01:19:57.253154006 +0000 UTC m=+85.488414286" observedRunningTime="2026-03-04 01:19:58.510487604 +0000 UTC m=+86.745747827" watchObservedRunningTime="2026-03-04 01:19:58.587165752 +0000 UTC m=+86.822425971" Mar 4 01:19:58.602736 kubelet[2701]: I0304 01:19:58.598330 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-cccfbd5cf-hwgzw" podStartSLOduration=55.680552643 podStartE2EDuration="1m3.59831797s" podCreationTimestamp="2026-03-04 01:18:55 +0000 UTC" firstStartedPulling="2026-03-04 01:19:41.610992516 +0000 UTC m=+69.846252722" lastFinishedPulling="2026-03-04 01:19:49.528757835 +0000 UTC m=+77.764018049" observedRunningTime="2026-03-04 01:19:50.235377944 +0000 UTC m=+78.470638170" watchObservedRunningTime="2026-03-04 01:19:58.59831797 +0000 UTC m=+86.833578190" Mar 4 01:20:01.144792 containerd[1516]: time="2026-03-04T01:20:01.144577270Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:20:01.148015 containerd[1516]: time="2026-03-04T01:20:01.147800143Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Mar 4 01:20:01.157811 containerd[1516]: time="2026-03-04T01:20:01.157721242Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:20:01.175006 containerd[1516]: time="2026-03-04T01:20:01.172977698Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 4 01:20:01.190524 containerd[1516]: time="2026-03-04T01:20:01.190459373Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 3.934166557s" Mar 4 01:20:01.191769 containerd[1516]: time="2026-03-04T01:20:01.190568312Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Mar 4 01:20:01.210772 containerd[1516]: time="2026-03-04T01:20:01.210703617Z" level=info msg="CreateContainer within sandbox \"aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 4 01:20:01.251991 containerd[1516]: time="2026-03-04T01:20:01.251798109Z" level=info msg="CreateContainer within sandbox \"aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"36b784ece51f7bbc21baba87698da4f3739780bd403ef0eac6c8d3045c3c10e9\"" Mar 4 01:20:01.254535 containerd[1516]: time="2026-03-04T01:20:01.252628933Z" level=info msg="StartContainer for \"36b784ece51f7bbc21baba87698da4f3739780bd403ef0eac6c8d3045c3c10e9\"" Mar 4 01:20:01.358432 systemd[1]: Started cri-containerd-36b784ece51f7bbc21baba87698da4f3739780bd403ef0eac6c8d3045c3c10e9.scope - libcontainer container 36b784ece51f7bbc21baba87698da4f3739780bd403ef0eac6c8d3045c3c10e9. Mar 4 01:20:01.421288 containerd[1516]: time="2026-03-04T01:20:01.421125264Z" level=info msg="StartContainer for \"36b784ece51f7bbc21baba87698da4f3739780bd403ef0eac6c8d3045c3c10e9\" returns successfully" Mar 4 01:20:02.343914 kubelet[2701]: I0304 01:20:02.343271 2701 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-fnbg8" podStartSLOduration=46.341160783 podStartE2EDuration="1m5.343196247s" podCreationTimestamp="2026-03-04 01:18:57 +0000 UTC" firstStartedPulling="2026-03-04 01:19:42.203126758 +0000 UTC m=+70.438386971" lastFinishedPulling="2026-03-04 01:20:01.205162221 +0000 UTC m=+89.440422435" observedRunningTime="2026-03-04 01:20:02.33963519 +0000 UTC m=+90.574895433" watchObservedRunningTime="2026-03-04 01:20:02.343196247 +0000 UTC m=+90.578456478" Mar 4 01:20:02.475127 kubelet[2701]: I0304 01:20:02.474881 2701 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 4 01:20:02.479715 kubelet[2701]: I0304 01:20:02.479675 2701 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 4 01:20:03.644132 systemd[1]: Started sshd@9-10.243.77.214:22-20.161.92.111:47010.service - OpenSSH per-connection server daemon (20.161.92.111:47010). Mar 4 01:20:04.343791 sshd[5828]: Accepted publickey for core from 20.161.92.111 port 47010 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:20:04.355823 sshd[5828]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:20:04.370432 systemd-logind[1492]: New session 12 of user core. Mar 4 01:20:04.389828 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 4 01:20:05.617567 sshd[5828]: pam_unix(sshd:session): session closed for user core Mar 4 01:20:05.634832 systemd[1]: sshd@9-10.243.77.214:22-20.161.92.111:47010.service: Deactivated successfully. Mar 4 01:20:05.640184 systemd[1]: session-12.scope: Deactivated successfully. Mar 4 01:20:05.641793 systemd-logind[1492]: Session 12 logged out. Waiting for processes to exit. Mar 4 01:20:05.645299 systemd-logind[1492]: Removed session 12. Mar 4 01:20:08.387242 kubelet[2701]: I0304 01:20:08.387120 2701 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 4 01:20:10.732412 systemd[1]: Started sshd@10-10.243.77.214:22-20.161.92.111:50738.service - OpenSSH per-connection server daemon (20.161.92.111:50738). Mar 4 01:20:11.402003 sshd[5856]: Accepted publickey for core from 20.161.92.111 port 50738 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:20:11.405346 sshd[5856]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:20:11.416235 systemd-logind[1492]: New session 13 of user core. Mar 4 01:20:11.427457 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 4 01:20:12.202787 sshd[5856]: pam_unix(sshd:session): session closed for user core Mar 4 01:20:12.218512 systemd[1]: sshd@10-10.243.77.214:22-20.161.92.111:50738.service: Deactivated successfully. Mar 4 01:20:12.221996 systemd[1]: session-13.scope: Deactivated successfully. Mar 4 01:20:12.224431 systemd-logind[1492]: Session 13 logged out. Waiting for processes to exit. Mar 4 01:20:12.226362 systemd-logind[1492]: Removed session 13. Mar 4 01:20:15.927118 kubelet[2701]: I0304 01:20:15.926889 2701 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 4 01:20:17.316813 systemd[1]: Started sshd@11-10.243.77.214:22-20.161.92.111:50746.service - OpenSSH per-connection server daemon (20.161.92.111:50746). Mar 4 01:20:17.960351 sshd[5888]: Accepted publickey for core from 20.161.92.111 port 50746 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:20:17.962800 sshd[5888]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:20:17.972066 systemd-logind[1492]: New session 14 of user core. Mar 4 01:20:17.978898 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 4 01:20:18.541162 sshd[5888]: pam_unix(sshd:session): session closed for user core Mar 4 01:20:18.549713 systemd[1]: sshd@11-10.243.77.214:22-20.161.92.111:50746.service: Deactivated successfully. Mar 4 01:20:18.553902 systemd[1]: session-14.scope: Deactivated successfully. Mar 4 01:20:18.557042 systemd-logind[1492]: Session 14 logged out. Waiting for processes to exit. Mar 4 01:20:18.558624 systemd-logind[1492]: Removed session 14. Mar 4 01:20:23.658528 systemd[1]: Started sshd@12-10.243.77.214:22-20.161.92.111:45672.service - OpenSSH per-connection server daemon (20.161.92.111:45672). Mar 4 01:20:24.329869 sshd[5929]: Accepted publickey for core from 20.161.92.111 port 45672 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:20:24.334408 sshd[5929]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:20:24.351620 systemd-logind[1492]: New session 15 of user core. Mar 4 01:20:24.359928 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 4 01:20:25.029807 sshd[5929]: pam_unix(sshd:session): session closed for user core Mar 4 01:20:25.037930 systemd-logind[1492]: Session 15 logged out. Waiting for processes to exit. Mar 4 01:20:25.038999 systemd[1]: sshd@12-10.243.77.214:22-20.161.92.111:45672.service: Deactivated successfully. Mar 4 01:20:25.044221 systemd[1]: session-15.scope: Deactivated successfully. Mar 4 01:20:25.046748 systemd-logind[1492]: Removed session 15. Mar 4 01:20:29.378607 systemd[1]: run-containerd-runc-k8s.io-bb738bfaac862e861db17431164d689aedecfdb600eb5fcc02222c6418018d00-runc.NSEdGh.mount: Deactivated successfully. Mar 4 01:20:30.140553 systemd[1]: Started sshd@13-10.243.77.214:22-20.161.92.111:45676.service - OpenSSH per-connection server daemon (20.161.92.111:45676). Mar 4 01:20:30.814232 sshd[6021]: Accepted publickey for core from 20.161.92.111 port 45676 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:20:30.817488 sshd[6021]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:20:30.827251 systemd-logind[1492]: New session 16 of user core. Mar 4 01:20:30.833266 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 4 01:20:31.637761 sshd[6021]: pam_unix(sshd:session): session closed for user core Mar 4 01:20:31.647343 systemd[1]: sshd@13-10.243.77.214:22-20.161.92.111:45676.service: Deactivated successfully. Mar 4 01:20:31.650416 systemd[1]: session-16.scope: Deactivated successfully. Mar 4 01:20:31.652541 systemd-logind[1492]: Session 16 logged out. Waiting for processes to exit. Mar 4 01:20:31.653964 systemd-logind[1492]: Removed session 16. Mar 4 01:20:31.739414 systemd[1]: Started sshd@14-10.243.77.214:22-20.161.92.111:48820.service - OpenSSH per-connection server daemon (20.161.92.111:48820). Mar 4 01:20:32.327747 sshd[6035]: Accepted publickey for core from 20.161.92.111 port 48820 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:20:32.333583 sshd[6035]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:20:32.350924 systemd-logind[1492]: New session 17 of user core. Mar 4 01:20:32.357267 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 4 01:20:32.521366 containerd[1516]: time="2026-03-04T01:20:32.521199784Z" level=info msg="StopPodSandbox for \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\"" Mar 4 01:20:33.146964 sshd[6035]: pam_unix(sshd:session): session closed for user core Mar 4 01:20:33.167859 systemd[1]: sshd@14-10.243.77.214:22-20.161.92.111:48820.service: Deactivated successfully. Mar 4 01:20:33.172752 systemd[1]: session-17.scope: Deactivated successfully. Mar 4 01:20:33.185003 systemd-logind[1492]: Session 17 logged out. Waiting for processes to exit. Mar 4 01:20:33.189363 systemd-logind[1492]: Removed session 17. Mar 4 01:20:33.253732 systemd[1]: Started sshd@15-10.243.77.214:22-20.161.92.111:48834.service - OpenSSH per-connection server daemon (20.161.92.111:48834). Mar 4 01:20:33.373966 containerd[1516]: 2026-03-04 01:20:32.930 [WARNING][6048] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0", GenerateName:"calico-kube-controllers-58bc6545dc-", Namespace:"calico-system", SelfLink:"", UID:"11453fdd-1482-4ede-8950-e97c22d85781", ResourceVersion:"1123", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58bc6545dc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468", Pod:"calico-kube-controllers-58bc6545dc-jbw6r", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.45.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali70ef227e547", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:20:33.373966 containerd[1516]: 2026-03-04 01:20:32.932 [INFO][6048] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Mar 4 01:20:33.373966 containerd[1516]: 2026-03-04 01:20:32.932 [INFO][6048] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" iface="eth0" netns="" Mar 4 01:20:33.373966 containerd[1516]: 2026-03-04 01:20:32.932 [INFO][6048] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Mar 4 01:20:33.373966 containerd[1516]: 2026-03-04 01:20:32.932 [INFO][6048] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Mar 4 01:20:33.373966 containerd[1516]: 2026-03-04 01:20:33.290 [INFO][6059] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" HandleID="k8s-pod-network.5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" Mar 4 01:20:33.373966 containerd[1516]: 2026-03-04 01:20:33.298 [INFO][6059] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:20:33.373966 containerd[1516]: 2026-03-04 01:20:33.298 [INFO][6059] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:20:33.373966 containerd[1516]: 2026-03-04 01:20:33.344 [WARNING][6059] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" HandleID="k8s-pod-network.5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" Mar 4 01:20:33.373966 containerd[1516]: 2026-03-04 01:20:33.347 [INFO][6059] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" HandleID="k8s-pod-network.5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" Mar 4 01:20:33.373966 containerd[1516]: 2026-03-04 01:20:33.350 [INFO][6059] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:20:33.373966 containerd[1516]: 2026-03-04 01:20:33.364 [INFO][6048] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Mar 4 01:20:33.429305 containerd[1516]: time="2026-03-04T01:20:33.427263837Z" level=info msg="TearDown network for sandbox \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\" successfully" Mar 4 01:20:33.429534 containerd[1516]: time="2026-03-04T01:20:33.429503986Z" level=info msg="StopPodSandbox for \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\" returns successfully" Mar 4 01:20:33.561358 containerd[1516]: time="2026-03-04T01:20:33.561282525Z" level=info msg="RemovePodSandbox for \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\"" Mar 4 01:20:33.578954 containerd[1516]: time="2026-03-04T01:20:33.578834098Z" level=info msg="Forcibly stopping sandbox \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\"" Mar 4 01:20:34.037355 containerd[1516]: 2026-03-04 01:20:33.819 [WARNING][6081] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0", GenerateName:"calico-kube-controllers-58bc6545dc-", Namespace:"calico-system", SelfLink:"", UID:"11453fdd-1482-4ede-8950-e97c22d85781", ResourceVersion:"1123", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58bc6545dc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"a52a1a38b6bde524ca68d0fcb56852d79b9ffd8b35b67d11a6e76c172ad93468", Pod:"calico-kube-controllers-58bc6545dc-jbw6r", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.45.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali70ef227e547", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:20:34.037355 containerd[1516]: 2026-03-04 01:20:33.823 [INFO][6081] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Mar 4 01:20:34.037355 containerd[1516]: 2026-03-04 01:20:33.823 [INFO][6081] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" iface="eth0" netns="" Mar 4 01:20:34.037355 containerd[1516]: 2026-03-04 01:20:33.823 [INFO][6081] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Mar 4 01:20:34.037355 containerd[1516]: 2026-03-04 01:20:33.823 [INFO][6081] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Mar 4 01:20:34.037355 containerd[1516]: 2026-03-04 01:20:33.980 [INFO][6088] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" HandleID="k8s-pod-network.5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" Mar 4 01:20:34.037355 containerd[1516]: 2026-03-04 01:20:33.981 [INFO][6088] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:20:34.037355 containerd[1516]: 2026-03-04 01:20:33.982 [INFO][6088] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:20:34.037355 containerd[1516]: 2026-03-04 01:20:34.013 [WARNING][6088] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" HandleID="k8s-pod-network.5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" Mar 4 01:20:34.037355 containerd[1516]: 2026-03-04 01:20:34.014 [INFO][6088] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" HandleID="k8s-pod-network.5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--kube--controllers--58bc6545dc--jbw6r-eth0" Mar 4 01:20:34.037355 containerd[1516]: 2026-03-04 01:20:34.019 [INFO][6088] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:20:34.037355 containerd[1516]: 2026-03-04 01:20:34.029 [INFO][6081] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b" Mar 4 01:20:34.041739 containerd[1516]: time="2026-03-04T01:20:34.037584268Z" level=info msg="TearDown network for sandbox \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\" successfully" Mar 4 01:20:34.075099 sshd[6070]: Accepted publickey for core from 20.161.92.111 port 48834 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:20:34.095977 sshd[6070]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:20:34.128151 systemd-logind[1492]: New session 18 of user core. Mar 4 01:20:34.132060 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 4 01:20:34.197890 containerd[1516]: time="2026-03-04T01:20:34.197587496Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 4 01:20:34.197890 containerd[1516]: time="2026-03-04T01:20:34.197775993Z" level=info msg="RemovePodSandbox \"5c9edbdff8cef29dedc1579be48aed11b226cedb06bf4d6e5310150257952c8b\" returns successfully" Mar 4 01:20:34.201416 containerd[1516]: time="2026-03-04T01:20:34.200738854Z" level=info msg="StopPodSandbox for \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\"" Mar 4 01:20:34.388432 containerd[1516]: 2026-03-04 01:20:34.286 [WARNING][6103] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"58ea317b-5eaa-44c7-a296-9b69e1cee2ab", ResourceVersion:"1038", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f", Pod:"coredns-66bc5c9577-dg6p5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.45.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie91308b0b81", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:20:34.388432 containerd[1516]: 2026-03-04 01:20:34.287 [INFO][6103] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Mar 4 01:20:34.388432 containerd[1516]: 2026-03-04 01:20:34.287 [INFO][6103] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" iface="eth0" netns="" Mar 4 01:20:34.388432 containerd[1516]: 2026-03-04 01:20:34.287 [INFO][6103] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Mar 4 01:20:34.388432 containerd[1516]: 2026-03-04 01:20:34.287 [INFO][6103] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Mar 4 01:20:34.388432 containerd[1516]: 2026-03-04 01:20:34.360 [INFO][6110] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" HandleID="k8s-pod-network.999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" Mar 4 01:20:34.388432 containerd[1516]: 2026-03-04 01:20:34.360 [INFO][6110] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:20:34.388432 containerd[1516]: 2026-03-04 01:20:34.361 [INFO][6110] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:20:34.388432 containerd[1516]: 2026-03-04 01:20:34.376 [WARNING][6110] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" HandleID="k8s-pod-network.999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" Mar 4 01:20:34.388432 containerd[1516]: 2026-03-04 01:20:34.377 [INFO][6110] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" HandleID="k8s-pod-network.999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" Mar 4 01:20:34.388432 containerd[1516]: 2026-03-04 01:20:34.381 [INFO][6110] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:20:34.388432 containerd[1516]: 2026-03-04 01:20:34.384 [INFO][6103] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Mar 4 01:20:34.392205 containerd[1516]: time="2026-03-04T01:20:34.388510480Z" level=info msg="TearDown network for sandbox \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\" successfully" Mar 4 01:20:34.392205 containerd[1516]: time="2026-03-04T01:20:34.388550583Z" level=info msg="StopPodSandbox for \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\" returns successfully" Mar 4 01:20:34.392205 containerd[1516]: time="2026-03-04T01:20:34.391283462Z" level=info msg="RemovePodSandbox for \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\"" Mar 4 01:20:34.392205 containerd[1516]: time="2026-03-04T01:20:34.391369193Z" level=info msg="Forcibly stopping sandbox \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\"" Mar 4 01:20:34.577941 containerd[1516]: 2026-03-04 01:20:34.470 [WARNING][6125] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"58ea317b-5eaa-44c7-a296-9b69e1cee2ab", ResourceVersion:"1038", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"58b4c9428519aae83f630ac373ab73e43d2750b32a53c369ff11aac4d69db82f", Pod:"coredns-66bc5c9577-dg6p5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.45.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie91308b0b81", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:20:34.577941 containerd[1516]: 2026-03-04 01:20:34.471 [INFO][6125] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Mar 4 01:20:34.577941 containerd[1516]: 2026-03-04 01:20:34.471 [INFO][6125] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" iface="eth0" netns="" Mar 4 01:20:34.577941 containerd[1516]: 2026-03-04 01:20:34.471 [INFO][6125] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Mar 4 01:20:34.577941 containerd[1516]: 2026-03-04 01:20:34.471 [INFO][6125] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Mar 4 01:20:34.577941 containerd[1516]: 2026-03-04 01:20:34.544 [INFO][6135] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" HandleID="k8s-pod-network.999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" Mar 4 01:20:34.577941 containerd[1516]: 2026-03-04 01:20:34.545 [INFO][6135] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:20:34.577941 containerd[1516]: 2026-03-04 01:20:34.545 [INFO][6135] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:20:34.577941 containerd[1516]: 2026-03-04 01:20:34.559 [WARNING][6135] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" HandleID="k8s-pod-network.999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" Mar 4 01:20:34.577941 containerd[1516]: 2026-03-04 01:20:34.559 [INFO][6135] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" HandleID="k8s-pod-network.999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--dg6p5-eth0" Mar 4 01:20:34.577941 containerd[1516]: 2026-03-04 01:20:34.563 [INFO][6135] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:20:34.577941 containerd[1516]: 2026-03-04 01:20:34.571 [INFO][6125] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1" Mar 4 01:20:34.583072 containerd[1516]: time="2026-03-04T01:20:34.578376653Z" level=info msg="TearDown network for sandbox \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\" successfully" Mar 4 01:20:34.709505 containerd[1516]: time="2026-03-04T01:20:34.708369368Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 4 01:20:34.709505 containerd[1516]: time="2026-03-04T01:20:34.708691211Z" level=info msg="RemovePodSandbox \"999160567e8a6ef64f1f1710dd7ce656c36815f0c624015ca7ae0679a4bdb1c1\" returns successfully" Mar 4 01:20:34.711096 containerd[1516]: time="2026-03-04T01:20:34.711034961Z" level=info msg="StopPodSandbox for \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\"" Mar 4 01:20:34.971466 containerd[1516]: 2026-03-04 01:20:34.864 [WARNING][6152] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"49b3e387-9ccb-4b0f-9de1-bd709b96a755", ResourceVersion:"1007", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0", Pod:"coredns-66bc5c9577-ghd5d", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.45.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3233f0f9962", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:20:34.971466 containerd[1516]: 2026-03-04 01:20:34.865 [INFO][6152] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Mar 4 01:20:34.971466 containerd[1516]: 2026-03-04 01:20:34.865 [INFO][6152] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" iface="eth0" netns="" Mar 4 01:20:34.971466 containerd[1516]: 2026-03-04 01:20:34.865 [INFO][6152] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Mar 4 01:20:34.971466 containerd[1516]: 2026-03-04 01:20:34.866 [INFO][6152] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Mar 4 01:20:34.971466 containerd[1516]: 2026-03-04 01:20:34.942 [INFO][6159] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" HandleID="k8s-pod-network.7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" Mar 4 01:20:34.971466 containerd[1516]: 2026-03-04 01:20:34.942 [INFO][6159] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:20:34.971466 containerd[1516]: 2026-03-04 01:20:34.942 [INFO][6159] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:20:34.971466 containerd[1516]: 2026-03-04 01:20:34.958 [WARNING][6159] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" HandleID="k8s-pod-network.7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" Mar 4 01:20:34.971466 containerd[1516]: 2026-03-04 01:20:34.958 [INFO][6159] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" HandleID="k8s-pod-network.7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" Mar 4 01:20:34.971466 containerd[1516]: 2026-03-04 01:20:34.961 [INFO][6159] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:20:34.971466 containerd[1516]: 2026-03-04 01:20:34.966 [INFO][6152] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Mar 4 01:20:34.972901 containerd[1516]: time="2026-03-04T01:20:34.971770699Z" level=info msg="TearDown network for sandbox \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\" successfully" Mar 4 01:20:34.972901 containerd[1516]: time="2026-03-04T01:20:34.971842427Z" level=info msg="StopPodSandbox for \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\" returns successfully" Mar 4 01:20:34.976657 containerd[1516]: time="2026-03-04T01:20:34.974895284Z" level=info msg="RemovePodSandbox for \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\"" Mar 4 01:20:34.976657 containerd[1516]: time="2026-03-04T01:20:34.974930455Z" level=info msg="Forcibly stopping sandbox \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\"" Mar 4 01:20:35.353792 containerd[1516]: 2026-03-04 01:20:35.167 [WARNING][6174] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"49b3e387-9ccb-4b0f-9de1-bd709b96a755", ResourceVersion:"1007", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"b425da7f868d314747b54ee532ce7e570ab3720dd536dffa8058d38d98f67af0", Pod:"coredns-66bc5c9577-ghd5d", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.45.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3233f0f9962", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:20:35.353792 containerd[1516]: 2026-03-04 01:20:35.173 [INFO][6174] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Mar 4 01:20:35.353792 containerd[1516]: 2026-03-04 01:20:35.173 [INFO][6174] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" iface="eth0" netns="" Mar 4 01:20:35.353792 containerd[1516]: 2026-03-04 01:20:35.173 [INFO][6174] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Mar 4 01:20:35.353792 containerd[1516]: 2026-03-04 01:20:35.173 [INFO][6174] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Mar 4 01:20:35.353792 containerd[1516]: 2026-03-04 01:20:35.311 [INFO][6182] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" HandleID="k8s-pod-network.7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" Mar 4 01:20:35.353792 containerd[1516]: 2026-03-04 01:20:35.311 [INFO][6182] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:20:35.353792 containerd[1516]: 2026-03-04 01:20:35.311 [INFO][6182] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:20:35.353792 containerd[1516]: 2026-03-04 01:20:35.337 [WARNING][6182] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" HandleID="k8s-pod-network.7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" Mar 4 01:20:35.353792 containerd[1516]: 2026-03-04 01:20:35.337 [INFO][6182] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" HandleID="k8s-pod-network.7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-coredns--66bc5c9577--ghd5d-eth0" Mar 4 01:20:35.353792 containerd[1516]: 2026-03-04 01:20:35.341 [INFO][6182] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:20:35.353792 containerd[1516]: 2026-03-04 01:20:35.349 [INFO][6174] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a" Mar 4 01:20:35.359089 containerd[1516]: time="2026-03-04T01:20:35.354459054Z" level=info msg="TearDown network for sandbox \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\" successfully" Mar 4 01:20:35.372899 containerd[1516]: time="2026-03-04T01:20:35.372102306Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 4 01:20:35.372899 containerd[1516]: time="2026-03-04T01:20:35.372273919Z" level=info msg="RemovePodSandbox \"7ea02f6760ddac87b513057887f4872f0214cf6b2c0e17202e19a4987496c14a\" returns successfully" Mar 4 01:20:35.374613 containerd[1516]: time="2026-03-04T01:20:35.373513775Z" level=info msg="StopPodSandbox for \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\"" Mar 4 01:20:35.663516 containerd[1516]: 2026-03-04 01:20:35.497 [WARNING][6195] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0", GenerateName:"calico-apiserver-5d996b5b5-", Namespace:"calico-system", SelfLink:"", UID:"0aba09a3-09ff-4e29-a406-0f9932fc94f6", ResourceVersion:"1209", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d996b5b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b", Pod:"calico-apiserver-5d996b5b5-c257w", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.45.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calib4c44dadcf8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:20:35.663516 containerd[1516]: 2026-03-04 01:20:35.500 [INFO][6195] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Mar 4 01:20:35.663516 containerd[1516]: 2026-03-04 01:20:35.500 [INFO][6195] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" iface="eth0" netns="" Mar 4 01:20:35.663516 containerd[1516]: 2026-03-04 01:20:35.500 [INFO][6195] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Mar 4 01:20:35.663516 containerd[1516]: 2026-03-04 01:20:35.501 [INFO][6195] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Mar 4 01:20:35.663516 containerd[1516]: 2026-03-04 01:20:35.620 [INFO][6202] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" HandleID="k8s-pod-network.b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" Mar 4 01:20:35.663516 containerd[1516]: 2026-03-04 01:20:35.620 [INFO][6202] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:20:35.663516 containerd[1516]: 2026-03-04 01:20:35.620 [INFO][6202] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:20:35.663516 containerd[1516]: 2026-03-04 01:20:35.638 [WARNING][6202] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" HandleID="k8s-pod-network.b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" Mar 4 01:20:35.663516 containerd[1516]: 2026-03-04 01:20:35.638 [INFO][6202] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" HandleID="k8s-pod-network.b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" Mar 4 01:20:35.663516 containerd[1516]: 2026-03-04 01:20:35.644 [INFO][6202] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:20:35.663516 containerd[1516]: 2026-03-04 01:20:35.651 [INFO][6195] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Mar 4 01:20:35.670933 containerd[1516]: time="2026-03-04T01:20:35.664525929Z" level=info msg="TearDown network for sandbox \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\" successfully" Mar 4 01:20:35.670933 containerd[1516]: time="2026-03-04T01:20:35.664567207Z" level=info msg="StopPodSandbox for \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\" returns successfully" Mar 4 01:20:35.718714 containerd[1516]: time="2026-03-04T01:20:35.718478766Z" level=info msg="RemovePodSandbox for \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\"" Mar 4 01:20:35.718955 containerd[1516]: time="2026-03-04T01:20:35.718713060Z" level=info msg="Forcibly stopping sandbox \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\"" Mar 4 01:20:35.769133 sshd[6070]: pam_unix(sshd:session): session closed for user core Mar 4 01:20:35.794655 systemd[1]: sshd@15-10.243.77.214:22-20.161.92.111:48834.service: Deactivated successfully. Mar 4 01:20:35.800369 systemd[1]: session-18.scope: Deactivated successfully. Mar 4 01:20:35.804293 systemd-logind[1492]: Session 18 logged out. Waiting for processes to exit. Mar 4 01:20:35.807488 systemd-logind[1492]: Removed session 18. Mar 4 01:20:35.966105 containerd[1516]: 2026-03-04 01:20:35.874 [WARNING][6217] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0", GenerateName:"calico-apiserver-5d996b5b5-", Namespace:"calico-system", SelfLink:"", UID:"0aba09a3-09ff-4e29-a406-0f9932fc94f6", ResourceVersion:"1209", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d996b5b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"b2947e456c6688c98f1cd2e77204b6e0c9d39bf3de182191d45c3d20fb4f6a4b", Pod:"calico-apiserver-5d996b5b5-c257w", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.45.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calib4c44dadcf8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:20:35.966105 containerd[1516]: 2026-03-04 01:20:35.875 [INFO][6217] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Mar 4 01:20:35.966105 containerd[1516]: 2026-03-04 01:20:35.875 [INFO][6217] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" iface="eth0" netns="" Mar 4 01:20:35.966105 containerd[1516]: 2026-03-04 01:20:35.875 [INFO][6217] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Mar 4 01:20:35.966105 containerd[1516]: 2026-03-04 01:20:35.875 [INFO][6217] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Mar 4 01:20:35.966105 containerd[1516]: 2026-03-04 01:20:35.943 [INFO][6226] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" HandleID="k8s-pod-network.b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" Mar 4 01:20:35.966105 containerd[1516]: 2026-03-04 01:20:35.944 [INFO][6226] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:20:35.966105 containerd[1516]: 2026-03-04 01:20:35.944 [INFO][6226] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:20:35.966105 containerd[1516]: 2026-03-04 01:20:35.954 [WARNING][6226] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" HandleID="k8s-pod-network.b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" Mar 4 01:20:35.966105 containerd[1516]: 2026-03-04 01:20:35.955 [INFO][6226] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" HandleID="k8s-pod-network.b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--c257w-eth0" Mar 4 01:20:35.966105 containerd[1516]: 2026-03-04 01:20:35.957 [INFO][6226] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:20:35.966105 containerd[1516]: 2026-03-04 01:20:35.962 [INFO][6217] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4" Mar 4 01:20:35.968391 containerd[1516]: time="2026-03-04T01:20:35.966267380Z" level=info msg="TearDown network for sandbox \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\" successfully" Mar 4 01:20:35.973016 containerd[1516]: time="2026-03-04T01:20:35.972686990Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 4 01:20:35.982470 containerd[1516]: time="2026-03-04T01:20:35.982406078Z" level=info msg="RemovePodSandbox \"b38951e41f3fcd0b4a199b227e99ce37e3f3f79528b4980c707e49fe1329f2b4\" returns successfully" Mar 4 01:20:35.983770 containerd[1516]: time="2026-03-04T01:20:35.983715807Z" level=info msg="StopPodSandbox for \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\"" Mar 4 01:20:36.197271 containerd[1516]: 2026-03-04 01:20:36.088 [WARNING][6240] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"621d6fb0-0e39-428b-8e9a-8e8c65b0d05c", ResourceVersion:"1136", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c", Pod:"csi-node-driver-fnbg8", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.45.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali968ca8cd64c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:20:36.197271 containerd[1516]: 2026-03-04 01:20:36.089 [INFO][6240] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Mar 4 01:20:36.197271 containerd[1516]: 2026-03-04 01:20:36.089 [INFO][6240] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" iface="eth0" netns="" Mar 4 01:20:36.197271 containerd[1516]: 2026-03-04 01:20:36.089 [INFO][6240] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Mar 4 01:20:36.197271 containerd[1516]: 2026-03-04 01:20:36.089 [INFO][6240] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Mar 4 01:20:36.197271 containerd[1516]: 2026-03-04 01:20:36.150 [INFO][6248] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" HandleID="k8s-pod-network.b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Workload="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" Mar 4 01:20:36.197271 containerd[1516]: 2026-03-04 01:20:36.150 [INFO][6248] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:20:36.197271 containerd[1516]: 2026-03-04 01:20:36.150 [INFO][6248] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:20:36.197271 containerd[1516]: 2026-03-04 01:20:36.178 [WARNING][6248] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" HandleID="k8s-pod-network.b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Workload="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" Mar 4 01:20:36.197271 containerd[1516]: 2026-03-04 01:20:36.178 [INFO][6248] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" HandleID="k8s-pod-network.b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Workload="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" Mar 4 01:20:36.197271 containerd[1516]: 2026-03-04 01:20:36.188 [INFO][6248] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:20:36.197271 containerd[1516]: 2026-03-04 01:20:36.191 [INFO][6240] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Mar 4 01:20:36.197271 containerd[1516]: time="2026-03-04T01:20:36.197134593Z" level=info msg="TearDown network for sandbox \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\" successfully" Mar 4 01:20:36.197271 containerd[1516]: time="2026-03-04T01:20:36.197208764Z" level=info msg="StopPodSandbox for \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\" returns successfully" Mar 4 01:20:36.207890 containerd[1516]: time="2026-03-04T01:20:36.199718617Z" level=info msg="RemovePodSandbox for \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\"" Mar 4 01:20:36.207890 containerd[1516]: time="2026-03-04T01:20:36.199786775Z" level=info msg="Forcibly stopping sandbox \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\"" Mar 4 01:20:36.371858 containerd[1516]: 2026-03-04 01:20:36.289 [WARNING][6262] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"621d6fb0-0e39-428b-8e9a-8e8c65b0d05c", ResourceVersion:"1136", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"aa3cc95acf6b8a00eaae1bd0908e87f0f56944dc053554487d165dab9afead2c", Pod:"csi-node-driver-fnbg8", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.45.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali968ca8cd64c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:20:36.371858 containerd[1516]: 2026-03-04 01:20:36.291 [INFO][6262] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Mar 4 01:20:36.371858 containerd[1516]: 2026-03-04 01:20:36.291 [INFO][6262] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" iface="eth0" netns="" Mar 4 01:20:36.371858 containerd[1516]: 2026-03-04 01:20:36.291 [INFO][6262] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Mar 4 01:20:36.371858 containerd[1516]: 2026-03-04 01:20:36.292 [INFO][6262] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Mar 4 01:20:36.371858 containerd[1516]: 2026-03-04 01:20:36.347 [INFO][6270] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" HandleID="k8s-pod-network.b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Workload="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" Mar 4 01:20:36.371858 containerd[1516]: 2026-03-04 01:20:36.349 [INFO][6270] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:20:36.371858 containerd[1516]: 2026-03-04 01:20:36.349 [INFO][6270] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:20:36.371858 containerd[1516]: 2026-03-04 01:20:36.360 [WARNING][6270] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" HandleID="k8s-pod-network.b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Workload="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" Mar 4 01:20:36.371858 containerd[1516]: 2026-03-04 01:20:36.360 [INFO][6270] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" HandleID="k8s-pod-network.b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Workload="srv--8wmcq.gb1.brightbox.com-k8s-csi--node--driver--fnbg8-eth0" Mar 4 01:20:36.371858 containerd[1516]: 2026-03-04 01:20:36.363 [INFO][6270] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:20:36.371858 containerd[1516]: 2026-03-04 01:20:36.367 [INFO][6262] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7" Mar 4 01:20:36.372827 containerd[1516]: time="2026-03-04T01:20:36.372138348Z" level=info msg="TearDown network for sandbox \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\" successfully" Mar 4 01:20:36.380780 containerd[1516]: time="2026-03-04T01:20:36.379712584Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 4 01:20:36.380780 containerd[1516]: time="2026-03-04T01:20:36.379929337Z" level=info msg="RemovePodSandbox \"b6b7c90683323404e7f2e03ed334633cdb79627172e01853786858c102f084d7\" returns successfully" Mar 4 01:20:36.382399 containerd[1516]: time="2026-03-04T01:20:36.381887697Z" level=info msg="StopPodSandbox for \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\"" Mar 4 01:20:36.565420 containerd[1516]: 2026-03-04 01:20:36.475 [WARNING][6285] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0", GenerateName:"calico-apiserver-5d996b5b5-", Namespace:"calico-system", SelfLink:"", UID:"e815ce3e-bc88-40d3-b47c-e2c0c6843ef4", ResourceVersion:"1246", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d996b5b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6", Pod:"calico-apiserver-5d996b5b5-4hclm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.45.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calieca23766474", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:20:36.565420 containerd[1516]: 2026-03-04 01:20:36.475 [INFO][6285] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Mar 4 01:20:36.565420 containerd[1516]: 2026-03-04 01:20:36.476 [INFO][6285] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" iface="eth0" netns="" Mar 4 01:20:36.565420 containerd[1516]: 2026-03-04 01:20:36.477 [INFO][6285] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Mar 4 01:20:36.565420 containerd[1516]: 2026-03-04 01:20:36.477 [INFO][6285] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Mar 4 01:20:36.565420 containerd[1516]: 2026-03-04 01:20:36.541 [INFO][6292] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" HandleID="k8s-pod-network.312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" Mar 4 01:20:36.565420 containerd[1516]: 2026-03-04 01:20:36.542 [INFO][6292] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:20:36.565420 containerd[1516]: 2026-03-04 01:20:36.542 [INFO][6292] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:20:36.565420 containerd[1516]: 2026-03-04 01:20:36.554 [WARNING][6292] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" HandleID="k8s-pod-network.312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" Mar 4 01:20:36.565420 containerd[1516]: 2026-03-04 01:20:36.554 [INFO][6292] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" HandleID="k8s-pod-network.312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" Mar 4 01:20:36.565420 containerd[1516]: 2026-03-04 01:20:36.556 [INFO][6292] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:20:36.565420 containerd[1516]: 2026-03-04 01:20:36.561 [INFO][6285] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Mar 4 01:20:36.569281 containerd[1516]: time="2026-03-04T01:20:36.565521773Z" level=info msg="TearDown network for sandbox \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\" successfully" Mar 4 01:20:36.569281 containerd[1516]: time="2026-03-04T01:20:36.565565805Z" level=info msg="StopPodSandbox for \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\" returns successfully" Mar 4 01:20:36.569281 containerd[1516]: time="2026-03-04T01:20:36.567524289Z" level=info msg="RemovePodSandbox for \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\"" Mar 4 01:20:36.569281 containerd[1516]: time="2026-03-04T01:20:36.567571950Z" level=info msg="Forcibly stopping sandbox \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\"" Mar 4 01:20:36.757302 containerd[1516]: 2026-03-04 01:20:36.660 [WARNING][6307] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0", GenerateName:"calico-apiserver-5d996b5b5-", Namespace:"calico-system", SelfLink:"", UID:"e815ce3e-bc88-40d3-b47c-e2c0c6843ef4", ResourceVersion:"1246", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d996b5b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"dad98c2a8e3bed39728073156299fa3c1987e4b5af724da578aca5ffaf0703c6", Pod:"calico-apiserver-5d996b5b5-4hclm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.45.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calieca23766474", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:20:36.757302 containerd[1516]: 2026-03-04 01:20:36.661 [INFO][6307] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Mar 4 01:20:36.757302 containerd[1516]: 2026-03-04 01:20:36.661 [INFO][6307] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" iface="eth0" netns="" Mar 4 01:20:36.757302 containerd[1516]: 2026-03-04 01:20:36.661 [INFO][6307] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Mar 4 01:20:36.757302 containerd[1516]: 2026-03-04 01:20:36.661 [INFO][6307] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Mar 4 01:20:36.757302 containerd[1516]: 2026-03-04 01:20:36.727 [INFO][6314] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" HandleID="k8s-pod-network.312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" Mar 4 01:20:36.757302 containerd[1516]: 2026-03-04 01:20:36.728 [INFO][6314] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:20:36.757302 containerd[1516]: 2026-03-04 01:20:36.728 [INFO][6314] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:20:36.757302 containerd[1516]: 2026-03-04 01:20:36.742 [WARNING][6314] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" HandleID="k8s-pod-network.312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" Mar 4 01:20:36.757302 containerd[1516]: 2026-03-04 01:20:36.742 [INFO][6314] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" HandleID="k8s-pod-network.312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Workload="srv--8wmcq.gb1.brightbox.com-k8s-calico--apiserver--5d996b5b5--4hclm-eth0" Mar 4 01:20:36.757302 containerd[1516]: 2026-03-04 01:20:36.746 [INFO][6314] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:20:36.757302 containerd[1516]: 2026-03-04 01:20:36.751 [INFO][6307] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a" Mar 4 01:20:36.757302 containerd[1516]: time="2026-03-04T01:20:36.755840572Z" level=info msg="TearDown network for sandbox \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\" successfully" Mar 4 01:20:36.765196 containerd[1516]: time="2026-03-04T01:20:36.765099550Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 4 01:20:36.765653 containerd[1516]: time="2026-03-04T01:20:36.765286792Z" level=info msg="RemovePodSandbox \"312b9669175790460cdbb5d2706d9712b21d8a3c34eafc6fb2491da08a2b667a\" returns successfully" Mar 4 01:20:36.766679 containerd[1516]: time="2026-03-04T01:20:36.766644314Z" level=info msg="StopPodSandbox for \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\"" Mar 4 01:20:37.070247 containerd[1516]: 2026-03-04 01:20:36.948 [WARNING][6330] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"0036b8ad-7c49-4b71-addf-d1386c2532e8", ResourceVersion:"1279", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d", Pod:"goldmane-cccfbd5cf-hwgzw", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.45.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali81eae1f738e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:20:37.070247 containerd[1516]: 2026-03-04 01:20:36.949 [INFO][6330] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Mar 4 01:20:37.070247 containerd[1516]: 2026-03-04 01:20:36.949 [INFO][6330] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" iface="eth0" netns="" Mar 4 01:20:37.070247 containerd[1516]: 2026-03-04 01:20:36.949 [INFO][6330] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Mar 4 01:20:37.070247 containerd[1516]: 2026-03-04 01:20:36.949 [INFO][6330] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Mar 4 01:20:37.070247 containerd[1516]: 2026-03-04 01:20:37.030 [INFO][6337] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" HandleID="k8s-pod-network.495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Workload="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" Mar 4 01:20:37.070247 containerd[1516]: 2026-03-04 01:20:37.032 [INFO][6337] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:20:37.070247 containerd[1516]: 2026-03-04 01:20:37.032 [INFO][6337] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:20:37.070247 containerd[1516]: 2026-03-04 01:20:37.053 [WARNING][6337] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" HandleID="k8s-pod-network.495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Workload="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" Mar 4 01:20:37.070247 containerd[1516]: 2026-03-04 01:20:37.053 [INFO][6337] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" HandleID="k8s-pod-network.495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Workload="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" Mar 4 01:20:37.070247 containerd[1516]: 2026-03-04 01:20:37.056 [INFO][6337] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:20:37.070247 containerd[1516]: 2026-03-04 01:20:37.062 [INFO][6330] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Mar 4 01:20:37.070247 containerd[1516]: time="2026-03-04T01:20:37.069854982Z" level=info msg="TearDown network for sandbox \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\" successfully" Mar 4 01:20:37.070247 containerd[1516]: time="2026-03-04T01:20:37.069910203Z" level=info msg="StopPodSandbox for \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\" returns successfully" Mar 4 01:20:37.074120 containerd[1516]: time="2026-03-04T01:20:37.074040981Z" level=info msg="RemovePodSandbox for \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\"" Mar 4 01:20:37.074223 containerd[1516]: time="2026-03-04T01:20:37.074123769Z" level=info msg="Forcibly stopping sandbox \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\"" Mar 4 01:20:37.284415 containerd[1516]: 2026-03-04 01:20:37.168 [WARNING][6352] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"0036b8ad-7c49-4b71-addf-d1386c2532e8", ResourceVersion:"1279", Generation:0, CreationTimestamp:time.Date(2026, time.March, 4, 1, 18, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-8wmcq.gb1.brightbox.com", ContainerID:"1afac4bdbab8fd117ec855fdf73a39cdf587918be62ff40772d9eeb34602248d", Pod:"goldmane-cccfbd5cf-hwgzw", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.45.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali81eae1f738e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 4 01:20:37.284415 containerd[1516]: 2026-03-04 01:20:37.169 [INFO][6352] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Mar 4 01:20:37.284415 containerd[1516]: 2026-03-04 01:20:37.169 [INFO][6352] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" iface="eth0" netns="" Mar 4 01:20:37.284415 containerd[1516]: 2026-03-04 01:20:37.169 [INFO][6352] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Mar 4 01:20:37.284415 containerd[1516]: 2026-03-04 01:20:37.169 [INFO][6352] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Mar 4 01:20:37.284415 containerd[1516]: 2026-03-04 01:20:37.252 [INFO][6359] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" HandleID="k8s-pod-network.495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Workload="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" Mar 4 01:20:37.284415 containerd[1516]: 2026-03-04 01:20:37.253 [INFO][6359] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 4 01:20:37.284415 containerd[1516]: 2026-03-04 01:20:37.253 [INFO][6359] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 4 01:20:37.284415 containerd[1516]: 2026-03-04 01:20:37.270 [WARNING][6359] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" HandleID="k8s-pod-network.495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Workload="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" Mar 4 01:20:37.284415 containerd[1516]: 2026-03-04 01:20:37.271 [INFO][6359] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" HandleID="k8s-pod-network.495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Workload="srv--8wmcq.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--hwgzw-eth0" Mar 4 01:20:37.284415 containerd[1516]: 2026-03-04 01:20:37.273 [INFO][6359] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 4 01:20:37.284415 containerd[1516]: 2026-03-04 01:20:37.278 [INFO][6352] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e" Mar 4 01:20:37.284415 containerd[1516]: time="2026-03-04T01:20:37.283277792Z" level=info msg="TearDown network for sandbox \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\" successfully" Mar 4 01:20:37.291156 containerd[1516]: time="2026-03-04T01:20:37.290859466Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 4 01:20:37.291156 containerd[1516]: time="2026-03-04T01:20:37.290968653Z" level=info msg="RemovePodSandbox \"495cbddfc993d42f76565c197d24a4fd02f7ad6771ae700bf2a0860fa00df15e\" returns successfully" Mar 4 01:20:40.873884 systemd[1]: Started sshd@16-10.243.77.214:22-20.161.92.111:39482.service - OpenSSH per-connection server daemon (20.161.92.111:39482). Mar 4 01:20:41.567655 sshd[6388]: Accepted publickey for core from 20.161.92.111 port 39482 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:20:41.570980 sshd[6388]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:20:41.583287 systemd-logind[1492]: New session 19 of user core. Mar 4 01:20:41.587325 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 4 01:20:42.586432 sshd[6388]: pam_unix(sshd:session): session closed for user core Mar 4 01:20:42.593733 systemd[1]: sshd@16-10.243.77.214:22-20.161.92.111:39482.service: Deactivated successfully. Mar 4 01:20:42.599959 systemd[1]: session-19.scope: Deactivated successfully. Mar 4 01:20:42.602350 systemd-logind[1492]: Session 19 logged out. Waiting for processes to exit. Mar 4 01:20:42.604164 systemd-logind[1492]: Removed session 19. Mar 4 01:20:42.702552 systemd[1]: Started sshd@17-10.243.77.214:22-20.161.92.111:39496.service - OpenSSH per-connection server daemon (20.161.92.111:39496). Mar 4 01:20:43.306857 sshd[6401]: Accepted publickey for core from 20.161.92.111 port 39496 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:20:43.309281 sshd[6401]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:20:43.317870 systemd-logind[1492]: New session 20 of user core. Mar 4 01:20:43.323317 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 4 01:20:44.282621 sshd[6401]: pam_unix(sshd:session): session closed for user core Mar 4 01:20:44.290041 systemd[1]: sshd@17-10.243.77.214:22-20.161.92.111:39496.service: Deactivated successfully. Mar 4 01:20:44.295021 systemd[1]: session-20.scope: Deactivated successfully. Mar 4 01:20:44.296733 systemd-logind[1492]: Session 20 logged out. Waiting for processes to exit. Mar 4 01:20:44.298993 systemd-logind[1492]: Removed session 20. Mar 4 01:20:44.393569 systemd[1]: Started sshd@18-10.243.77.214:22-20.161.92.111:39502.service - OpenSSH per-connection server daemon (20.161.92.111:39502). Mar 4 01:20:45.046950 sshd[6413]: Accepted publickey for core from 20.161.92.111 port 39502 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:20:45.049832 sshd[6413]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:20:45.056904 systemd-logind[1492]: New session 21 of user core. Mar 4 01:20:45.064320 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 4 01:20:46.617416 sshd[6413]: pam_unix(sshd:session): session closed for user core Mar 4 01:20:46.638118 systemd-logind[1492]: Session 21 logged out. Waiting for processes to exit. Mar 4 01:20:46.638696 systemd[1]: sshd@18-10.243.77.214:22-20.161.92.111:39502.service: Deactivated successfully. Mar 4 01:20:46.643683 systemd[1]: session-21.scope: Deactivated successfully. Mar 4 01:20:46.648213 systemd-logind[1492]: Removed session 21. Mar 4 01:20:46.715753 systemd[1]: Started sshd@19-10.243.77.214:22-20.161.92.111:39504.service - OpenSSH per-connection server daemon (20.161.92.111:39504). Mar 4 01:20:47.339192 sshd[6440]: Accepted publickey for core from 20.161.92.111 port 39504 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:20:47.343244 sshd[6440]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:20:47.351795 systemd-logind[1492]: New session 22 of user core. Mar 4 01:20:47.357929 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 4 01:20:48.770065 sshd[6440]: pam_unix(sshd:session): session closed for user core Mar 4 01:20:48.779749 systemd[1]: sshd@19-10.243.77.214:22-20.161.92.111:39504.service: Deactivated successfully. Mar 4 01:20:48.786182 systemd[1]: session-22.scope: Deactivated successfully. Mar 4 01:20:48.789664 systemd-logind[1492]: Session 22 logged out. Waiting for processes to exit. Mar 4 01:20:48.792038 systemd-logind[1492]: Removed session 22. Mar 4 01:20:48.873454 systemd[1]: Started sshd@20-10.243.77.214:22-20.161.92.111:39514.service - OpenSSH per-connection server daemon (20.161.92.111:39514). Mar 4 01:20:49.508802 sshd[6453]: Accepted publickey for core from 20.161.92.111 port 39514 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:20:49.511206 sshd[6453]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:20:49.520165 systemd-logind[1492]: New session 23 of user core. Mar 4 01:20:49.524298 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 4 01:20:50.086832 sshd[6453]: pam_unix(sshd:session): session closed for user core Mar 4 01:20:50.093544 systemd[1]: sshd@20-10.243.77.214:22-20.161.92.111:39514.service: Deactivated successfully. Mar 4 01:20:50.096584 systemd[1]: session-23.scope: Deactivated successfully. Mar 4 01:20:50.097907 systemd-logind[1492]: Session 23 logged out. Waiting for processes to exit. Mar 4 01:20:50.100969 systemd-logind[1492]: Removed session 23. Mar 4 01:20:55.191556 systemd[1]: Started sshd@21-10.243.77.214:22-20.161.92.111:34644.service - OpenSSH per-connection server daemon (20.161.92.111:34644). Mar 4 01:20:55.829988 sshd[6506]: Accepted publickey for core from 20.161.92.111 port 34644 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:20:55.833569 sshd[6506]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:20:55.843120 systemd-logind[1492]: New session 24 of user core. Mar 4 01:20:55.851310 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 4 01:20:56.848371 sshd[6506]: pam_unix(sshd:session): session closed for user core Mar 4 01:20:56.855411 systemd[1]: sshd@21-10.243.77.214:22-20.161.92.111:34644.service: Deactivated successfully. Mar 4 01:20:56.858560 systemd[1]: session-24.scope: Deactivated successfully. Mar 4 01:20:56.860795 systemd-logind[1492]: Session 24 logged out. Waiting for processes to exit. Mar 4 01:20:56.865761 systemd-logind[1492]: Removed session 24. Mar 4 01:21:01.981210 systemd[1]: Started sshd@22-10.243.77.214:22-20.161.92.111:38418.service - OpenSSH per-connection server daemon (20.161.92.111:38418). Mar 4 01:21:02.747494 sshd[6577]: Accepted publickey for core from 20.161.92.111 port 38418 ssh2: RSA SHA256:phL7137i5y6DHtmwXYw8sU0DtZKGvJBo2Tpr6jEeFOI Mar 4 01:21:02.754996 sshd[6577]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 4 01:21:02.764110 systemd-logind[1492]: New session 25 of user core. Mar 4 01:21:02.770362 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 4 01:21:04.016650 sshd[6577]: pam_unix(sshd:session): session closed for user core Mar 4 01:21:04.032478 systemd[1]: sshd@22-10.243.77.214:22-20.161.92.111:38418.service: Deactivated successfully. Mar 4 01:21:04.038232 systemd[1]: session-25.scope: Deactivated successfully. Mar 4 01:21:04.040739 systemd-logind[1492]: Session 25 logged out. Waiting for processes to exit. Mar 4 01:21:04.043616 systemd-logind[1492]: Removed session 25.