Mar 12 04:47:15.060404 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Wed Mar 11 23:23:33 -00 2026 Mar 12 04:47:15.060445 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=0e4243d51ac00bffbb09a606c7378a821ca08f30dbebc6b82c4452fcc120d7bc Mar 12 04:47:15.060460 kernel: BIOS-provided physical RAM map: Mar 12 04:47:15.060475 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 12 04:47:15.060485 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 12 04:47:15.060495 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 12 04:47:15.060507 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Mar 12 04:47:15.060517 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Mar 12 04:47:15.060527 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Mar 12 04:47:15.060537 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Mar 12 04:47:15.060547 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 12 04:47:15.060557 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 12 04:47:15.060573 kernel: NX (Execute Disable) protection: active Mar 12 04:47:15.060584 kernel: APIC: Static calls initialized Mar 12 04:47:15.060596 kernel: SMBIOS 2.8 present. Mar 12 04:47:15.060608 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Mar 12 04:47:15.060619 kernel: Hypervisor detected: KVM Mar 12 04:47:15.060635 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 12 04:47:15.060646 kernel: kvm-clock: using sched offset of 4852800482 cycles Mar 12 04:47:15.060658 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 12 04:47:15.060670 kernel: tsc: Detected 2499.998 MHz processor Mar 12 04:47:15.060682 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 12 04:47:15.060693 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 12 04:47:15.060705 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Mar 12 04:47:15.060716 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 12 04:47:15.060728 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 12 04:47:15.060744 kernel: Using GB pages for direct mapping Mar 12 04:47:15.060756 kernel: ACPI: Early table checksum verification disabled Mar 12 04:47:15.060767 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Mar 12 04:47:15.060779 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 04:47:15.060790 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 04:47:15.060801 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 04:47:15.060812 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Mar 12 04:47:15.060824 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 04:47:15.060835 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 04:47:15.060851 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 04:47:15.060863 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 04:47:15.060874 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Mar 12 04:47:15.060885 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Mar 12 04:47:15.060897 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Mar 12 04:47:15.060915 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Mar 12 04:47:15.060927 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Mar 12 04:47:15.060943 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Mar 12 04:47:15.060956 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Mar 12 04:47:15.060967 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Mar 12 04:47:15.060979 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Mar 12 04:47:15.060991 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Mar 12 04:47:15.061003 kernel: SRAT: PXM 0 -> APIC 0x03 -> Node 0 Mar 12 04:47:15.061014 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Mar 12 04:47:15.061031 kernel: SRAT: PXM 0 -> APIC 0x05 -> Node 0 Mar 12 04:47:15.062091 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Mar 12 04:47:15.062103 kernel: SRAT: PXM 0 -> APIC 0x07 -> Node 0 Mar 12 04:47:15.062115 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Mar 12 04:47:15.062127 kernel: SRAT: PXM 0 -> APIC 0x09 -> Node 0 Mar 12 04:47:15.062138 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Mar 12 04:47:15.062150 kernel: SRAT: PXM 0 -> APIC 0x0b -> Node 0 Mar 12 04:47:15.062161 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Mar 12 04:47:15.062173 kernel: SRAT: PXM 0 -> APIC 0x0d -> Node 0 Mar 12 04:47:15.062184 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Mar 12 04:47:15.062204 kernel: SRAT: PXM 0 -> APIC 0x0f -> Node 0 Mar 12 04:47:15.062216 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Mar 12 04:47:15.062228 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Mar 12 04:47:15.062240 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Mar 12 04:47:15.062252 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00000000-0x7ffdbfff] Mar 12 04:47:15.062264 kernel: NODE_DATA(0) allocated [mem 0x7ffd6000-0x7ffdbfff] Mar 12 04:47:15.062277 kernel: Zone ranges: Mar 12 04:47:15.062288 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 12 04:47:15.062300 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Mar 12 04:47:15.062317 kernel: Normal empty Mar 12 04:47:15.062329 kernel: Movable zone start for each node Mar 12 04:47:15.062341 kernel: Early memory node ranges Mar 12 04:47:15.062353 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 12 04:47:15.062364 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Mar 12 04:47:15.062387 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Mar 12 04:47:15.062400 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 12 04:47:15.062411 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 12 04:47:15.062423 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Mar 12 04:47:15.062435 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 12 04:47:15.062453 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 12 04:47:15.062465 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 12 04:47:15.062477 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 12 04:47:15.062489 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 12 04:47:15.062500 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 12 04:47:15.062512 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 12 04:47:15.062524 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 12 04:47:15.062536 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 12 04:47:15.062547 kernel: TSC deadline timer available Mar 12 04:47:15.062565 kernel: smpboot: Allowing 16 CPUs, 14 hotplug CPUs Mar 12 04:47:15.062576 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 12 04:47:15.062588 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Mar 12 04:47:15.062600 kernel: Booting paravirtualized kernel on KVM Mar 12 04:47:15.062612 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 12 04:47:15.062624 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Mar 12 04:47:15.062636 kernel: percpu: Embedded 57 pages/cpu s196328 r8192 d28952 u262144 Mar 12 04:47:15.062647 kernel: pcpu-alloc: s196328 r8192 d28952 u262144 alloc=1*2097152 Mar 12 04:47:15.062659 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Mar 12 04:47:15.062676 kernel: kvm-guest: PV spinlocks enabled Mar 12 04:47:15.062688 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 12 04:47:15.062702 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=0e4243d51ac00bffbb09a606c7378a821ca08f30dbebc6b82c4452fcc120d7bc Mar 12 04:47:15.062714 kernel: random: crng init done Mar 12 04:47:15.062726 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 12 04:47:15.062738 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 12 04:47:15.062749 kernel: Fallback order for Node 0: 0 Mar 12 04:47:15.062761 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515804 Mar 12 04:47:15.062778 kernel: Policy zone: DMA32 Mar 12 04:47:15.062790 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 12 04:47:15.062802 kernel: software IO TLB: area num 16. Mar 12 04:47:15.062814 kernel: Memory: 1901596K/2096616K available (12288K kernel code, 2288K rwdata, 22752K rodata, 42892K init, 2304K bss, 194760K reserved, 0K cma-reserved) Mar 12 04:47:15.062826 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Mar 12 04:47:15.062838 kernel: Kernel/User page tables isolation: enabled Mar 12 04:47:15.062850 kernel: ftrace: allocating 37996 entries in 149 pages Mar 12 04:47:15.062861 kernel: ftrace: allocated 149 pages with 4 groups Mar 12 04:47:15.062873 kernel: Dynamic Preempt: voluntary Mar 12 04:47:15.062890 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 12 04:47:15.062903 kernel: rcu: RCU event tracing is enabled. Mar 12 04:47:15.062915 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Mar 12 04:47:15.062927 kernel: Trampoline variant of Tasks RCU enabled. Mar 12 04:47:15.062939 kernel: Rude variant of Tasks RCU enabled. Mar 12 04:47:15.062964 kernel: Tracing variant of Tasks RCU enabled. Mar 12 04:47:15.062981 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 12 04:47:15.062994 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Mar 12 04:47:15.063006 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Mar 12 04:47:15.063019 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 12 04:47:15.063031 kernel: Console: colour VGA+ 80x25 Mar 12 04:47:15.065180 kernel: printk: console [tty0] enabled Mar 12 04:47:15.065204 kernel: printk: console [ttyS0] enabled Mar 12 04:47:15.065217 kernel: ACPI: Core revision 20230628 Mar 12 04:47:15.065230 kernel: APIC: Switch to symmetric I/O mode setup Mar 12 04:47:15.065242 kernel: x2apic enabled Mar 12 04:47:15.065255 kernel: APIC: Switched APIC routing to: physical x2apic Mar 12 04:47:15.065273 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Mar 12 04:47:15.065287 kernel: Calibrating delay loop (skipped) preset value.. 4999.99 BogoMIPS (lpj=2499998) Mar 12 04:47:15.065299 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 12 04:47:15.065312 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Mar 12 04:47:15.065325 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Mar 12 04:47:15.065337 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 12 04:47:15.065350 kernel: Spectre V2 : Mitigation: Retpolines Mar 12 04:47:15.065362 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Mar 12 04:47:15.065396 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Mar 12 04:47:15.065409 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 12 04:47:15.065428 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 12 04:47:15.065441 kernel: MDS: Mitigation: Clear CPU buffers Mar 12 04:47:15.065453 kernel: MMIO Stale Data: Unknown: No mitigations Mar 12 04:47:15.065465 kernel: SRBDS: Unknown: Dependent on hypervisor status Mar 12 04:47:15.065478 kernel: active return thunk: its_return_thunk Mar 12 04:47:15.065490 kernel: ITS: Mitigation: Aligned branch/return thunks Mar 12 04:47:15.065502 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 12 04:47:15.065515 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 12 04:47:15.065527 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 12 04:47:15.065539 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 12 04:47:15.065552 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Mar 12 04:47:15.065570 kernel: Freeing SMP alternatives memory: 32K Mar 12 04:47:15.065582 kernel: pid_max: default: 32768 minimum: 301 Mar 12 04:47:15.065595 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 12 04:47:15.065607 kernel: landlock: Up and running. Mar 12 04:47:15.065619 kernel: SELinux: Initializing. Mar 12 04:47:15.065632 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 12 04:47:15.065644 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 12 04:47:15.065657 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Mar 12 04:47:15.065670 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 12 04:47:15.065682 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 12 04:47:15.065695 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 12 04:47:15.065713 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Mar 12 04:47:15.065726 kernel: signal: max sigframe size: 1776 Mar 12 04:47:15.065739 kernel: rcu: Hierarchical SRCU implementation. Mar 12 04:47:15.065752 kernel: rcu: Max phase no-delay instances is 400. Mar 12 04:47:15.065765 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Mar 12 04:47:15.065778 kernel: smp: Bringing up secondary CPUs ... Mar 12 04:47:15.065790 kernel: smpboot: x86: Booting SMP configuration: Mar 12 04:47:15.065802 kernel: .... node #0, CPUs: #1 Mar 12 04:47:15.065815 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Mar 12 04:47:15.065833 kernel: smp: Brought up 1 node, 2 CPUs Mar 12 04:47:15.065845 kernel: smpboot: Max logical packages: 16 Mar 12 04:47:15.065858 kernel: smpboot: Total of 2 processors activated (9999.99 BogoMIPS) Mar 12 04:47:15.065871 kernel: devtmpfs: initialized Mar 12 04:47:15.065883 kernel: x86/mm: Memory block size: 128MB Mar 12 04:47:15.065896 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 12 04:47:15.065909 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Mar 12 04:47:15.065921 kernel: pinctrl core: initialized pinctrl subsystem Mar 12 04:47:15.065933 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 12 04:47:15.065951 kernel: audit: initializing netlink subsys (disabled) Mar 12 04:47:15.065964 kernel: audit: type=2000 audit(1773290833.149:1): state=initialized audit_enabled=0 res=1 Mar 12 04:47:15.065976 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 12 04:47:15.065989 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 12 04:47:15.066001 kernel: cpuidle: using governor menu Mar 12 04:47:15.066013 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 12 04:47:15.066026 kernel: dca service started, version 1.12.1 Mar 12 04:47:15.066065 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Mar 12 04:47:15.066079 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Mar 12 04:47:15.066098 kernel: PCI: Using configuration type 1 for base access Mar 12 04:47:15.066111 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 12 04:47:15.066124 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 12 04:47:15.066136 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 12 04:47:15.066149 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 12 04:47:15.066162 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 12 04:47:15.066174 kernel: ACPI: Added _OSI(Module Device) Mar 12 04:47:15.066187 kernel: ACPI: Added _OSI(Processor Device) Mar 12 04:47:15.066205 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 12 04:47:15.066218 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 12 04:47:15.066230 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 12 04:47:15.066243 kernel: ACPI: Interpreter enabled Mar 12 04:47:15.066255 kernel: ACPI: PM: (supports S0 S5) Mar 12 04:47:15.066268 kernel: ACPI: Using IOAPIC for interrupt routing Mar 12 04:47:15.066281 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 12 04:47:15.066293 kernel: PCI: Using E820 reservations for host bridge windows Mar 12 04:47:15.066306 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 12 04:47:15.066318 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 12 04:47:15.066705 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 12 04:47:15.066907 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 12 04:47:15.068137 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 12 04:47:15.068161 kernel: PCI host bridge to bus 0000:00 Mar 12 04:47:15.068383 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 12 04:47:15.068551 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 12 04:47:15.068722 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 12 04:47:15.068886 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Mar 12 04:47:15.070079 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 12 04:47:15.070272 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Mar 12 04:47:15.070458 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 12 04:47:15.070690 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Mar 12 04:47:15.070928 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 Mar 12 04:47:15.072207 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfa000000-0xfbffffff pref] Mar 12 04:47:15.072411 kernel: pci 0000:00:01.0: reg 0x14: [mem 0xfea50000-0xfea50fff] Mar 12 04:47:15.072589 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea40000-0xfea4ffff pref] Mar 12 04:47:15.072765 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 12 04:47:15.072987 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Mar 12 04:47:15.073952 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea51000-0xfea51fff] Mar 12 04:47:15.074214 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Mar 12 04:47:15.074410 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea52000-0xfea52fff] Mar 12 04:47:15.074615 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Mar 12 04:47:15.074797 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea53000-0xfea53fff] Mar 12 04:47:15.074997 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Mar 12 04:47:15.078228 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea54000-0xfea54fff] Mar 12 04:47:15.078513 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Mar 12 04:47:15.078701 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea55000-0xfea55fff] Mar 12 04:47:15.078917 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Mar 12 04:47:15.079122 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea56000-0xfea56fff] Mar 12 04:47:15.079322 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Mar 12 04:47:15.079515 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea57000-0xfea57fff] Mar 12 04:47:15.079739 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Mar 12 04:47:15.079922 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea58000-0xfea58fff] Mar 12 04:47:15.080164 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Mar 12 04:47:15.080345 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc0c0-0xc0df] Mar 12 04:47:15.080542 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfea59000-0xfea59fff] Mar 12 04:47:15.080722 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Mar 12 04:47:15.080901 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfea00000-0xfea3ffff pref] Mar 12 04:47:15.081146 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Mar 12 04:47:15.081339 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Mar 12 04:47:15.081537 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfea5a000-0xfea5afff] Mar 12 04:47:15.081718 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfd004000-0xfd007fff 64bit pref] Mar 12 04:47:15.081936 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Mar 12 04:47:15.082147 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 12 04:47:15.082363 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Mar 12 04:47:15.082592 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc0e0-0xc0ff] Mar 12 04:47:15.082766 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea5b000-0xfea5bfff] Mar 12 04:47:15.082977 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Mar 12 04:47:15.083174 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Mar 12 04:47:15.083406 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 Mar 12 04:47:15.083594 kernel: pci 0000:01:00.0: reg 0x10: [mem 0xfda00000-0xfda000ff 64bit] Mar 12 04:47:15.083786 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 12 04:47:15.083968 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Mar 12 04:47:15.088284 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 12 04:47:15.088601 kernel: pci_bus 0000:02: extended config space not accessible Mar 12 04:47:15.088826 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 Mar 12 04:47:15.089024 kernel: pci 0000:02:01.0: reg 0x10: [mem 0xfd800000-0xfd80000f] Mar 12 04:47:15.089242 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 12 04:47:15.089445 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 12 04:47:15.089665 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 Mar 12 04:47:15.089850 kernel: pci 0000:03:00.0: reg 0x10: [mem 0xfe800000-0xfe803fff 64bit] Mar 12 04:47:15.090096 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 12 04:47:15.090283 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Mar 12 04:47:15.090476 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 12 04:47:15.090697 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 Mar 12 04:47:15.090894 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Mar 12 04:47:15.091094 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 12 04:47:15.091274 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Mar 12 04:47:15.091474 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 12 04:47:15.091658 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 12 04:47:15.091836 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Mar 12 04:47:15.092014 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 12 04:47:15.092257 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 12 04:47:15.092454 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Mar 12 04:47:15.092628 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 12 04:47:15.092806 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 12 04:47:15.092980 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Mar 12 04:47:15.093170 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 12 04:47:15.093348 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 12 04:47:15.093607 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Mar 12 04:47:15.093794 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 12 04:47:15.093978 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 12 04:47:15.094180 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Mar 12 04:47:15.094360 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 12 04:47:15.094392 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 12 04:47:15.094406 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 12 04:47:15.094419 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 12 04:47:15.094432 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 12 04:47:15.094445 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 12 04:47:15.094466 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 12 04:47:15.094478 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 12 04:47:15.094491 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 12 04:47:15.094504 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 12 04:47:15.094517 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 12 04:47:15.094529 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 12 04:47:15.094542 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 12 04:47:15.094555 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 12 04:47:15.094567 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 12 04:47:15.094586 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 12 04:47:15.094598 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 12 04:47:15.094611 kernel: iommu: Default domain type: Translated Mar 12 04:47:15.094624 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 12 04:47:15.094637 kernel: PCI: Using ACPI for IRQ routing Mar 12 04:47:15.094649 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 12 04:47:15.096794 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 12 04:47:15.096814 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Mar 12 04:47:15.097011 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 12 04:47:15.097223 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 12 04:47:15.097414 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 12 04:47:15.097435 kernel: vgaarb: loaded Mar 12 04:47:15.097448 kernel: clocksource: Switched to clocksource kvm-clock Mar 12 04:47:15.097461 kernel: VFS: Disk quotas dquot_6.6.0 Mar 12 04:47:15.097474 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 12 04:47:15.097487 kernel: pnp: PnP ACPI init Mar 12 04:47:15.097706 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Mar 12 04:47:15.097736 kernel: pnp: PnP ACPI: found 5 devices Mar 12 04:47:15.097749 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 12 04:47:15.097762 kernel: NET: Registered PF_INET protocol family Mar 12 04:47:15.097775 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 12 04:47:15.097788 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Mar 12 04:47:15.097801 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 12 04:47:15.097813 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 12 04:47:15.097826 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Mar 12 04:47:15.097844 kernel: TCP: Hash tables configured (established 16384 bind 16384) Mar 12 04:47:15.097858 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 12 04:47:15.097871 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 12 04:47:15.097883 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 12 04:47:15.097896 kernel: NET: Registered PF_XDP protocol family Mar 12 04:47:15.100117 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Mar 12 04:47:15.100315 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Mar 12 04:47:15.100513 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Mar 12 04:47:15.100705 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Mar 12 04:47:15.100888 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 12 04:47:15.101083 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 12 04:47:15.101266 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 12 04:47:15.101462 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 12 04:47:15.101642 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Mar 12 04:47:15.101829 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Mar 12 04:47:15.102007 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Mar 12 04:47:15.104227 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Mar 12 04:47:15.104423 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Mar 12 04:47:15.104606 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Mar 12 04:47:15.104782 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Mar 12 04:47:15.104966 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Mar 12 04:47:15.105233 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 12 04:47:15.105463 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 12 04:47:15.105639 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 12 04:47:15.105813 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Mar 12 04:47:15.105992 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Mar 12 04:47:15.107248 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 12 04:47:15.107441 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 12 04:47:15.107615 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Mar 12 04:47:15.107789 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Mar 12 04:47:15.107963 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 12 04:47:15.108164 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 12 04:47:15.108339 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Mar 12 04:47:15.108527 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Mar 12 04:47:15.108711 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 12 04:47:15.108885 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 12 04:47:15.111129 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Mar 12 04:47:15.111312 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Mar 12 04:47:15.111504 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 12 04:47:15.111682 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 12 04:47:15.111855 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Mar 12 04:47:15.112047 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Mar 12 04:47:15.112227 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 12 04:47:15.112419 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 12 04:47:15.112594 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Mar 12 04:47:15.112778 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Mar 12 04:47:15.112954 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 12 04:47:15.116180 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 12 04:47:15.116382 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Mar 12 04:47:15.116562 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Mar 12 04:47:15.116745 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 12 04:47:15.116925 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 12 04:47:15.117130 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Mar 12 04:47:15.117387 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Mar 12 04:47:15.117582 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 12 04:47:15.117752 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 12 04:47:15.117911 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 12 04:47:15.118092 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 12 04:47:15.118258 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Mar 12 04:47:15.118446 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Mar 12 04:47:15.118616 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Mar 12 04:47:15.118824 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Mar 12 04:47:15.118998 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Mar 12 04:47:15.119202 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Mar 12 04:47:15.119402 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Mar 12 04:47:15.119600 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Mar 12 04:47:15.119784 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Mar 12 04:47:15.119954 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 12 04:47:15.120173 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Mar 12 04:47:15.120347 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Mar 12 04:47:15.120531 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 12 04:47:15.120724 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Mar 12 04:47:15.120904 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Mar 12 04:47:15.121156 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 12 04:47:15.121345 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Mar 12 04:47:15.121528 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Mar 12 04:47:15.121693 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 12 04:47:15.121887 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Mar 12 04:47:15.122078 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Mar 12 04:47:15.122260 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 12 04:47:15.122462 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Mar 12 04:47:15.122629 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Mar 12 04:47:15.122794 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 12 04:47:15.122989 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Mar 12 04:47:15.123184 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Mar 12 04:47:15.123348 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 12 04:47:15.123388 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 12 04:47:15.123404 kernel: PCI: CLS 0 bytes, default 64 Mar 12 04:47:15.123418 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 12 04:47:15.123432 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Mar 12 04:47:15.123445 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Mar 12 04:47:15.123459 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Mar 12 04:47:15.123473 kernel: Initialise system trusted keyrings Mar 12 04:47:15.123486 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Mar 12 04:47:15.123500 kernel: Key type asymmetric registered Mar 12 04:47:15.123520 kernel: Asymmetric key parser 'x509' registered Mar 12 04:47:15.123533 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 12 04:47:15.123547 kernel: io scheduler mq-deadline registered Mar 12 04:47:15.123560 kernel: io scheduler kyber registered Mar 12 04:47:15.123573 kernel: io scheduler bfq registered Mar 12 04:47:15.123755 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 12 04:47:15.123931 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 12 04:47:15.124130 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 04:47:15.124318 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 12 04:47:15.124511 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 12 04:47:15.124689 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 04:47:15.124870 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 12 04:47:15.125061 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 12 04:47:15.125239 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 04:47:15.125441 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 12 04:47:15.125619 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 12 04:47:15.125799 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 04:47:15.125980 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 12 04:47:15.126180 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 12 04:47:15.126358 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 04:47:15.126563 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 12 04:47:15.126747 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 12 04:47:15.126927 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 04:47:15.127187 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 12 04:47:15.127362 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 12 04:47:15.127549 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 04:47:15.127733 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 12 04:47:15.127908 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 12 04:47:15.128097 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 04:47:15.128119 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 12 04:47:15.128134 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 12 04:47:15.128148 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Mar 12 04:47:15.128161 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 12 04:47:15.128186 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 12 04:47:15.128201 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 12 04:47:15.128214 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 12 04:47:15.128227 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 12 04:47:15.128432 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 12 04:47:15.128599 kernel: rtc_cmos 00:03: registered as rtc0 Mar 12 04:47:15.128763 kernel: rtc_cmos 00:03: setting system clock to 2026-03-12T04:47:14 UTC (1773290834) Mar 12 04:47:15.128926 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Mar 12 04:47:15.128953 kernel: intel_pstate: CPU model not supported Mar 12 04:47:15.128967 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 12 04:47:15.128981 kernel: NET: Registered PF_INET6 protocol family Mar 12 04:47:15.128994 kernel: Segment Routing with IPv6 Mar 12 04:47:15.129007 kernel: In-situ OAM (IOAM) with IPv6 Mar 12 04:47:15.129021 kernel: NET: Registered PF_PACKET protocol family Mar 12 04:47:15.129062 kernel: Key type dns_resolver registered Mar 12 04:47:15.129077 kernel: IPI shorthand broadcast: enabled Mar 12 04:47:15.129091 kernel: sched_clock: Marking stable (1421004587, 231302521)->(1787387887, -135080779) Mar 12 04:47:15.129112 kernel: registered taskstats version 1 Mar 12 04:47:15.129125 kernel: Loading compiled-in X.509 certificates Mar 12 04:47:15.129139 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: 67287262975845098ef9f337a0e8baa9afd38510' Mar 12 04:47:15.129152 kernel: Key type .fscrypt registered Mar 12 04:47:15.129165 kernel: Key type fscrypt-provisioning registered Mar 12 04:47:15.129178 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 12 04:47:15.129191 kernel: ima: Allocated hash algorithm: sha1 Mar 12 04:47:15.129204 kernel: ima: No architecture policies found Mar 12 04:47:15.129217 kernel: clk: Disabling unused clocks Mar 12 04:47:15.129236 kernel: Freeing unused kernel image (initmem) memory: 42892K Mar 12 04:47:15.129249 kernel: Write protecting the kernel read-only data: 36864k Mar 12 04:47:15.129263 kernel: Freeing unused kernel image (rodata/data gap) memory: 1824K Mar 12 04:47:15.129276 kernel: Run /init as init process Mar 12 04:47:15.129290 kernel: with arguments: Mar 12 04:47:15.129303 kernel: /init Mar 12 04:47:15.129316 kernel: with environment: Mar 12 04:47:15.129328 kernel: HOME=/ Mar 12 04:47:15.129341 kernel: TERM=linux Mar 12 04:47:15.129363 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 12 04:47:15.129393 systemd[1]: Detected virtualization kvm. Mar 12 04:47:15.129408 systemd[1]: Detected architecture x86-64. Mar 12 04:47:15.129422 systemd[1]: Running in initrd. Mar 12 04:47:15.129436 systemd[1]: No hostname configured, using default hostname. Mar 12 04:47:15.129449 systemd[1]: Hostname set to . Mar 12 04:47:15.129464 systemd[1]: Initializing machine ID from VM UUID. Mar 12 04:47:15.129484 systemd[1]: Queued start job for default target initrd.target. Mar 12 04:47:15.129498 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 04:47:15.129513 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 04:47:15.129528 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 12 04:47:15.129542 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 12 04:47:15.129557 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 12 04:47:15.129571 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 12 04:47:15.129593 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 12 04:47:15.129608 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 12 04:47:15.129622 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 04:47:15.129636 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 12 04:47:15.129650 systemd[1]: Reached target paths.target - Path Units. Mar 12 04:47:15.129664 systemd[1]: Reached target slices.target - Slice Units. Mar 12 04:47:15.129678 systemd[1]: Reached target swap.target - Swaps. Mar 12 04:47:15.129692 systemd[1]: Reached target timers.target - Timer Units. Mar 12 04:47:15.129712 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 12 04:47:15.129727 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 12 04:47:15.129746 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 12 04:47:15.129760 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 12 04:47:15.129775 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 12 04:47:15.129789 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 12 04:47:15.129803 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 04:47:15.129817 systemd[1]: Reached target sockets.target - Socket Units. Mar 12 04:47:15.129832 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 12 04:47:15.129852 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 12 04:47:15.129866 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 12 04:47:15.129880 systemd[1]: Starting systemd-fsck-usr.service... Mar 12 04:47:15.129895 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 12 04:47:15.129909 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 12 04:47:15.129923 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 04:47:15.129937 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 12 04:47:15.130000 systemd-journald[203]: Collecting audit messages is disabled. Mar 12 04:47:15.130053 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 04:47:15.130068 systemd[1]: Finished systemd-fsck-usr.service. Mar 12 04:47:15.130090 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 12 04:47:15.130105 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 12 04:47:15.130119 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 12 04:47:15.130134 kernel: Bridge firewalling registered Mar 12 04:47:15.130149 systemd-journald[203]: Journal started Mar 12 04:47:15.130182 systemd-journald[203]: Runtime Journal (/run/log/journal/30172283a55f480287bdda0362e3ab64) is 4.7M, max 38.0M, 33.2M free. Mar 12 04:47:15.071116 systemd-modules-load[204]: Inserted module 'overlay' Mar 12 04:47:15.117466 systemd-modules-load[204]: Inserted module 'br_netfilter' Mar 12 04:47:15.183527 systemd[1]: Started systemd-journald.service - Journal Service. Mar 12 04:47:15.184755 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 12 04:47:15.185774 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 04:47:15.194258 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 12 04:47:15.208496 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 12 04:47:15.215425 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 12 04:47:15.224380 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 12 04:47:15.230140 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 12 04:47:15.242299 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 04:47:15.245675 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 12 04:47:15.252362 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 12 04:47:15.259548 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 04:47:15.270391 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 12 04:47:15.280888 dracut-cmdline[237]: dracut-dracut-053 Mar 12 04:47:15.285100 dracut-cmdline[237]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=0e4243d51ac00bffbb09a606c7378a821ca08f30dbebc6b82c4452fcc120d7bc Mar 12 04:47:15.318171 systemd-resolved[239]: Positive Trust Anchors: Mar 12 04:47:15.319307 systemd-resolved[239]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 12 04:47:15.319355 systemd-resolved[239]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 12 04:47:15.328074 systemd-resolved[239]: Defaulting to hostname 'linux'. Mar 12 04:47:15.331116 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 12 04:47:15.332296 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 12 04:47:15.389151 kernel: SCSI subsystem initialized Mar 12 04:47:15.400094 kernel: Loading iSCSI transport class v2.0-870. Mar 12 04:47:15.415086 kernel: iscsi: registered transport (tcp) Mar 12 04:47:15.442147 kernel: iscsi: registered transport (qla4xxx) Mar 12 04:47:15.442289 kernel: QLogic iSCSI HBA Driver Mar 12 04:47:15.499689 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 12 04:47:15.507472 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 12 04:47:15.541840 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 12 04:47:15.541947 kernel: device-mapper: uevent: version 1.0.3 Mar 12 04:47:15.544228 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 12 04:47:15.594162 kernel: raid6: sse2x4 gen() 13981 MB/s Mar 12 04:47:15.613124 kernel: raid6: sse2x2 gen() 9552 MB/s Mar 12 04:47:15.631779 kernel: raid6: sse2x1 gen() 10127 MB/s Mar 12 04:47:15.631889 kernel: raid6: using algorithm sse2x4 gen() 13981 MB/s Mar 12 04:47:15.650803 kernel: raid6: .... xor() 7817 MB/s, rmw enabled Mar 12 04:47:15.650925 kernel: raid6: using ssse3x2 recovery algorithm Mar 12 04:47:15.678105 kernel: xor: automatically using best checksumming function avx Mar 12 04:47:15.875094 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 12 04:47:15.893560 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 12 04:47:15.901455 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 04:47:15.924300 systemd-udevd[422]: Using default interface naming scheme 'v255'. Mar 12 04:47:15.932058 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 04:47:15.943353 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 12 04:47:15.964782 dracut-pre-trigger[428]: rd.md=0: removing MD RAID activation Mar 12 04:47:16.009205 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 12 04:47:16.016264 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 12 04:47:16.152072 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 04:47:16.161561 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 12 04:47:16.199078 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 12 04:47:16.203528 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 12 04:47:16.205209 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 04:47:16.205936 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 12 04:47:16.215486 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 12 04:47:16.247417 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 12 04:47:16.316320 kernel: cryptd: max_cpu_qlen set to 1000 Mar 12 04:47:16.327085 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Mar 12 04:47:16.339793 kernel: ACPI: bus type USB registered Mar 12 04:47:16.339892 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Mar 12 04:47:16.340234 kernel: usbcore: registered new interface driver usbfs Mar 12 04:47:16.349085 kernel: usbcore: registered new interface driver hub Mar 12 04:47:16.349526 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 12 04:47:16.357815 kernel: usbcore: registered new device driver usb Mar 12 04:47:16.357872 kernel: AVX version of gcm_enc/dec engaged. Mar 12 04:47:16.357892 kernel: AES CTR mode by8 optimization enabled Mar 12 04:47:16.349739 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 12 04:47:16.378644 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 12 04:47:16.378680 kernel: GPT:17805311 != 125829119 Mar 12 04:47:16.378699 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 12 04:47:16.378717 kernel: GPT:17805311 != 125829119 Mar 12 04:47:16.378747 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 12 04:47:16.378765 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 12 04:47:16.378263 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 12 04:47:16.379460 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 12 04:47:16.379710 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 04:47:16.385554 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 04:47:16.391161 kernel: libata version 3.00 loaded. Mar 12 04:47:16.396015 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 04:47:16.403184 kernel: ahci 0000:00:1f.2: version 3.0 Mar 12 04:47:16.403523 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 12 04:47:16.409536 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Mar 12 04:47:16.409893 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 12 04:47:16.416057 kernel: scsi host0: ahci Mar 12 04:47:16.420550 kernel: scsi host1: ahci Mar 12 04:47:16.420856 kernel: scsi host2: ahci Mar 12 04:47:16.423734 kernel: scsi host3: ahci Mar 12 04:47:16.423988 kernel: scsi host4: ahci Mar 12 04:47:16.427071 kernel: scsi host5: ahci Mar 12 04:47:16.437315 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 38 Mar 12 04:47:16.437429 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 38 Mar 12 04:47:16.437451 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 38 Mar 12 04:47:16.437468 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 38 Mar 12 04:47:16.437486 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 38 Mar 12 04:47:16.437516 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 38 Mar 12 04:47:16.470063 kernel: BTRFS: device fsid 94537345-7f6b-4b2a-965f-248bd6f0b7eb devid 1 transid 33 /dev/vda3 scanned by (udev-worker) (470) Mar 12 04:47:16.471055 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/vda6 scanned by (udev-worker) (480) Mar 12 04:47:16.498947 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 12 04:47:16.548625 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 04:47:16.556810 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 12 04:47:16.564260 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 12 04:47:16.570219 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 12 04:47:16.571086 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 12 04:47:16.580256 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 12 04:47:16.583882 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 12 04:47:16.589635 disk-uuid[561]: Primary Header is updated. Mar 12 04:47:16.589635 disk-uuid[561]: Secondary Entries is updated. Mar 12 04:47:16.589635 disk-uuid[561]: Secondary Header is updated. Mar 12 04:47:16.597960 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 12 04:47:16.607067 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 12 04:47:16.616594 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 12 04:47:16.748166 kernel: ata1: SATA link down (SStatus 0 SControl 300) Mar 12 04:47:16.748249 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 12 04:47:16.759075 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 12 04:47:16.766060 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 12 04:47:16.766123 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 12 04:47:16.768610 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 12 04:47:16.780066 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Mar 12 04:47:16.783056 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Mar 12 04:47:16.789070 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 12 04:47:16.789373 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Mar 12 04:47:16.792249 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Mar 12 04:47:16.792523 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Mar 12 04:47:16.796691 kernel: hub 1-0:1.0: USB hub found Mar 12 04:47:16.797001 kernel: hub 1-0:1.0: 4 ports detected Mar 12 04:47:16.797424 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 12 04:47:16.800706 kernel: hub 2-0:1.0: USB hub found Mar 12 04:47:16.800977 kernel: hub 2-0:1.0: 4 ports detected Mar 12 04:47:17.038138 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 12 04:47:17.180079 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 12 04:47:17.187199 kernel: usbcore: registered new interface driver usbhid Mar 12 04:47:17.187300 kernel: usbhid: USB HID core driver Mar 12 04:47:17.194122 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Mar 12 04:47:17.198285 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Mar 12 04:47:17.608946 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 12 04:47:17.610590 disk-uuid[562]: The operation has completed successfully. Mar 12 04:47:17.668094 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 12 04:47:17.668289 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 12 04:47:17.694280 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 12 04:47:17.702489 sh[587]: Success Mar 12 04:47:17.721147 kernel: device-mapper: verity: sha256 using implementation "sha256-avx" Mar 12 04:47:17.798431 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 12 04:47:17.802182 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 12 04:47:17.803157 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 12 04:47:17.841079 kernel: BTRFS info (device dm-0): first mount of filesystem 94537345-7f6b-4b2a-965f-248bd6f0b7eb Mar 12 04:47:17.841159 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 12 04:47:17.841180 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 12 04:47:17.844121 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 12 04:47:17.844161 kernel: BTRFS info (device dm-0): using free space tree Mar 12 04:47:17.857538 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 12 04:47:17.859095 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 12 04:47:17.865235 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 12 04:47:17.870215 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 12 04:47:17.887975 kernel: BTRFS info (device vda6): first mount of filesystem 0ebf6eb2-dc55-4706-86d1-78d37843d203 Mar 12 04:47:17.888072 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 12 04:47:17.889116 kernel: BTRFS info (device vda6): using free space tree Mar 12 04:47:17.900086 kernel: BTRFS info (device vda6): auto enabling async discard Mar 12 04:47:17.916450 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 12 04:47:17.918809 kernel: BTRFS info (device vda6): last unmount of filesystem 0ebf6eb2-dc55-4706-86d1-78d37843d203 Mar 12 04:47:17.928095 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 12 04:47:17.939313 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 12 04:47:18.071238 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 12 04:47:18.085967 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 12 04:47:18.090991 ignition[686]: Ignition 2.19.0 Mar 12 04:47:18.091022 ignition[686]: Stage: fetch-offline Mar 12 04:47:18.093457 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 12 04:47:18.091156 ignition[686]: no configs at "/usr/lib/ignition/base.d" Mar 12 04:47:18.091184 ignition[686]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 04:47:18.091408 ignition[686]: parsed url from cmdline: "" Mar 12 04:47:18.091416 ignition[686]: no config URL provided Mar 12 04:47:18.091426 ignition[686]: reading system config file "/usr/lib/ignition/user.ign" Mar 12 04:47:18.091443 ignition[686]: no config at "/usr/lib/ignition/user.ign" Mar 12 04:47:18.091453 ignition[686]: failed to fetch config: resource requires networking Mar 12 04:47:18.092079 ignition[686]: Ignition finished successfully Mar 12 04:47:18.130355 systemd-networkd[774]: lo: Link UP Mar 12 04:47:18.130376 systemd-networkd[774]: lo: Gained carrier Mar 12 04:47:18.132834 systemd-networkd[774]: Enumeration completed Mar 12 04:47:18.132982 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 12 04:47:18.133987 systemd[1]: Reached target network.target - Network. Mar 12 04:47:18.134596 systemd-networkd[774]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 04:47:18.134602 systemd-networkd[774]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 12 04:47:18.136272 systemd-networkd[774]: eth0: Link UP Mar 12 04:47:18.136278 systemd-networkd[774]: eth0: Gained carrier Mar 12 04:47:18.136289 systemd-networkd[774]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 04:47:18.144427 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 12 04:47:18.175640 ignition[777]: Ignition 2.19.0 Mar 12 04:47:18.176101 ignition[777]: Stage: fetch Mar 12 04:47:18.176805 ignition[777]: no configs at "/usr/lib/ignition/base.d" Mar 12 04:47:18.176834 ignition[777]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 04:47:18.176993 ignition[777]: parsed url from cmdline: "" Mar 12 04:47:18.177000 ignition[777]: no config URL provided Mar 12 04:47:18.177010 ignition[777]: reading system config file "/usr/lib/ignition/user.ign" Mar 12 04:47:18.177027 ignition[777]: no config at "/usr/lib/ignition/user.ign" Mar 12 04:47:18.177274 ignition[777]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 12 04:47:18.177639 ignition[777]: GET error: Get "http://169.254.169.254/openstack/latest/user_data": dial tcp 169.254.169.254:80: connect: network is unreachable Mar 12 04:47:18.177698 ignition[777]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 12 04:47:18.177720 ignition[777]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 12 04:47:18.200338 systemd-networkd[774]: eth0: DHCPv4 address 10.230.23.190/30, gateway 10.230.23.189 acquired from 10.230.23.189 Mar 12 04:47:18.377979 ignition[777]: GET http://169.254.169.254/openstack/latest/user_data: attempt #2 Mar 12 04:47:18.392814 ignition[777]: GET result: OK Mar 12 04:47:18.393151 ignition[777]: parsing config with SHA512: 7739fccbb15616053958cde0ef922a7da38271ce35d69a87660b38d9e0474a4927c66815e0cb0949d1b16f69908b0469c0a2ef82cad94331065af4a957c98e49 Mar 12 04:47:18.400012 unknown[777]: fetched base config from "system" Mar 12 04:47:18.400048 unknown[777]: fetched base config from "system" Mar 12 04:47:18.400671 ignition[777]: fetch: fetch complete Mar 12 04:47:18.400061 unknown[777]: fetched user config from "openstack" Mar 12 04:47:18.400681 ignition[777]: fetch: fetch passed Mar 12 04:47:18.402896 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 12 04:47:18.400755 ignition[777]: Ignition finished successfully Mar 12 04:47:18.413420 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 12 04:47:18.437873 ignition[785]: Ignition 2.19.0 Mar 12 04:47:18.437900 ignition[785]: Stage: kargs Mar 12 04:47:18.438326 ignition[785]: no configs at "/usr/lib/ignition/base.d" Mar 12 04:47:18.438351 ignition[785]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 04:47:18.441082 ignition[785]: kargs: kargs passed Mar 12 04:47:18.441171 ignition[785]: Ignition finished successfully Mar 12 04:47:18.444446 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 12 04:47:18.450298 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 12 04:47:18.483626 ignition[791]: Ignition 2.19.0 Mar 12 04:47:18.483655 ignition[791]: Stage: disks Mar 12 04:47:18.484051 ignition[791]: no configs at "/usr/lib/ignition/base.d" Mar 12 04:47:18.485617 ignition[791]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 04:47:18.488842 ignition[791]: disks: disks passed Mar 12 04:47:18.488935 ignition[791]: Ignition finished successfully Mar 12 04:47:18.491489 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 12 04:47:18.493383 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 12 04:47:18.495142 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 12 04:47:18.496097 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 12 04:47:18.497718 systemd[1]: Reached target sysinit.target - System Initialization. Mar 12 04:47:18.499258 systemd[1]: Reached target basic.target - Basic System. Mar 12 04:47:18.506359 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 12 04:47:18.530400 systemd-fsck[799]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 12 04:47:18.534629 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 12 04:47:18.544208 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 12 04:47:18.663067 kernel: EXT4-fs (vda9): mounted filesystem f90926b1-4cc2-4a2d-8c45-4ec584c98779 r/w with ordered data mode. Quota mode: none. Mar 12 04:47:18.664881 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 12 04:47:18.666454 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 12 04:47:18.674330 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 12 04:47:18.679196 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 12 04:47:18.680419 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 12 04:47:18.683232 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Mar 12 04:47:18.684251 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 12 04:47:18.684367 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 12 04:47:18.697636 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 12 04:47:18.699820 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/vda6 scanned by mount (807) Mar 12 04:47:18.705881 kernel: BTRFS info (device vda6): first mount of filesystem 0ebf6eb2-dc55-4706-86d1-78d37843d203 Mar 12 04:47:18.705993 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 12 04:47:18.706025 kernel: BTRFS info (device vda6): using free space tree Mar 12 04:47:18.708954 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 12 04:47:18.716064 kernel: BTRFS info (device vda6): auto enabling async discard Mar 12 04:47:18.718969 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 12 04:47:18.804454 initrd-setup-root[836]: cut: /sysroot/etc/passwd: No such file or directory Mar 12 04:47:18.820027 initrd-setup-root[843]: cut: /sysroot/etc/group: No such file or directory Mar 12 04:47:18.830233 initrd-setup-root[850]: cut: /sysroot/etc/shadow: No such file or directory Mar 12 04:47:18.840288 initrd-setup-root[857]: cut: /sysroot/etc/gshadow: No such file or directory Mar 12 04:47:18.955494 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 12 04:47:18.963243 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 12 04:47:18.967260 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 12 04:47:18.979093 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 12 04:47:18.983084 kernel: BTRFS info (device vda6): last unmount of filesystem 0ebf6eb2-dc55-4706-86d1-78d37843d203 Mar 12 04:47:19.011167 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 12 04:47:19.022411 ignition[924]: INFO : Ignition 2.19.0 Mar 12 04:47:19.024159 ignition[924]: INFO : Stage: mount Mar 12 04:47:19.024159 ignition[924]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 04:47:19.024159 ignition[924]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 04:47:19.029157 ignition[924]: INFO : mount: mount passed Mar 12 04:47:19.029157 ignition[924]: INFO : Ignition finished successfully Mar 12 04:47:19.029639 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 12 04:47:19.346412 systemd-networkd[774]: eth0: Gained IPv6LL Mar 12 04:47:20.855515 systemd-networkd[774]: eth0: Ignoring DHCPv6 address 2a02:1348:179:85ef:24:19ff:fee6:17be/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:85ef:24:19ff:fee6:17be/64 assigned by NDisc. Mar 12 04:47:20.855533 systemd-networkd[774]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Mar 12 04:47:25.883091 coreos-metadata[809]: Mar 12 04:47:25.882 WARN failed to locate config-drive, using the metadata service API instead Mar 12 04:47:25.907756 coreos-metadata[809]: Mar 12 04:47:25.907 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 12 04:47:25.924186 coreos-metadata[809]: Mar 12 04:47:25.924 INFO Fetch successful Mar 12 04:47:25.926153 coreos-metadata[809]: Mar 12 04:47:25.926 INFO wrote hostname srv-1ee83.gb1.brightbox.com to /sysroot/etc/hostname Mar 12 04:47:25.928687 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 12 04:47:25.928870 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Mar 12 04:47:25.940283 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 12 04:47:25.966386 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 12 04:47:25.980680 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 scanned by mount (942) Mar 12 04:47:25.980784 kernel: BTRFS info (device vda6): first mount of filesystem 0ebf6eb2-dc55-4706-86d1-78d37843d203 Mar 12 04:47:25.983267 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 12 04:47:25.985290 kernel: BTRFS info (device vda6): using free space tree Mar 12 04:47:25.991054 kernel: BTRFS info (device vda6): auto enabling async discard Mar 12 04:47:25.994638 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 12 04:47:26.032246 ignition[960]: INFO : Ignition 2.19.0 Mar 12 04:47:26.034147 ignition[960]: INFO : Stage: files Mar 12 04:47:26.034147 ignition[960]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 04:47:26.034147 ignition[960]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 04:47:26.038624 ignition[960]: DEBUG : files: compiled without relabeling support, skipping Mar 12 04:47:26.039608 ignition[960]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 12 04:47:26.039608 ignition[960]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 12 04:47:26.045323 ignition[960]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 12 04:47:26.046529 ignition[960]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 12 04:47:26.048209 unknown[960]: wrote ssh authorized keys file for user: core Mar 12 04:47:26.049312 ignition[960]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 12 04:47:26.050889 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 12 04:47:26.052316 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Mar 12 04:47:26.300768 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 12 04:47:26.603911 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 12 04:47:26.603911 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 12 04:47:26.603911 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 12 04:47:26.615430 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 12 04:47:26.615430 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 12 04:47:26.615430 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 12 04:47:26.615430 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 12 04:47:26.615430 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 12 04:47:26.615430 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 12 04:47:26.615430 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 12 04:47:26.615430 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 12 04:47:26.615430 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 12 04:47:26.615430 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 12 04:47:26.615430 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 12 04:47:26.615430 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.4-x86-64.raw: attempt #1 Mar 12 04:47:27.037665 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 12 04:47:29.423005 ignition[960]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 12 04:47:29.423005 ignition[960]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 12 04:47:29.430992 ignition[960]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 12 04:47:29.430992 ignition[960]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 12 04:47:29.430992 ignition[960]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 12 04:47:29.430992 ignition[960]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 12 04:47:29.430992 ignition[960]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 12 04:47:29.430992 ignition[960]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 12 04:47:29.430992 ignition[960]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 12 04:47:29.430992 ignition[960]: INFO : files: files passed Mar 12 04:47:29.430992 ignition[960]: INFO : Ignition finished successfully Mar 12 04:47:29.429709 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 12 04:47:29.442453 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 12 04:47:29.453788 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 12 04:47:29.459828 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 12 04:47:29.460022 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 12 04:47:29.473150 initrd-setup-root-after-ignition[989]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 12 04:47:29.473150 initrd-setup-root-after-ignition[989]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 12 04:47:29.476933 initrd-setup-root-after-ignition[993]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 12 04:47:29.478756 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 12 04:47:29.480388 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 12 04:47:29.487331 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 12 04:47:29.536831 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 12 04:47:29.537050 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 12 04:47:29.539001 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 12 04:47:29.540442 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 12 04:47:29.542108 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 12 04:47:29.548289 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 12 04:47:29.571093 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 12 04:47:29.576263 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 12 04:47:29.601299 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 12 04:47:29.602304 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 04:47:29.603986 systemd[1]: Stopped target timers.target - Timer Units. Mar 12 04:47:29.605618 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 12 04:47:29.605808 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 12 04:47:29.607713 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 12 04:47:29.608729 systemd[1]: Stopped target basic.target - Basic System. Mar 12 04:47:29.610325 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 12 04:47:29.611841 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 12 04:47:29.613306 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 12 04:47:29.614874 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 12 04:47:29.616487 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 12 04:47:29.618116 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 12 04:47:29.619596 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 12 04:47:29.621233 systemd[1]: Stopped target swap.target - Swaps. Mar 12 04:47:29.622698 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 12 04:47:29.622884 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 12 04:47:29.624770 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 12 04:47:29.625833 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 04:47:29.627212 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 12 04:47:29.627618 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 04:47:29.628867 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 12 04:47:29.629109 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 12 04:47:29.631184 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 12 04:47:29.631376 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 12 04:47:29.633140 systemd[1]: ignition-files.service: Deactivated successfully. Mar 12 04:47:29.633301 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 12 04:47:29.641333 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 12 04:47:29.642174 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 12 04:47:29.642446 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 04:47:29.654188 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 12 04:47:29.655001 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 12 04:47:29.655322 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 04:47:29.657608 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 12 04:47:29.657864 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 12 04:47:29.668392 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 12 04:47:29.668594 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 12 04:47:29.684128 ignition[1013]: INFO : Ignition 2.19.0 Mar 12 04:47:29.684128 ignition[1013]: INFO : Stage: umount Mar 12 04:47:29.684128 ignition[1013]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 04:47:29.684128 ignition[1013]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 04:47:29.690999 ignition[1013]: INFO : umount: umount passed Mar 12 04:47:29.690999 ignition[1013]: INFO : Ignition finished successfully Mar 12 04:47:29.687872 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 12 04:47:29.688511 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 12 04:47:29.692849 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 12 04:47:29.694547 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 12 04:47:29.694621 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 12 04:47:29.696004 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 12 04:47:29.696334 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 12 04:47:29.697552 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 12 04:47:29.697626 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 12 04:47:29.704925 systemd[1]: Stopped target network.target - Network. Mar 12 04:47:29.709533 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 12 04:47:29.709667 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 12 04:47:29.711586 systemd[1]: Stopped target paths.target - Path Units. Mar 12 04:47:29.713091 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 12 04:47:29.718300 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 04:47:29.720152 systemd[1]: Stopped target slices.target - Slice Units. Mar 12 04:47:29.721972 systemd[1]: Stopped target sockets.target - Socket Units. Mar 12 04:47:29.723671 systemd[1]: iscsid.socket: Deactivated successfully. Mar 12 04:47:29.723781 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 12 04:47:29.725013 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 12 04:47:29.725128 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 12 04:47:29.726417 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 12 04:47:29.726511 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 12 04:47:29.727832 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 12 04:47:29.727906 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 12 04:47:29.729913 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 12 04:47:29.732850 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 12 04:47:29.734756 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 12 04:47:29.734917 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 12 04:47:29.735448 systemd-networkd[774]: eth0: DHCPv6 lease lost Mar 12 04:47:29.737820 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 12 04:47:29.738012 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 12 04:47:29.743339 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 12 04:47:29.743414 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 12 04:47:29.744345 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 12 04:47:29.744424 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 12 04:47:29.752261 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 12 04:47:29.753025 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 12 04:47:29.753241 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 12 04:47:29.756845 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 04:47:29.762980 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 12 04:47:29.763324 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 12 04:47:29.776760 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 12 04:47:29.777121 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 04:47:29.781642 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 12 04:47:29.781814 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 12 04:47:29.785199 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 12 04:47:29.785304 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 12 04:47:29.788650 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 12 04:47:29.788739 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 04:47:29.789488 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 12 04:47:29.789570 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 12 04:47:29.791444 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 12 04:47:29.791529 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 12 04:47:29.792966 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 12 04:47:29.793070 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 12 04:47:29.802314 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 12 04:47:29.803250 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 12 04:47:29.803356 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 12 04:47:29.810639 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 12 04:47:29.810750 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 12 04:47:29.812456 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 12 04:47:29.812529 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 04:47:29.815464 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 12 04:47:29.815536 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 04:47:29.816346 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 12 04:47:29.816515 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 04:47:29.819881 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 12 04:47:29.820075 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 12 04:47:29.821670 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 12 04:47:29.829362 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 12 04:47:29.842816 systemd[1]: Switching root. Mar 12 04:47:29.881930 systemd-journald[203]: Journal stopped Mar 12 04:47:31.425900 systemd-journald[203]: Received SIGTERM from PID 1 (systemd). Mar 12 04:47:31.426028 kernel: SELinux: policy capability network_peer_controls=1 Mar 12 04:47:31.426089 kernel: SELinux: policy capability open_perms=1 Mar 12 04:47:31.426112 kernel: SELinux: policy capability extended_socket_class=1 Mar 12 04:47:31.426151 kernel: SELinux: policy capability always_check_network=0 Mar 12 04:47:31.426172 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 12 04:47:31.426191 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 12 04:47:31.426210 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 12 04:47:31.426235 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 12 04:47:31.426257 systemd[1]: Successfully loaded SELinux policy in 51.356ms. Mar 12 04:47:31.426288 kernel: audit: type=1403 audit(1773290850.139:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 12 04:47:31.426309 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 22.488ms. Mar 12 04:47:31.426344 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 12 04:47:31.426369 systemd[1]: Detected virtualization kvm. Mar 12 04:47:31.426390 systemd[1]: Detected architecture x86-64. Mar 12 04:47:31.426411 systemd[1]: Detected first boot. Mar 12 04:47:31.426437 systemd[1]: Hostname set to . Mar 12 04:47:31.426459 systemd[1]: Initializing machine ID from VM UUID. Mar 12 04:47:31.426480 zram_generator::config[1056]: No configuration found. Mar 12 04:47:31.426509 systemd[1]: Populated /etc with preset unit settings. Mar 12 04:47:31.426542 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 12 04:47:31.426573 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 12 04:47:31.426596 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 12 04:47:31.426618 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 12 04:47:31.426646 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 12 04:47:31.426668 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 12 04:47:31.426695 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 12 04:47:31.426716 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 12 04:47:31.426737 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 12 04:47:31.426771 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 12 04:47:31.426793 systemd[1]: Created slice user.slice - User and Session Slice. Mar 12 04:47:31.426814 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 04:47:31.426835 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 04:47:31.426856 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 12 04:47:31.426876 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 12 04:47:31.426897 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 12 04:47:31.426918 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 12 04:47:31.426951 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 12 04:47:31.426986 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 04:47:31.427009 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 12 04:47:31.431942 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 12 04:47:31.431998 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 12 04:47:31.432023 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 12 04:47:31.432074 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 04:47:31.432120 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 12 04:47:31.432151 systemd[1]: Reached target slices.target - Slice Units. Mar 12 04:47:31.432174 systemd[1]: Reached target swap.target - Swaps. Mar 12 04:47:31.432195 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 12 04:47:31.432215 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 12 04:47:31.432236 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 12 04:47:31.432256 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 12 04:47:31.432277 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 04:47:31.432311 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 12 04:47:31.432347 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 12 04:47:31.432382 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 12 04:47:31.432407 systemd[1]: Mounting media.mount - External Media Directory... Mar 12 04:47:31.432435 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 12 04:47:31.432464 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 12 04:47:31.432486 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 12 04:47:31.432519 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 12 04:47:31.432543 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 12 04:47:31.432566 systemd[1]: Reached target machines.target - Containers. Mar 12 04:47:31.432586 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 12 04:47:31.432607 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 04:47:31.432628 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 12 04:47:31.432648 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 12 04:47:31.432668 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 12 04:47:31.432702 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 12 04:47:31.432725 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 12 04:47:31.432746 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 12 04:47:31.432767 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 12 04:47:31.432787 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 12 04:47:31.432808 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 12 04:47:31.432829 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 12 04:47:31.432850 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 12 04:47:31.432871 systemd[1]: Stopped systemd-fsck-usr.service. Mar 12 04:47:31.432904 kernel: fuse: init (API version 7.39) Mar 12 04:47:31.432927 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 12 04:47:31.432948 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 12 04:47:31.432969 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 12 04:47:31.432998 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 12 04:47:31.433020 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 12 04:47:31.433087 kernel: loop: module loaded Mar 12 04:47:31.433120 systemd[1]: verity-setup.service: Deactivated successfully. Mar 12 04:47:31.433143 systemd[1]: Stopped verity-setup.service. Mar 12 04:47:31.433179 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 12 04:47:31.433247 systemd-journald[1156]: Collecting audit messages is disabled. Mar 12 04:47:31.433286 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 12 04:47:31.433308 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 12 04:47:31.433329 systemd[1]: Mounted media.mount - External Media Directory. Mar 12 04:47:31.433366 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 12 04:47:31.433389 systemd-journald[1156]: Journal started Mar 12 04:47:31.433422 systemd-journald[1156]: Runtime Journal (/run/log/journal/30172283a55f480287bdda0362e3ab64) is 4.7M, max 38.0M, 33.2M free. Mar 12 04:47:30.994441 systemd[1]: Queued start job for default target multi-user.target. Mar 12 04:47:31.015622 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 12 04:47:31.440477 systemd[1]: Started systemd-journald.service - Journal Service. Mar 12 04:47:31.016438 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 12 04:47:31.436686 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 12 04:47:31.437659 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 12 04:47:31.438733 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 12 04:47:31.439881 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 04:47:31.441987 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 12 04:47:31.442242 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 12 04:47:31.443792 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 12 04:47:31.445102 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 12 04:47:31.446528 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 12 04:47:31.446748 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 12 04:47:31.448024 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 12 04:47:31.449507 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 12 04:47:31.450983 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 12 04:47:31.451249 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 12 04:47:31.452348 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 12 04:47:31.454902 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 12 04:47:31.456494 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 12 04:47:31.472421 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 12 04:47:31.484151 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 12 04:47:31.497896 kernel: ACPI: bus type drm_connector registered Mar 12 04:47:31.494313 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 12 04:47:31.497163 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 12 04:47:31.497224 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 12 04:47:31.500019 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Mar 12 04:47:31.510520 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 12 04:47:31.515181 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 12 04:47:31.516118 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 04:47:31.518153 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 12 04:47:31.526452 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 12 04:47:31.528707 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 12 04:47:31.530891 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 12 04:47:31.532171 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 12 04:47:31.543359 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 12 04:47:31.549378 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 12 04:47:31.554266 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 12 04:47:31.559814 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 12 04:47:31.560124 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 12 04:47:31.562711 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 12 04:47:31.564749 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 12 04:47:31.567140 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 12 04:47:31.586492 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 12 04:47:31.587702 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 12 04:47:31.598833 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Mar 12 04:47:31.601099 systemd-journald[1156]: Time spent on flushing to /var/log/journal/30172283a55f480287bdda0362e3ab64 is 158.198ms for 1141 entries. Mar 12 04:47:31.601099 systemd-journald[1156]: System Journal (/var/log/journal/30172283a55f480287bdda0362e3ab64) is 8.0M, max 584.8M, 576.8M free. Mar 12 04:47:31.800201 systemd-journald[1156]: Received client request to flush runtime journal. Mar 12 04:47:31.800278 kernel: loop0: detected capacity change from 0 to 140768 Mar 12 04:47:31.800311 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 12 04:47:31.800336 kernel: loop1: detected capacity change from 0 to 142488 Mar 12 04:47:31.658803 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 12 04:47:31.666892 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 12 04:47:31.670143 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Mar 12 04:47:31.691563 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 12 04:47:31.706278 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 12 04:47:31.788342 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 04:47:31.804369 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 12 04:47:31.806198 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 12 04:47:31.840080 kernel: loop2: detected capacity change from 0 to 8 Mar 12 04:47:31.844579 systemd-tmpfiles[1200]: ACLs are not supported, ignoring. Mar 12 04:47:31.845004 systemd-tmpfiles[1200]: ACLs are not supported, ignoring. Mar 12 04:47:31.864985 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 04:47:31.878348 udevadm[1205]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 12 04:47:31.895086 kernel: loop3: detected capacity change from 0 to 219192 Mar 12 04:47:31.959072 kernel: loop4: detected capacity change from 0 to 140768 Mar 12 04:47:32.007991 kernel: loop5: detected capacity change from 0 to 142488 Mar 12 04:47:32.029774 kernel: loop6: detected capacity change from 0 to 8 Mar 12 04:47:32.029876 kernel: loop7: detected capacity change from 0 to 219192 Mar 12 04:47:32.052387 (sd-merge)[1214]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Mar 12 04:47:32.053401 (sd-merge)[1214]: Merged extensions into '/usr'. Mar 12 04:47:32.069199 systemd[1]: Reloading requested from client PID 1188 ('systemd-sysext') (unit systemd-sysext.service)... Mar 12 04:47:32.069235 systemd[1]: Reloading... Mar 12 04:47:32.242081 zram_generator::config[1238]: No configuration found. Mar 12 04:47:32.447191 ldconfig[1183]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 12 04:47:32.542962 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 12 04:47:32.613468 systemd[1]: Reloading finished in 543 ms. Mar 12 04:47:32.658603 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 12 04:47:32.662998 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 12 04:47:32.664742 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 12 04:47:32.678563 systemd[1]: Starting ensure-sysext.service... Mar 12 04:47:32.686450 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 12 04:47:32.697446 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 04:47:32.705270 systemd[1]: Reloading requested from client PID 1297 ('systemctl') (unit ensure-sysext.service)... Mar 12 04:47:32.705317 systemd[1]: Reloading... Mar 12 04:47:32.754551 systemd-udevd[1299]: Using default interface naming scheme 'v255'. Mar 12 04:47:32.757894 systemd-tmpfiles[1298]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 12 04:47:32.760425 systemd-tmpfiles[1298]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 12 04:47:32.761994 systemd-tmpfiles[1298]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 12 04:47:32.766438 systemd-tmpfiles[1298]: ACLs are not supported, ignoring. Mar 12 04:47:32.766568 systemd-tmpfiles[1298]: ACLs are not supported, ignoring. Mar 12 04:47:32.777416 systemd-tmpfiles[1298]: Detected autofs mount point /boot during canonicalization of boot. Mar 12 04:47:32.777436 systemd-tmpfiles[1298]: Skipping /boot Mar 12 04:47:32.824910 systemd-tmpfiles[1298]: Detected autofs mount point /boot during canonicalization of boot. Mar 12 04:47:32.824932 systemd-tmpfiles[1298]: Skipping /boot Mar 12 04:47:32.867067 zram_generator::config[1341]: No configuration found. Mar 12 04:47:33.011082 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 33 scanned by (udev-worker) (1330) Mar 12 04:47:33.061127 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Mar 12 04:47:33.070094 kernel: ACPI: button: Power Button [PWRF] Mar 12 04:47:33.170186 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 12 04:47:33.201124 kernel: mousedev: PS/2 mouse device common for all mice Mar 12 04:47:33.212154 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 12 04:47:33.230112 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Mar 12 04:47:33.237541 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Mar 12 04:47:33.237902 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 12 04:47:33.302293 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 12 04:47:33.303417 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 12 04:47:33.304764 systemd[1]: Reloading finished in 598 ms. Mar 12 04:47:33.328614 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 04:47:33.335852 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 04:47:33.389752 systemd[1]: Finished ensure-sysext.service. Mar 12 04:47:33.394594 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 12 04:47:33.421572 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 12 04:47:33.428378 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 12 04:47:33.430505 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 04:47:33.441445 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 12 04:47:33.447563 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 12 04:47:33.453357 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 12 04:47:33.467339 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 12 04:47:33.468345 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 04:47:33.473717 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 12 04:47:33.488309 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 12 04:47:33.497354 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 12 04:47:33.507358 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 12 04:47:33.520337 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 12 04:47:33.530278 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 12 04:47:33.535747 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 04:47:33.536533 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 12 04:47:33.540577 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 12 04:47:33.540821 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 12 04:47:33.542117 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 12 04:47:33.543273 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 12 04:47:33.544619 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 12 04:47:33.546110 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 12 04:47:33.547380 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 12 04:47:33.549225 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 12 04:47:33.564581 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 12 04:47:33.564831 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 12 04:47:33.575561 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 12 04:47:33.591225 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 12 04:47:33.595511 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 12 04:47:33.600659 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 12 04:47:33.620554 augenrules[1444]: No rules Mar 12 04:47:33.623175 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 12 04:47:33.631858 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 12 04:47:33.668932 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 12 04:47:33.686382 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 12 04:47:33.717579 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 12 04:47:33.731970 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 12 04:47:33.748382 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 12 04:47:33.749452 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 12 04:47:33.777426 lvm[1458]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 12 04:47:33.824103 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 12 04:47:33.903387 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 12 04:47:33.912301 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 12 04:47:33.914629 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 04:47:33.918257 systemd-networkd[1425]: lo: Link UP Mar 12 04:47:33.918270 systemd-networkd[1425]: lo: Gained carrier Mar 12 04:47:33.923564 systemd-networkd[1425]: Enumeration completed Mar 12 04:47:33.924111 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 12 04:47:33.936410 lvm[1469]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 12 04:47:33.924230 systemd-networkd[1425]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 04:47:33.924237 systemd-networkd[1425]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 12 04:47:33.935397 systemd-networkd[1425]: eth0: Link UP Mar 12 04:47:33.935424 systemd-networkd[1425]: eth0: Gained carrier Mar 12 04:47:33.935465 systemd-networkd[1425]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 04:47:33.952411 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 12 04:47:33.962507 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 12 04:47:33.963723 systemd[1]: Reached target time-set.target - System Time Set. Mar 12 04:47:33.972114 systemd-resolved[1427]: Positive Trust Anchors: Mar 12 04:47:33.972144 systemd-resolved[1427]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 12 04:47:33.972194 systemd-resolved[1427]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 12 04:47:33.980263 systemd-networkd[1425]: eth0: DHCPv4 address 10.230.23.190/30, gateway 10.230.23.189 acquired from 10.230.23.189 Mar 12 04:47:33.981551 systemd-timesyncd[1429]: Network configuration changed, trying to establish connection. Mar 12 04:47:33.985460 systemd-resolved[1427]: Using system hostname 'srv-1ee83.gb1.brightbox.com'. Mar 12 04:47:33.989658 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 12 04:47:33.990779 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 12 04:47:33.991854 systemd[1]: Reached target network.target - Network. Mar 12 04:47:33.992583 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 12 04:47:33.993405 systemd[1]: Reached target sysinit.target - System Initialization. Mar 12 04:47:33.994319 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 12 04:47:33.995637 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 12 04:47:33.996820 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 12 04:47:33.997736 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 12 04:47:33.998556 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 12 04:47:33.999360 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 12 04:47:33.999418 systemd[1]: Reached target paths.target - Path Units. Mar 12 04:47:34.000090 systemd[1]: Reached target timers.target - Timer Units. Mar 12 04:47:34.001924 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 12 04:47:34.005954 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 12 04:47:34.015145 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 12 04:47:34.016907 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 12 04:47:34.017856 systemd[1]: Reached target sockets.target - Socket Units. Mar 12 04:47:34.018682 systemd[1]: Reached target basic.target - Basic System. Mar 12 04:47:34.019447 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 12 04:47:34.019521 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 12 04:47:34.021674 systemd[1]: Starting containerd.service - containerd container runtime... Mar 12 04:47:34.026363 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 12 04:47:34.034356 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 12 04:47:34.039090 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 12 04:47:34.043586 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 12 04:47:34.045474 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 12 04:47:34.052385 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 12 04:47:34.065311 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 12 04:47:34.075388 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 12 04:47:34.083335 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 12 04:47:34.095260 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 12 04:47:34.097029 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 12 04:47:34.097863 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 12 04:47:34.100679 systemd[1]: Starting update-engine.service - Update Engine... Mar 12 04:47:34.114797 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 12 04:47:34.115515 jq[1479]: false Mar 12 04:47:34.126693 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 12 04:47:34.127153 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 12 04:47:34.143196 dbus-daemon[1478]: [system] SELinux support is enabled Mar 12 04:47:34.144168 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 12 04:47:34.147707 dbus-daemon[1478]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.0' (uid=244 pid=1425 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Mar 12 04:47:34.150695 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 12 04:47:34.150750 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 12 04:47:34.152275 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 12 04:47:34.152317 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 12 04:47:34.155683 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 12 04:47:34.156003 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 12 04:47:34.161908 dbus-daemon[1478]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 12 04:47:34.175672 extend-filesystems[1480]: Found loop4 Mar 12 04:47:34.175672 extend-filesystems[1480]: Found loop5 Mar 12 04:47:34.175672 extend-filesystems[1480]: Found loop6 Mar 12 04:47:34.175672 extend-filesystems[1480]: Found loop7 Mar 12 04:47:34.175672 extend-filesystems[1480]: Found vda Mar 12 04:47:34.175672 extend-filesystems[1480]: Found vda1 Mar 12 04:47:34.175672 extend-filesystems[1480]: Found vda2 Mar 12 04:47:34.175672 extend-filesystems[1480]: Found vda3 Mar 12 04:47:34.175672 extend-filesystems[1480]: Found usr Mar 12 04:47:34.175672 extend-filesystems[1480]: Found vda4 Mar 12 04:47:34.175672 extend-filesystems[1480]: Found vda6 Mar 12 04:47:34.175672 extend-filesystems[1480]: Found vda7 Mar 12 04:47:34.175672 extend-filesystems[1480]: Found vda9 Mar 12 04:47:34.216129 extend-filesystems[1480]: Checking size of /dev/vda9 Mar 12 04:47:34.219153 tar[1493]: linux-amd64/LICENSE Mar 12 04:47:34.219153 tar[1493]: linux-amd64/helm Mar 12 04:47:34.182356 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Mar 12 04:47:34.219834 jq[1490]: true Mar 12 04:47:34.249128 extend-filesystems[1480]: Resized partition /dev/vda9 Mar 12 04:47:34.241583 (ntainerd)[1510]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 12 04:47:34.254159 systemd[1]: motdgen.service: Deactivated successfully. Mar 12 04:47:34.259750 extend-filesystems[1516]: resize2fs 1.47.1 (20-May-2024) Mar 12 04:47:34.313291 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Mar 12 04:47:34.255919 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 12 04:47:34.313446 update_engine[1488]: I20260312 04:47:34.270843 1488 main.cc:92] Flatcar Update Engine starting Mar 12 04:47:34.313446 update_engine[1488]: I20260312 04:47:34.290372 1488 update_check_scheduler.cc:74] Next update check in 11m30s Mar 12 04:47:34.289182 systemd[1]: Started update-engine.service - Update Engine. Mar 12 04:47:34.313936 jq[1511]: true Mar 12 04:47:34.340314 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 12 04:47:34.370541 systemd-timesyncd[1429]: Contacted time server 162.159.200.123:123 (0.flatcar.pool.ntp.org). Mar 12 04:47:34.370643 systemd-timesyncd[1429]: Initial clock synchronization to Thu 2026-03-12 04:47:34.688211 UTC. Mar 12 04:47:34.410211 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 33 scanned by (udev-worker) (1334) Mar 12 04:47:34.463326 systemd-logind[1487]: Watching system buttons on /dev/input/event2 (Power Button) Mar 12 04:47:34.463374 systemd-logind[1487]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 12 04:47:34.467733 systemd-logind[1487]: New seat seat0. Mar 12 04:47:34.475112 systemd[1]: Started systemd-logind.service - User Login Management. Mar 12 04:47:34.531557 locksmithd[1520]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 12 04:47:34.578639 bash[1539]: Updated "/home/core/.ssh/authorized_keys" Mar 12 04:47:34.578835 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 12 04:47:34.581203 dbus-daemon[1478]: [system] Successfully activated service 'org.freedesktop.hostname1' Mar 12 04:47:34.582108 dbus-daemon[1478]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.8' (uid=0 pid=1507 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Mar 12 04:47:34.591487 systemd[1]: Starting sshkeys.service... Mar 12 04:47:34.592559 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Mar 12 04:47:34.606901 systemd[1]: Starting polkit.service - Authorization Manager... Mar 12 04:47:34.634181 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 12 04:47:34.646315 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 12 04:47:34.673914 polkitd[1547]: Started polkitd version 121 Mar 12 04:47:34.705071 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Mar 12 04:47:34.706828 polkitd[1547]: Loading rules from directory /etc/polkit-1/rules.d Mar 12 04:47:34.706975 polkitd[1547]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 12 04:47:34.721176 polkitd[1547]: Finished loading, compiling and executing 2 rules Mar 12 04:47:34.727159 systemd[1]: Started polkit.service - Authorization Manager. Mar 12 04:47:34.726237 dbus-daemon[1478]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Mar 12 04:47:34.730809 polkitd[1547]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Mar 12 04:47:34.751403 extend-filesystems[1516]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 12 04:47:34.751403 extend-filesystems[1516]: old_desc_blocks = 1, new_desc_blocks = 8 Mar 12 04:47:34.751403 extend-filesystems[1516]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Mar 12 04:47:34.765440 extend-filesystems[1480]: Resized filesystem in /dev/vda9 Mar 12 04:47:34.755526 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 12 04:47:34.756195 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 12 04:47:34.781291 systemd-hostnamed[1507]: Hostname set to (static) Mar 12 04:47:34.801069 containerd[1510]: time="2026-03-12T04:47:34.800660306Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Mar 12 04:47:34.842078 containerd[1510]: time="2026-03-12T04:47:34.840542891Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 12 04:47:34.844299 containerd[1510]: time="2026-03-12T04:47:34.844241691Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 12 04:47:34.844417 containerd[1510]: time="2026-03-12T04:47:34.844391547Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 12 04:47:34.844546 containerd[1510]: time="2026-03-12T04:47:34.844519387Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 12 04:47:34.844907 containerd[1510]: time="2026-03-12T04:47:34.844877736Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 12 04:47:34.845066 containerd[1510]: time="2026-03-12T04:47:34.845006172Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 12 04:47:34.845258 containerd[1510]: time="2026-03-12T04:47:34.845228772Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 12 04:47:34.845421 containerd[1510]: time="2026-03-12T04:47:34.845356373Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 12 04:47:34.845789 containerd[1510]: time="2026-03-12T04:47:34.845757706Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 12 04:47:34.845883 containerd[1510]: time="2026-03-12T04:47:34.845859773Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 12 04:47:34.845989 containerd[1510]: time="2026-03-12T04:47:34.845962474Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 12 04:47:34.846661 containerd[1510]: time="2026-03-12T04:47:34.846069575Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 12 04:47:34.846661 containerd[1510]: time="2026-03-12T04:47:34.846205259Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 12 04:47:34.846661 containerd[1510]: time="2026-03-12T04:47:34.846579526Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 12 04:47:34.846947 containerd[1510]: time="2026-03-12T04:47:34.846917140Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 12 04:47:34.847105 containerd[1510]: time="2026-03-12T04:47:34.847070416Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 12 04:47:34.847473 containerd[1510]: time="2026-03-12T04:47:34.847438183Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 12 04:47:34.847714 containerd[1510]: time="2026-03-12T04:47:34.847676797Z" level=info msg="metadata content store policy set" policy=shared Mar 12 04:47:34.852301 containerd[1510]: time="2026-03-12T04:47:34.852183539Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 12 04:47:34.852917 containerd[1510]: time="2026-03-12T04:47:34.852398805Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 12 04:47:34.852917 containerd[1510]: time="2026-03-12T04:47:34.852434509Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 12 04:47:34.852917 containerd[1510]: time="2026-03-12T04:47:34.852462915Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 12 04:47:34.852917 containerd[1510]: time="2026-03-12T04:47:34.852500586Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 12 04:47:34.852917 containerd[1510]: time="2026-03-12T04:47:34.852745067Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 12 04:47:34.854123 containerd[1510]: time="2026-03-12T04:47:34.854094262Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 12 04:47:34.854408 containerd[1510]: time="2026-03-12T04:47:34.854379691Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 12 04:47:34.855228 containerd[1510]: time="2026-03-12T04:47:34.854487034Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 12 04:47:34.855228 containerd[1510]: time="2026-03-12T04:47:34.854517767Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 12 04:47:34.855228 containerd[1510]: time="2026-03-12T04:47:34.854539881Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 12 04:47:34.855228 containerd[1510]: time="2026-03-12T04:47:34.854560149Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 12 04:47:34.855228 containerd[1510]: time="2026-03-12T04:47:34.854580152Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 12 04:47:34.855228 containerd[1510]: time="2026-03-12T04:47:34.854602027Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 12 04:47:34.855228 containerd[1510]: time="2026-03-12T04:47:34.854623877Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 12 04:47:34.855228 containerd[1510]: time="2026-03-12T04:47:34.854643829Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 12 04:47:34.855228 containerd[1510]: time="2026-03-12T04:47:34.854662562Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 12 04:47:34.855228 containerd[1510]: time="2026-03-12T04:47:34.854696312Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 12 04:47:34.855228 containerd[1510]: time="2026-03-12T04:47:34.854737668Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.855228 containerd[1510]: time="2026-03-12T04:47:34.854761832Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.855228 containerd[1510]: time="2026-03-12T04:47:34.854781560Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.855228 containerd[1510]: time="2026-03-12T04:47:34.854802332Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.855696 containerd[1510]: time="2026-03-12T04:47:34.854821284Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.855696 containerd[1510]: time="2026-03-12T04:47:34.854848940Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.855696 containerd[1510]: time="2026-03-12T04:47:34.854869320Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.855696 containerd[1510]: time="2026-03-12T04:47:34.854889368Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.855696 containerd[1510]: time="2026-03-12T04:47:34.854909514Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.855696 containerd[1510]: time="2026-03-12T04:47:34.854933188Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.855696 containerd[1510]: time="2026-03-12T04:47:34.854953316Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.855696 containerd[1510]: time="2026-03-12T04:47:34.854972590Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.855696 containerd[1510]: time="2026-03-12T04:47:34.855007942Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.855696 containerd[1510]: time="2026-03-12T04:47:34.855057545Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 12 04:47:34.855696 containerd[1510]: time="2026-03-12T04:47:34.855092685Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.855696 containerd[1510]: time="2026-03-12T04:47:34.855113915Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.855696 containerd[1510]: time="2026-03-12T04:47:34.855132505Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 12 04:47:34.857057 containerd[1510]: time="2026-03-12T04:47:34.856172066Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 12 04:47:34.857057 containerd[1510]: time="2026-03-12T04:47:34.856349903Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 12 04:47:34.857057 containerd[1510]: time="2026-03-12T04:47:34.856377889Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 12 04:47:34.857057 containerd[1510]: time="2026-03-12T04:47:34.856399640Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 12 04:47:34.857057 containerd[1510]: time="2026-03-12T04:47:34.856416501Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.857057 containerd[1510]: time="2026-03-12T04:47:34.856435809Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 12 04:47:34.857057 containerd[1510]: time="2026-03-12T04:47:34.856460919Z" level=info msg="NRI interface is disabled by configuration." Mar 12 04:47:34.857057 containerd[1510]: time="2026-03-12T04:47:34.856487570Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 12 04:47:34.857649 containerd[1510]: time="2026-03-12T04:47:34.856977586Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 12 04:47:34.858673 containerd[1510]: time="2026-03-12T04:47:34.858061450Z" level=info msg="Connect containerd service" Mar 12 04:47:34.858673 containerd[1510]: time="2026-03-12T04:47:34.858150040Z" level=info msg="using legacy CRI server" Mar 12 04:47:34.858673 containerd[1510]: time="2026-03-12T04:47:34.858175252Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 12 04:47:34.858673 containerd[1510]: time="2026-03-12T04:47:34.858361663Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 12 04:47:34.862586 containerd[1510]: time="2026-03-12T04:47:34.862390574Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 12 04:47:34.864060 containerd[1510]: time="2026-03-12T04:47:34.863556446Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 12 04:47:34.864060 containerd[1510]: time="2026-03-12T04:47:34.863647468Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 12 04:47:34.864060 containerd[1510]: time="2026-03-12T04:47:34.863717721Z" level=info msg="Start subscribing containerd event" Mar 12 04:47:34.864060 containerd[1510]: time="2026-03-12T04:47:34.863787608Z" level=info msg="Start recovering state" Mar 12 04:47:34.864060 containerd[1510]: time="2026-03-12T04:47:34.863915008Z" level=info msg="Start event monitor" Mar 12 04:47:34.864060 containerd[1510]: time="2026-03-12T04:47:34.863956786Z" level=info msg="Start snapshots syncer" Mar 12 04:47:34.864060 containerd[1510]: time="2026-03-12T04:47:34.863975633Z" level=info msg="Start cni network conf syncer for default" Mar 12 04:47:34.864060 containerd[1510]: time="2026-03-12T04:47:34.864006400Z" level=info msg="Start streaming server" Mar 12 04:47:34.867018 systemd[1]: Started containerd.service - containerd container runtime. Mar 12 04:47:34.870060 containerd[1510]: time="2026-03-12T04:47:34.868964475Z" level=info msg="containerd successfully booted in 0.070415s" Mar 12 04:47:34.922332 sshd_keygen[1513]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 12 04:47:34.954116 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 12 04:47:34.966639 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 12 04:47:34.976811 systemd[1]: issuegen.service: Deactivated successfully. Mar 12 04:47:34.977233 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 12 04:47:34.988223 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 12 04:47:35.002447 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 12 04:47:35.013737 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 12 04:47:35.019043 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 12 04:47:35.020524 systemd[1]: Reached target getty.target - Login Prompts. Mar 12 04:47:35.090480 systemd-networkd[1425]: eth0: Gained IPv6LL Mar 12 04:47:35.097357 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 12 04:47:35.101412 systemd[1]: Reached target network-online.target - Network is Online. Mar 12 04:47:35.114326 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:47:35.124539 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 12 04:47:35.184509 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 12 04:47:35.397298 tar[1493]: linux-amd64/README.md Mar 12 04:47:35.414615 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 12 04:47:36.193025 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:47:36.208169 (kubelet)[1601]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 04:47:36.599941 systemd-networkd[1425]: eth0: Ignoring DHCPv6 address 2a02:1348:179:85ef:24:19ff:fee6:17be/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:85ef:24:19ff:fee6:17be/64 assigned by NDisc. Mar 12 04:47:36.599954 systemd-networkd[1425]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Mar 12 04:47:36.795785 kubelet[1601]: E0312 04:47:36.795637 1601 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 04:47:36.798973 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 04:47:36.799413 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 04:47:36.800203 systemd[1]: kubelet.service: Consumed 1.041s CPU time. Mar 12 04:47:39.514040 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 12 04:47:39.529044 systemd[1]: Started sshd@0-10.230.23.190:22-20.161.92.111:51844.service - OpenSSH per-connection server daemon (20.161.92.111:51844). Mar 12 04:47:40.078980 login[1580]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 12 04:47:40.079449 login[1579]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 12 04:47:40.097425 systemd-logind[1487]: New session 2 of user core. Mar 12 04:47:40.102401 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 12 04:47:40.119859 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 12 04:47:40.125582 sshd[1613]: Accepted publickey for core from 20.161.92.111 port 51844 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:47:40.129237 sshd[1613]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:47:40.131762 systemd-logind[1487]: New session 1 of user core. Mar 12 04:47:40.143343 systemd-logind[1487]: New session 3 of user core. Mar 12 04:47:40.150896 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 12 04:47:40.160819 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 12 04:47:40.180434 (systemd)[1621]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 12 04:47:40.385074 systemd[1621]: Queued start job for default target default.target. Mar 12 04:47:40.397374 systemd[1621]: Created slice app.slice - User Application Slice. Mar 12 04:47:40.397427 systemd[1621]: Reached target paths.target - Paths. Mar 12 04:47:40.397452 systemd[1621]: Reached target timers.target - Timers. Mar 12 04:47:40.399797 systemd[1621]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 12 04:47:40.417351 systemd[1621]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 12 04:47:40.417578 systemd[1621]: Reached target sockets.target - Sockets. Mar 12 04:47:40.417619 systemd[1621]: Reached target basic.target - Basic System. Mar 12 04:47:40.417702 systemd[1621]: Reached target default.target - Main User Target. Mar 12 04:47:40.417769 systemd[1621]: Startup finished in 227ms. Mar 12 04:47:40.417966 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 12 04:47:40.432830 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 12 04:47:40.435360 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 12 04:47:40.437456 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 12 04:47:40.896655 systemd[1]: Started sshd@1-10.230.23.190:22-20.161.92.111:60876.service - OpenSSH per-connection server daemon (20.161.92.111:60876). Mar 12 04:47:41.180609 coreos-metadata[1477]: Mar 12 04:47:41.179 WARN failed to locate config-drive, using the metadata service API instead Mar 12 04:47:41.210380 coreos-metadata[1477]: Mar 12 04:47:41.210 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Mar 12 04:47:41.217344 coreos-metadata[1477]: Mar 12 04:47:41.217 INFO Fetch failed with 404: resource not found Mar 12 04:47:41.217344 coreos-metadata[1477]: Mar 12 04:47:41.217 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 12 04:47:41.218678 coreos-metadata[1477]: Mar 12 04:47:41.218 INFO Fetch successful Mar 12 04:47:41.218844 coreos-metadata[1477]: Mar 12 04:47:41.218 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Mar 12 04:47:41.233129 coreos-metadata[1477]: Mar 12 04:47:41.232 INFO Fetch successful Mar 12 04:47:41.233501 coreos-metadata[1477]: Mar 12 04:47:41.233 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Mar 12 04:47:41.251973 coreos-metadata[1477]: Mar 12 04:47:41.251 INFO Fetch successful Mar 12 04:47:41.252442 coreos-metadata[1477]: Mar 12 04:47:41.252 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Mar 12 04:47:41.268172 coreos-metadata[1477]: Mar 12 04:47:41.268 INFO Fetch successful Mar 12 04:47:41.268585 coreos-metadata[1477]: Mar 12 04:47:41.268 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Mar 12 04:47:41.286179 coreos-metadata[1477]: Mar 12 04:47:41.286 INFO Fetch successful Mar 12 04:47:41.312068 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 12 04:47:41.313683 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 12 04:47:41.487390 sshd[1654]: Accepted publickey for core from 20.161.92.111 port 60876 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:47:41.489784 sshd[1654]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:47:41.496284 systemd-logind[1487]: New session 4 of user core. Mar 12 04:47:41.510575 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 12 04:47:41.785235 coreos-metadata[1548]: Mar 12 04:47:41.785 WARN failed to locate config-drive, using the metadata service API instead Mar 12 04:47:41.812142 coreos-metadata[1548]: Mar 12 04:47:41.812 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 12 04:47:41.841483 coreos-metadata[1548]: Mar 12 04:47:41.841 INFO Fetch successful Mar 12 04:47:41.841765 coreos-metadata[1548]: Mar 12 04:47:41.841 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 12 04:47:41.869954 coreos-metadata[1548]: Mar 12 04:47:41.869 INFO Fetch successful Mar 12 04:47:41.872679 unknown[1548]: wrote ssh authorized keys file for user: core Mar 12 04:47:41.902347 update-ssh-keys[1667]: Updated "/home/core/.ssh/authorized_keys" Mar 12 04:47:41.903747 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 12 04:47:41.905861 sshd[1654]: pam_unix(sshd:session): session closed for user core Mar 12 04:47:41.909358 systemd[1]: Finished sshkeys.service. Mar 12 04:47:41.913943 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 12 04:47:41.914186 systemd[1]: Startup finished in 1.601s (kernel) + 15.379s (initrd) + 11.825s (userspace) = 28.806s. Mar 12 04:47:41.914936 systemd[1]: sshd@1-10.230.23.190:22-20.161.92.111:60876.service: Deactivated successfully. Mar 12 04:47:41.917501 systemd[1]: session-4.scope: Deactivated successfully. Mar 12 04:47:41.919557 systemd-logind[1487]: Session 4 logged out. Waiting for processes to exit. Mar 12 04:47:41.921671 systemd-logind[1487]: Removed session 4. Mar 12 04:47:42.006466 systemd[1]: Started sshd@2-10.230.23.190:22-20.161.92.111:60878.service - OpenSSH per-connection server daemon (20.161.92.111:60878). Mar 12 04:47:42.577661 sshd[1674]: Accepted publickey for core from 20.161.92.111 port 60878 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:47:42.579906 sshd[1674]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:47:42.587447 systemd-logind[1487]: New session 5 of user core. Mar 12 04:47:42.598426 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 12 04:47:42.978918 sshd[1674]: pam_unix(sshd:session): session closed for user core Mar 12 04:47:42.984627 systemd[1]: sshd@2-10.230.23.190:22-20.161.92.111:60878.service: Deactivated successfully. Mar 12 04:47:42.987392 systemd[1]: session-5.scope: Deactivated successfully. Mar 12 04:47:42.988655 systemd-logind[1487]: Session 5 logged out. Waiting for processes to exit. Mar 12 04:47:42.990177 systemd-logind[1487]: Removed session 5. Mar 12 04:47:46.916906 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 12 04:47:46.925455 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:47:47.111806 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:47:47.118956 (kubelet)[1688]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 04:47:47.219317 kubelet[1688]: E0312 04:47:47.219090 1688 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 04:47:47.224272 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 04:47:47.224722 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 04:47:53.171599 systemd[1]: Started sshd@3-10.230.23.190:22-20.161.92.111:49450.service - OpenSSH per-connection server daemon (20.161.92.111:49450). Mar 12 04:47:53.737080 sshd[1696]: Accepted publickey for core from 20.161.92.111 port 49450 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:47:53.739744 sshd[1696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:47:53.751962 systemd-logind[1487]: New session 6 of user core. Mar 12 04:47:53.761809 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 12 04:47:54.136950 sshd[1696]: pam_unix(sshd:session): session closed for user core Mar 12 04:47:54.143856 systemd[1]: sshd@3-10.230.23.190:22-20.161.92.111:49450.service: Deactivated successfully. Mar 12 04:47:54.147597 systemd[1]: session-6.scope: Deactivated successfully. Mar 12 04:47:54.148920 systemd-logind[1487]: Session 6 logged out. Waiting for processes to exit. Mar 12 04:47:54.151003 systemd-logind[1487]: Removed session 6. Mar 12 04:47:54.237395 systemd[1]: Started sshd@4-10.230.23.190:22-20.161.92.111:49466.service - OpenSSH per-connection server daemon (20.161.92.111:49466). Mar 12 04:47:54.815675 sshd[1703]: Accepted publickey for core from 20.161.92.111 port 49466 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:47:54.819117 sshd[1703]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:47:54.826828 systemd-logind[1487]: New session 7 of user core. Mar 12 04:47:54.838462 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 12 04:47:55.207186 sshd[1703]: pam_unix(sshd:session): session closed for user core Mar 12 04:47:55.212472 systemd-logind[1487]: Session 7 logged out. Waiting for processes to exit. Mar 12 04:47:55.213762 systemd[1]: sshd@4-10.230.23.190:22-20.161.92.111:49466.service: Deactivated successfully. Mar 12 04:47:55.215825 systemd[1]: session-7.scope: Deactivated successfully. Mar 12 04:47:55.216999 systemd-logind[1487]: Removed session 7. Mar 12 04:47:55.314494 systemd[1]: Started sshd@5-10.230.23.190:22-20.161.92.111:49482.service - OpenSSH per-connection server daemon (20.161.92.111:49482). Mar 12 04:47:55.865095 sshd[1710]: Accepted publickey for core from 20.161.92.111 port 49482 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:47:55.867028 sshd[1710]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:47:55.873130 systemd-logind[1487]: New session 8 of user core. Mar 12 04:47:55.884377 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 12 04:47:56.263799 sshd[1710]: pam_unix(sshd:session): session closed for user core Mar 12 04:47:56.268088 systemd[1]: sshd@5-10.230.23.190:22-20.161.92.111:49482.service: Deactivated successfully. Mar 12 04:47:56.270659 systemd[1]: session-8.scope: Deactivated successfully. Mar 12 04:47:56.273000 systemd-logind[1487]: Session 8 logged out. Waiting for processes to exit. Mar 12 04:47:56.274803 systemd-logind[1487]: Removed session 8. Mar 12 04:47:56.381500 systemd[1]: Started sshd@6-10.230.23.190:22-20.161.92.111:49484.service - OpenSSH per-connection server daemon (20.161.92.111:49484). Mar 12 04:47:56.974845 sshd[1717]: Accepted publickey for core from 20.161.92.111 port 49484 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:47:56.978176 sshd[1717]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:47:56.985603 systemd-logind[1487]: New session 9 of user core. Mar 12 04:47:56.996322 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 12 04:47:57.320760 sudo[1720]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 12 04:47:57.321329 sudo[1720]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 04:47:57.322873 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 12 04:47:57.330424 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:47:57.349628 sudo[1720]: pam_unix(sudo:session): session closed for user root Mar 12 04:47:57.449725 sshd[1717]: pam_unix(sshd:session): session closed for user core Mar 12 04:47:57.455566 systemd[1]: sshd@6-10.230.23.190:22-20.161.92.111:49484.service: Deactivated successfully. Mar 12 04:47:57.456274 systemd-logind[1487]: Session 9 logged out. Waiting for processes to exit. Mar 12 04:47:57.458365 systemd[1]: session-9.scope: Deactivated successfully. Mar 12 04:47:57.460409 systemd-logind[1487]: Removed session 9. Mar 12 04:47:57.519389 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:47:57.521537 (kubelet)[1732]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 04:47:57.556159 systemd[1]: Started sshd@7-10.230.23.190:22-20.161.92.111:49488.service - OpenSSH per-connection server daemon (20.161.92.111:49488). Mar 12 04:47:57.596113 kubelet[1732]: E0312 04:47:57.595868 1732 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 04:47:57.598715 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 04:47:57.598988 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 04:47:58.134090 sshd[1738]: Accepted publickey for core from 20.161.92.111 port 49488 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:47:58.135842 sshd[1738]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:47:58.142297 systemd-logind[1487]: New session 10 of user core. Mar 12 04:47:58.158418 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 12 04:47:58.456637 sudo[1744]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 12 04:47:58.457156 sudo[1744]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 04:47:58.462735 sudo[1744]: pam_unix(sudo:session): session closed for user root Mar 12 04:47:58.471171 sudo[1743]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Mar 12 04:47:58.471661 sudo[1743]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 04:47:58.501251 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Mar 12 04:47:58.503187 auditctl[1747]: No rules Mar 12 04:47:58.504429 systemd[1]: audit-rules.service: Deactivated successfully. Mar 12 04:47:58.504828 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Mar 12 04:47:58.507666 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 12 04:47:58.550395 augenrules[1765]: No rules Mar 12 04:47:58.552475 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 12 04:47:58.554661 sudo[1743]: pam_unix(sudo:session): session closed for user root Mar 12 04:47:58.648158 sshd[1738]: pam_unix(sshd:session): session closed for user core Mar 12 04:47:58.653467 systemd[1]: sshd@7-10.230.23.190:22-20.161.92.111:49488.service: Deactivated successfully. Mar 12 04:47:58.656648 systemd[1]: session-10.scope: Deactivated successfully. Mar 12 04:47:58.668616 systemd-logind[1487]: Session 10 logged out. Waiting for processes to exit. Mar 12 04:47:58.671174 systemd-logind[1487]: Removed session 10. Mar 12 04:47:58.752862 systemd[1]: Started sshd@8-10.230.23.190:22-20.161.92.111:49490.service - OpenSSH per-connection server daemon (20.161.92.111:49490). Mar 12 04:47:59.306530 sshd[1773]: Accepted publickey for core from 20.161.92.111 port 49490 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:47:59.308869 sshd[1773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:47:59.315864 systemd-logind[1487]: New session 11 of user core. Mar 12 04:47:59.328373 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 12 04:47:59.615978 sudo[1776]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 12 04:47:59.617126 sudo[1776]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 04:48:00.130409 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 12 04:48:00.143978 (dockerd)[1792]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 12 04:48:00.584083 dockerd[1792]: time="2026-03-12T04:48:00.583325185Z" level=info msg="Starting up" Mar 12 04:48:00.773672 dockerd[1792]: time="2026-03-12T04:48:00.772762542Z" level=info msg="Loading containers: start." Mar 12 04:48:00.945585 kernel: Initializing XFRM netlink socket Mar 12 04:48:01.059170 systemd-networkd[1425]: docker0: Link UP Mar 12 04:48:01.086534 dockerd[1792]: time="2026-03-12T04:48:01.086444648Z" level=info msg="Loading containers: done." Mar 12 04:48:01.106153 dockerd[1792]: time="2026-03-12T04:48:01.106071020Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 12 04:48:01.106441 dockerd[1792]: time="2026-03-12T04:48:01.106292970Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Mar 12 04:48:01.106560 dockerd[1792]: time="2026-03-12T04:48:01.106513980Z" level=info msg="Daemon has completed initialization" Mar 12 04:48:01.169365 dockerd[1792]: time="2026-03-12T04:48:01.169213370Z" level=info msg="API listen on /run/docker.sock" Mar 12 04:48:01.170185 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 12 04:48:01.876476 containerd[1510]: time="2026-03-12T04:48:01.875183491Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\"" Mar 12 04:48:02.593902 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3238098315.mount: Deactivated successfully. Mar 12 04:48:05.417379 containerd[1510]: time="2026-03-12T04:48:05.417286796Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:05.419185 containerd[1510]: time="2026-03-12T04:48:05.419067117Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.5: active requests=0, bytes read=27074505" Mar 12 04:48:05.420103 containerd[1510]: time="2026-03-12T04:48:05.419846794Z" level=info msg="ImageCreate event name:\"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:05.425068 containerd[1510]: time="2026-03-12T04:48:05.424001459Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:05.426063 containerd[1510]: time="2026-03-12T04:48:05.425734515Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.5\" with image id \"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\", size \"27071096\" in 3.55044384s" Mar 12 04:48:05.426063 containerd[1510]: time="2026-03-12T04:48:05.425811412Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\" returns image reference \"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\"" Mar 12 04:48:05.427088 containerd[1510]: time="2026-03-12T04:48:05.427055454Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\"" Mar 12 04:48:06.615819 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Mar 12 04:48:07.324252 containerd[1510]: time="2026-03-12T04:48:07.324161611Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:07.330603 containerd[1510]: time="2026-03-12T04:48:07.330474379Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.5: active requests=0, bytes read=21165831" Mar 12 04:48:07.331904 containerd[1510]: time="2026-03-12T04:48:07.331819014Z" level=info msg="ImageCreate event name:\"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:07.337078 containerd[1510]: time="2026-03-12T04:48:07.336853919Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:07.339149 containerd[1510]: time="2026-03-12T04:48:07.338860745Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.5\" with image id \"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\", size \"22822771\" in 1.911760955s" Mar 12 04:48:07.339149 containerd[1510]: time="2026-03-12T04:48:07.338921263Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\" returns image reference \"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\"" Mar 12 04:48:07.340310 containerd[1510]: time="2026-03-12T04:48:07.340266905Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\"" Mar 12 04:48:07.666820 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 12 04:48:07.681414 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:48:07.940068 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:48:07.952666 (kubelet)[2008]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 04:48:08.026999 kubelet[2008]: E0312 04:48:08.026890 2008 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 04:48:08.029438 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 04:48:08.029686 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 04:48:09.117849 containerd[1510]: time="2026-03-12T04:48:09.117378031Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:09.119120 containerd[1510]: time="2026-03-12T04:48:09.119008495Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.5: active requests=0, bytes read=15729832" Mar 12 04:48:09.121077 containerd[1510]: time="2026-03-12T04:48:09.119852483Z" level=info msg="ImageCreate event name:\"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:09.125216 containerd[1510]: time="2026-03-12T04:48:09.125180188Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:09.128590 containerd[1510]: time="2026-03-12T04:48:09.128393170Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.5\" with image id \"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\", size \"17386790\" in 1.787982252s" Mar 12 04:48:09.128590 containerd[1510]: time="2026-03-12T04:48:09.128435427Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\" returns image reference \"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\"" Mar 12 04:48:09.129628 containerd[1510]: time="2026-03-12T04:48:09.129349716Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\"" Mar 12 04:48:10.923270 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3695078708.mount: Deactivated successfully. Mar 12 04:48:12.464752 containerd[1510]: time="2026-03-12T04:48:12.464619890Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:12.467705 containerd[1510]: time="2026-03-12T04:48:12.467622934Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.5: active requests=0, bytes read=25861778" Mar 12 04:48:12.470063 containerd[1510]: time="2026-03-12T04:48:12.468663734Z" level=info msg="ImageCreate event name:\"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:12.472859 containerd[1510]: time="2026-03-12T04:48:12.472793345Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:12.474074 containerd[1510]: time="2026-03-12T04:48:12.473997333Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.5\" with image id \"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\", repo tag \"registry.k8s.io/kube-proxy:v1.34.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\", size \"25860789\" in 3.344355631s" Mar 12 04:48:12.474239 containerd[1510]: time="2026-03-12T04:48:12.474207197Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\" returns image reference \"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\"" Mar 12 04:48:12.475482 containerd[1510]: time="2026-03-12T04:48:12.475451783Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Mar 12 04:48:13.124260 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4207021038.mount: Deactivated successfully. Mar 12 04:48:15.386381 containerd[1510]: time="2026-03-12T04:48:15.384332269Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:15.391219 containerd[1510]: time="2026-03-12T04:48:15.391083658Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=22388015" Mar 12 04:48:15.395098 containerd[1510]: time="2026-03-12T04:48:15.394442475Z" level=info msg="ImageCreate event name:\"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:15.401960 containerd[1510]: time="2026-03-12T04:48:15.401850628Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:15.405024 containerd[1510]: time="2026-03-12T04:48:15.404937168Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"22384805\" in 2.929335177s" Mar 12 04:48:15.405024 containerd[1510]: time="2026-03-12T04:48:15.405013232Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\"" Mar 12 04:48:15.406173 containerd[1510]: time="2026-03-12T04:48:15.405876091Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 12 04:48:15.971375 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount896029879.mount: Deactivated successfully. Mar 12 04:48:15.987943 containerd[1510]: time="2026-03-12T04:48:15.986742232Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:15.989290 containerd[1510]: time="2026-03-12T04:48:15.989229555Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321226" Mar 12 04:48:15.990362 containerd[1510]: time="2026-03-12T04:48:15.990316263Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:15.993867 containerd[1510]: time="2026-03-12T04:48:15.993815430Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:15.995112 containerd[1510]: time="2026-03-12T04:48:15.995028890Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 589.113723ms" Mar 12 04:48:15.995300 containerd[1510]: time="2026-03-12T04:48:15.995270345Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Mar 12 04:48:15.996483 containerd[1510]: time="2026-03-12T04:48:15.996422512Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\"" Mar 12 04:48:16.625541 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4235140989.mount: Deactivated successfully. Mar 12 04:48:18.080642 containerd[1510]: time="2026-03-12T04:48:18.080504529Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:18.083619 containerd[1510]: time="2026-03-12T04:48:18.083364539Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.5-0: active requests=0, bytes read=22860682" Mar 12 04:48:18.084554 containerd[1510]: time="2026-03-12T04:48:18.084481540Z" level=info msg="ImageCreate event name:\"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:18.088654 containerd[1510]: time="2026-03-12T04:48:18.088612707Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:18.092082 containerd[1510]: time="2026-03-12T04:48:18.090524717Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.5-0\" with image id \"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\", repo tag \"registry.k8s.io/etcd:3.6.5-0\", repo digest \"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\", size \"22871747\" in 2.093789543s" Mar 12 04:48:18.092082 containerd[1510]: time="2026-03-12T04:48:18.090577520Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\" returns image reference \"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\"" Mar 12 04:48:18.169241 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 12 04:48:18.178523 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:48:18.478417 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:48:18.490345 (kubelet)[2171]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 04:48:18.582806 kubelet[2171]: E0312 04:48:18.580914 2171 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 04:48:18.583735 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 04:48:18.584055 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 04:48:19.122084 update_engine[1488]: I20260312 04:48:19.121193 1488 update_attempter.cc:509] Updating boot flags... Mar 12 04:48:19.214331 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 33 scanned by (udev-worker) (2188) Mar 12 04:48:19.295070 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 33 scanned by (udev-worker) (2192) Mar 12 04:48:23.307667 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:48:23.319363 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:48:23.349840 systemd[1]: Reloading requested from client PID 2202 ('systemctl') (unit session-11.scope)... Mar 12 04:48:23.349884 systemd[1]: Reloading... Mar 12 04:48:23.463143 zram_generator::config[2241]: No configuration found. Mar 12 04:48:23.659816 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 12 04:48:23.778273 systemd[1]: Reloading finished in 427 ms. Mar 12 04:48:23.868856 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 12 04:48:23.869005 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 12 04:48:23.869472 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:48:23.888616 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:48:24.072693 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:48:24.072715 (kubelet)[2308]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 12 04:48:24.136757 kubelet[2308]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 12 04:48:24.136757 kubelet[2308]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 04:48:24.136757 kubelet[2308]: I0312 04:48:24.135596 2308 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 04:48:25.149083 kubelet[2308]: I0312 04:48:25.148723 2308 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 12 04:48:25.149083 kubelet[2308]: I0312 04:48:25.148772 2308 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 04:48:25.149083 kubelet[2308]: I0312 04:48:25.148822 2308 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 12 04:48:25.149083 kubelet[2308]: I0312 04:48:25.148844 2308 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 12 04:48:25.149777 kubelet[2308]: I0312 04:48:25.149200 2308 server.go:956] "Client rotation is on, will bootstrap in background" Mar 12 04:48:25.157163 kubelet[2308]: E0312 04:48:25.157112 2308 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.230.23.190:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.230.23.190:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 12 04:48:25.157723 kubelet[2308]: I0312 04:48:25.157517 2308 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 12 04:48:25.167781 kubelet[2308]: E0312 04:48:25.167499 2308 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 12 04:48:25.167781 kubelet[2308]: I0312 04:48:25.167588 2308 server.go:1400] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Mar 12 04:48:25.173645 kubelet[2308]: I0312 04:48:25.173571 2308 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 12 04:48:25.175378 kubelet[2308]: I0312 04:48:25.175306 2308 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 04:48:25.175738 kubelet[2308]: I0312 04:48:25.175361 2308 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-1ee83.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 04:48:25.175738 kubelet[2308]: I0312 04:48:25.175646 2308 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 04:48:25.175738 kubelet[2308]: I0312 04:48:25.175664 2308 container_manager_linux.go:306] "Creating device plugin manager" Mar 12 04:48:25.176132 kubelet[2308]: I0312 04:48:25.175835 2308 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 12 04:48:25.178460 kubelet[2308]: I0312 04:48:25.178410 2308 state_mem.go:36] "Initialized new in-memory state store" Mar 12 04:48:25.178820 kubelet[2308]: I0312 04:48:25.178796 2308 kubelet.go:475] "Attempting to sync node with API server" Mar 12 04:48:25.178899 kubelet[2308]: I0312 04:48:25.178830 2308 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 04:48:25.178899 kubelet[2308]: I0312 04:48:25.178890 2308 kubelet.go:387] "Adding apiserver pod source" Mar 12 04:48:25.178991 kubelet[2308]: I0312 04:48:25.178930 2308 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 04:48:25.182318 kubelet[2308]: I0312 04:48:25.182283 2308 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 12 04:48:25.183099 kubelet[2308]: I0312 04:48:25.183070 2308 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 12 04:48:25.183222 kubelet[2308]: I0312 04:48:25.183119 2308 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 12 04:48:25.183281 kubelet[2308]: W0312 04:48:25.183241 2308 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 12 04:48:25.189475 kubelet[2308]: I0312 04:48:25.189435 2308 server.go:1262] "Started kubelet" Mar 12 04:48:25.190098 kubelet[2308]: E0312 04:48:25.190058 2308 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.230.23.190:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.230.23.190:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 04:48:25.191423 kubelet[2308]: E0312 04:48:25.191380 2308 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.230.23.190:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-1ee83.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.230.23.190:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 04:48:25.191503 kubelet[2308]: I0312 04:48:25.191476 2308 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 04:48:25.193215 kubelet[2308]: I0312 04:48:25.192849 2308 server.go:310] "Adding debug handlers to kubelet server" Mar 12 04:48:25.194978 kubelet[2308]: I0312 04:48:25.194905 2308 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 04:48:25.195204 kubelet[2308]: I0312 04:48:25.195177 2308 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 12 04:48:25.195910 kubelet[2308]: I0312 04:48:25.195885 2308 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 04:48:25.197574 kubelet[2308]: E0312 04:48:25.196227 2308 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.230.23.190:6443/api/v1/namespaces/default/events\": dial tcp 10.230.23.190:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-1ee83.gb1.brightbox.com.189bfeaed73e3b0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-1ee83.gb1.brightbox.com,UID:srv-1ee83.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-1ee83.gb1.brightbox.com,},FirstTimestamp:2026-03-12 04:48:25.189366543 +0000 UTC m=+1.109526684,LastTimestamp:2026-03-12 04:48:25.189366543 +0000 UTC m=+1.109526684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-1ee83.gb1.brightbox.com,}" Mar 12 04:48:25.201330 kubelet[2308]: I0312 04:48:25.201297 2308 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 04:48:25.202300 kubelet[2308]: I0312 04:48:25.202261 2308 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 12 04:48:25.207597 kubelet[2308]: E0312 04:48:25.207552 2308 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 12 04:48:25.207769 kubelet[2308]: E0312 04:48:25.207727 2308 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"srv-1ee83.gb1.brightbox.com\" not found" Mar 12 04:48:25.207835 kubelet[2308]: I0312 04:48:25.207780 2308 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 12 04:48:25.208279 kubelet[2308]: I0312 04:48:25.208251 2308 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 12 04:48:25.208382 kubelet[2308]: I0312 04:48:25.208361 2308 reconciler.go:29] "Reconciler: start to sync state" Mar 12 04:48:25.209350 kubelet[2308]: E0312 04:48:25.209296 2308 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.230.23.190:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.230.23.190:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 12 04:48:25.209623 kubelet[2308]: I0312 04:48:25.209584 2308 factory.go:223] Registration of the systemd container factory successfully Mar 12 04:48:25.209720 kubelet[2308]: I0312 04:48:25.209694 2308 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 12 04:48:25.216010 kubelet[2308]: I0312 04:48:25.214846 2308 factory.go:223] Registration of the containerd container factory successfully Mar 12 04:48:25.226912 kubelet[2308]: E0312 04:48:25.226842 2308 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.23.190:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-1ee83.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.23.190:6443: connect: connection refused" interval="200ms" Mar 12 04:48:25.249414 kubelet[2308]: I0312 04:48:25.249143 2308 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 12 04:48:25.253345 kubelet[2308]: I0312 04:48:25.252330 2308 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 12 04:48:25.253345 kubelet[2308]: I0312 04:48:25.252506 2308 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 12 04:48:25.253345 kubelet[2308]: I0312 04:48:25.252683 2308 kubelet.go:2428] "Starting kubelet main sync loop" Mar 12 04:48:25.260830 kubelet[2308]: E0312 04:48:25.260262 2308 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 04:48:25.288802 kubelet[2308]: E0312 04:48:25.261791 2308 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.230.23.190:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.230.23.190:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 12 04:48:25.288802 kubelet[2308]: I0312 04:48:25.265899 2308 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 12 04:48:25.288802 kubelet[2308]: I0312 04:48:25.265926 2308 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 12 04:48:25.288802 kubelet[2308]: I0312 04:48:25.265960 2308 state_mem.go:36] "Initialized new in-memory state store" Mar 12 04:48:25.291157 kubelet[2308]: I0312 04:48:25.291103 2308 policy_none.go:49] "None policy: Start" Mar 12 04:48:25.291261 kubelet[2308]: I0312 04:48:25.291175 2308 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 12 04:48:25.291261 kubelet[2308]: I0312 04:48:25.291211 2308 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 12 04:48:25.292613 kubelet[2308]: I0312 04:48:25.292565 2308 policy_none.go:47] "Start" Mar 12 04:48:25.302750 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 12 04:48:25.309889 kubelet[2308]: E0312 04:48:25.308404 2308 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"srv-1ee83.gb1.brightbox.com\" not found" Mar 12 04:48:25.316796 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 12 04:48:25.336989 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 12 04:48:25.340673 kubelet[2308]: E0312 04:48:25.340285 2308 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 12 04:48:25.340673 kubelet[2308]: I0312 04:48:25.340603 2308 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 04:48:25.340673 kubelet[2308]: I0312 04:48:25.340628 2308 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 04:48:25.341943 kubelet[2308]: I0312 04:48:25.341671 2308 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 04:48:25.344166 kubelet[2308]: E0312 04:48:25.344131 2308 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 12 04:48:25.344275 kubelet[2308]: E0312 04:48:25.344209 2308 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-1ee83.gb1.brightbox.com\" not found" Mar 12 04:48:25.377453 systemd[1]: Created slice kubepods-burstable-pod79c1d99ca5812254fdabfafee3ffa905.slice - libcontainer container kubepods-burstable-pod79c1d99ca5812254fdabfafee3ffa905.slice. Mar 12 04:48:25.384739 kubelet[2308]: E0312 04:48:25.384268 2308 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-1ee83.gb1.brightbox.com\" not found" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:25.392574 systemd[1]: Created slice kubepods-burstable-pod0ee049e37276852334ff5f9ecaeb1d3a.slice - libcontainer container kubepods-burstable-pod0ee049e37276852334ff5f9ecaeb1d3a.slice. Mar 12 04:48:25.396674 kubelet[2308]: E0312 04:48:25.396327 2308 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-1ee83.gb1.brightbox.com\" not found" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:25.409813 systemd[1]: Created slice kubepods-burstable-pod534d8e7711a0e5723c3e326081eebb8c.slice - libcontainer container kubepods-burstable-pod534d8e7711a0e5723c3e326081eebb8c.slice. Mar 12 04:48:25.415609 kubelet[2308]: E0312 04:48:25.415494 2308 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-1ee83.gb1.brightbox.com\" not found" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:25.428536 kubelet[2308]: E0312 04:48:25.428455 2308 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.23.190:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-1ee83.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.23.190:6443: connect: connection refused" interval="400ms" Mar 12 04:48:25.444987 kubelet[2308]: I0312 04:48:25.444916 2308 kubelet_node_status.go:75] "Attempting to register node" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:25.445567 kubelet[2308]: E0312 04:48:25.445492 2308 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.230.23.190:6443/api/v1/nodes\": dial tcp 10.230.23.190:6443: connect: connection refused" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:25.510349 kubelet[2308]: I0312 04:48:25.510268 2308 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/0ee049e37276852334ff5f9ecaeb1d3a-flexvolume-dir\") pod \"kube-controller-manager-srv-1ee83.gb1.brightbox.com\" (UID: \"0ee049e37276852334ff5f9ecaeb1d3a\") " pod="kube-system/kube-controller-manager-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:25.510349 kubelet[2308]: I0312 04:48:25.510361 2308 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0ee049e37276852334ff5f9ecaeb1d3a-kubeconfig\") pod \"kube-controller-manager-srv-1ee83.gb1.brightbox.com\" (UID: \"0ee049e37276852334ff5f9ecaeb1d3a\") " pod="kube-system/kube-controller-manager-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:25.510613 kubelet[2308]: I0312 04:48:25.510395 2308 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0ee049e37276852334ff5f9ecaeb1d3a-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-1ee83.gb1.brightbox.com\" (UID: \"0ee049e37276852334ff5f9ecaeb1d3a\") " pod="kube-system/kube-controller-manager-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:25.510613 kubelet[2308]: I0312 04:48:25.510425 2308 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/534d8e7711a0e5723c3e326081eebb8c-kubeconfig\") pod \"kube-scheduler-srv-1ee83.gb1.brightbox.com\" (UID: \"534d8e7711a0e5723c3e326081eebb8c\") " pod="kube-system/kube-scheduler-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:25.510613 kubelet[2308]: I0312 04:48:25.510452 2308 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/79c1d99ca5812254fdabfafee3ffa905-ca-certs\") pod \"kube-apiserver-srv-1ee83.gb1.brightbox.com\" (UID: \"79c1d99ca5812254fdabfafee3ffa905\") " pod="kube-system/kube-apiserver-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:25.510613 kubelet[2308]: I0312 04:48:25.510478 2308 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/79c1d99ca5812254fdabfafee3ffa905-k8s-certs\") pod \"kube-apiserver-srv-1ee83.gb1.brightbox.com\" (UID: \"79c1d99ca5812254fdabfafee3ffa905\") " pod="kube-system/kube-apiserver-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:25.510613 kubelet[2308]: I0312 04:48:25.510507 2308 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/79c1d99ca5812254fdabfafee3ffa905-usr-share-ca-certificates\") pod \"kube-apiserver-srv-1ee83.gb1.brightbox.com\" (UID: \"79c1d99ca5812254fdabfafee3ffa905\") " pod="kube-system/kube-apiserver-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:25.510838 kubelet[2308]: I0312 04:48:25.510534 2308 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0ee049e37276852334ff5f9ecaeb1d3a-ca-certs\") pod \"kube-controller-manager-srv-1ee83.gb1.brightbox.com\" (UID: \"0ee049e37276852334ff5f9ecaeb1d3a\") " pod="kube-system/kube-controller-manager-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:25.510838 kubelet[2308]: I0312 04:48:25.510559 2308 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0ee049e37276852334ff5f9ecaeb1d3a-k8s-certs\") pod \"kube-controller-manager-srv-1ee83.gb1.brightbox.com\" (UID: \"0ee049e37276852334ff5f9ecaeb1d3a\") " pod="kube-system/kube-controller-manager-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:25.648888 kubelet[2308]: I0312 04:48:25.648845 2308 kubelet_node_status.go:75] "Attempting to register node" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:25.649361 kubelet[2308]: E0312 04:48:25.649328 2308 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.230.23.190:6443/api/v1/nodes\": dial tcp 10.230.23.190:6443: connect: connection refused" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:25.689052 containerd[1510]: time="2026-03-12T04:48:25.688977928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-1ee83.gb1.brightbox.com,Uid:79c1d99ca5812254fdabfafee3ffa905,Namespace:kube-system,Attempt:0,}" Mar 12 04:48:25.700135 containerd[1510]: time="2026-03-12T04:48:25.700063301Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-1ee83.gb1.brightbox.com,Uid:0ee049e37276852334ff5f9ecaeb1d3a,Namespace:kube-system,Attempt:0,}" Mar 12 04:48:25.726467 containerd[1510]: time="2026-03-12T04:48:25.725925682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-1ee83.gb1.brightbox.com,Uid:534d8e7711a0e5723c3e326081eebb8c,Namespace:kube-system,Attempt:0,}" Mar 12 04:48:25.829900 kubelet[2308]: E0312 04:48:25.829788 2308 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.23.190:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-1ee83.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.23.190:6443: connect: connection refused" interval="800ms" Mar 12 04:48:26.053543 kubelet[2308]: I0312 04:48:26.052798 2308 kubelet_node_status.go:75] "Attempting to register node" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:26.053543 kubelet[2308]: E0312 04:48:26.053284 2308 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.230.23.190:6443/api/v1/nodes\": dial tcp 10.230.23.190:6443: connect: connection refused" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:26.089029 kubelet[2308]: E0312 04:48:26.088909 2308 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.230.23.190:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.230.23.190:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 12 04:48:26.249537 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1093048780.mount: Deactivated successfully. Mar 12 04:48:26.261897 containerd[1510]: time="2026-03-12T04:48:26.261806429Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 04:48:26.265496 containerd[1510]: time="2026-03-12T04:48:26.265367939Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Mar 12 04:48:26.266336 containerd[1510]: time="2026-03-12T04:48:26.266279721Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 04:48:26.268520 containerd[1510]: time="2026-03-12T04:48:26.268469689Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 04:48:26.270536 containerd[1510]: time="2026-03-12T04:48:26.270487681Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 04:48:26.272016 containerd[1510]: time="2026-03-12T04:48:26.271969257Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 12 04:48:26.273047 containerd[1510]: time="2026-03-12T04:48:26.272951659Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 12 04:48:26.276021 containerd[1510]: time="2026-03-12T04:48:26.275900100Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 04:48:26.280610 containerd[1510]: time="2026-03-12T04:48:26.279607477Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 553.562797ms" Mar 12 04:48:26.281450 containerd[1510]: time="2026-03-12T04:48:26.281412043Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 581.211183ms" Mar 12 04:48:26.284623 containerd[1510]: time="2026-03-12T04:48:26.284583070Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 594.871034ms" Mar 12 04:48:26.326238 kubelet[2308]: E0312 04:48:26.326107 2308 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.230.23.190:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.230.23.190:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 04:48:26.393579 kubelet[2308]: E0312 04:48:26.393489 2308 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.230.23.190:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.230.23.190:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 12 04:48:26.450567 containerd[1510]: time="2026-03-12T04:48:26.449874734Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:48:26.451090 containerd[1510]: time="2026-03-12T04:48:26.450993841Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:48:26.451280 containerd[1510]: time="2026-03-12T04:48:26.451215437Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:48:26.452050 containerd[1510]: time="2026-03-12T04:48:26.451779760Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:48:26.452561 containerd[1510]: time="2026-03-12T04:48:26.452432851Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:48:26.452561 containerd[1510]: time="2026-03-12T04:48:26.452531575Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:48:26.453385 containerd[1510]: time="2026-03-12T04:48:26.453305415Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:48:26.454301 containerd[1510]: time="2026-03-12T04:48:26.453828345Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:48:26.455389 containerd[1510]: time="2026-03-12T04:48:26.454935114Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:48:26.455389 containerd[1510]: time="2026-03-12T04:48:26.455239489Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:48:26.455487 containerd[1510]: time="2026-03-12T04:48:26.455347325Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:48:26.458759 containerd[1510]: time="2026-03-12T04:48:26.455614380Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:48:26.501746 systemd[1]: Started cri-containerd-e1e4a1d9e35ae7f5f62e6d09be752c7698db4adc99a379fccbee72ef145ad3ad.scope - libcontainer container e1e4a1d9e35ae7f5f62e6d09be752c7698db4adc99a379fccbee72ef145ad3ad. Mar 12 04:48:26.511401 systemd[1]: Started cri-containerd-eade2bf43bf4330a7dc83f34377d27e9ab3495595acf16961631be39e4f13301.scope - libcontainer container eade2bf43bf4330a7dc83f34377d27e9ab3495595acf16961631be39e4f13301. Mar 12 04:48:26.517159 systemd[1]: Started cri-containerd-68b93300b5f323f689cabec7b94baebb5baf4ef2de5630cdba1c54074a39743b.scope - libcontainer container 68b93300b5f323f689cabec7b94baebb5baf4ef2de5630cdba1c54074a39743b. Mar 12 04:48:26.619098 containerd[1510]: time="2026-03-12T04:48:26.618375807Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-1ee83.gb1.brightbox.com,Uid:79c1d99ca5812254fdabfafee3ffa905,Namespace:kube-system,Attempt:0,} returns sandbox id \"68b93300b5f323f689cabec7b94baebb5baf4ef2de5630cdba1c54074a39743b\"" Mar 12 04:48:26.630723 kubelet[2308]: E0312 04:48:26.630566 2308 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.23.190:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-1ee83.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.23.190:6443: connect: connection refused" interval="1.6s" Mar 12 04:48:26.637319 containerd[1510]: time="2026-03-12T04:48:26.637190605Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-1ee83.gb1.brightbox.com,Uid:0ee049e37276852334ff5f9ecaeb1d3a,Namespace:kube-system,Attempt:0,} returns sandbox id \"e1e4a1d9e35ae7f5f62e6d09be752c7698db4adc99a379fccbee72ef145ad3ad\"" Mar 12 04:48:26.651598 containerd[1510]: time="2026-03-12T04:48:26.651403872Z" level=info msg="CreateContainer within sandbox \"68b93300b5f323f689cabec7b94baebb5baf4ef2de5630cdba1c54074a39743b\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 12 04:48:26.653058 containerd[1510]: time="2026-03-12T04:48:26.652195305Z" level=info msg="CreateContainer within sandbox \"e1e4a1d9e35ae7f5f62e6d09be752c7698db4adc99a379fccbee72ef145ad3ad\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 12 04:48:26.659631 containerd[1510]: time="2026-03-12T04:48:26.659586573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-1ee83.gb1.brightbox.com,Uid:534d8e7711a0e5723c3e326081eebb8c,Namespace:kube-system,Attempt:0,} returns sandbox id \"eade2bf43bf4330a7dc83f34377d27e9ab3495595acf16961631be39e4f13301\"" Mar 12 04:48:26.670388 containerd[1510]: time="2026-03-12T04:48:26.670335268Z" level=info msg="CreateContainer within sandbox \"eade2bf43bf4330a7dc83f34377d27e9ab3495595acf16961631be39e4f13301\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 12 04:48:26.670973 containerd[1510]: time="2026-03-12T04:48:26.670933021Z" level=info msg="CreateContainer within sandbox \"e1e4a1d9e35ae7f5f62e6d09be752c7698db4adc99a379fccbee72ef145ad3ad\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"64e13d366f974726bba34ce1b1ed81b5da9054b4dce1b1fd89a3382b7a2c6e92\"" Mar 12 04:48:26.671773 containerd[1510]: time="2026-03-12T04:48:26.671735477Z" level=info msg="StartContainer for \"64e13d366f974726bba34ce1b1ed81b5da9054b4dce1b1fd89a3382b7a2c6e92\"" Mar 12 04:48:26.677625 containerd[1510]: time="2026-03-12T04:48:26.677566929Z" level=info msg="CreateContainer within sandbox \"68b93300b5f323f689cabec7b94baebb5baf4ef2de5630cdba1c54074a39743b\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"7e07e1ef12c82cdc9c6f86c52a70a37302ae58a48528a7c3af48246b0f952d77\"" Mar 12 04:48:26.678381 containerd[1510]: time="2026-03-12T04:48:26.678344958Z" level=info msg="StartContainer for \"7e07e1ef12c82cdc9c6f86c52a70a37302ae58a48528a7c3af48246b0f952d77\"" Mar 12 04:48:26.694604 containerd[1510]: time="2026-03-12T04:48:26.694425738Z" level=info msg="CreateContainer within sandbox \"eade2bf43bf4330a7dc83f34377d27e9ab3495595acf16961631be39e4f13301\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"e329df25b515a598a89e20416e8250c5234b123e77a843341d1768fc61315cbd\"" Mar 12 04:48:26.696592 containerd[1510]: time="2026-03-12T04:48:26.695369935Z" level=info msg="StartContainer for \"e329df25b515a598a89e20416e8250c5234b123e77a843341d1768fc61315cbd\"" Mar 12 04:48:26.730546 systemd[1]: Started cri-containerd-64e13d366f974726bba34ce1b1ed81b5da9054b4dce1b1fd89a3382b7a2c6e92.scope - libcontainer container 64e13d366f974726bba34ce1b1ed81b5da9054b4dce1b1fd89a3382b7a2c6e92. Mar 12 04:48:26.743568 systemd[1]: Started cri-containerd-7e07e1ef12c82cdc9c6f86c52a70a37302ae58a48528a7c3af48246b0f952d77.scope - libcontainer container 7e07e1ef12c82cdc9c6f86c52a70a37302ae58a48528a7c3af48246b0f952d77. Mar 12 04:48:26.754279 kubelet[2308]: E0312 04:48:26.754199 2308 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.230.23.190:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-1ee83.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.230.23.190:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 04:48:26.764238 systemd[1]: Started cri-containerd-e329df25b515a598a89e20416e8250c5234b123e77a843341d1768fc61315cbd.scope - libcontainer container e329df25b515a598a89e20416e8250c5234b123e77a843341d1768fc61315cbd. Mar 12 04:48:26.849196 containerd[1510]: time="2026-03-12T04:48:26.848902880Z" level=info msg="StartContainer for \"64e13d366f974726bba34ce1b1ed81b5da9054b4dce1b1fd89a3382b7a2c6e92\" returns successfully" Mar 12 04:48:26.858417 kubelet[2308]: I0312 04:48:26.857992 2308 kubelet_node_status.go:75] "Attempting to register node" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:26.858922 kubelet[2308]: E0312 04:48:26.858685 2308 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.230.23.190:6443/api/v1/nodes\": dial tcp 10.230.23.190:6443: connect: connection refused" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:26.887555 containerd[1510]: time="2026-03-12T04:48:26.886990358Z" level=info msg="StartContainer for \"7e07e1ef12c82cdc9c6f86c52a70a37302ae58a48528a7c3af48246b0f952d77\" returns successfully" Mar 12 04:48:26.905020 containerd[1510]: time="2026-03-12T04:48:26.904960358Z" level=info msg="StartContainer for \"e329df25b515a598a89e20416e8250c5234b123e77a843341d1768fc61315cbd\" returns successfully" Mar 12 04:48:27.278847 kubelet[2308]: E0312 04:48:27.278802 2308 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-1ee83.gb1.brightbox.com\" not found" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:27.287480 kubelet[2308]: E0312 04:48:27.287437 2308 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-1ee83.gb1.brightbox.com\" not found" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:27.288508 kubelet[2308]: E0312 04:48:27.288485 2308 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-1ee83.gb1.brightbox.com\" not found" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:28.292143 kubelet[2308]: E0312 04:48:28.292093 2308 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-1ee83.gb1.brightbox.com\" not found" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:28.292771 kubelet[2308]: E0312 04:48:28.292744 2308 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-1ee83.gb1.brightbox.com\" not found" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:28.463510 kubelet[2308]: I0312 04:48:28.463458 2308 kubelet_node_status.go:75] "Attempting to register node" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:29.295191 kubelet[2308]: E0312 04:48:29.295110 2308 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-1ee83.gb1.brightbox.com\" not found" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:30.055066 kubelet[2308]: E0312 04:48:30.054100 2308 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-1ee83.gb1.brightbox.com\" not found" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:30.150240 kubelet[2308]: I0312 04:48:30.150156 2308 kubelet_node_status.go:78] "Successfully registered node" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:30.151832 kubelet[2308]: E0312 04:48:30.150216 2308 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"srv-1ee83.gb1.brightbox.com\": node \"srv-1ee83.gb1.brightbox.com\" not found" Mar 12 04:48:30.186321 kubelet[2308]: I0312 04:48:30.186087 2308 apiserver.go:52] "Watching apiserver" Mar 12 04:48:30.208807 kubelet[2308]: I0312 04:48:30.208687 2308 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 12 04:48:30.216054 kubelet[2308]: I0312 04:48:30.214922 2308 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:30.299971 kubelet[2308]: E0312 04:48:30.299917 2308 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-1ee83.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:30.299971 kubelet[2308]: I0312 04:48:30.299966 2308 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:30.304699 kubelet[2308]: E0312 04:48:30.304665 2308 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-1ee83.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:30.304699 kubelet[2308]: I0312 04:48:30.304697 2308 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:30.310173 kubelet[2308]: E0312 04:48:30.308214 2308 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-1ee83.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:32.633466 systemd[1]: Reloading requested from client PID 2591 ('systemctl') (unit session-11.scope)... Mar 12 04:48:32.633493 systemd[1]: Reloading... Mar 12 04:48:32.750116 zram_generator::config[2627]: No configuration found. Mar 12 04:48:32.984638 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 12 04:48:33.119602 systemd[1]: Reloading finished in 485 ms. Mar 12 04:48:33.186189 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:48:33.201199 systemd[1]: kubelet.service: Deactivated successfully. Mar 12 04:48:33.201522 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:48:33.201627 systemd[1]: kubelet.service: Consumed 1.717s CPU time, 125.7M memory peak, 0B memory swap peak. Mar 12 04:48:33.213562 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:48:33.424927 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:48:33.436549 (kubelet)[2694]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 12 04:48:33.592685 kubelet[2694]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 12 04:48:33.592685 kubelet[2694]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 04:48:33.595685 kubelet[2694]: I0312 04:48:33.593653 2694 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 04:48:33.604785 kubelet[2694]: I0312 04:48:33.604740 2694 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 12 04:48:33.604977 kubelet[2694]: I0312 04:48:33.604958 2694 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 04:48:33.605178 kubelet[2694]: I0312 04:48:33.605156 2694 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 12 04:48:33.605322 kubelet[2694]: I0312 04:48:33.605299 2694 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 12 04:48:33.605697 kubelet[2694]: I0312 04:48:33.605674 2694 server.go:956] "Client rotation is on, will bootstrap in background" Mar 12 04:48:33.607995 kubelet[2694]: I0312 04:48:33.607972 2694 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 12 04:48:33.616962 kubelet[2694]: I0312 04:48:33.616888 2694 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 12 04:48:33.626848 kubelet[2694]: E0312 04:48:33.625945 2694 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 12 04:48:33.626848 kubelet[2694]: I0312 04:48:33.626074 2694 server.go:1400] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Mar 12 04:48:33.631612 kubelet[2694]: I0312 04:48:33.630860 2694 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 12 04:48:33.631901 kubelet[2694]: I0312 04:48:33.631836 2694 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 04:48:33.632794 kubelet[2694]: I0312 04:48:33.631900 2694 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-1ee83.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 04:48:33.632794 kubelet[2694]: I0312 04:48:33.632159 2694 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 04:48:33.632794 kubelet[2694]: I0312 04:48:33.632175 2694 container_manager_linux.go:306] "Creating device plugin manager" Mar 12 04:48:33.632794 kubelet[2694]: I0312 04:48:33.632213 2694 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 12 04:48:33.632794 kubelet[2694]: I0312 04:48:33.632575 2694 state_mem.go:36] "Initialized new in-memory state store" Mar 12 04:48:33.633218 kubelet[2694]: I0312 04:48:33.632861 2694 kubelet.go:475] "Attempting to sync node with API server" Mar 12 04:48:33.633218 kubelet[2694]: I0312 04:48:33.632884 2694 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 04:48:33.636760 kubelet[2694]: I0312 04:48:33.633831 2694 kubelet.go:387] "Adding apiserver pod source" Mar 12 04:48:33.636760 kubelet[2694]: I0312 04:48:33.633891 2694 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 04:48:33.638159 kubelet[2694]: I0312 04:48:33.637941 2694 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 12 04:48:33.639350 kubelet[2694]: I0312 04:48:33.638867 2694 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 12 04:48:33.639350 kubelet[2694]: I0312 04:48:33.638913 2694 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 12 04:48:33.651225 kubelet[2694]: I0312 04:48:33.651161 2694 server.go:1262] "Started kubelet" Mar 12 04:48:33.655393 kubelet[2694]: I0312 04:48:33.655351 2694 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 04:48:33.657364 kubelet[2694]: I0312 04:48:33.656243 2694 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 04:48:33.657364 kubelet[2694]: I0312 04:48:33.656322 2694 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 12 04:48:33.657364 kubelet[2694]: I0312 04:48:33.656749 2694 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 04:48:33.659894 kubelet[2694]: I0312 04:48:33.659870 2694 server.go:310] "Adding debug handlers to kubelet server" Mar 12 04:48:33.666935 kubelet[2694]: I0312 04:48:33.666903 2694 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 04:48:33.679424 kubelet[2694]: I0312 04:48:33.679287 2694 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 12 04:48:33.689064 kubelet[2694]: I0312 04:48:33.686876 2694 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 12 04:48:33.693063 kubelet[2694]: I0312 04:48:33.691510 2694 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 12 04:48:33.694277 kubelet[2694]: I0312 04:48:33.694245 2694 reconciler.go:29] "Reconciler: start to sync state" Mar 12 04:48:33.704496 kubelet[2694]: I0312 04:48:33.704447 2694 factory.go:223] Registration of the systemd container factory successfully Mar 12 04:48:33.705283 kubelet[2694]: E0312 04:48:33.705212 2694 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 12 04:48:33.707143 kubelet[2694]: I0312 04:48:33.707088 2694 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 12 04:48:33.720166 kubelet[2694]: I0312 04:48:33.718966 2694 factory.go:223] Registration of the containerd container factory successfully Mar 12 04:48:33.727062 kubelet[2694]: I0312 04:48:33.725946 2694 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 12 04:48:33.732100 kubelet[2694]: I0312 04:48:33.729856 2694 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 12 04:48:33.732100 kubelet[2694]: I0312 04:48:33.729893 2694 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 12 04:48:33.732100 kubelet[2694]: I0312 04:48:33.729944 2694 kubelet.go:2428] "Starting kubelet main sync loop" Mar 12 04:48:33.732100 kubelet[2694]: E0312 04:48:33.730677 2694 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 04:48:33.830922 kubelet[2694]: E0312 04:48:33.830880 2694 kubelet.go:2452] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 12 04:48:33.876284 kubelet[2694]: I0312 04:48:33.876229 2694 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 12 04:48:33.876613 kubelet[2694]: I0312 04:48:33.876588 2694 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 12 04:48:33.876765 kubelet[2694]: I0312 04:48:33.876747 2694 state_mem.go:36] "Initialized new in-memory state store" Mar 12 04:48:33.877183 kubelet[2694]: I0312 04:48:33.877158 2694 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 12 04:48:33.878457 kubelet[2694]: I0312 04:48:33.878415 2694 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 12 04:48:33.878577 kubelet[2694]: I0312 04:48:33.878559 2694 policy_none.go:49] "None policy: Start" Mar 12 04:48:33.878705 kubelet[2694]: I0312 04:48:33.878684 2694 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 12 04:48:33.878823 kubelet[2694]: I0312 04:48:33.878804 2694 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 12 04:48:33.880732 kubelet[2694]: I0312 04:48:33.879099 2694 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 12 04:48:33.880732 kubelet[2694]: I0312 04:48:33.880626 2694 policy_none.go:47] "Start" Mar 12 04:48:33.895172 kubelet[2694]: E0312 04:48:33.893764 2694 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 12 04:48:33.895172 kubelet[2694]: I0312 04:48:33.894109 2694 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 04:48:33.895172 kubelet[2694]: I0312 04:48:33.894137 2694 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 04:48:33.902832 kubelet[2694]: I0312 04:48:33.902696 2694 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 04:48:33.910526 kubelet[2694]: E0312 04:48:33.908516 2694 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 12 04:48:34.025254 kubelet[2694]: I0312 04:48:34.023000 2694 kubelet_node_status.go:75] "Attempting to register node" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:34.034413 kubelet[2694]: I0312 04:48:34.033570 2694 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:34.034808 kubelet[2694]: I0312 04:48:34.033733 2694 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:34.036059 kubelet[2694]: I0312 04:48:34.035203 2694 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:34.038352 kubelet[2694]: I0312 04:48:34.037908 2694 kubelet_node_status.go:124] "Node was previously registered" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:34.038352 kubelet[2694]: I0312 04:48:34.037994 2694 kubelet_node_status.go:78] "Successfully registered node" node="srv-1ee83.gb1.brightbox.com" Mar 12 04:48:34.049514 kubelet[2694]: I0312 04:48:34.049448 2694 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 12 04:48:34.054505 kubelet[2694]: I0312 04:48:34.054124 2694 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 12 04:48:34.057557 kubelet[2694]: I0312 04:48:34.057437 2694 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 12 04:48:34.099606 kubelet[2694]: I0312 04:48:34.099363 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/0ee049e37276852334ff5f9ecaeb1d3a-flexvolume-dir\") pod \"kube-controller-manager-srv-1ee83.gb1.brightbox.com\" (UID: \"0ee049e37276852334ff5f9ecaeb1d3a\") " pod="kube-system/kube-controller-manager-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:34.099606 kubelet[2694]: I0312 04:48:34.099556 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0ee049e37276852334ff5f9ecaeb1d3a-k8s-certs\") pod \"kube-controller-manager-srv-1ee83.gb1.brightbox.com\" (UID: \"0ee049e37276852334ff5f9ecaeb1d3a\") " pod="kube-system/kube-controller-manager-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:34.099849 kubelet[2694]: I0312 04:48:34.099643 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0ee049e37276852334ff5f9ecaeb1d3a-kubeconfig\") pod \"kube-controller-manager-srv-1ee83.gb1.brightbox.com\" (UID: \"0ee049e37276852334ff5f9ecaeb1d3a\") " pod="kube-system/kube-controller-manager-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:34.099849 kubelet[2694]: I0312 04:48:34.099757 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/79c1d99ca5812254fdabfafee3ffa905-ca-certs\") pod \"kube-apiserver-srv-1ee83.gb1.brightbox.com\" (UID: \"79c1d99ca5812254fdabfafee3ffa905\") " pod="kube-system/kube-apiserver-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:34.099849 kubelet[2694]: I0312 04:48:34.099794 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0ee049e37276852334ff5f9ecaeb1d3a-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-1ee83.gb1.brightbox.com\" (UID: \"0ee049e37276852334ff5f9ecaeb1d3a\") " pod="kube-system/kube-controller-manager-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:34.100094 kubelet[2694]: I0312 04:48:34.099856 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/534d8e7711a0e5723c3e326081eebb8c-kubeconfig\") pod \"kube-scheduler-srv-1ee83.gb1.brightbox.com\" (UID: \"534d8e7711a0e5723c3e326081eebb8c\") " pod="kube-system/kube-scheduler-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:34.100094 kubelet[2694]: I0312 04:48:34.099936 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/79c1d99ca5812254fdabfafee3ffa905-k8s-certs\") pod \"kube-apiserver-srv-1ee83.gb1.brightbox.com\" (UID: \"79c1d99ca5812254fdabfafee3ffa905\") " pod="kube-system/kube-apiserver-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:34.100094 kubelet[2694]: I0312 04:48:34.100078 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/79c1d99ca5812254fdabfafee3ffa905-usr-share-ca-certificates\") pod \"kube-apiserver-srv-1ee83.gb1.brightbox.com\" (UID: \"79c1d99ca5812254fdabfafee3ffa905\") " pod="kube-system/kube-apiserver-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:34.100240 kubelet[2694]: I0312 04:48:34.100124 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0ee049e37276852334ff5f9ecaeb1d3a-ca-certs\") pod \"kube-controller-manager-srv-1ee83.gb1.brightbox.com\" (UID: \"0ee049e37276852334ff5f9ecaeb1d3a\") " pod="kube-system/kube-controller-manager-srv-1ee83.gb1.brightbox.com" Mar 12 04:48:34.636915 kubelet[2694]: I0312 04:48:34.635338 2694 apiserver.go:52] "Watching apiserver" Mar 12 04:48:34.694299 kubelet[2694]: I0312 04:48:34.694198 2694 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 12 04:48:34.839570 kubelet[2694]: I0312 04:48:34.839187 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-1ee83.gb1.brightbox.com" podStartSLOduration=0.839147433 podStartE2EDuration="839.147433ms" podCreationTimestamp="2026-03-12 04:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 04:48:34.838636676 +0000 UTC m=+1.327575010" watchObservedRunningTime="2026-03-12 04:48:34.839147433 +0000 UTC m=+1.328085757" Mar 12 04:48:34.899758 kubelet[2694]: I0312 04:48:34.898646 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-1ee83.gb1.brightbox.com" podStartSLOduration=0.898599864 podStartE2EDuration="898.599864ms" podCreationTimestamp="2026-03-12 04:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 04:48:34.856771395 +0000 UTC m=+1.345709746" watchObservedRunningTime="2026-03-12 04:48:34.898599864 +0000 UTC m=+1.387538186" Mar 12 04:48:34.918575 kubelet[2694]: I0312 04:48:34.917347 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-1ee83.gb1.brightbox.com" podStartSLOduration=0.917311743 podStartE2EDuration="917.311743ms" podCreationTimestamp="2026-03-12 04:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 04:48:34.898976942 +0000 UTC m=+1.387915264" watchObservedRunningTime="2026-03-12 04:48:34.917311743 +0000 UTC m=+1.406250073" Mar 12 04:48:37.998347 kubelet[2694]: I0312 04:48:37.998287 2694 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 12 04:48:37.999138 containerd[1510]: time="2026-03-12T04:48:37.998844859Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 12 04:48:37.999627 kubelet[2694]: I0312 04:48:37.999386 2694 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 12 04:48:38.967092 systemd[1]: Created slice kubepods-besteffort-pod27f2eb8e_ec33_43ff_879d_0b6447c0d686.slice - libcontainer container kubepods-besteffort-pod27f2eb8e_ec33_43ff_879d_0b6447c0d686.slice. Mar 12 04:48:39.033770 kubelet[2694]: I0312 04:48:39.033525 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/27f2eb8e-ec33-43ff-879d-0b6447c0d686-kube-proxy\") pod \"kube-proxy-g55p2\" (UID: \"27f2eb8e-ec33-43ff-879d-0b6447c0d686\") " pod="kube-system/kube-proxy-g55p2" Mar 12 04:48:39.033770 kubelet[2694]: I0312 04:48:39.033599 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/27f2eb8e-ec33-43ff-879d-0b6447c0d686-xtables-lock\") pod \"kube-proxy-g55p2\" (UID: \"27f2eb8e-ec33-43ff-879d-0b6447c0d686\") " pod="kube-system/kube-proxy-g55p2" Mar 12 04:48:39.033770 kubelet[2694]: I0312 04:48:39.033630 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grs72\" (UniqueName: \"kubernetes.io/projected/27f2eb8e-ec33-43ff-879d-0b6447c0d686-kube-api-access-grs72\") pod \"kube-proxy-g55p2\" (UID: \"27f2eb8e-ec33-43ff-879d-0b6447c0d686\") " pod="kube-system/kube-proxy-g55p2" Mar 12 04:48:39.033770 kubelet[2694]: I0312 04:48:39.033666 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/27f2eb8e-ec33-43ff-879d-0b6447c0d686-lib-modules\") pod \"kube-proxy-g55p2\" (UID: \"27f2eb8e-ec33-43ff-879d-0b6447c0d686\") " pod="kube-system/kube-proxy-g55p2" Mar 12 04:48:39.207855 systemd[1]: Created slice kubepods-besteffort-pod170c16d6_8bc8_41c2_aee6_59bcf8b9567a.slice - libcontainer container kubepods-besteffort-pod170c16d6_8bc8_41c2_aee6_59bcf8b9567a.slice. Mar 12 04:48:39.209101 kubelet[2694]: E0312 04:48:39.208325 2694 status_manager.go:1018] "Failed to get status for pod" err="pods \"tigera-operator-5588576f44-4vvz9\" is forbidden: User \"system:node:srv-1ee83.gb1.brightbox.com\" cannot get resource \"pods\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'srv-1ee83.gb1.brightbox.com' and this object" podUID="170c16d6-8bc8-41c2-aee6-59bcf8b9567a" pod="tigera-operator/tigera-operator-5588576f44-4vvz9" Mar 12 04:48:39.209447 kubelet[2694]: E0312 04:48:39.209367 2694 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kubernetes-services-endpoint\" is forbidden: User \"system:node:srv-1ee83.gb1.brightbox.com\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'srv-1ee83.gb1.brightbox.com' and this object" logger="UnhandledError" reflector="object-\"tigera-operator\"/\"kubernetes-services-endpoint\"" type="*v1.ConfigMap" Mar 12 04:48:39.209783 kubelet[2694]: E0312 04:48:39.209696 2694 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:srv-1ee83.gb1.brightbox.com\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'srv-1ee83.gb1.brightbox.com' and this object" logger="UnhandledError" reflector="object-\"tigera-operator\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Mar 12 04:48:39.235739 kubelet[2694]: I0312 04:48:39.235543 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/170c16d6-8bc8-41c2-aee6-59bcf8b9567a-var-lib-calico\") pod \"tigera-operator-5588576f44-4vvz9\" (UID: \"170c16d6-8bc8-41c2-aee6-59bcf8b9567a\") " pod="tigera-operator/tigera-operator-5588576f44-4vvz9" Mar 12 04:48:39.235739 kubelet[2694]: I0312 04:48:39.235632 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rrhk\" (UniqueName: \"kubernetes.io/projected/170c16d6-8bc8-41c2-aee6-59bcf8b9567a-kube-api-access-6rrhk\") pod \"tigera-operator-5588576f44-4vvz9\" (UID: \"170c16d6-8bc8-41c2-aee6-59bcf8b9567a\") " pod="tigera-operator/tigera-operator-5588576f44-4vvz9" Mar 12 04:48:39.281998 containerd[1510]: time="2026-03-12T04:48:39.281919155Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-g55p2,Uid:27f2eb8e-ec33-43ff-879d-0b6447c0d686,Namespace:kube-system,Attempt:0,}" Mar 12 04:48:39.334090 containerd[1510]: time="2026-03-12T04:48:39.333825720Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:48:39.334535 containerd[1510]: time="2026-03-12T04:48:39.333999590Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:48:39.334535 containerd[1510]: time="2026-03-12T04:48:39.334229116Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:48:39.334535 containerd[1510]: time="2026-03-12T04:48:39.334475711Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:48:39.372836 systemd[1]: run-containerd-runc-k8s.io-16a8112ea5f5b6c45f421c1770eb3a7ead89b7d47e4f01c3eac4ca98a48b0be2-runc.rvKHnA.mount: Deactivated successfully. Mar 12 04:48:39.387493 systemd[1]: Started cri-containerd-16a8112ea5f5b6c45f421c1770eb3a7ead89b7d47e4f01c3eac4ca98a48b0be2.scope - libcontainer container 16a8112ea5f5b6c45f421c1770eb3a7ead89b7d47e4f01c3eac4ca98a48b0be2. Mar 12 04:48:39.441700 containerd[1510]: time="2026-03-12T04:48:39.441608552Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-g55p2,Uid:27f2eb8e-ec33-43ff-879d-0b6447c0d686,Namespace:kube-system,Attempt:0,} returns sandbox id \"16a8112ea5f5b6c45f421c1770eb3a7ead89b7d47e4f01c3eac4ca98a48b0be2\"" Mar 12 04:48:39.460526 containerd[1510]: time="2026-03-12T04:48:39.460450556Z" level=info msg="CreateContainer within sandbox \"16a8112ea5f5b6c45f421c1770eb3a7ead89b7d47e4f01c3eac4ca98a48b0be2\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 12 04:48:39.502900 containerd[1510]: time="2026-03-12T04:48:39.502708246Z" level=info msg="CreateContainer within sandbox \"16a8112ea5f5b6c45f421c1770eb3a7ead89b7d47e4f01c3eac4ca98a48b0be2\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"223e8e635be0cc62d064c8e8e20381a9907cdc53dd316d706e29ebb6f66b68ae\"" Mar 12 04:48:39.505024 containerd[1510]: time="2026-03-12T04:48:39.504190223Z" level=info msg="StartContainer for \"223e8e635be0cc62d064c8e8e20381a9907cdc53dd316d706e29ebb6f66b68ae\"" Mar 12 04:48:39.561324 systemd[1]: Started cri-containerd-223e8e635be0cc62d064c8e8e20381a9907cdc53dd316d706e29ebb6f66b68ae.scope - libcontainer container 223e8e635be0cc62d064c8e8e20381a9907cdc53dd316d706e29ebb6f66b68ae. Mar 12 04:48:39.629547 containerd[1510]: time="2026-03-12T04:48:39.629278068Z" level=info msg="StartContainer for \"223e8e635be0cc62d064c8e8e20381a9907cdc53dd316d706e29ebb6f66b68ae\" returns successfully" Mar 12 04:48:39.857155 kubelet[2694]: I0312 04:48:39.856205 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-g55p2" podStartSLOduration=1.8561245579999999 podStartE2EDuration="1.856124558s" podCreationTimestamp="2026-03-12 04:48:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 04:48:39.855611055 +0000 UTC m=+6.344549396" watchObservedRunningTime="2026-03-12 04:48:39.856124558 +0000 UTC m=+6.345062886" Mar 12 04:48:40.350106 kubelet[2694]: E0312 04:48:40.349818 2694 projected.go:291] Couldn't get configMap tigera-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 12 04:48:40.350106 kubelet[2694]: E0312 04:48:40.349876 2694 projected.go:196] Error preparing data for projected volume kube-api-access-6rrhk for pod tigera-operator/tigera-operator-5588576f44-4vvz9: failed to sync configmap cache: timed out waiting for the condition Mar 12 04:48:40.350106 kubelet[2694]: E0312 04:48:40.350016 2694 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/170c16d6-8bc8-41c2-aee6-59bcf8b9567a-kube-api-access-6rrhk podName:170c16d6-8bc8-41c2-aee6-59bcf8b9567a nodeName:}" failed. No retries permitted until 2026-03-12 04:48:40.849970618 +0000 UTC m=+7.338908936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6rrhk" (UniqueName: "kubernetes.io/projected/170c16d6-8bc8-41c2-aee6-59bcf8b9567a-kube-api-access-6rrhk") pod "tigera-operator-5588576f44-4vvz9" (UID: "170c16d6-8bc8-41c2-aee6-59bcf8b9567a") : failed to sync configmap cache: timed out waiting for the condition Mar 12 04:48:41.029739 containerd[1510]: time="2026-03-12T04:48:41.029488612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-4vvz9,Uid:170c16d6-8bc8-41c2-aee6-59bcf8b9567a,Namespace:tigera-operator,Attempt:0,}" Mar 12 04:48:41.082263 containerd[1510]: time="2026-03-12T04:48:41.082104700Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:48:41.082263 containerd[1510]: time="2026-03-12T04:48:41.082205636Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:48:41.082263 containerd[1510]: time="2026-03-12T04:48:41.082231559Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:48:41.085100 containerd[1510]: time="2026-03-12T04:48:41.082547566Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:48:41.116352 systemd[1]: Started cri-containerd-e581f5dde9789f9e7f6a9a29ac962cc99cdea86467c93e11dd58dd28e17dfbf4.scope - libcontainer container e581f5dde9789f9e7f6a9a29ac962cc99cdea86467c93e11dd58dd28e17dfbf4. Mar 12 04:48:41.180535 containerd[1510]: time="2026-03-12T04:48:41.180473581Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-4vvz9,Uid:170c16d6-8bc8-41c2-aee6-59bcf8b9567a,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"e581f5dde9789f9e7f6a9a29ac962cc99cdea86467c93e11dd58dd28e17dfbf4\"" Mar 12 04:48:41.184580 containerd[1510]: time="2026-03-12T04:48:41.184466050Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 12 04:48:41.188236 systemd[1]: run-containerd-runc-k8s.io-e581f5dde9789f9e7f6a9a29ac962cc99cdea86467c93e11dd58dd28e17dfbf4-runc.r5CcFz.mount: Deactivated successfully. Mar 12 04:48:43.826455 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1288639492.mount: Deactivated successfully. Mar 12 04:48:46.183516 containerd[1510]: time="2026-03-12T04:48:46.182812148Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:46.184324 containerd[1510]: time="2026-03-12T04:48:46.184155219Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Mar 12 04:48:46.185188 containerd[1510]: time="2026-03-12T04:48:46.185124252Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:46.189932 containerd[1510]: time="2026-03-12T04:48:46.189781584Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:48:46.191122 containerd[1510]: time="2026-03-12T04:48:46.191076884Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 5.006460604s" Mar 12 04:48:46.192065 containerd[1510]: time="2026-03-12T04:48:46.191130149Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Mar 12 04:48:46.200661 containerd[1510]: time="2026-03-12T04:48:46.200607911Z" level=info msg="CreateContainer within sandbox \"e581f5dde9789f9e7f6a9a29ac962cc99cdea86467c93e11dd58dd28e17dfbf4\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 12 04:48:46.216956 containerd[1510]: time="2026-03-12T04:48:46.216729180Z" level=info msg="CreateContainer within sandbox \"e581f5dde9789f9e7f6a9a29ac962cc99cdea86467c93e11dd58dd28e17dfbf4\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"b843303832a051ebaa59b529caf8908b6432f462b331ac6330f1c9403ff37061\"" Mar 12 04:48:46.218885 containerd[1510]: time="2026-03-12T04:48:46.217950700Z" level=info msg="StartContainer for \"b843303832a051ebaa59b529caf8908b6432f462b331ac6330f1c9403ff37061\"" Mar 12 04:48:46.264290 systemd[1]: Started cri-containerd-b843303832a051ebaa59b529caf8908b6432f462b331ac6330f1c9403ff37061.scope - libcontainer container b843303832a051ebaa59b529caf8908b6432f462b331ac6330f1c9403ff37061. Mar 12 04:48:46.313015 containerd[1510]: time="2026-03-12T04:48:46.312945402Z" level=info msg="StartContainer for \"b843303832a051ebaa59b529caf8908b6432f462b331ac6330f1c9403ff37061\" returns successfully" Mar 12 04:48:52.514117 sudo[1776]: pam_unix(sudo:session): session closed for user root Mar 12 04:48:52.614193 sshd[1773]: pam_unix(sshd:session): session closed for user core Mar 12 04:48:52.624830 systemd[1]: sshd@8-10.230.23.190:22-20.161.92.111:49490.service: Deactivated successfully. Mar 12 04:48:52.632313 systemd[1]: session-11.scope: Deactivated successfully. Mar 12 04:48:52.633582 systemd[1]: session-11.scope: Consumed 8.139s CPU time, 155.0M memory peak, 0B memory swap peak. Mar 12 04:48:52.639499 systemd-logind[1487]: Session 11 logged out. Waiting for processes to exit. Mar 12 04:48:52.643777 systemd-logind[1487]: Removed session 11. Mar 12 04:48:56.724458 kubelet[2694]: I0312 04:48:56.724345 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5588576f44-4vvz9" podStartSLOduration=12.71407761 podStartE2EDuration="17.724317693s" podCreationTimestamp="2026-03-12 04:48:39 +0000 UTC" firstStartedPulling="2026-03-12 04:48:41.183683709 +0000 UTC m=+7.672622025" lastFinishedPulling="2026-03-12 04:48:46.193923785 +0000 UTC m=+12.682862108" observedRunningTime="2026-03-12 04:48:46.868003201 +0000 UTC m=+13.356941546" watchObservedRunningTime="2026-03-12 04:48:56.724317693 +0000 UTC m=+23.213256020" Mar 12 04:48:56.743266 systemd[1]: Created slice kubepods-besteffort-pod7b377add_b2bb_4984_bb57_fdc2d8e7a7ae.slice - libcontainer container kubepods-besteffort-pod7b377add_b2bb_4984_bb57_fdc2d8e7a7ae.slice. Mar 12 04:48:56.764220 kubelet[2694]: I0312 04:48:56.763430 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b377add-b2bb-4984-bb57-fdc2d8e7a7ae-tigera-ca-bundle\") pod \"calico-typha-f5587448-9q6kj\" (UID: \"7b377add-b2bb-4984-bb57-fdc2d8e7a7ae\") " pod="calico-system/calico-typha-f5587448-9q6kj" Mar 12 04:48:56.764220 kubelet[2694]: I0312 04:48:56.763642 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/7b377add-b2bb-4984-bb57-fdc2d8e7a7ae-typha-certs\") pod \"calico-typha-f5587448-9q6kj\" (UID: \"7b377add-b2bb-4984-bb57-fdc2d8e7a7ae\") " pod="calico-system/calico-typha-f5587448-9q6kj" Mar 12 04:48:56.764220 kubelet[2694]: I0312 04:48:56.763888 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kscf\" (UniqueName: \"kubernetes.io/projected/7b377add-b2bb-4984-bb57-fdc2d8e7a7ae-kube-api-access-2kscf\") pod \"calico-typha-f5587448-9q6kj\" (UID: \"7b377add-b2bb-4984-bb57-fdc2d8e7a7ae\") " pod="calico-system/calico-typha-f5587448-9q6kj" Mar 12 04:48:57.048374 systemd[1]: Created slice kubepods-besteffort-pod66efc577_1b61_4931_a1c1_bc59ceeac9cf.slice - libcontainer container kubepods-besteffort-pod66efc577_1b61_4931_a1c1_bc59ceeac9cf.slice. Mar 12 04:48:57.057560 containerd[1510]: time="2026-03-12T04:48:57.057437915Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-f5587448-9q6kj,Uid:7b377add-b2bb-4984-bb57-fdc2d8e7a7ae,Namespace:calico-system,Attempt:0,}" Mar 12 04:48:57.066912 kubelet[2694]: I0312 04:48:57.066214 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/66efc577-1b61-4931-a1c1-bc59ceeac9cf-bpffs\") pod \"calico-node-hk2n4\" (UID: \"66efc577-1b61-4931-a1c1-bc59ceeac9cf\") " pod="calico-system/calico-node-hk2n4" Mar 12 04:48:57.066912 kubelet[2694]: I0312 04:48:57.066271 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/66efc577-1b61-4931-a1c1-bc59ceeac9cf-cni-net-dir\") pod \"calico-node-hk2n4\" (UID: \"66efc577-1b61-4931-a1c1-bc59ceeac9cf\") " pod="calico-system/calico-node-hk2n4" Mar 12 04:48:57.066912 kubelet[2694]: I0312 04:48:57.066302 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/66efc577-1b61-4931-a1c1-bc59ceeac9cf-nodeproc\") pod \"calico-node-hk2n4\" (UID: \"66efc577-1b61-4931-a1c1-bc59ceeac9cf\") " pod="calico-system/calico-node-hk2n4" Mar 12 04:48:57.066912 kubelet[2694]: I0312 04:48:57.066333 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/66efc577-1b61-4931-a1c1-bc59ceeac9cf-flexvol-driver-host\") pod \"calico-node-hk2n4\" (UID: \"66efc577-1b61-4931-a1c1-bc59ceeac9cf\") " pod="calico-system/calico-node-hk2n4" Mar 12 04:48:57.066912 kubelet[2694]: I0312 04:48:57.066376 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/66efc577-1b61-4931-a1c1-bc59ceeac9cf-cni-log-dir\") pod \"calico-node-hk2n4\" (UID: \"66efc577-1b61-4931-a1c1-bc59ceeac9cf\") " pod="calico-system/calico-node-hk2n4" Mar 12 04:48:57.067262 kubelet[2694]: I0312 04:48:57.066405 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/66efc577-1b61-4931-a1c1-bc59ceeac9cf-node-certs\") pod \"calico-node-hk2n4\" (UID: \"66efc577-1b61-4931-a1c1-bc59ceeac9cf\") " pod="calico-system/calico-node-hk2n4" Mar 12 04:48:57.067262 kubelet[2694]: I0312 04:48:57.066436 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/66efc577-1b61-4931-a1c1-bc59ceeac9cf-cni-bin-dir\") pod \"calico-node-hk2n4\" (UID: \"66efc577-1b61-4931-a1c1-bc59ceeac9cf\") " pod="calico-system/calico-node-hk2n4" Mar 12 04:48:57.067262 kubelet[2694]: I0312 04:48:57.066475 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/66efc577-1b61-4931-a1c1-bc59ceeac9cf-var-lib-calico\") pod \"calico-node-hk2n4\" (UID: \"66efc577-1b61-4931-a1c1-bc59ceeac9cf\") " pod="calico-system/calico-node-hk2n4" Mar 12 04:48:57.067262 kubelet[2694]: I0312 04:48:57.066503 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/66efc577-1b61-4931-a1c1-bc59ceeac9cf-sys-fs\") pod \"calico-node-hk2n4\" (UID: \"66efc577-1b61-4931-a1c1-bc59ceeac9cf\") " pod="calico-system/calico-node-hk2n4" Mar 12 04:48:57.067262 kubelet[2694]: I0312 04:48:57.066527 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66efc577-1b61-4931-a1c1-bc59ceeac9cf-tigera-ca-bundle\") pod \"calico-node-hk2n4\" (UID: \"66efc577-1b61-4931-a1c1-bc59ceeac9cf\") " pod="calico-system/calico-node-hk2n4" Mar 12 04:48:57.067875 kubelet[2694]: I0312 04:48:57.066551 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/66efc577-1b61-4931-a1c1-bc59ceeac9cf-var-run-calico\") pod \"calico-node-hk2n4\" (UID: \"66efc577-1b61-4931-a1c1-bc59ceeac9cf\") " pod="calico-system/calico-node-hk2n4" Mar 12 04:48:57.067875 kubelet[2694]: I0312 04:48:57.066582 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/66efc577-1b61-4931-a1c1-bc59ceeac9cf-lib-modules\") pod \"calico-node-hk2n4\" (UID: \"66efc577-1b61-4931-a1c1-bc59ceeac9cf\") " pod="calico-system/calico-node-hk2n4" Mar 12 04:48:57.067875 kubelet[2694]: I0312 04:48:57.066607 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/66efc577-1b61-4931-a1c1-bc59ceeac9cf-policysync\") pod \"calico-node-hk2n4\" (UID: \"66efc577-1b61-4931-a1c1-bc59ceeac9cf\") " pod="calico-system/calico-node-hk2n4" Mar 12 04:48:57.067875 kubelet[2694]: I0312 04:48:57.066650 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/66efc577-1b61-4931-a1c1-bc59ceeac9cf-xtables-lock\") pod \"calico-node-hk2n4\" (UID: \"66efc577-1b61-4931-a1c1-bc59ceeac9cf\") " pod="calico-system/calico-node-hk2n4" Mar 12 04:48:57.067875 kubelet[2694]: I0312 04:48:57.066681 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf8zt\" (UniqueName: \"kubernetes.io/projected/66efc577-1b61-4931-a1c1-bc59ceeac9cf-kube-api-access-wf8zt\") pod \"calico-node-hk2n4\" (UID: \"66efc577-1b61-4931-a1c1-bc59ceeac9cf\") " pod="calico-system/calico-node-hk2n4" Mar 12 04:48:57.157109 containerd[1510]: time="2026-03-12T04:48:57.156880672Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:48:57.157109 containerd[1510]: time="2026-03-12T04:48:57.157006873Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:48:57.157109 containerd[1510]: time="2026-03-12T04:48:57.157059036Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:48:57.158688 containerd[1510]: time="2026-03-12T04:48:57.158598193Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:48:57.178447 kubelet[2694]: E0312 04:48:57.177306 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.178447 kubelet[2694]: W0312 04:48:57.177364 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.178447 kubelet[2694]: E0312 04:48:57.177420 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.179430 kubelet[2694]: E0312 04:48:57.179344 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.179430 kubelet[2694]: W0312 04:48:57.179367 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.179430 kubelet[2694]: E0312 04:48:57.179384 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.180717 kubelet[2694]: E0312 04:48:57.180555 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.180717 kubelet[2694]: W0312 04:48:57.180576 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.180717 kubelet[2694]: E0312 04:48:57.180593 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.186341 kubelet[2694]: E0312 04:48:57.186159 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.186341 kubelet[2694]: W0312 04:48:57.186184 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.186341 kubelet[2694]: E0312 04:48:57.186205 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.188022 kubelet[2694]: E0312 04:48:57.187687 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.188022 kubelet[2694]: W0312 04:48:57.187707 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.188022 kubelet[2694]: E0312 04:48:57.187724 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.194129 kubelet[2694]: E0312 04:48:57.193109 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.194129 kubelet[2694]: W0312 04:48:57.193136 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.194129 kubelet[2694]: E0312 04:48:57.193155 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.199161 kubelet[2694]: E0312 04:48:57.197374 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.199161 kubelet[2694]: W0312 04:48:57.197397 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.199161 kubelet[2694]: E0312 04:48:57.197416 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.201726 kubelet[2694]: E0312 04:48:57.200135 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.201726 kubelet[2694]: W0312 04:48:57.200151 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.201726 kubelet[2694]: E0312 04:48:57.200178 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.203226 kubelet[2694]: E0312 04:48:57.203113 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.203226 kubelet[2694]: W0312 04:48:57.203177 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.203226 kubelet[2694]: E0312 04:48:57.203197 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.205415 kubelet[2694]: E0312 04:48:57.205161 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.205415 kubelet[2694]: W0312 04:48:57.205179 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.205415 kubelet[2694]: E0312 04:48:57.205196 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.206354 kubelet[2694]: E0312 04:48:57.206270 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.206354 kubelet[2694]: W0312 04:48:57.206290 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.206354 kubelet[2694]: E0312 04:48:57.206306 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.209261 kubelet[2694]: E0312 04:48:57.207981 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.209261 kubelet[2694]: W0312 04:48:57.208011 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.209261 kubelet[2694]: E0312 04:48:57.208048 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.210187 kubelet[2694]: E0312 04:48:57.209954 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.210187 kubelet[2694]: W0312 04:48:57.209977 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.210187 kubelet[2694]: E0312 04:48:57.209994 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.210754 kubelet[2694]: E0312 04:48:57.210442 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.210754 kubelet[2694]: W0312 04:48:57.210457 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.210754 kubelet[2694]: E0312 04:48:57.210472 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.212745 kubelet[2694]: E0312 04:48:57.212267 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.212745 kubelet[2694]: W0312 04:48:57.212289 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.212745 kubelet[2694]: E0312 04:48:57.212508 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.216069 kubelet[2694]: E0312 04:48:57.213835 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.216069 kubelet[2694]: W0312 04:48:57.213856 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.216069 kubelet[2694]: E0312 04:48:57.213885 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.221220 kubelet[2694]: E0312 04:48:57.221134 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.221220 kubelet[2694]: W0312 04:48:57.221167 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.221220 kubelet[2694]: E0312 04:48:57.221191 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.224587 kubelet[2694]: E0312 04:48:57.224565 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.224587 kubelet[2694]: W0312 04:48:57.224586 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.224900 kubelet[2694]: E0312 04:48:57.224604 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.229195 kubelet[2694]: E0312 04:48:57.229144 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.229195 kubelet[2694]: W0312 04:48:57.229177 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.229195 kubelet[2694]: E0312 04:48:57.229200 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.231141 kubelet[2694]: E0312 04:48:57.231118 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.231141 kubelet[2694]: W0312 04:48:57.231140 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.231356 kubelet[2694]: E0312 04:48:57.231157 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.231758 kubelet[2694]: E0312 04:48:57.231709 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.231758 kubelet[2694]: W0312 04:48:57.231730 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.231758 kubelet[2694]: E0312 04:48:57.231747 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.234527 kubelet[2694]: E0312 04:48:57.234500 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.234527 kubelet[2694]: W0312 04:48:57.234524 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.234646 kubelet[2694]: E0312 04:48:57.234542 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.244681 kubelet[2694]: E0312 04:48:57.243658 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdpsf" podUID="fa7a451e-0708-48d2-97cd-fdb82e83df3d" Mar 12 04:48:57.257615 kubelet[2694]: E0312 04:48:57.257475 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.257615 kubelet[2694]: W0312 04:48:57.257508 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.257615 kubelet[2694]: E0312 04:48:57.257542 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.261165 kubelet[2694]: E0312 04:48:57.261111 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.261165 kubelet[2694]: W0312 04:48:57.261134 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.261165 kubelet[2694]: E0312 04:48:57.261153 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.264059 kubelet[2694]: E0312 04:48:57.262414 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.264059 kubelet[2694]: W0312 04:48:57.262436 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.264059 kubelet[2694]: E0312 04:48:57.262454 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.265012 kubelet[2694]: E0312 04:48:57.264980 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.265012 kubelet[2694]: W0312 04:48:57.265004 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.265181 kubelet[2694]: E0312 04:48:57.265022 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.266312 systemd[1]: Started cri-containerd-8f1df9d52748eb3cd39b3b439107a8955272793429498fd42032b4e6b4a042df.scope - libcontainer container 8f1df9d52748eb3cd39b3b439107a8955272793429498fd42032b4e6b4a042df. Mar 12 04:48:57.267664 kubelet[2694]: E0312 04:48:57.267363 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.267664 kubelet[2694]: W0312 04:48:57.267386 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.267664 kubelet[2694]: E0312 04:48:57.267406 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.271157 kubelet[2694]: E0312 04:48:57.267704 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.271157 kubelet[2694]: W0312 04:48:57.267723 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.271157 kubelet[2694]: E0312 04:48:57.267738 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.271157 kubelet[2694]: E0312 04:48:57.270103 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.271157 kubelet[2694]: W0312 04:48:57.270118 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.271157 kubelet[2694]: E0312 04:48:57.270134 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.273365 kubelet[2694]: E0312 04:48:57.273339 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.273365 kubelet[2694]: W0312 04:48:57.273363 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.273523 kubelet[2694]: E0312 04:48:57.273381 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.273702 kubelet[2694]: E0312 04:48:57.273681 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.273702 kubelet[2694]: W0312 04:48:57.273701 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.273929 kubelet[2694]: E0312 04:48:57.273717 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.274089 kubelet[2694]: E0312 04:48:57.273970 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.274089 kubelet[2694]: W0312 04:48:57.273984 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.274089 kubelet[2694]: E0312 04:48:57.273998 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.275153 kubelet[2694]: E0312 04:48:57.274259 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.275153 kubelet[2694]: W0312 04:48:57.274272 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.275153 kubelet[2694]: E0312 04:48:57.274287 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.275153 kubelet[2694]: E0312 04:48:57.274523 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.275153 kubelet[2694]: W0312 04:48:57.274536 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.275153 kubelet[2694]: E0312 04:48:57.274551 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.275153 kubelet[2694]: E0312 04:48:57.274795 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.275153 kubelet[2694]: W0312 04:48:57.274808 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.275153 kubelet[2694]: E0312 04:48:57.274822 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.275153 kubelet[2694]: E0312 04:48:57.275085 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.277650 kubelet[2694]: W0312 04:48:57.275099 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.277650 kubelet[2694]: E0312 04:48:57.275114 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.277650 kubelet[2694]: E0312 04:48:57.275352 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.277650 kubelet[2694]: W0312 04:48:57.275365 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.277650 kubelet[2694]: E0312 04:48:57.275379 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.277650 kubelet[2694]: E0312 04:48:57.275617 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.277650 kubelet[2694]: W0312 04:48:57.275631 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.277650 kubelet[2694]: E0312 04:48:57.275645 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.277650 kubelet[2694]: E0312 04:48:57.275903 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.277650 kubelet[2694]: W0312 04:48:57.275918 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.281100 kubelet[2694]: E0312 04:48:57.275934 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.281100 kubelet[2694]: E0312 04:48:57.276192 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.281100 kubelet[2694]: W0312 04:48:57.276218 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.281100 kubelet[2694]: E0312 04:48:57.276232 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.281100 kubelet[2694]: E0312 04:48:57.278867 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.281100 kubelet[2694]: W0312 04:48:57.278949 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.281100 kubelet[2694]: E0312 04:48:57.278969 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.281100 kubelet[2694]: E0312 04:48:57.279277 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.281100 kubelet[2694]: W0312 04:48:57.279324 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.281100 kubelet[2694]: E0312 04:48:57.279346 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.286725 kubelet[2694]: E0312 04:48:57.286217 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.286725 kubelet[2694]: W0312 04:48:57.286244 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.286725 kubelet[2694]: E0312 04:48:57.286373 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.286725 kubelet[2694]: I0312 04:48:57.286675 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa7a451e-0708-48d2-97cd-fdb82e83df3d-kubelet-dir\") pod \"csi-node-driver-xdpsf\" (UID: \"fa7a451e-0708-48d2-97cd-fdb82e83df3d\") " pod="calico-system/csi-node-driver-xdpsf" Mar 12 04:48:57.288311 kubelet[2694]: E0312 04:48:57.287891 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.288311 kubelet[2694]: W0312 04:48:57.287913 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.288311 kubelet[2694]: E0312 04:48:57.287931 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.288311 kubelet[2694]: I0312 04:48:57.287968 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fa7a451e-0708-48d2-97cd-fdb82e83df3d-registration-dir\") pod \"csi-node-driver-xdpsf\" (UID: \"fa7a451e-0708-48d2-97cd-fdb82e83df3d\") " pod="calico-system/csi-node-driver-xdpsf" Mar 12 04:48:57.290557 kubelet[2694]: E0312 04:48:57.290223 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.290557 kubelet[2694]: W0312 04:48:57.290258 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.290557 kubelet[2694]: E0312 04:48:57.290277 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.291494 kubelet[2694]: E0312 04:48:57.291227 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.291635 kubelet[2694]: W0312 04:48:57.291610 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.291950 kubelet[2694]: E0312 04:48:57.291741 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.293776 kubelet[2694]: E0312 04:48:57.292509 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.293776 kubelet[2694]: W0312 04:48:57.292545 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.293776 kubelet[2694]: E0312 04:48:57.292564 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.294703 kubelet[2694]: E0312 04:48:57.294499 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.294703 kubelet[2694]: W0312 04:48:57.294521 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.294703 kubelet[2694]: E0312 04:48:57.294540 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.295420 kubelet[2694]: E0312 04:48:57.295227 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.295420 kubelet[2694]: W0312 04:48:57.295264 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.295420 kubelet[2694]: E0312 04:48:57.295283 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.295420 kubelet[2694]: I0312 04:48:57.295347 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvfn4\" (UniqueName: \"kubernetes.io/projected/fa7a451e-0708-48d2-97cd-fdb82e83df3d-kube-api-access-hvfn4\") pod \"csi-node-driver-xdpsf\" (UID: \"fa7a451e-0708-48d2-97cd-fdb82e83df3d\") " pod="calico-system/csi-node-driver-xdpsf" Mar 12 04:48:57.296296 kubelet[2694]: E0312 04:48:57.295995 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.296296 kubelet[2694]: W0312 04:48:57.296015 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.296296 kubelet[2694]: E0312 04:48:57.296067 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.296296 kubelet[2694]: I0312 04:48:57.296095 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/fa7a451e-0708-48d2-97cd-fdb82e83df3d-varrun\") pod \"csi-node-driver-xdpsf\" (UID: \"fa7a451e-0708-48d2-97cd-fdb82e83df3d\") " pod="calico-system/csi-node-driver-xdpsf" Mar 12 04:48:57.297377 kubelet[2694]: E0312 04:48:57.296984 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.297377 kubelet[2694]: W0312 04:48:57.297005 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.297377 kubelet[2694]: E0312 04:48:57.297214 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.298380 kubelet[2694]: E0312 04:48:57.297954 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.298380 kubelet[2694]: W0312 04:48:57.297973 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.298380 kubelet[2694]: E0312 04:48:57.298079 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.300210 kubelet[2694]: E0312 04:48:57.298982 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.300210 kubelet[2694]: W0312 04:48:57.299001 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.300671 kubelet[2694]: E0312 04:48:57.299019 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.300671 kubelet[2694]: I0312 04:48:57.300485 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fa7a451e-0708-48d2-97cd-fdb82e83df3d-socket-dir\") pod \"csi-node-driver-xdpsf\" (UID: \"fa7a451e-0708-48d2-97cd-fdb82e83df3d\") " pod="calico-system/csi-node-driver-xdpsf" Mar 12 04:48:57.303488 kubelet[2694]: E0312 04:48:57.303334 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.304939 kubelet[2694]: W0312 04:48:57.303363 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.310733 kubelet[2694]: E0312 04:48:57.308659 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.313532 kubelet[2694]: E0312 04:48:57.313502 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.313899 kubelet[2694]: W0312 04:48:57.313859 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.314094 kubelet[2694]: E0312 04:48:57.314028 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.316066 kubelet[2694]: E0312 04:48:57.316027 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.316362 kubelet[2694]: W0312 04:48:57.316338 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.317201 kubelet[2694]: E0312 04:48:57.317178 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.319081 kubelet[2694]: E0312 04:48:57.318519 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.319081 kubelet[2694]: W0312 04:48:57.318956 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.319081 kubelet[2694]: E0312 04:48:57.318977 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.320621 kubelet[2694]: E0312 04:48:57.320408 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.320621 kubelet[2694]: W0312 04:48:57.320433 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.320621 kubelet[2694]: E0312 04:48:57.320452 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.357321 containerd[1510]: time="2026-03-12T04:48:57.357258019Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hk2n4,Uid:66efc577-1b61-4931-a1c1-bc59ceeac9cf,Namespace:calico-system,Attempt:0,}" Mar 12 04:48:57.401883 kubelet[2694]: E0312 04:48:57.401696 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.401883 kubelet[2694]: W0312 04:48:57.401732 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.401883 kubelet[2694]: E0312 04:48:57.401817 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.405285 kubelet[2694]: E0312 04:48:57.404123 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.405285 kubelet[2694]: W0312 04:48:57.404146 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.405285 kubelet[2694]: E0312 04:48:57.404165 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.406413 kubelet[2694]: E0312 04:48:57.405881 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.406413 kubelet[2694]: W0312 04:48:57.405912 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.406413 kubelet[2694]: E0312 04:48:57.405930 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.406626 kubelet[2694]: E0312 04:48:57.406500 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.406626 kubelet[2694]: W0312 04:48:57.406516 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.406626 kubelet[2694]: E0312 04:48:57.406532 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.409590 kubelet[2694]: E0312 04:48:57.408719 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.409590 kubelet[2694]: W0312 04:48:57.408747 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.409590 kubelet[2694]: E0312 04:48:57.408770 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.409590 kubelet[2694]: E0312 04:48:57.409358 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.409590 kubelet[2694]: W0312 04:48:57.409375 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.409590 kubelet[2694]: E0312 04:48:57.409393 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.410714 kubelet[2694]: E0312 04:48:57.409918 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.410714 kubelet[2694]: W0312 04:48:57.409933 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.410714 kubelet[2694]: E0312 04:48:57.409953 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.410837 kubelet[2694]: E0312 04:48:57.410823 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.410915 kubelet[2694]: W0312 04:48:57.410839 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.410915 kubelet[2694]: E0312 04:48:57.410865 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.412726 kubelet[2694]: E0312 04:48:57.411622 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.412726 kubelet[2694]: W0312 04:48:57.411646 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.412726 kubelet[2694]: E0312 04:48:57.411663 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.413330 kubelet[2694]: E0312 04:48:57.412870 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.413330 kubelet[2694]: W0312 04:48:57.412885 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.413330 kubelet[2694]: E0312 04:48:57.412916 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.414572 kubelet[2694]: E0312 04:48:57.414072 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.414572 kubelet[2694]: W0312 04:48:57.414107 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.414572 kubelet[2694]: E0312 04:48:57.414126 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.415262 kubelet[2694]: E0312 04:48:57.414971 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.415262 kubelet[2694]: W0312 04:48:57.415002 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.415262 kubelet[2694]: E0312 04:48:57.415020 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.417617 kubelet[2694]: E0312 04:48:57.417583 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.417617 kubelet[2694]: W0312 04:48:57.417606 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.417759 kubelet[2694]: E0312 04:48:57.417625 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.418434 kubelet[2694]: E0312 04:48:57.418387 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.418434 kubelet[2694]: W0312 04:48:57.418408 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.418434 kubelet[2694]: E0312 04:48:57.418426 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.420352 kubelet[2694]: E0312 04:48:57.419942 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.420352 kubelet[2694]: W0312 04:48:57.419964 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.420352 kubelet[2694]: E0312 04:48:57.419982 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.421331 kubelet[2694]: E0312 04:48:57.420355 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.421331 kubelet[2694]: W0312 04:48:57.420370 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.421331 kubelet[2694]: E0312 04:48:57.420386 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.421331 kubelet[2694]: E0312 04:48:57.420719 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.421331 kubelet[2694]: W0312 04:48:57.420734 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.421331 kubelet[2694]: E0312 04:48:57.420750 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.423415 kubelet[2694]: E0312 04:48:57.422532 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.423415 kubelet[2694]: W0312 04:48:57.422551 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.423415 kubelet[2694]: E0312 04:48:57.422567 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.424611 kubelet[2694]: E0312 04:48:57.424064 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.424611 kubelet[2694]: W0312 04:48:57.424088 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.424611 kubelet[2694]: E0312 04:48:57.424107 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.425399 kubelet[2694]: E0312 04:48:57.424814 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.425399 kubelet[2694]: W0312 04:48:57.424842 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.425399 kubelet[2694]: E0312 04:48:57.424860 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.427012 kubelet[2694]: E0312 04:48:57.426142 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.427012 kubelet[2694]: W0312 04:48:57.426164 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.427012 kubelet[2694]: E0312 04:48:57.426180 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.427012 kubelet[2694]: E0312 04:48:57.426873 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.427012 kubelet[2694]: W0312 04:48:57.426888 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.427012 kubelet[2694]: E0312 04:48:57.426918 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.428797 kubelet[2694]: E0312 04:48:57.427990 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.428797 kubelet[2694]: W0312 04:48:57.428007 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.428797 kubelet[2694]: E0312 04:48:57.428023 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.429392 kubelet[2694]: E0312 04:48:57.429199 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.429392 kubelet[2694]: W0312 04:48:57.429215 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.429392 kubelet[2694]: E0312 04:48:57.429231 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.431667 kubelet[2694]: E0312 04:48:57.430215 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.431667 kubelet[2694]: W0312 04:48:57.430235 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.431667 kubelet[2694]: E0312 04:48:57.430253 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.451296 containerd[1510]: time="2026-03-12T04:48:57.450737675Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:48:57.451296 containerd[1510]: time="2026-03-12T04:48:57.450844127Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:48:57.451296 containerd[1510]: time="2026-03-12T04:48:57.450880609Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:48:57.451296 containerd[1510]: time="2026-03-12T04:48:57.451146256Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:48:57.463470 kubelet[2694]: E0312 04:48:57.462446 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:48:57.463470 kubelet[2694]: W0312 04:48:57.462482 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:48:57.463470 kubelet[2694]: E0312 04:48:57.462517 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:48:57.499246 systemd[1]: Started cri-containerd-02b422c290978330f6149a72d111a38271d2d403c669e653fce6e99fc37e3f82.scope - libcontainer container 02b422c290978330f6149a72d111a38271d2d403c669e653fce6e99fc37e3f82. Mar 12 04:48:57.557883 containerd[1510]: time="2026-03-12T04:48:57.556513026Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-f5587448-9q6kj,Uid:7b377add-b2bb-4984-bb57-fdc2d8e7a7ae,Namespace:calico-system,Attempt:0,} returns sandbox id \"8f1df9d52748eb3cd39b3b439107a8955272793429498fd42032b4e6b4a042df\"" Mar 12 04:48:57.584231 containerd[1510]: time="2026-03-12T04:48:57.584143339Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hk2n4,Uid:66efc577-1b61-4931-a1c1-bc59ceeac9cf,Namespace:calico-system,Attempt:0,} returns sandbox id \"02b422c290978330f6149a72d111a38271d2d403c669e653fce6e99fc37e3f82\"" Mar 12 04:48:57.587563 containerd[1510]: time="2026-03-12T04:48:57.587531223Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 12 04:48:58.732525 kubelet[2694]: E0312 04:48:58.731637 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdpsf" podUID="fa7a451e-0708-48d2-97cd-fdb82e83df3d" Mar 12 04:48:59.861211 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3622233392.mount: Deactivated successfully. Mar 12 04:49:00.733711 kubelet[2694]: E0312 04:49:00.731815 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdpsf" podUID="fa7a451e-0708-48d2-97cd-fdb82e83df3d" Mar 12 04:49:00.988816 containerd[1510]: time="2026-03-12T04:49:00.988456631Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:00.990287 containerd[1510]: time="2026-03-12T04:49:00.990023963Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Mar 12 04:49:00.991060 containerd[1510]: time="2026-03-12T04:49:00.990706637Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:01.001876 containerd[1510]: time="2026-03-12T04:49:01.001656913Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:01.004587 containerd[1510]: time="2026-03-12T04:49:01.003483130Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 3.415903038s" Mar 12 04:49:01.004587 containerd[1510]: time="2026-03-12T04:49:01.003534800Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Mar 12 04:49:01.008517 containerd[1510]: time="2026-03-12T04:49:01.008108586Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 12 04:49:01.046674 containerd[1510]: time="2026-03-12T04:49:01.046511866Z" level=info msg="CreateContainer within sandbox \"8f1df9d52748eb3cd39b3b439107a8955272793429498fd42032b4e6b4a042df\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 12 04:49:01.063017 containerd[1510]: time="2026-03-12T04:49:01.062796386Z" level=info msg="CreateContainer within sandbox \"8f1df9d52748eb3cd39b3b439107a8955272793429498fd42032b4e6b4a042df\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"adc2587a534a7080aa1f0db4b1a974706368bf6c1c91e492cff6f2afccb0aa60\"" Mar 12 04:49:01.065463 containerd[1510]: time="2026-03-12T04:49:01.065418141Z" level=info msg="StartContainer for \"adc2587a534a7080aa1f0db4b1a974706368bf6c1c91e492cff6f2afccb0aa60\"" Mar 12 04:49:01.119316 systemd[1]: Started cri-containerd-adc2587a534a7080aa1f0db4b1a974706368bf6c1c91e492cff6f2afccb0aa60.scope - libcontainer container adc2587a534a7080aa1f0db4b1a974706368bf6c1c91e492cff6f2afccb0aa60. Mar 12 04:49:01.197606 containerd[1510]: time="2026-03-12T04:49:01.197309776Z" level=info msg="StartContainer for \"adc2587a534a7080aa1f0db4b1a974706368bf6c1c91e492cff6f2afccb0aa60\" returns successfully" Mar 12 04:49:02.022250 kubelet[2694]: E0312 04:49:02.021932 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.022250 kubelet[2694]: W0312 04:49:02.021978 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.022250 kubelet[2694]: E0312 04:49:02.022015 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.023197 kubelet[2694]: E0312 04:49:02.023141 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.023294 kubelet[2694]: W0312 04:49:02.023202 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.023294 kubelet[2694]: E0312 04:49:02.023223 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.023636 kubelet[2694]: E0312 04:49:02.023595 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.023636 kubelet[2694]: W0312 04:49:02.023628 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.023750 kubelet[2694]: E0312 04:49:02.023647 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.027546 kubelet[2694]: E0312 04:49:02.024159 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.027546 kubelet[2694]: W0312 04:49:02.024187 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.027546 kubelet[2694]: E0312 04:49:02.024228 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.027546 kubelet[2694]: E0312 04:49:02.024920 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.027546 kubelet[2694]: W0312 04:49:02.024937 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.027546 kubelet[2694]: E0312 04:49:02.024954 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.027546 kubelet[2694]: E0312 04:49:02.025341 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.027546 kubelet[2694]: W0312 04:49:02.025356 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.027546 kubelet[2694]: E0312 04:49:02.025372 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.027546 kubelet[2694]: E0312 04:49:02.025710 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.029088 kubelet[2694]: W0312 04:49:02.025725 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.029088 kubelet[2694]: E0312 04:49:02.025740 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.029088 kubelet[2694]: E0312 04:49:02.026047 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.029088 kubelet[2694]: W0312 04:49:02.026063 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.029088 kubelet[2694]: E0312 04:49:02.026078 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.029088 kubelet[2694]: E0312 04:49:02.026457 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.029088 kubelet[2694]: W0312 04:49:02.026473 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.029088 kubelet[2694]: E0312 04:49:02.026488 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.029088 kubelet[2694]: E0312 04:49:02.026785 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.029088 kubelet[2694]: W0312 04:49:02.026800 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.029593 kubelet[2694]: E0312 04:49:02.026815 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.029593 kubelet[2694]: E0312 04:49:02.027138 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.029593 kubelet[2694]: W0312 04:49:02.027153 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.029593 kubelet[2694]: E0312 04:49:02.027176 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.029593 kubelet[2694]: E0312 04:49:02.027446 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.029593 kubelet[2694]: W0312 04:49:02.027461 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.029593 kubelet[2694]: E0312 04:49:02.027475 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.029593 kubelet[2694]: E0312 04:49:02.027751 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.029593 kubelet[2694]: W0312 04:49:02.027766 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.029593 kubelet[2694]: E0312 04:49:02.027780 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.030212 kubelet[2694]: E0312 04:49:02.028088 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.030212 kubelet[2694]: W0312 04:49:02.028103 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.030212 kubelet[2694]: E0312 04:49:02.028117 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.030212 kubelet[2694]: E0312 04:49:02.028476 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.030212 kubelet[2694]: W0312 04:49:02.028492 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.030212 kubelet[2694]: E0312 04:49:02.028508 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.054635 kubelet[2694]: E0312 04:49:02.054580 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.054945 kubelet[2694]: W0312 04:49:02.054622 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.054945 kubelet[2694]: E0312 04:49:02.054685 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.057342 kubelet[2694]: E0312 04:49:02.055127 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.057342 kubelet[2694]: W0312 04:49:02.055151 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.057342 kubelet[2694]: E0312 04:49:02.055169 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.057812 kubelet[2694]: E0312 04:49:02.057788 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.058695 kubelet[2694]: W0312 04:49:02.057919 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.058695 kubelet[2694]: E0312 04:49:02.057982 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.058695 kubelet[2694]: E0312 04:49:02.058538 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.058695 kubelet[2694]: W0312 04:49:02.058564 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.058695 kubelet[2694]: E0312 04:49:02.058583 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.059185 kubelet[2694]: E0312 04:49:02.059165 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.059574 kubelet[2694]: W0312 04:49:02.059311 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.059574 kubelet[2694]: E0312 04:49:02.059333 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.060327 kubelet[2694]: E0312 04:49:02.060288 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.060484 kubelet[2694]: W0312 04:49:02.060460 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.060634 kubelet[2694]: E0312 04:49:02.060612 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.061189 kubelet[2694]: E0312 04:49:02.061170 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.061548 kubelet[2694]: W0312 04:49:02.061289 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.061548 kubelet[2694]: E0312 04:49:02.061315 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.062154 kubelet[2694]: E0312 04:49:02.062134 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.062335 kubelet[2694]: W0312 04:49:02.062267 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.062335 kubelet[2694]: E0312 04:49:02.062292 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.063053 kubelet[2694]: E0312 04:49:02.062885 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.063053 kubelet[2694]: W0312 04:49:02.062904 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.063053 kubelet[2694]: E0312 04:49:02.062920 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.063452 kubelet[2694]: E0312 04:49:02.063297 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.063452 kubelet[2694]: W0312 04:49:02.063316 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.063452 kubelet[2694]: E0312 04:49:02.063352 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.064161 kubelet[2694]: E0312 04:49:02.063947 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.064161 kubelet[2694]: W0312 04:49:02.063966 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.064161 kubelet[2694]: E0312 04:49:02.063982 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.064770 kubelet[2694]: E0312 04:49:02.064514 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.064770 kubelet[2694]: W0312 04:49:02.064528 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.064770 kubelet[2694]: E0312 04:49:02.064544 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.065235 kubelet[2694]: E0312 04:49:02.065181 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.065235 kubelet[2694]: W0312 04:49:02.065195 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.065235 kubelet[2694]: E0312 04:49:02.065210 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.065722 kubelet[2694]: E0312 04:49:02.065685 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.065722 kubelet[2694]: W0312 04:49:02.065713 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.065849 kubelet[2694]: E0312 04:49:02.065731 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.066090 kubelet[2694]: E0312 04:49:02.066019 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.066090 kubelet[2694]: W0312 04:49:02.066074 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.066194 kubelet[2694]: E0312 04:49:02.066094 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.067868 kubelet[2694]: E0312 04:49:02.066486 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.067868 kubelet[2694]: W0312 04:49:02.066503 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.067868 kubelet[2694]: E0312 04:49:02.066520 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.067868 kubelet[2694]: E0312 04:49:02.066806 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.067868 kubelet[2694]: W0312 04:49:02.066820 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.067868 kubelet[2694]: E0312 04:49:02.066835 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.067868 kubelet[2694]: E0312 04:49:02.067623 2694 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:49:02.067868 kubelet[2694]: W0312 04:49:02.067638 2694 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:49:02.067868 kubelet[2694]: E0312 04:49:02.067653 2694 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:49:02.558079 containerd[1510]: time="2026-03-12T04:49:02.557792387Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:02.559253 containerd[1510]: time="2026-03-12T04:49:02.559063397Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Mar 12 04:49:02.560133 containerd[1510]: time="2026-03-12T04:49:02.560094684Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:02.563860 containerd[1510]: time="2026-03-12T04:49:02.563783435Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:02.565069 containerd[1510]: time="2026-03-12T04:49:02.564845794Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 1.556652429s" Mar 12 04:49:02.565214 containerd[1510]: time="2026-03-12T04:49:02.565177011Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Mar 12 04:49:02.573055 containerd[1510]: time="2026-03-12T04:49:02.572914578Z" level=info msg="CreateContainer within sandbox \"02b422c290978330f6149a72d111a38271d2d403c669e653fce6e99fc37e3f82\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 12 04:49:02.600383 containerd[1510]: time="2026-03-12T04:49:02.600211071Z" level=info msg="CreateContainer within sandbox \"02b422c290978330f6149a72d111a38271d2d403c669e653fce6e99fc37e3f82\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"795910d24c1123ad267cdf064c10817e356c96992be33c0cb7dd035d9100ed78\"" Mar 12 04:49:02.603759 containerd[1510]: time="2026-03-12T04:49:02.602306087Z" level=info msg="StartContainer for \"795910d24c1123ad267cdf064c10817e356c96992be33c0cb7dd035d9100ed78\"" Mar 12 04:49:02.657286 systemd[1]: Started cri-containerd-795910d24c1123ad267cdf064c10817e356c96992be33c0cb7dd035d9100ed78.scope - libcontainer container 795910d24c1123ad267cdf064c10817e356c96992be33c0cb7dd035d9100ed78. Mar 12 04:49:02.712534 containerd[1510]: time="2026-03-12T04:49:02.712134601Z" level=info msg="StartContainer for \"795910d24c1123ad267cdf064c10817e356c96992be33c0cb7dd035d9100ed78\" returns successfully" Mar 12 04:49:02.731266 kubelet[2694]: E0312 04:49:02.731200 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdpsf" podUID="fa7a451e-0708-48d2-97cd-fdb82e83df3d" Mar 12 04:49:02.739709 systemd[1]: cri-containerd-795910d24c1123ad267cdf064c10817e356c96992be33c0cb7dd035d9100ed78.scope: Deactivated successfully. Mar 12 04:49:02.815355 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-795910d24c1123ad267cdf064c10817e356c96992be33c0cb7dd035d9100ed78-rootfs.mount: Deactivated successfully. Mar 12 04:49:02.924784 kubelet[2694]: I0312 04:49:02.924733 2694 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 04:49:02.967832 kubelet[2694]: I0312 04:49:02.965655 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-f5587448-9q6kj" podStartSLOduration=3.54639954 podStartE2EDuration="6.965604501s" podCreationTimestamp="2026-03-12 04:48:56 +0000 UTC" firstStartedPulling="2026-03-12 04:48:57.586106524 +0000 UTC m=+24.075044845" lastFinishedPulling="2026-03-12 04:49:01.005311478 +0000 UTC m=+27.494249806" observedRunningTime="2026-03-12 04:49:01.939433185 +0000 UTC m=+28.428371523" watchObservedRunningTime="2026-03-12 04:49:02.965604501 +0000 UTC m=+29.454542830" Mar 12 04:49:03.013621 containerd[1510]: time="2026-03-12T04:49:02.967512779Z" level=info msg="shim disconnected" id=795910d24c1123ad267cdf064c10817e356c96992be33c0cb7dd035d9100ed78 namespace=k8s.io Mar 12 04:49:03.013621 containerd[1510]: time="2026-03-12T04:49:03.013252276Z" level=warning msg="cleaning up after shim disconnected" id=795910d24c1123ad267cdf064c10817e356c96992be33c0cb7dd035d9100ed78 namespace=k8s.io Mar 12 04:49:03.013621 containerd[1510]: time="2026-03-12T04:49:03.013291535Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 12 04:49:03.931461 containerd[1510]: time="2026-03-12T04:49:03.930971370Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 12 04:49:04.732318 kubelet[2694]: E0312 04:49:04.731971 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdpsf" podUID="fa7a451e-0708-48d2-97cd-fdb82e83df3d" Mar 12 04:49:05.674645 kubelet[2694]: I0312 04:49:05.673898 2694 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 04:49:06.731670 kubelet[2694]: E0312 04:49:06.731579 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdpsf" podUID="fa7a451e-0708-48d2-97cd-fdb82e83df3d" Mar 12 04:49:08.731902 kubelet[2694]: E0312 04:49:08.731388 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdpsf" podUID="fa7a451e-0708-48d2-97cd-fdb82e83df3d" Mar 12 04:49:10.731207 kubelet[2694]: E0312 04:49:10.731097 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdpsf" podUID="fa7a451e-0708-48d2-97cd-fdb82e83df3d" Mar 12 04:49:12.732467 kubelet[2694]: E0312 04:49:12.731844 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdpsf" podUID="fa7a451e-0708-48d2-97cd-fdb82e83df3d" Mar 12 04:49:14.132866 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2707536592.mount: Deactivated successfully. Mar 12 04:49:14.189722 containerd[1510]: time="2026-03-12T04:49:14.189450883Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:14.191471 containerd[1510]: time="2026-03-12T04:49:14.191359369Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Mar 12 04:49:14.192779 containerd[1510]: time="2026-03-12T04:49:14.192700018Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:14.196838 containerd[1510]: time="2026-03-12T04:49:14.196694493Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:14.198184 containerd[1510]: time="2026-03-12T04:49:14.197723300Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 10.266687372s" Mar 12 04:49:14.198184 containerd[1510]: time="2026-03-12T04:49:14.197785793Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Mar 12 04:49:14.211940 containerd[1510]: time="2026-03-12T04:49:14.211864727Z" level=info msg="CreateContainer within sandbox \"02b422c290978330f6149a72d111a38271d2d403c669e653fce6e99fc37e3f82\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 12 04:49:14.239707 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2118781741.mount: Deactivated successfully. Mar 12 04:49:14.242437 containerd[1510]: time="2026-03-12T04:49:14.242332234Z" level=info msg="CreateContainer within sandbox \"02b422c290978330f6149a72d111a38271d2d403c669e653fce6e99fc37e3f82\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"54ca03a73b303d01a4ca55b3ec9bc0ada0163d01c58dfa3608640124cc70e2ad\"" Mar 12 04:49:14.246068 containerd[1510]: time="2026-03-12T04:49:14.245288495Z" level=info msg="StartContainer for \"54ca03a73b303d01a4ca55b3ec9bc0ada0163d01c58dfa3608640124cc70e2ad\"" Mar 12 04:49:14.308649 systemd[1]: Started cri-containerd-54ca03a73b303d01a4ca55b3ec9bc0ada0163d01c58dfa3608640124cc70e2ad.scope - libcontainer container 54ca03a73b303d01a4ca55b3ec9bc0ada0163d01c58dfa3608640124cc70e2ad. Mar 12 04:49:14.376436 containerd[1510]: time="2026-03-12T04:49:14.376351496Z" level=info msg="StartContainer for \"54ca03a73b303d01a4ca55b3ec9bc0ada0163d01c58dfa3608640124cc70e2ad\" returns successfully" Mar 12 04:49:14.574493 systemd[1]: cri-containerd-54ca03a73b303d01a4ca55b3ec9bc0ada0163d01c58dfa3608640124cc70e2ad.scope: Deactivated successfully. Mar 12 04:49:14.722156 containerd[1510]: time="2026-03-12T04:49:14.721625719Z" level=info msg="shim disconnected" id=54ca03a73b303d01a4ca55b3ec9bc0ada0163d01c58dfa3608640124cc70e2ad namespace=k8s.io Mar 12 04:49:14.722156 containerd[1510]: time="2026-03-12T04:49:14.721739115Z" level=warning msg="cleaning up after shim disconnected" id=54ca03a73b303d01a4ca55b3ec9bc0ada0163d01c58dfa3608640124cc70e2ad namespace=k8s.io Mar 12 04:49:14.722156 containerd[1510]: time="2026-03-12T04:49:14.721760550Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 12 04:49:14.731759 kubelet[2694]: E0312 04:49:14.731595 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdpsf" podUID="fa7a451e-0708-48d2-97cd-fdb82e83df3d" Mar 12 04:49:14.975255 containerd[1510]: time="2026-03-12T04:49:14.973865114Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 12 04:49:15.132507 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-54ca03a73b303d01a4ca55b3ec9bc0ada0163d01c58dfa3608640124cc70e2ad-rootfs.mount: Deactivated successfully. Mar 12 04:49:16.732611 kubelet[2694]: E0312 04:49:16.731978 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdpsf" podUID="fa7a451e-0708-48d2-97cd-fdb82e83df3d" Mar 12 04:49:18.731372 kubelet[2694]: E0312 04:49:18.731257 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdpsf" podUID="fa7a451e-0708-48d2-97cd-fdb82e83df3d" Mar 12 04:49:19.653624 containerd[1510]: time="2026-03-12T04:49:19.653532294Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:19.656397 containerd[1510]: time="2026-03-12T04:49:19.656343863Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Mar 12 04:49:19.658103 containerd[1510]: time="2026-03-12T04:49:19.657753520Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:19.660680 containerd[1510]: time="2026-03-12T04:49:19.660618358Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:19.662199 containerd[1510]: time="2026-03-12T04:49:19.661986027Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 4.686881605s" Mar 12 04:49:19.662199 containerd[1510]: time="2026-03-12T04:49:19.662046865Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Mar 12 04:49:19.670368 containerd[1510]: time="2026-03-12T04:49:19.670320763Z" level=info msg="CreateContainer within sandbox \"02b422c290978330f6149a72d111a38271d2d403c669e653fce6e99fc37e3f82\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 12 04:49:19.695653 containerd[1510]: time="2026-03-12T04:49:19.695571060Z" level=info msg="CreateContainer within sandbox \"02b422c290978330f6149a72d111a38271d2d403c669e653fce6e99fc37e3f82\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"12ef8b6630962397fea87aaca23fb8b4afa51b25a3cc9489ce2fdae9d46cd8f6\"" Mar 12 04:49:19.698856 containerd[1510]: time="2026-03-12T04:49:19.698733388Z" level=info msg="StartContainer for \"12ef8b6630962397fea87aaca23fb8b4afa51b25a3cc9489ce2fdae9d46cd8f6\"" Mar 12 04:49:19.759359 systemd[1]: Started cri-containerd-12ef8b6630962397fea87aaca23fb8b4afa51b25a3cc9489ce2fdae9d46cd8f6.scope - libcontainer container 12ef8b6630962397fea87aaca23fb8b4afa51b25a3cc9489ce2fdae9d46cd8f6. Mar 12 04:49:19.821785 containerd[1510]: time="2026-03-12T04:49:19.821655507Z" level=info msg="StartContainer for \"12ef8b6630962397fea87aaca23fb8b4afa51b25a3cc9489ce2fdae9d46cd8f6\" returns successfully" Mar 12 04:49:20.731822 kubelet[2694]: E0312 04:49:20.731666 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xdpsf" podUID="fa7a451e-0708-48d2-97cd-fdb82e83df3d" Mar 12 04:49:20.816588 systemd[1]: cri-containerd-12ef8b6630962397fea87aaca23fb8b4afa51b25a3cc9489ce2fdae9d46cd8f6.scope: Deactivated successfully. Mar 12 04:49:20.858535 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-12ef8b6630962397fea87aaca23fb8b4afa51b25a3cc9489ce2fdae9d46cd8f6-rootfs.mount: Deactivated successfully. Mar 12 04:49:20.936371 kubelet[2694]: I0312 04:49:20.878163 2694 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Mar 12 04:49:20.941068 containerd[1510]: time="2026-03-12T04:49:20.940962261Z" level=info msg="shim disconnected" id=12ef8b6630962397fea87aaca23fb8b4afa51b25a3cc9489ce2fdae9d46cd8f6 namespace=k8s.io Mar 12 04:49:20.942309 containerd[1510]: time="2026-03-12T04:49:20.941068421Z" level=warning msg="cleaning up after shim disconnected" id=12ef8b6630962397fea87aaca23fb8b4afa51b25a3cc9489ce2fdae9d46cd8f6 namespace=k8s.io Mar 12 04:49:20.942309 containerd[1510]: time="2026-03-12T04:49:20.941088054Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 12 04:49:21.036237 systemd[1]: Created slice kubepods-burstable-poddd0fb06c_f33e_47f9_b47a_58a6d0dd5bf9.slice - libcontainer container kubepods-burstable-poddd0fb06c_f33e_47f9_b47a_58a6d0dd5bf9.slice. Mar 12 04:49:21.084525 systemd[1]: Created slice kubepods-besteffort-pod5d03062c_e112_45ee_991a_aaa27fc24b6c.slice - libcontainer container kubepods-besteffort-pod5d03062c_e112_45ee_991a_aaa27fc24b6c.slice. Mar 12 04:49:21.087114 containerd[1510]: time="2026-03-12T04:49:21.087025396Z" level=info msg="CreateContainer within sandbox \"02b422c290978330f6149a72d111a38271d2d403c669e653fce6e99fc37e3f82\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 12 04:49:21.100783 systemd[1]: Created slice kubepods-burstable-podb9bb1a7d_4427_4ab5_9819_9fb3c34a6da8.slice - libcontainer container kubepods-burstable-podb9bb1a7d_4427_4ab5_9819_9fb3c34a6da8.slice. Mar 12 04:49:21.109281 kubelet[2694]: I0312 04:49:21.109010 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9bb1a7d-4427-4ab5-9819-9fb3c34a6da8-config-volume\") pod \"coredns-66bc5c9577-tpm4s\" (UID: \"b9bb1a7d-4427-4ab5-9819-9fb3c34a6da8\") " pod="kube-system/coredns-66bc5c9577-tpm4s" Mar 12 04:49:21.109281 kubelet[2694]: I0312 04:49:21.109089 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n2jj\" (UniqueName: \"kubernetes.io/projected/2e788078-9f19-4645-87ab-03575ffe2f01-kube-api-access-5n2jj\") pod \"calico-apiserver-6fcb6485c6-4gx67\" (UID: \"2e788078-9f19-4645-87ab-03575ffe2f01\") " pod="calico-system/calico-apiserver-6fcb6485c6-4gx67" Mar 12 04:49:21.110074 kubelet[2694]: I0312 04:49:21.109733 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crzkx\" (UniqueName: \"kubernetes.io/projected/2d76263c-902f-4005-a7fd-9d164ca77df9-kube-api-access-crzkx\") pod \"goldmane-cccfbd5cf-fn29j\" (UID: \"2d76263c-902f-4005-a7fd-9d164ca77df9\") " pod="calico-system/goldmane-cccfbd5cf-fn29j" Mar 12 04:49:21.110074 kubelet[2694]: I0312 04:49:21.109793 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59gk6\" (UniqueName: \"kubernetes.io/projected/020ffbbb-c522-41bc-be16-1670cd72b7a8-kube-api-access-59gk6\") pod \"whisker-6c56c56549-c7w56\" (UID: \"020ffbbb-c522-41bc-be16-1670cd72b7a8\") " pod="calico-system/whisker-6c56c56549-c7w56" Mar 12 04:49:21.110074 kubelet[2694]: I0312 04:49:21.109895 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/020ffbbb-c522-41bc-be16-1670cd72b7a8-whisker-backend-key-pair\") pod \"whisker-6c56c56549-c7w56\" (UID: \"020ffbbb-c522-41bc-be16-1670cd72b7a8\") " pod="calico-system/whisker-6c56c56549-c7w56" Mar 12 04:49:21.110074 kubelet[2694]: I0312 04:49:21.109946 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd0fb06c-f33e-47f9-b47a-58a6d0dd5bf9-config-volume\") pod \"coredns-66bc5c9577-wd9b5\" (UID: \"dd0fb06c-f33e-47f9-b47a-58a6d0dd5bf9\") " pod="kube-system/coredns-66bc5c9577-wd9b5" Mar 12 04:49:21.110074 kubelet[2694]: I0312 04:49:21.109983 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/2e788078-9f19-4645-87ab-03575ffe2f01-calico-apiserver-certs\") pod \"calico-apiserver-6fcb6485c6-4gx67\" (UID: \"2e788078-9f19-4645-87ab-03575ffe2f01\") " pod="calico-system/calico-apiserver-6fcb6485c6-4gx67" Mar 12 04:49:21.110319 kubelet[2694]: I0312 04:49:21.110012 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d76263c-902f-4005-a7fd-9d164ca77df9-config\") pod \"goldmane-cccfbd5cf-fn29j\" (UID: \"2d76263c-902f-4005-a7fd-9d164ca77df9\") " pod="calico-system/goldmane-cccfbd5cf-fn29j" Mar 12 04:49:21.110319 kubelet[2694]: I0312 04:49:21.110060 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5847ada8-9805-4f41-8520-9af174205760-calico-apiserver-certs\") pod \"calico-apiserver-6fcb6485c6-mlm75\" (UID: \"5847ada8-9805-4f41-8520-9af174205760\") " pod="calico-system/calico-apiserver-6fcb6485c6-mlm75" Mar 12 04:49:21.110319 kubelet[2694]: I0312 04:49:21.110095 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d03062c-e112-45ee-991a-aaa27fc24b6c-tigera-ca-bundle\") pod \"calico-kube-controllers-98bbf8757-vp827\" (UID: \"5d03062c-e112-45ee-991a-aaa27fc24b6c\") " pod="calico-system/calico-kube-controllers-98bbf8757-vp827" Mar 12 04:49:21.110319 kubelet[2694]: I0312 04:49:21.110127 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mqnt\" (UniqueName: \"kubernetes.io/projected/5d03062c-e112-45ee-991a-aaa27fc24b6c-kube-api-access-6mqnt\") pod \"calico-kube-controllers-98bbf8757-vp827\" (UID: \"5d03062c-e112-45ee-991a-aaa27fc24b6c\") " pod="calico-system/calico-kube-controllers-98bbf8757-vp827" Mar 12 04:49:21.110319 kubelet[2694]: I0312 04:49:21.110173 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nltwz\" (UniqueName: \"kubernetes.io/projected/b9bb1a7d-4427-4ab5-9819-9fb3c34a6da8-kube-api-access-nltwz\") pod \"coredns-66bc5c9577-tpm4s\" (UID: \"b9bb1a7d-4427-4ab5-9819-9fb3c34a6da8\") " pod="kube-system/coredns-66bc5c9577-tpm4s" Mar 12 04:49:21.110570 kubelet[2694]: I0312 04:49:21.110211 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/2d76263c-902f-4005-a7fd-9d164ca77df9-goldmane-key-pair\") pod \"goldmane-cccfbd5cf-fn29j\" (UID: \"2d76263c-902f-4005-a7fd-9d164ca77df9\") " pod="calico-system/goldmane-cccfbd5cf-fn29j" Mar 12 04:49:21.110570 kubelet[2694]: I0312 04:49:21.110256 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5mw7\" (UniqueName: \"kubernetes.io/projected/dd0fb06c-f33e-47f9-b47a-58a6d0dd5bf9-kube-api-access-v5mw7\") pod \"coredns-66bc5c9577-wd9b5\" (UID: \"dd0fb06c-f33e-47f9-b47a-58a6d0dd5bf9\") " pod="kube-system/coredns-66bc5c9577-wd9b5" Mar 12 04:49:21.110570 kubelet[2694]: I0312 04:49:21.110316 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gqqp\" (UniqueName: \"kubernetes.io/projected/5847ada8-9805-4f41-8520-9af174205760-kube-api-access-8gqqp\") pod \"calico-apiserver-6fcb6485c6-mlm75\" (UID: \"5847ada8-9805-4f41-8520-9af174205760\") " pod="calico-system/calico-apiserver-6fcb6485c6-mlm75" Mar 12 04:49:21.110570 kubelet[2694]: I0312 04:49:21.110346 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d76263c-902f-4005-a7fd-9d164ca77df9-goldmane-ca-bundle\") pod \"goldmane-cccfbd5cf-fn29j\" (UID: \"2d76263c-902f-4005-a7fd-9d164ca77df9\") " pod="calico-system/goldmane-cccfbd5cf-fn29j" Mar 12 04:49:21.110570 kubelet[2694]: I0312 04:49:21.110374 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/020ffbbb-c522-41bc-be16-1670cd72b7a8-nginx-config\") pod \"whisker-6c56c56549-c7w56\" (UID: \"020ffbbb-c522-41bc-be16-1670cd72b7a8\") " pod="calico-system/whisker-6c56c56549-c7w56" Mar 12 04:49:21.110814 kubelet[2694]: I0312 04:49:21.110424 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/020ffbbb-c522-41bc-be16-1670cd72b7a8-whisker-ca-bundle\") pod \"whisker-6c56c56549-c7w56\" (UID: \"020ffbbb-c522-41bc-be16-1670cd72b7a8\") " pod="calico-system/whisker-6c56c56549-c7w56" Mar 12 04:49:21.118709 containerd[1510]: time="2026-03-12T04:49:21.118249168Z" level=info msg="CreateContainer within sandbox \"02b422c290978330f6149a72d111a38271d2d403c669e653fce6e99fc37e3f82\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"cf01efacb05384b922df5f15da45ba93bb3a162a3789a53c1e10458be67b3402\"" Mar 12 04:49:21.121928 containerd[1510]: time="2026-03-12T04:49:21.119886240Z" level=info msg="StartContainer for \"cf01efacb05384b922df5f15da45ba93bb3a162a3789a53c1e10458be67b3402\"" Mar 12 04:49:21.127493 systemd[1]: Created slice kubepods-besteffort-pod5847ada8_9805_4f41_8520_9af174205760.slice - libcontainer container kubepods-besteffort-pod5847ada8_9805_4f41_8520_9af174205760.slice. Mar 12 04:49:21.145210 systemd[1]: Created slice kubepods-besteffort-pod020ffbbb_c522_41bc_be16_1670cd72b7a8.slice - libcontainer container kubepods-besteffort-pod020ffbbb_c522_41bc_be16_1670cd72b7a8.slice. Mar 12 04:49:21.158483 systemd[1]: Created slice kubepods-besteffort-pod2e788078_9f19_4645_87ab_03575ffe2f01.slice - libcontainer container kubepods-besteffort-pod2e788078_9f19_4645_87ab_03575ffe2f01.slice. Mar 12 04:49:21.177656 systemd[1]: Created slice kubepods-besteffort-pod2d76263c_902f_4005_a7fd_9d164ca77df9.slice - libcontainer container kubepods-besteffort-pod2d76263c_902f_4005_a7fd_9d164ca77df9.slice. Mar 12 04:49:21.219286 systemd[1]: Started cri-containerd-cf01efacb05384b922df5f15da45ba93bb3a162a3789a53c1e10458be67b3402.scope - libcontainer container cf01efacb05384b922df5f15da45ba93bb3a162a3789a53c1e10458be67b3402. Mar 12 04:49:21.373578 containerd[1510]: time="2026-03-12T04:49:21.373371577Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-wd9b5,Uid:dd0fb06c-f33e-47f9-b47a-58a6d0dd5bf9,Namespace:kube-system,Attempt:0,}" Mar 12 04:49:21.385226 containerd[1510]: time="2026-03-12T04:49:21.385167184Z" level=info msg="StartContainer for \"cf01efacb05384b922df5f15da45ba93bb3a162a3789a53c1e10458be67b3402\" returns successfully" Mar 12 04:49:21.402116 containerd[1510]: time="2026-03-12T04:49:21.401449031Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-98bbf8757-vp827,Uid:5d03062c-e112-45ee-991a-aaa27fc24b6c,Namespace:calico-system,Attempt:0,}" Mar 12 04:49:21.432322 containerd[1510]: time="2026-03-12T04:49:21.431773192Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-tpm4s,Uid:b9bb1a7d-4427-4ab5-9819-9fb3c34a6da8,Namespace:kube-system,Attempt:0,}" Mar 12 04:49:21.451855 containerd[1510]: time="2026-03-12T04:49:21.451188834Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6fcb6485c6-mlm75,Uid:5847ada8-9805-4f41-8520-9af174205760,Namespace:calico-system,Attempt:0,}" Mar 12 04:49:21.458293 containerd[1510]: time="2026-03-12T04:49:21.458247864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6c56c56549-c7w56,Uid:020ffbbb-c522-41bc-be16-1670cd72b7a8,Namespace:calico-system,Attempt:0,}" Mar 12 04:49:21.491201 containerd[1510]: time="2026-03-12T04:49:21.491149827Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6fcb6485c6-4gx67,Uid:2e788078-9f19-4645-87ab-03575ffe2f01,Namespace:calico-system,Attempt:0,}" Mar 12 04:49:21.500367 containerd[1510]: time="2026-03-12T04:49:21.500316517Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-fn29j,Uid:2d76263c-902f-4005-a7fd-9d164ca77df9,Namespace:calico-system,Attempt:0,}" Mar 12 04:49:22.103780 kubelet[2694]: I0312 04:49:22.103672 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-hk2n4" podStartSLOduration=3.011293049 podStartE2EDuration="25.088112422s" podCreationTimestamp="2026-03-12 04:48:57 +0000 UTC" firstStartedPulling="2026-03-12 04:48:57.58707723 +0000 UTC m=+24.076015546" lastFinishedPulling="2026-03-12 04:49:19.66389659 +0000 UTC m=+46.152834919" observedRunningTime="2026-03-12 04:49:22.087201173 +0000 UTC m=+48.576139515" watchObservedRunningTime="2026-03-12 04:49:22.088112422 +0000 UTC m=+48.577050752" Mar 12 04:49:22.240214 containerd[1510]: time="2026-03-12T04:49:22.239088046Z" level=error msg="Failed to destroy network for sandbox \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.240214 containerd[1510]: time="2026-03-12T04:49:22.240024085Z" level=error msg="Failed to destroy network for sandbox \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.244397 containerd[1510]: time="2026-03-12T04:49:22.244342271Z" level=error msg="Failed to destroy network for sandbox \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.247748 containerd[1510]: time="2026-03-12T04:49:22.244964739Z" level=error msg="encountered an error cleaning up failed sandbox \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.247978 containerd[1510]: time="2026-03-12T04:49:22.247928283Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6c56c56549-c7w56,Uid:020ffbbb-c522-41bc-be16-1670cd72b7a8,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.250003 containerd[1510]: time="2026-03-12T04:49:22.247335893Z" level=error msg="Failed to destroy network for sandbox \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.249883 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13-shm.mount: Deactivated successfully. Mar 12 04:49:22.255216 containerd[1510]: time="2026-03-12T04:49:22.254740931Z" level=error msg="encountered an error cleaning up failed sandbox \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.255216 containerd[1510]: time="2026-03-12T04:49:22.254861498Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-fn29j,Uid:2d76263c-902f-4005-a7fd-9d164ca77df9,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.262708 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6-shm.mount: Deactivated successfully. Mar 12 04:49:22.262890 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159-shm.mount: Deactivated successfully. Mar 12 04:49:22.270431 containerd[1510]: time="2026-03-12T04:49:22.270378574Z" level=error msg="encountered an error cleaning up failed sandbox \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.270642 containerd[1510]: time="2026-03-12T04:49:22.270593308Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6fcb6485c6-4gx67,Uid:2e788078-9f19-4645-87ab-03575ffe2f01,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.273106 containerd[1510]: time="2026-03-12T04:49:22.271766928Z" level=error msg="encountered an error cleaning up failed sandbox \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.273609 containerd[1510]: time="2026-03-12T04:49:22.273295658Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-wd9b5,Uid:dd0fb06c-f33e-47f9-b47a-58a6d0dd5bf9,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.279759 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf-shm.mount: Deactivated successfully. Mar 12 04:49:22.284862 kubelet[2694]: E0312 04:49:22.283265 2694 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.284862 kubelet[2694]: E0312 04:49:22.284204 2694 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.284862 kubelet[2694]: E0312 04:49:22.284397 2694 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.284862 kubelet[2694]: E0312 04:49:22.284484 2694 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.300995 kubelet[2694]: E0312 04:49:22.300905 2694 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6c56c56549-c7w56" Mar 12 04:49:22.301933 kubelet[2694]: E0312 04:49:22.301281 2694 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6c56c56549-c7w56" Mar 12 04:49:22.301933 kubelet[2694]: E0312 04:49:22.300974 2694 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-wd9b5" Mar 12 04:49:22.301933 kubelet[2694]: E0312 04:49:22.301424 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6c56c56549-c7w56_calico-system(020ffbbb-c522-41bc-be16-1670cd72b7a8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6c56c56549-c7w56_calico-system(020ffbbb-c522-41bc-be16-1670cd72b7a8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6c56c56549-c7w56" podUID="020ffbbb-c522-41bc-be16-1670cd72b7a8" Mar 12 04:49:22.302303 kubelet[2694]: E0312 04:49:22.301099 2694 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6fcb6485c6-4gx67" Mar 12 04:49:22.302303 kubelet[2694]: E0312 04:49:22.301515 2694 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6fcb6485c6-4gx67" Mar 12 04:49:22.302303 kubelet[2694]: E0312 04:49:22.301598 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6fcb6485c6-4gx67_calico-system(2e788078-9f19-4645-87ab-03575ffe2f01)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6fcb6485c6-4gx67_calico-system(2e788078-9f19-4645-87ab-03575ffe2f01)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-6fcb6485c6-4gx67" podUID="2e788078-9f19-4645-87ab-03575ffe2f01" Mar 12 04:49:22.302509 kubelet[2694]: E0312 04:49:22.301454 2694 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-wd9b5" Mar 12 04:49:22.302509 kubelet[2694]: E0312 04:49:22.301672 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-wd9b5_kube-system(dd0fb06c-f33e-47f9-b47a-58a6d0dd5bf9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-wd9b5_kube-system(dd0fb06c-f33e-47f9-b47a-58a6d0dd5bf9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-wd9b5" podUID="dd0fb06c-f33e-47f9-b47a-58a6d0dd5bf9" Mar 12 04:49:22.302509 kubelet[2694]: E0312 04:49:22.301175 2694 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-fn29j" Mar 12 04:49:22.302684 kubelet[2694]: E0312 04:49:22.301725 2694 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-fn29j" Mar 12 04:49:22.302684 kubelet[2694]: E0312 04:49:22.301775 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-cccfbd5cf-fn29j_calico-system(2d76263c-902f-4005-a7fd-9d164ca77df9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-cccfbd5cf-fn29j_calico-system(2d76263c-902f-4005-a7fd-9d164ca77df9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-cccfbd5cf-fn29j" podUID="2d76263c-902f-4005-a7fd-9d164ca77df9" Mar 12 04:49:22.331532 containerd[1510]: time="2026-03-12T04:49:22.329689397Z" level=error msg="Failed to destroy network for sandbox \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.332361 containerd[1510]: time="2026-03-12T04:49:22.332305205Z" level=error msg="encountered an error cleaning up failed sandbox \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.333217 containerd[1510]: time="2026-03-12T04:49:22.332504015Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6fcb6485c6-mlm75,Uid:5847ada8-9805-4f41-8520-9af174205760,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.334685 kubelet[2694]: E0312 04:49:22.334420 2694 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.334685 kubelet[2694]: E0312 04:49:22.334494 2694 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6fcb6485c6-mlm75" Mar 12 04:49:22.334685 kubelet[2694]: E0312 04:49:22.334526 2694 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6fcb6485c6-mlm75" Mar 12 04:49:22.334898 kubelet[2694]: E0312 04:49:22.334616 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6fcb6485c6-mlm75_calico-system(5847ada8-9805-4f41-8520-9af174205760)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6fcb6485c6-mlm75_calico-system(5847ada8-9805-4f41-8520-9af174205760)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-6fcb6485c6-mlm75" podUID="5847ada8-9805-4f41-8520-9af174205760" Mar 12 04:49:22.337858 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117-shm.mount: Deactivated successfully. Mar 12 04:49:22.347075 containerd[1510]: time="2026-03-12T04:49:22.346548748Z" level=error msg="Failed to destroy network for sandbox \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.347836 containerd[1510]: time="2026-03-12T04:49:22.346899233Z" level=error msg="Failed to destroy network for sandbox \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.349738 containerd[1510]: time="2026-03-12T04:49:22.349519393Z" level=error msg="encountered an error cleaning up failed sandbox \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.349738 containerd[1510]: time="2026-03-12T04:49:22.349609849Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-98bbf8757-vp827,Uid:5d03062c-e112-45ee-991a-aaa27fc24b6c,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.350658 containerd[1510]: time="2026-03-12T04:49:22.350236190Z" level=error msg="encountered an error cleaning up failed sandbox \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.350658 containerd[1510]: time="2026-03-12T04:49:22.350299165Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-tpm4s,Uid:b9bb1a7d-4427-4ab5-9819-9fb3c34a6da8,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.350802 kubelet[2694]: E0312 04:49:22.350295 2694 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.350802 kubelet[2694]: E0312 04:49:22.350393 2694 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-98bbf8757-vp827" Mar 12 04:49:22.350802 kubelet[2694]: E0312 04:49:22.350436 2694 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-98bbf8757-vp827" Mar 12 04:49:22.350989 kubelet[2694]: E0312 04:49:22.350535 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-98bbf8757-vp827_calico-system(5d03062c-e112-45ee-991a-aaa27fc24b6c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-98bbf8757-vp827_calico-system(5d03062c-e112-45ee-991a-aaa27fc24b6c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-98bbf8757-vp827" podUID="5d03062c-e112-45ee-991a-aaa27fc24b6c" Mar 12 04:49:22.350989 kubelet[2694]: E0312 04:49:22.350664 2694 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:49:22.350989 kubelet[2694]: E0312 04:49:22.350700 2694 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-tpm4s" Mar 12 04:49:22.352309 kubelet[2694]: E0312 04:49:22.350727 2694 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-tpm4s" Mar 12 04:49:22.352309 kubelet[2694]: E0312 04:49:22.350773 2694 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-tpm4s_kube-system(b9bb1a7d-4427-4ab5-9819-9fb3c34a6da8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-tpm4s_kube-system(b9bb1a7d-4427-4ab5-9819-9fb3c34a6da8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-tpm4s" podUID="b9bb1a7d-4427-4ab5-9819-9fb3c34a6da8" Mar 12 04:49:22.740224 systemd[1]: Created slice kubepods-besteffort-podfa7a451e_0708_48d2_97cd_fdb82e83df3d.slice - libcontainer container kubepods-besteffort-podfa7a451e_0708_48d2_97cd_fdb82e83df3d.slice. Mar 12 04:49:22.747370 containerd[1510]: time="2026-03-12T04:49:22.747314303Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xdpsf,Uid:fa7a451e-0708-48d2-97cd-fdb82e83df3d,Namespace:calico-system,Attempt:0,}" Mar 12 04:49:22.862511 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8-shm.mount: Deactivated successfully. Mar 12 04:49:22.862941 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9-shm.mount: Deactivated successfully. Mar 12 04:49:23.022386 systemd-networkd[1425]: cali4fea10897ea: Link UP Mar 12 04:49:23.024309 systemd-networkd[1425]: cali4fea10897ea: Gained carrier Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.792 [ERROR][3826] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.840 [INFO][3826] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--1ee83.gb1.brightbox.com-k8s-csi--node--driver--xdpsf-eth0 csi-node-driver- calico-system fa7a451e-0708-48d2-97cd-fdb82e83df3d 743 0 2026-03-12 04:48:57 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:98cbb5577 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-1ee83.gb1.brightbox.com csi-node-driver-xdpsf eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali4fea10897ea [] [] }} ContainerID="20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" Namespace="calico-system" Pod="csi-node-driver-xdpsf" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-csi--node--driver--xdpsf-" Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.840 [INFO][3826] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" Namespace="calico-system" Pod="csi-node-driver-xdpsf" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-csi--node--driver--xdpsf-eth0" Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.914 [INFO][3837] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" HandleID="k8s-pod-network.20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" Workload="srv--1ee83.gb1.brightbox.com-k8s-csi--node--driver--xdpsf-eth0" Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.925 [INFO][3837] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" HandleID="k8s-pod-network.20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" Workload="srv--1ee83.gb1.brightbox.com-k8s-csi--node--driver--xdpsf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0005e8350), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-1ee83.gb1.brightbox.com", "pod":"csi-node-driver-xdpsf", "timestamp":"2026-03-12 04:49:22.914226378 +0000 UTC"}, Hostname:"srv-1ee83.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0002dcf20)} Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.925 [INFO][3837] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.925 [INFO][3837] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.925 [INFO][3837] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-1ee83.gb1.brightbox.com' Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.930 [INFO][3837] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.942 [INFO][3837] ipam/ipam.go 409: Looking up existing affinities for host host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.948 [INFO][3837] ipam/ipam.go 526: Trying affinity for 192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.951 [INFO][3837] ipam/ipam.go 160: Attempting to load block cidr=192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.954 [INFO][3837] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.954 [INFO][3837] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.54.128/26 handle="k8s-pod-network.20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.957 [INFO][3837] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.964 [INFO][3837] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.54.128/26 handle="k8s-pod-network.20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.972 [INFO][3837] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.54.129/26] block=192.168.54.128/26 handle="k8s-pod-network.20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.972 [INFO][3837] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.54.129/26] handle="k8s-pod-network.20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.972 [INFO][3837] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:23.044312 containerd[1510]: 2026-03-12 04:49:22.972 [INFO][3837] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.54.129/26] IPv6=[] ContainerID="20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" HandleID="k8s-pod-network.20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" Workload="srv--1ee83.gb1.brightbox.com-k8s-csi--node--driver--xdpsf-eth0" Mar 12 04:49:23.048339 containerd[1510]: 2026-03-12 04:49:22.977 [INFO][3826] cni-plugin/k8s.go 418: Populated endpoint ContainerID="20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" Namespace="calico-system" Pod="csi-node-driver-xdpsf" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-csi--node--driver--xdpsf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-csi--node--driver--xdpsf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"fa7a451e-0708-48d2-97cd-fdb82e83df3d", ResourceVersion:"743", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-xdpsf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.54.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4fea10897ea", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:23.048339 containerd[1510]: 2026-03-12 04:49:22.978 [INFO][3826] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.54.129/32] ContainerID="20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" Namespace="calico-system" Pod="csi-node-driver-xdpsf" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-csi--node--driver--xdpsf-eth0" Mar 12 04:49:23.048339 containerd[1510]: 2026-03-12 04:49:22.978 [INFO][3826] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4fea10897ea ContainerID="20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" Namespace="calico-system" Pod="csi-node-driver-xdpsf" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-csi--node--driver--xdpsf-eth0" Mar 12 04:49:23.048339 containerd[1510]: 2026-03-12 04:49:23.021 [INFO][3826] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" Namespace="calico-system" Pod="csi-node-driver-xdpsf" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-csi--node--driver--xdpsf-eth0" Mar 12 04:49:23.048339 containerd[1510]: 2026-03-12 04:49:23.022 [INFO][3826] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" Namespace="calico-system" Pod="csi-node-driver-xdpsf" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-csi--node--driver--xdpsf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-csi--node--driver--xdpsf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"fa7a451e-0708-48d2-97cd-fdb82e83df3d", ResourceVersion:"743", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a", Pod:"csi-node-driver-xdpsf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.54.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4fea10897ea", MAC:"d6:71:ce:2e:70:ab", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:23.048339 containerd[1510]: 2026-03-12 04:49:23.038 [INFO][3826] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a" Namespace="calico-system" Pod="csi-node-driver-xdpsf" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-csi--node--driver--xdpsf-eth0" Mar 12 04:49:23.067914 kubelet[2694]: I0312 04:49:23.067549 2694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Mar 12 04:49:23.069986 kubelet[2694]: I0312 04:49:23.069845 2694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Mar 12 04:49:23.082120 kubelet[2694]: I0312 04:49:23.080921 2694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Mar 12 04:49:23.086332 kubelet[2694]: I0312 04:49:23.086302 2694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Mar 12 04:49:23.092963 kubelet[2694]: I0312 04:49:23.092791 2694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Mar 12 04:49:23.099402 kubelet[2694]: I0312 04:49:23.099243 2694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Mar 12 04:49:23.112065 kubelet[2694]: I0312 04:49:23.109426 2694 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Mar 12 04:49:23.164007 containerd[1510]: time="2026-03-12T04:49:23.135589800Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:49:23.164007 containerd[1510]: time="2026-03-12T04:49:23.135756836Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:49:23.164007 containerd[1510]: time="2026-03-12T04:49:23.135796510Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:49:23.164007 containerd[1510]: time="2026-03-12T04:49:23.135947234Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:49:23.184106 containerd[1510]: time="2026-03-12T04:49:23.179861129Z" level=info msg="StopPodSandbox for \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\"" Mar 12 04:49:23.196771 containerd[1510]: time="2026-03-12T04:49:23.195677599Z" level=info msg="StopPodSandbox for \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\"" Mar 12 04:49:23.196771 containerd[1510]: time="2026-03-12T04:49:23.195992111Z" level=info msg="Ensure that sandbox 52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9 in task-service has been cleanup successfully" Mar 12 04:49:23.201620 containerd[1510]: time="2026-03-12T04:49:23.201158854Z" level=info msg="StopPodSandbox for \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\"" Mar 12 04:49:23.201620 containerd[1510]: time="2026-03-12T04:49:23.201460199Z" level=info msg="Ensure that sandbox 55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf in task-service has been cleanup successfully" Mar 12 04:49:23.205142 containerd[1510]: time="2026-03-12T04:49:23.205102047Z" level=info msg="StopPodSandbox for \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\"" Mar 12 04:49:23.212511 systemd[1]: run-containerd-runc-k8s.io-20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a-runc.MQHp5Q.mount: Deactivated successfully. Mar 12 04:49:23.214505 containerd[1510]: time="2026-03-12T04:49:23.205325494Z" level=info msg="StopPodSandbox for \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\"" Mar 12 04:49:23.216565 containerd[1510]: time="2026-03-12T04:49:23.215513048Z" level=info msg="Ensure that sandbox 993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117 in task-service has been cleanup successfully" Mar 12 04:49:23.219792 containerd[1510]: time="2026-03-12T04:49:23.210109016Z" level=info msg="StopPodSandbox for \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\"" Mar 12 04:49:23.219792 containerd[1510]: time="2026-03-12T04:49:23.219380434Z" level=info msg="Ensure that sandbox 4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13 in task-service has been cleanup successfully" Mar 12 04:49:23.221500 containerd[1510]: time="2026-03-12T04:49:23.210147727Z" level=info msg="StopPodSandbox for \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\"" Mar 12 04:49:23.221808 containerd[1510]: time="2026-03-12T04:49:23.221777422Z" level=info msg="Ensure that sandbox 16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159 in task-service has been cleanup successfully" Mar 12 04:49:23.223420 systemd[1]: Started cri-containerd-20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a.scope - libcontainer container 20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a. Mar 12 04:49:23.224566 containerd[1510]: time="2026-03-12T04:49:23.210207334Z" level=info msg="Ensure that sandbox 8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8 in task-service has been cleanup successfully" Mar 12 04:49:23.240668 containerd[1510]: time="2026-03-12T04:49:23.237369835Z" level=info msg="Ensure that sandbox 447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6 in task-service has been cleanup successfully" Mar 12 04:49:23.435253 containerd[1510]: time="2026-03-12T04:49:23.435167800Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xdpsf,Uid:fa7a451e-0708-48d2-97cd-fdb82e83df3d,Namespace:calico-system,Attempt:0,} returns sandbox id \"20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a\"" Mar 12 04:49:23.458114 containerd[1510]: time="2026-03-12T04:49:23.458054839Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 12 04:49:23.680562 containerd[1510]: 2026-03-12 04:49:23.378 [INFO][3942] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Mar 12 04:49:23.680562 containerd[1510]: 2026-03-12 04:49:23.378 [INFO][3942] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" iface="eth0" netns="/var/run/netns/cni-cbf38194-43c8-c524-6c4c-ad93037ff1ab" Mar 12 04:49:23.680562 containerd[1510]: 2026-03-12 04:49:23.384 [INFO][3942] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" iface="eth0" netns="/var/run/netns/cni-cbf38194-43c8-c524-6c4c-ad93037ff1ab" Mar 12 04:49:23.680562 containerd[1510]: 2026-03-12 04:49:23.386 [INFO][3942] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" iface="eth0" netns="/var/run/netns/cni-cbf38194-43c8-c524-6c4c-ad93037ff1ab" Mar 12 04:49:23.680562 containerd[1510]: 2026-03-12 04:49:23.386 [INFO][3942] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Mar 12 04:49:23.680562 containerd[1510]: 2026-03-12 04:49:23.386 [INFO][3942] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Mar 12 04:49:23.680562 containerd[1510]: 2026-03-12 04:49:23.612 [INFO][3986] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" HandleID="k8s-pod-network.55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" Mar 12 04:49:23.680562 containerd[1510]: 2026-03-12 04:49:23.612 [INFO][3986] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:23.680562 containerd[1510]: 2026-03-12 04:49:23.612 [INFO][3986] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:23.680562 containerd[1510]: 2026-03-12 04:49:23.632 [WARNING][3986] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" HandleID="k8s-pod-network.55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" Mar 12 04:49:23.680562 containerd[1510]: 2026-03-12 04:49:23.633 [INFO][3986] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" HandleID="k8s-pod-network.55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" Mar 12 04:49:23.680562 containerd[1510]: 2026-03-12 04:49:23.642 [INFO][3986] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:23.680562 containerd[1510]: 2026-03-12 04:49:23.663 [INFO][3942] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Mar 12 04:49:23.684482 containerd[1510]: time="2026-03-12T04:49:23.683883617Z" level=info msg="TearDown network for sandbox \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\" successfully" Mar 12 04:49:23.684913 containerd[1510]: time="2026-03-12T04:49:23.684611697Z" level=info msg="StopPodSandbox for \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\" returns successfully" Mar 12 04:49:23.691851 containerd[1510]: time="2026-03-12T04:49:23.690932225Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-wd9b5,Uid:dd0fb06c-f33e-47f9-b47a-58a6d0dd5bf9,Namespace:kube-system,Attempt:1,}" Mar 12 04:49:23.864635 systemd[1]: run-containerd-runc-k8s.io-cf01efacb05384b922df5f15da45ba93bb3a162a3789a53c1e10458be67b3402-runc.fCtPBw.mount: Deactivated successfully. Mar 12 04:49:23.867684 systemd[1]: run-netns-cni\x2dcbf38194\x2d43c8\x2dc524\x2d6c4c\x2dad93037ff1ab.mount: Deactivated successfully. Mar 12 04:49:24.081927 containerd[1510]: 2026-03-12 04:49:23.566 [INFO][3957] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Mar 12 04:49:24.081927 containerd[1510]: 2026-03-12 04:49:23.568 [INFO][3957] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" iface="eth0" netns="/var/run/netns/cni-e19690d2-6ef0-b8e0-d546-4fc3e424cc3b" Mar 12 04:49:24.081927 containerd[1510]: 2026-03-12 04:49:23.568 [INFO][3957] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" iface="eth0" netns="/var/run/netns/cni-e19690d2-6ef0-b8e0-d546-4fc3e424cc3b" Mar 12 04:49:24.081927 containerd[1510]: 2026-03-12 04:49:23.569 [INFO][3957] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" iface="eth0" netns="/var/run/netns/cni-e19690d2-6ef0-b8e0-d546-4fc3e424cc3b" Mar 12 04:49:24.081927 containerd[1510]: 2026-03-12 04:49:23.569 [INFO][3957] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Mar 12 04:49:24.081927 containerd[1510]: 2026-03-12 04:49:23.569 [INFO][3957] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Mar 12 04:49:24.081927 containerd[1510]: 2026-03-12 04:49:23.974 [INFO][4010] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" HandleID="k8s-pod-network.52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" Mar 12 04:49:24.081927 containerd[1510]: 2026-03-12 04:49:23.974 [INFO][4010] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:24.081927 containerd[1510]: 2026-03-12 04:49:23.974 [INFO][4010] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:24.081927 containerd[1510]: 2026-03-12 04:49:24.044 [WARNING][4010] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" HandleID="k8s-pod-network.52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" Mar 12 04:49:24.081927 containerd[1510]: 2026-03-12 04:49:24.044 [INFO][4010] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" HandleID="k8s-pod-network.52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" Mar 12 04:49:24.081927 containerd[1510]: 2026-03-12 04:49:24.052 [INFO][4010] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:24.081927 containerd[1510]: 2026-03-12 04:49:24.063 [INFO][3957] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Mar 12 04:49:24.088082 containerd[1510]: time="2026-03-12T04:49:24.086324893Z" level=info msg="TearDown network for sandbox \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\" successfully" Mar 12 04:49:24.088082 containerd[1510]: time="2026-03-12T04:49:24.086380408Z" level=info msg="StopPodSandbox for \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\" returns successfully" Mar 12 04:49:24.088892 systemd[1]: run-netns-cni\x2de19690d2\x2d6ef0\x2db8e0\x2dd546\x2d4fc3e424cc3b.mount: Deactivated successfully. Mar 12 04:49:24.103294 containerd[1510]: time="2026-03-12T04:49:24.102670381Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-98bbf8757-vp827,Uid:5d03062c-e112-45ee-991a-aaa27fc24b6c,Namespace:calico-system,Attempt:1,}" Mar 12 04:49:24.177009 containerd[1510]: 2026-03-12 04:49:23.602 [INFO][3953] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Mar 12 04:49:24.177009 containerd[1510]: 2026-03-12 04:49:23.602 [INFO][3953] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" iface="eth0" netns="/var/run/netns/cni-03439d94-7967-4335-7b24-c10102d84f27" Mar 12 04:49:24.177009 containerd[1510]: 2026-03-12 04:49:23.603 [INFO][3953] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" iface="eth0" netns="/var/run/netns/cni-03439d94-7967-4335-7b24-c10102d84f27" Mar 12 04:49:24.177009 containerd[1510]: 2026-03-12 04:49:23.608 [INFO][3953] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" iface="eth0" netns="/var/run/netns/cni-03439d94-7967-4335-7b24-c10102d84f27" Mar 12 04:49:24.177009 containerd[1510]: 2026-03-12 04:49:23.608 [INFO][3953] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Mar 12 04:49:24.177009 containerd[1510]: 2026-03-12 04:49:23.608 [INFO][3953] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Mar 12 04:49:24.177009 containerd[1510]: 2026-03-12 04:49:24.065 [INFO][4018] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" HandleID="k8s-pod-network.4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Workload="srv--1ee83.gb1.brightbox.com-k8s-whisker--6c56c56549--c7w56-eth0" Mar 12 04:49:24.177009 containerd[1510]: 2026-03-12 04:49:24.065 [INFO][4018] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:24.177009 containerd[1510]: 2026-03-12 04:49:24.065 [INFO][4018] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:24.177009 containerd[1510]: 2026-03-12 04:49:24.113 [WARNING][4018] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" HandleID="k8s-pod-network.4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Workload="srv--1ee83.gb1.brightbox.com-k8s-whisker--6c56c56549--c7w56-eth0" Mar 12 04:49:24.177009 containerd[1510]: 2026-03-12 04:49:24.113 [INFO][4018] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" HandleID="k8s-pod-network.4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Workload="srv--1ee83.gb1.brightbox.com-k8s-whisker--6c56c56549--c7w56-eth0" Mar 12 04:49:24.177009 containerd[1510]: 2026-03-12 04:49:24.119 [INFO][4018] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:24.177009 containerd[1510]: 2026-03-12 04:49:24.141 [INFO][3953] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Mar 12 04:49:24.183485 containerd[1510]: time="2026-03-12T04:49:24.179251405Z" level=info msg="TearDown network for sandbox \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\" successfully" Mar 12 04:49:24.183485 containerd[1510]: time="2026-03-12T04:49:24.179318473Z" level=info msg="StopPodSandbox for \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\" returns successfully" Mar 12 04:49:24.187833 systemd[1]: run-netns-cni\x2d03439d94\x2d7967\x2d4335\x2d7b24\x2dc10102d84f27.mount: Deactivated successfully. Mar 12 04:49:24.283881 systemd-networkd[1425]: cali4fea10897ea: Gained IPv6LL Mar 12 04:49:24.293643 containerd[1510]: 2026-03-12 04:49:23.728 [INFO][3948] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Mar 12 04:49:24.293643 containerd[1510]: 2026-03-12 04:49:23.755 [INFO][3948] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" iface="eth0" netns="/var/run/netns/cni-4ea800c4-74ca-62b0-6e51-4c1be9e62612" Mar 12 04:49:24.293643 containerd[1510]: 2026-03-12 04:49:23.755 [INFO][3948] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" iface="eth0" netns="/var/run/netns/cni-4ea800c4-74ca-62b0-6e51-4c1be9e62612" Mar 12 04:49:24.293643 containerd[1510]: 2026-03-12 04:49:23.756 [INFO][3948] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" iface="eth0" netns="/var/run/netns/cni-4ea800c4-74ca-62b0-6e51-4c1be9e62612" Mar 12 04:49:24.293643 containerd[1510]: 2026-03-12 04:49:23.757 [INFO][3948] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Mar 12 04:49:24.293643 containerd[1510]: 2026-03-12 04:49:23.757 [INFO][3948] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Mar 12 04:49:24.293643 containerd[1510]: 2026-03-12 04:49:24.069 [INFO][4054] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" HandleID="k8s-pod-network.993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" Mar 12 04:49:24.293643 containerd[1510]: 2026-03-12 04:49:24.071 [INFO][4054] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:24.293643 containerd[1510]: 2026-03-12 04:49:24.120 [INFO][4054] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:24.293643 containerd[1510]: 2026-03-12 04:49:24.187 [WARNING][4054] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" HandleID="k8s-pod-network.993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" Mar 12 04:49:24.293643 containerd[1510]: 2026-03-12 04:49:24.187 [INFO][4054] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" HandleID="k8s-pod-network.993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" Mar 12 04:49:24.293643 containerd[1510]: 2026-03-12 04:49:24.213 [INFO][4054] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:24.293643 containerd[1510]: 2026-03-12 04:49:24.263 [INFO][3948] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Mar 12 04:49:24.301287 containerd[1510]: time="2026-03-12T04:49:24.293878637Z" level=info msg="TearDown network for sandbox \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\" successfully" Mar 12 04:49:24.301287 containerd[1510]: time="2026-03-12T04:49:24.293936548Z" level=info msg="StopPodSandbox for \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\" returns successfully" Mar 12 04:49:24.322532 containerd[1510]: time="2026-03-12T04:49:24.320025494Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6fcb6485c6-mlm75,Uid:5847ada8-9805-4f41-8520-9af174205760,Namespace:calico-system,Attempt:1,}" Mar 12 04:49:24.359086 kubelet[2694]: I0312 04:49:24.356367 2694 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/020ffbbb-c522-41bc-be16-1670cd72b7a8-whisker-ca-bundle\") pod \"020ffbbb-c522-41bc-be16-1670cd72b7a8\" (UID: \"020ffbbb-c522-41bc-be16-1670cd72b7a8\") " Mar 12 04:49:24.359086 kubelet[2694]: I0312 04:49:24.356532 2694 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/020ffbbb-c522-41bc-be16-1670cd72b7a8-nginx-config\") pod \"020ffbbb-c522-41bc-be16-1670cd72b7a8\" (UID: \"020ffbbb-c522-41bc-be16-1670cd72b7a8\") " Mar 12 04:49:24.359086 kubelet[2694]: I0312 04:49:24.356592 2694 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59gk6\" (UniqueName: \"kubernetes.io/projected/020ffbbb-c522-41bc-be16-1670cd72b7a8-kube-api-access-59gk6\") pod \"020ffbbb-c522-41bc-be16-1670cd72b7a8\" (UID: \"020ffbbb-c522-41bc-be16-1670cd72b7a8\") " Mar 12 04:49:24.359086 kubelet[2694]: I0312 04:49:24.356644 2694 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/020ffbbb-c522-41bc-be16-1670cd72b7a8-whisker-backend-key-pair\") pod \"020ffbbb-c522-41bc-be16-1670cd72b7a8\" (UID: \"020ffbbb-c522-41bc-be16-1670cd72b7a8\") " Mar 12 04:49:24.377078 containerd[1510]: 2026-03-12 04:49:23.828 [INFO][3972] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Mar 12 04:49:24.377078 containerd[1510]: 2026-03-12 04:49:23.828 [INFO][3972] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" iface="eth0" netns="/var/run/netns/cni-f1ebd437-7e88-c155-7672-3dd2b04e04c1" Mar 12 04:49:24.377078 containerd[1510]: 2026-03-12 04:49:23.828 [INFO][3972] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" iface="eth0" netns="/var/run/netns/cni-f1ebd437-7e88-c155-7672-3dd2b04e04c1" Mar 12 04:49:24.377078 containerd[1510]: 2026-03-12 04:49:23.832 [INFO][3972] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" iface="eth0" netns="/var/run/netns/cni-f1ebd437-7e88-c155-7672-3dd2b04e04c1" Mar 12 04:49:24.377078 containerd[1510]: 2026-03-12 04:49:23.834 [INFO][3972] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Mar 12 04:49:24.377078 containerd[1510]: 2026-03-12 04:49:23.835 [INFO][3972] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Mar 12 04:49:24.377078 containerd[1510]: 2026-03-12 04:49:24.268 [INFO][4074] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" HandleID="k8s-pod-network.8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" Mar 12 04:49:24.377078 containerd[1510]: 2026-03-12 04:49:24.270 [INFO][4074] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:24.377078 containerd[1510]: 2026-03-12 04:49:24.271 [INFO][4074] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:24.377078 containerd[1510]: 2026-03-12 04:49:24.327 [WARNING][4074] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" HandleID="k8s-pod-network.8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" Mar 12 04:49:24.377078 containerd[1510]: 2026-03-12 04:49:24.327 [INFO][4074] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" HandleID="k8s-pod-network.8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" Mar 12 04:49:24.377078 containerd[1510]: 2026-03-12 04:49:24.335 [INFO][4074] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:24.377078 containerd[1510]: 2026-03-12 04:49:24.349 [INFO][3972] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Mar 12 04:49:24.379378 containerd[1510]: time="2026-03-12T04:49:24.377883101Z" level=info msg="TearDown network for sandbox \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\" successfully" Mar 12 04:49:24.379378 containerd[1510]: time="2026-03-12T04:49:24.377932375Z" level=info msg="StopPodSandbox for \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\" returns successfully" Mar 12 04:49:24.380489 kubelet[2694]: I0312 04:49:24.371581 2694 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/020ffbbb-c522-41bc-be16-1670cd72b7a8-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "020ffbbb-c522-41bc-be16-1670cd72b7a8" (UID: "020ffbbb-c522-41bc-be16-1670cd72b7a8"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 04:49:24.402883 kubelet[2694]: I0312 04:49:24.373471 2694 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/020ffbbb-c522-41bc-be16-1670cd72b7a8-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "020ffbbb-c522-41bc-be16-1670cd72b7a8" (UID: "020ffbbb-c522-41bc-be16-1670cd72b7a8"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 04:49:24.416416 containerd[1510]: 2026-03-12 04:49:23.716 [INFO][3959] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Mar 12 04:49:24.416416 containerd[1510]: 2026-03-12 04:49:23.716 [INFO][3959] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" iface="eth0" netns="/var/run/netns/cni-f07924d0-6d73-bd42-8636-5201392e892e" Mar 12 04:49:24.416416 containerd[1510]: 2026-03-12 04:49:23.718 [INFO][3959] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" iface="eth0" netns="/var/run/netns/cni-f07924d0-6d73-bd42-8636-5201392e892e" Mar 12 04:49:24.416416 containerd[1510]: 2026-03-12 04:49:23.718 [INFO][3959] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" iface="eth0" netns="/var/run/netns/cni-f07924d0-6d73-bd42-8636-5201392e892e" Mar 12 04:49:24.416416 containerd[1510]: 2026-03-12 04:49:23.721 [INFO][3959] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Mar 12 04:49:24.416416 containerd[1510]: 2026-03-12 04:49:23.721 [INFO][3959] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Mar 12 04:49:24.416416 containerd[1510]: 2026-03-12 04:49:24.293 [INFO][4047] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" HandleID="k8s-pod-network.16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" Mar 12 04:49:24.416416 containerd[1510]: 2026-03-12 04:49:24.296 [INFO][4047] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:24.416416 containerd[1510]: 2026-03-12 04:49:24.335 [INFO][4047] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:24.416416 containerd[1510]: 2026-03-12 04:49:24.374 [WARNING][4047] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" HandleID="k8s-pod-network.16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" Mar 12 04:49:24.416416 containerd[1510]: 2026-03-12 04:49:24.375 [INFO][4047] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" HandleID="k8s-pod-network.16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" Mar 12 04:49:24.416416 containerd[1510]: 2026-03-12 04:49:24.385 [INFO][4047] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:24.416416 containerd[1510]: 2026-03-12 04:49:24.404 [INFO][3959] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Mar 12 04:49:24.429104 containerd[1510]: time="2026-03-12T04:49:24.421968365Z" level=info msg="TearDown network for sandbox \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\" successfully" Mar 12 04:49:24.429259 containerd[1510]: time="2026-03-12T04:49:24.429151460Z" level=info msg="StopPodSandbox for \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\" returns successfully" Mar 12 04:49:24.447806 kubelet[2694]: I0312 04:49:24.447693 2694 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020ffbbb-c522-41bc-be16-1670cd72b7a8-kube-api-access-59gk6" (OuterVolumeSpecName: "kube-api-access-59gk6") pod "020ffbbb-c522-41bc-be16-1670cd72b7a8" (UID: "020ffbbb-c522-41bc-be16-1670cd72b7a8"). InnerVolumeSpecName "kube-api-access-59gk6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 04:49:24.452128 kubelet[2694]: I0312 04:49:24.451109 2694 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020ffbbb-c522-41bc-be16-1670cd72b7a8-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "020ffbbb-c522-41bc-be16-1670cd72b7a8" (UID: "020ffbbb-c522-41bc-be16-1670cd72b7a8"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 04:49:24.457352 containerd[1510]: time="2026-03-12T04:49:24.455364218Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-tpm4s,Uid:b9bb1a7d-4427-4ab5-9819-9fb3c34a6da8,Namespace:kube-system,Attempt:1,}" Mar 12 04:49:24.488095 kubelet[2694]: I0312 04:49:24.486524 2694 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/020ffbbb-c522-41bc-be16-1670cd72b7a8-whisker-ca-bundle\") on node \"srv-1ee83.gb1.brightbox.com\" DevicePath \"\"" Mar 12 04:49:24.493171 kubelet[2694]: I0312 04:49:24.488325 2694 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/020ffbbb-c522-41bc-be16-1670cd72b7a8-nginx-config\") on node \"srv-1ee83.gb1.brightbox.com\" DevicePath \"\"" Mar 12 04:49:24.493171 kubelet[2694]: I0312 04:49:24.488420 2694 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-59gk6\" (UniqueName: \"kubernetes.io/projected/020ffbbb-c522-41bc-be16-1670cd72b7a8-kube-api-access-59gk6\") on node \"srv-1ee83.gb1.brightbox.com\" DevicePath \"\"" Mar 12 04:49:24.493171 kubelet[2694]: I0312 04:49:24.488447 2694 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/020ffbbb-c522-41bc-be16-1670cd72b7a8-whisker-backend-key-pair\") on node \"srv-1ee83.gb1.brightbox.com\" DevicePath \"\"" Mar 12 04:49:24.521142 containerd[1510]: time="2026-03-12T04:49:24.520648854Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6fcb6485c6-4gx67,Uid:2e788078-9f19-4645-87ab-03575ffe2f01,Namespace:calico-system,Attempt:1,}" Mar 12 04:49:24.577201 containerd[1510]: 2026-03-12 04:49:23.852 [INFO][3975] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Mar 12 04:49:24.577201 containerd[1510]: 2026-03-12 04:49:23.861 [INFO][3975] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" iface="eth0" netns="/var/run/netns/cni-36d26fb7-ffb0-7ce3-fb9e-2c57f9ec6e8c" Mar 12 04:49:24.577201 containerd[1510]: 2026-03-12 04:49:23.861 [INFO][3975] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" iface="eth0" netns="/var/run/netns/cni-36d26fb7-ffb0-7ce3-fb9e-2c57f9ec6e8c" Mar 12 04:49:24.577201 containerd[1510]: 2026-03-12 04:49:23.874 [INFO][3975] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" iface="eth0" netns="/var/run/netns/cni-36d26fb7-ffb0-7ce3-fb9e-2c57f9ec6e8c" Mar 12 04:49:24.577201 containerd[1510]: 2026-03-12 04:49:23.874 [INFO][3975] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Mar 12 04:49:24.577201 containerd[1510]: 2026-03-12 04:49:23.874 [INFO][3975] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Mar 12 04:49:24.577201 containerd[1510]: 2026-03-12 04:49:24.442 [INFO][4080] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" HandleID="k8s-pod-network.447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Workload="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" Mar 12 04:49:24.577201 containerd[1510]: 2026-03-12 04:49:24.452 [INFO][4080] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:24.577201 containerd[1510]: 2026-03-12 04:49:24.452 [INFO][4080] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:24.577201 containerd[1510]: 2026-03-12 04:49:24.534 [WARNING][4080] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" HandleID="k8s-pod-network.447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Workload="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" Mar 12 04:49:24.577201 containerd[1510]: 2026-03-12 04:49:24.541 [INFO][4080] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" HandleID="k8s-pod-network.447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Workload="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" Mar 12 04:49:24.577201 containerd[1510]: 2026-03-12 04:49:24.548 [INFO][4080] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:24.577201 containerd[1510]: 2026-03-12 04:49:24.559 [INFO][3975] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Mar 12 04:49:24.579488 containerd[1510]: time="2026-03-12T04:49:24.579445807Z" level=info msg="TearDown network for sandbox \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\" successfully" Mar 12 04:49:24.579658 containerd[1510]: time="2026-03-12T04:49:24.579574449Z" level=info msg="StopPodSandbox for \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\" returns successfully" Mar 12 04:49:24.590472 containerd[1510]: time="2026-03-12T04:49:24.589864466Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-fn29j,Uid:2d76263c-902f-4005-a7fd-9d164ca77df9,Namespace:calico-system,Attempt:1,}" Mar 12 04:49:24.876992 systemd[1]: run-netns-cni\x2d36d26fb7\x2dffb0\x2d7ce3\x2dfb9e\x2d2c57f9ec6e8c.mount: Deactivated successfully. Mar 12 04:49:24.877976 systemd[1]: run-netns-cni\x2df07924d0\x2d6d73\x2dbd42\x2d8636\x2d5201392e892e.mount: Deactivated successfully. Mar 12 04:49:24.878119 systemd[1]: run-netns-cni\x2d4ea800c4\x2d74ca\x2d62b0\x2d6e51\x2d4c1be9e62612.mount: Deactivated successfully. Mar 12 04:49:24.878227 systemd[1]: run-netns-cni\x2df1ebd437\x2d7e88\x2dc155\x2d7672\x2d3dd2b04e04c1.mount: Deactivated successfully. Mar 12 04:49:24.878332 systemd[1]: var-lib-kubelet-pods-020ffbbb\x2dc522\x2d41bc\x2dbe16\x2d1670cd72b7a8-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d59gk6.mount: Deactivated successfully. Mar 12 04:49:24.879160 systemd[1]: var-lib-kubelet-pods-020ffbbb\x2dc522\x2d41bc\x2dbe16\x2d1670cd72b7a8-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 12 04:49:24.988262 systemd-networkd[1425]: cali8200a6d65fa: Link UP Mar 12 04:49:24.999378 systemd-networkd[1425]: cali8200a6d65fa: Gained carrier Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.254 [ERROR][4067] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.357 [INFO][4067] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0 coredns-66bc5c9577- kube-system dd0fb06c-f33e-47f9-b47a-58a6d0dd5bf9 944 0 2026-03-12 04:48:39 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-1ee83.gb1.brightbox.com coredns-66bc5c9577-wd9b5 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8200a6d65fa [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" Namespace="kube-system" Pod="coredns-66bc5c9577-wd9b5" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-" Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.357 [INFO][4067] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" Namespace="kube-system" Pod="coredns-66bc5c9577-wd9b5" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.688 [INFO][4164] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" HandleID="k8s-pod-network.b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.730 [INFO][4164] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" HandleID="k8s-pod-network.b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000102800), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-1ee83.gb1.brightbox.com", "pod":"coredns-66bc5c9577-wd9b5", "timestamp":"2026-03-12 04:49:24.688255134 +0000 UTC"}, Hostname:"srv-1ee83.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003391e0)} Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.730 [INFO][4164] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.731 [INFO][4164] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.731 [INFO][4164] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-1ee83.gb1.brightbox.com' Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.741 [INFO][4164] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.760 [INFO][4164] ipam/ipam.go 409: Looking up existing affinities for host host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.786 [INFO][4164] ipam/ipam.go 526: Trying affinity for 192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.815 [INFO][4164] ipam/ipam.go 160: Attempting to load block cidr=192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.830 [INFO][4164] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.830 [INFO][4164] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.54.128/26 handle="k8s-pod-network.b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.845 [INFO][4164] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7 Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.885 [INFO][4164] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.54.128/26 handle="k8s-pod-network.b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.905 [INFO][4164] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.54.130/26] block=192.168.54.128/26 handle="k8s-pod-network.b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.926 [INFO][4164] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.54.130/26] handle="k8s-pod-network.b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.926 [INFO][4164] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:25.144492 containerd[1510]: 2026-03-12 04:49:24.926 [INFO][4164] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.54.130/26] IPv6=[] ContainerID="b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" HandleID="k8s-pod-network.b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" Mar 12 04:49:25.146520 containerd[1510]: 2026-03-12 04:49:24.954 [INFO][4067] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" Namespace="kube-system" Pod="coredns-66bc5c9577-wd9b5" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"dd0fb06c-f33e-47f9-b47a-58a6d0dd5bf9", ResourceVersion:"944", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"", Pod:"coredns-66bc5c9577-wd9b5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.54.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8200a6d65fa", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:25.146520 containerd[1510]: 2026-03-12 04:49:24.956 [INFO][4067] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.54.130/32] ContainerID="b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" Namespace="kube-system" Pod="coredns-66bc5c9577-wd9b5" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" Mar 12 04:49:25.146520 containerd[1510]: 2026-03-12 04:49:24.957 [INFO][4067] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8200a6d65fa ContainerID="b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" Namespace="kube-system" Pod="coredns-66bc5c9577-wd9b5" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" Mar 12 04:49:25.146520 containerd[1510]: 2026-03-12 04:49:24.994 [INFO][4067] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" Namespace="kube-system" Pod="coredns-66bc5c9577-wd9b5" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" Mar 12 04:49:25.146520 containerd[1510]: 2026-03-12 04:49:24.998 [INFO][4067] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" Namespace="kube-system" Pod="coredns-66bc5c9577-wd9b5" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"dd0fb06c-f33e-47f9-b47a-58a6d0dd5bf9", ResourceVersion:"944", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7", Pod:"coredns-66bc5c9577-wd9b5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.54.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8200a6d65fa", MAC:"4a:30:c8:07:b6:65", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:25.147025 containerd[1510]: 2026-03-12 04:49:25.092 [INFO][4067] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7" Namespace="kube-system" Pod="coredns-66bc5c9577-wd9b5" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" Mar 12 04:49:25.269369 systemd[1]: Removed slice kubepods-besteffort-pod020ffbbb_c522_41bc_be16_1670cd72b7a8.slice - libcontainer container kubepods-besteffort-pod020ffbbb_c522_41bc_be16_1670cd72b7a8.slice. Mar 12 04:49:25.378773 systemd-networkd[1425]: calie6cbce439d2: Link UP Mar 12 04:49:25.380640 systemd-networkd[1425]: calie6cbce439d2: Gained carrier Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:24.524 [ERROR][4130] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:24.554 [INFO][4130] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0 calico-kube-controllers-98bbf8757- calico-system 5d03062c-e112-45ee-991a-aaa27fc24b6c 946 0 2026-03-12 04:48:57 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:98bbf8757 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-1ee83.gb1.brightbox.com calico-kube-controllers-98bbf8757-vp827 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calie6cbce439d2 [] [] }} ContainerID="cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" Namespace="calico-system" Pod="calico-kube-controllers-98bbf8757-vp827" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-" Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:24.554 [INFO][4130] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" Namespace="calico-system" Pod="calico-kube-controllers-98bbf8757-vp827" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:24.872 [INFO][4215] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" HandleID="k8s-pod-network.cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:24.973 [INFO][4215] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" HandleID="k8s-pod-network.cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0005f42a0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-1ee83.gb1.brightbox.com", "pod":"calico-kube-controllers-98bbf8757-vp827", "timestamp":"2026-03-12 04:49:24.872840434 +0000 UTC"}, Hostname:"srv-1ee83.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000172000)} Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:24.982 [INFO][4215] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:24.982 [INFO][4215] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:24.982 [INFO][4215] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-1ee83.gb1.brightbox.com' Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:24.998 [INFO][4215] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:25.058 [INFO][4215] ipam/ipam.go 409: Looking up existing affinities for host host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:25.130 [INFO][4215] ipam/ipam.go 526: Trying affinity for 192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:25.164 [INFO][4215] ipam/ipam.go 160: Attempting to load block cidr=192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:25.186 [INFO][4215] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:25.186 [INFO][4215] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.54.128/26 handle="k8s-pod-network.cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:25.205 [INFO][4215] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:25.224 [INFO][4215] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.54.128/26 handle="k8s-pod-network.cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:25.249 [INFO][4215] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.54.131/26] block=192.168.54.128/26 handle="k8s-pod-network.cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:25.249 [INFO][4215] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.54.131/26] handle="k8s-pod-network.cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:25.249 [INFO][4215] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:25.492194 containerd[1510]: 2026-03-12 04:49:25.249 [INFO][4215] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.54.131/26] IPv6=[] ContainerID="cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" HandleID="k8s-pod-network.cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" Mar 12 04:49:25.497591 containerd[1510]: 2026-03-12 04:49:25.288 [INFO][4130] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" Namespace="calico-system" Pod="calico-kube-controllers-98bbf8757-vp827" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0", GenerateName:"calico-kube-controllers-98bbf8757-", Namespace:"calico-system", SelfLink:"", UID:"5d03062c-e112-45ee-991a-aaa27fc24b6c", ResourceVersion:"946", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"98bbf8757", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-98bbf8757-vp827", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.54.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie6cbce439d2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:25.497591 containerd[1510]: 2026-03-12 04:49:25.288 [INFO][4130] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.54.131/32] ContainerID="cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" Namespace="calico-system" Pod="calico-kube-controllers-98bbf8757-vp827" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" Mar 12 04:49:25.497591 containerd[1510]: 2026-03-12 04:49:25.288 [INFO][4130] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie6cbce439d2 ContainerID="cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" Namespace="calico-system" Pod="calico-kube-controllers-98bbf8757-vp827" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" Mar 12 04:49:25.497591 containerd[1510]: 2026-03-12 04:49:25.387 [INFO][4130] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" Namespace="calico-system" Pod="calico-kube-controllers-98bbf8757-vp827" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" Mar 12 04:49:25.497591 containerd[1510]: 2026-03-12 04:49:25.389 [INFO][4130] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" Namespace="calico-system" Pod="calico-kube-controllers-98bbf8757-vp827" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0", GenerateName:"calico-kube-controllers-98bbf8757-", Namespace:"calico-system", SelfLink:"", UID:"5d03062c-e112-45ee-991a-aaa27fc24b6c", ResourceVersion:"946", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"98bbf8757", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c", Pod:"calico-kube-controllers-98bbf8757-vp827", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.54.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie6cbce439d2", MAC:"5e:c1:27:f8:70:58", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:25.497591 containerd[1510]: 2026-03-12 04:49:25.458 [INFO][4130] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c" Namespace="calico-system" Pod="calico-kube-controllers-98bbf8757-vp827" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" Mar 12 04:49:25.630946 containerd[1510]: time="2026-03-12T04:49:25.628651889Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:49:25.630946 containerd[1510]: time="2026-03-12T04:49:25.628771338Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:49:25.630946 containerd[1510]: time="2026-03-12T04:49:25.628791380Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:49:25.630946 containerd[1510]: time="2026-03-12T04:49:25.628941926Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:49:25.677288 containerd[1510]: time="2026-03-12T04:49:25.672416799Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:49:25.693008 containerd[1510]: time="2026-03-12T04:49:25.692752033Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:49:25.694365 containerd[1510]: time="2026-03-12T04:49:25.694279975Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:49:25.695099 systemd[1]: Created slice kubepods-besteffort-poda6cbf177_8d80_4028_83be_45013223c63d.slice - libcontainer container kubepods-besteffort-poda6cbf177_8d80_4028_83be_45013223c63d.slice. Mar 12 04:49:25.706009 kubelet[2694]: I0312 04:49:25.704284 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a6cbf177-8d80-4028-83be-45013223c63d-whisker-backend-key-pair\") pod \"whisker-66b7b7bdbb-xpt8w\" (UID: \"a6cbf177-8d80-4028-83be-45013223c63d\") " pod="calico-system/whisker-66b7b7bdbb-xpt8w" Mar 12 04:49:25.706009 kubelet[2694]: I0312 04:49:25.704366 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdkvw\" (UniqueName: \"kubernetes.io/projected/a6cbf177-8d80-4028-83be-45013223c63d-kube-api-access-sdkvw\") pod \"whisker-66b7b7bdbb-xpt8w\" (UID: \"a6cbf177-8d80-4028-83be-45013223c63d\") " pod="calico-system/whisker-66b7b7bdbb-xpt8w" Mar 12 04:49:25.706009 kubelet[2694]: I0312 04:49:25.704403 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/a6cbf177-8d80-4028-83be-45013223c63d-nginx-config\") pod \"whisker-66b7b7bdbb-xpt8w\" (UID: \"a6cbf177-8d80-4028-83be-45013223c63d\") " pod="calico-system/whisker-66b7b7bdbb-xpt8w" Mar 12 04:49:25.706009 kubelet[2694]: I0312 04:49:25.704448 2694 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6cbf177-8d80-4028-83be-45013223c63d-whisker-ca-bundle\") pod \"whisker-66b7b7bdbb-xpt8w\" (UID: \"a6cbf177-8d80-4028-83be-45013223c63d\") " pod="calico-system/whisker-66b7b7bdbb-xpt8w" Mar 12 04:49:25.711256 containerd[1510]: time="2026-03-12T04:49:25.703385975Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:49:25.784239 kubelet[2694]: I0312 04:49:25.783960 2694 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020ffbbb-c522-41bc-be16-1670cd72b7a8" path="/var/lib/kubelet/pods/020ffbbb-c522-41bc-be16-1670cd72b7a8/volumes" Mar 12 04:49:25.808550 systemd-networkd[1425]: calib00f6146ae1: Link UP Mar 12 04:49:25.844018 systemd-networkd[1425]: calib00f6146ae1: Gained carrier Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:24.661 [ERROR][4171] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:24.710 [INFO][4171] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0 calico-apiserver-6fcb6485c6- calico-system 5847ada8-9805-4f41-8520-9af174205760 949 0 2026-03-12 04:48:55 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6fcb6485c6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-1ee83.gb1.brightbox.com calico-apiserver-6fcb6485c6-mlm75 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calib00f6146ae1 [] [] }} ContainerID="a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" Namespace="calico-system" Pod="calico-apiserver-6fcb6485c6-mlm75" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-" Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:24.710 [INFO][4171] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" Namespace="calico-system" Pod="calico-apiserver-6fcb6485c6-mlm75" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.072 [INFO][4250] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" HandleID="k8s-pod-network.a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.104 [INFO][4250] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" HandleID="k8s-pod-network.a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00030d1a0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-1ee83.gb1.brightbox.com", "pod":"calico-apiserver-6fcb6485c6-mlm75", "timestamp":"2026-03-12 04:49:25.072609781 +0000 UTC"}, Hostname:"srv-1ee83.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000230580)} Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.106 [INFO][4250] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.263 [INFO][4250] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.263 [INFO][4250] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-1ee83.gb1.brightbox.com' Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.283 [INFO][4250] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.363 [INFO][4250] ipam/ipam.go 409: Looking up existing affinities for host host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.410 [INFO][4250] ipam/ipam.go 526: Trying affinity for 192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.449 [INFO][4250] ipam/ipam.go 160: Attempting to load block cidr=192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.553 [INFO][4250] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.553 [INFO][4250] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.54.128/26 handle="k8s-pod-network.a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.574 [INFO][4250] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889 Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.596 [INFO][4250] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.54.128/26 handle="k8s-pod-network.a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.680 [INFO][4250] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.54.132/26] block=192.168.54.128/26 handle="k8s-pod-network.a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.680 [INFO][4250] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.54.132/26] handle="k8s-pod-network.a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.680 [INFO][4250] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:25.974066 containerd[1510]: 2026-03-12 04:49:25.680 [INFO][4250] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.54.132/26] IPv6=[] ContainerID="a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" HandleID="k8s-pod-network.a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" Mar 12 04:49:25.976229 containerd[1510]: 2026-03-12 04:49:25.749 [INFO][4171] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" Namespace="calico-system" Pod="calico-apiserver-6fcb6485c6-mlm75" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0", GenerateName:"calico-apiserver-6fcb6485c6-", Namespace:"calico-system", SelfLink:"", UID:"5847ada8-9805-4f41-8520-9af174205760", ResourceVersion:"949", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6fcb6485c6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-6fcb6485c6-mlm75", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.54.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calib00f6146ae1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:25.976229 containerd[1510]: 2026-03-12 04:49:25.750 [INFO][4171] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.54.132/32] ContainerID="a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" Namespace="calico-system" Pod="calico-apiserver-6fcb6485c6-mlm75" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" Mar 12 04:49:25.976229 containerd[1510]: 2026-03-12 04:49:25.750 [INFO][4171] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib00f6146ae1 ContainerID="a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" Namespace="calico-system" Pod="calico-apiserver-6fcb6485c6-mlm75" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" Mar 12 04:49:25.976229 containerd[1510]: 2026-03-12 04:49:25.849 [INFO][4171] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" Namespace="calico-system" Pod="calico-apiserver-6fcb6485c6-mlm75" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" Mar 12 04:49:25.976229 containerd[1510]: 2026-03-12 04:49:25.864 [INFO][4171] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" Namespace="calico-system" Pod="calico-apiserver-6fcb6485c6-mlm75" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0", GenerateName:"calico-apiserver-6fcb6485c6-", Namespace:"calico-system", SelfLink:"", UID:"5847ada8-9805-4f41-8520-9af174205760", ResourceVersion:"949", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6fcb6485c6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889", Pod:"calico-apiserver-6fcb6485c6-mlm75", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.54.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calib00f6146ae1", MAC:"7a:ab:d9:c9:b0:e2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:25.976229 containerd[1510]: 2026-03-12 04:49:25.967 [INFO][4171] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889" Namespace="calico-system" Pod="calico-apiserver-6fcb6485c6-mlm75" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" Mar 12 04:49:26.000337 systemd[1]: Started cri-containerd-b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7.scope - libcontainer container b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7. Mar 12 04:49:26.005673 systemd[1]: Started cri-containerd-cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c.scope - libcontainer container cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c. Mar 12 04:49:26.018907 containerd[1510]: time="2026-03-12T04:49:26.018821223Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-66b7b7bdbb-xpt8w,Uid:a6cbf177-8d80-4028-83be-45013223c63d,Namespace:calico-system,Attempt:0,}" Mar 12 04:49:26.187983 systemd-networkd[1425]: cali195415107d1: Link UP Mar 12 04:49:26.190299 systemd-networkd[1425]: cali195415107d1: Gained carrier Mar 12 04:49:26.339468 systemd-networkd[1425]: cali17c3c54c9e4: Link UP Mar 12 04:49:26.344569 systemd-networkd[1425]: cali17c3c54c9e4: Gained carrier Mar 12 04:49:26.384390 containerd[1510]: time="2026-03-12T04:49:26.349592527Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:49:26.384390 containerd[1510]: time="2026-03-12T04:49:26.349692237Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:49:26.384390 containerd[1510]: time="2026-03-12T04:49:26.349742271Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:49:26.384390 containerd[1510]: time="2026-03-12T04:49:26.349911577Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:49:26.389576 containerd[1510]: time="2026-03-12T04:49:26.386407485Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-wd9b5,Uid:dd0fb06c-f33e-47f9-b47a-58a6d0dd5bf9,Namespace:kube-system,Attempt:1,} returns sandbox id \"b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7\"" Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:24.809 [ERROR][4209] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:24.848 [INFO][4209] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0 calico-apiserver-6fcb6485c6- calico-system 2e788078-9f19-4645-87ab-03575ffe2f01 948 0 2026-03-12 04:48:55 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6fcb6485c6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-1ee83.gb1.brightbox.com calico-apiserver-6fcb6485c6-4gx67 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali195415107d1 [] [] }} ContainerID="4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" Namespace="calico-system" Pod="calico-apiserver-6fcb6485c6-4gx67" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-" Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:24.849 [INFO][4209] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" Namespace="calico-system" Pod="calico-apiserver-6fcb6485c6-4gx67" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:25.226 [INFO][4259] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" HandleID="k8s-pod-network.4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:25.291 [INFO][4259] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" HandleID="k8s-pod-network.4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f670), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-1ee83.gb1.brightbox.com", "pod":"calico-apiserver-6fcb6485c6-4gx67", "timestamp":"2026-03-12 04:49:25.226265461 +0000 UTC"}, Hostname:"srv-1ee83.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000470c60)} Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:25.291 [INFO][4259] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:25.685 [INFO][4259] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:25.686 [INFO][4259] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-1ee83.gb1.brightbox.com' Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:25.773 [INFO][4259] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:25.904 [INFO][4259] ipam/ipam.go 409: Looking up existing affinities for host host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:25.961 [INFO][4259] ipam/ipam.go 526: Trying affinity for 192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:25.975 [INFO][4259] ipam/ipam.go 160: Attempting to load block cidr=192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:25.990 [INFO][4259] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:25.991 [INFO][4259] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.54.128/26 handle="k8s-pod-network.4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:26.005 [INFO][4259] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1 Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:26.050 [INFO][4259] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.54.128/26 handle="k8s-pod-network.4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:26.109 [INFO][4259] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.54.133/26] block=192.168.54.128/26 handle="k8s-pod-network.4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:26.109 [INFO][4259] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.54.133/26] handle="k8s-pod-network.4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:26.109 [INFO][4259] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:26.395150 containerd[1510]: 2026-03-12 04:49:26.109 [INFO][4259] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.54.133/26] IPv6=[] ContainerID="4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" HandleID="k8s-pod-network.4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" Mar 12 04:49:26.403069 containerd[1510]: 2026-03-12 04:49:26.150 [INFO][4209] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" Namespace="calico-system" Pod="calico-apiserver-6fcb6485c6-4gx67" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0", GenerateName:"calico-apiserver-6fcb6485c6-", Namespace:"calico-system", SelfLink:"", UID:"2e788078-9f19-4645-87ab-03575ffe2f01", ResourceVersion:"948", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6fcb6485c6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-6fcb6485c6-4gx67", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.54.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali195415107d1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:26.403069 containerd[1510]: 2026-03-12 04:49:26.154 [INFO][4209] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.54.133/32] ContainerID="4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" Namespace="calico-system" Pod="calico-apiserver-6fcb6485c6-4gx67" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" Mar 12 04:49:26.403069 containerd[1510]: 2026-03-12 04:49:26.155 [INFO][4209] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali195415107d1 ContainerID="4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" Namespace="calico-system" Pod="calico-apiserver-6fcb6485c6-4gx67" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" Mar 12 04:49:26.403069 containerd[1510]: 2026-03-12 04:49:26.193 [INFO][4209] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" Namespace="calico-system" Pod="calico-apiserver-6fcb6485c6-4gx67" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" Mar 12 04:49:26.403069 containerd[1510]: 2026-03-12 04:49:26.195 [INFO][4209] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" Namespace="calico-system" Pod="calico-apiserver-6fcb6485c6-4gx67" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0", GenerateName:"calico-apiserver-6fcb6485c6-", Namespace:"calico-system", SelfLink:"", UID:"2e788078-9f19-4645-87ab-03575ffe2f01", ResourceVersion:"948", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6fcb6485c6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1", Pod:"calico-apiserver-6fcb6485c6-4gx67", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.54.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali195415107d1", MAC:"26:cb:4b:9c:cb:ff", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:26.403069 containerd[1510]: 2026-03-12 04:49:26.270 [INFO][4209] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1" Namespace="calico-system" Pod="calico-apiserver-6fcb6485c6-4gx67" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" Mar 12 04:49:26.407271 containerd[1510]: time="2026-03-12T04:49:26.406743494Z" level=info msg="CreateContainer within sandbox \"b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:25.026 [ERROR][4235] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:25.138 [INFO][4235] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0 goldmane-cccfbd5cf- calico-system 2d76263c-902f-4005-a7fd-9d164ca77df9 951 0 2026-03-12 04:48:55 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:cccfbd5cf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s srv-1ee83.gb1.brightbox.com goldmane-cccfbd5cf-fn29j eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali17c3c54c9e4 [] [] }} ContainerID="780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-fn29j" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-" Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:25.138 [INFO][4235] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-fn29j" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:25.456 [INFO][4281] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" HandleID="k8s-pod-network.780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" Workload="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:25.552 [INFO][4281] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" HandleID="k8s-pod-network.780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" Workload="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001227a0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-1ee83.gb1.brightbox.com", "pod":"goldmane-cccfbd5cf-fn29j", "timestamp":"2026-03-12 04:49:25.456641281 +0000 UTC"}, Hostname:"srv-1ee83.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003e86e0)} Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:25.552 [INFO][4281] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:26.109 [INFO][4281] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:26.109 [INFO][4281] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-1ee83.gb1.brightbox.com' Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:26.117 [INFO][4281] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:26.128 [INFO][4281] ipam/ipam.go 409: Looking up existing affinities for host host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:26.141 [INFO][4281] ipam/ipam.go 526: Trying affinity for 192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:26.148 [INFO][4281] ipam/ipam.go 160: Attempting to load block cidr=192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:26.157 [INFO][4281] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:26.157 [INFO][4281] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.54.128/26 handle="k8s-pod-network.780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:26.167 [INFO][4281] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:26.211 [INFO][4281] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.54.128/26 handle="k8s-pod-network.780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:26.240 [INFO][4281] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.54.134/26] block=192.168.54.128/26 handle="k8s-pod-network.780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:26.241 [INFO][4281] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.54.134/26] handle="k8s-pod-network.780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:26.244 [INFO][4281] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:26.431877 containerd[1510]: 2026-03-12 04:49:26.249 [INFO][4281] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.54.134/26] IPv6=[] ContainerID="780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" HandleID="k8s-pod-network.780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" Workload="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" Mar 12 04:49:26.437333 containerd[1510]: 2026-03-12 04:49:26.294 [INFO][4235] cni-plugin/k8s.go 418: Populated endpoint ContainerID="780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-fn29j" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"2d76263c-902f-4005-a7fd-9d164ca77df9", ResourceVersion:"951", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"", Pod:"goldmane-cccfbd5cf-fn29j", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.54.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali17c3c54c9e4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:26.437333 containerd[1510]: 2026-03-12 04:49:26.294 [INFO][4235] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.54.134/32] ContainerID="780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-fn29j" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" Mar 12 04:49:26.437333 containerd[1510]: 2026-03-12 04:49:26.294 [INFO][4235] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali17c3c54c9e4 ContainerID="780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-fn29j" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" Mar 12 04:49:26.437333 containerd[1510]: 2026-03-12 04:49:26.362 [INFO][4235] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-fn29j" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" Mar 12 04:49:26.437333 containerd[1510]: 2026-03-12 04:49:26.366 [INFO][4235] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-fn29j" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"2d76263c-902f-4005-a7fd-9d164ca77df9", ResourceVersion:"951", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a", Pod:"goldmane-cccfbd5cf-fn29j", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.54.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali17c3c54c9e4", MAC:"12:83:53:3b:1b:e4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:26.437333 containerd[1510]: 2026-03-12 04:49:26.424 [INFO][4235] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-fn29j" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" Mar 12 04:49:26.512285 containerd[1510]: time="2026-03-12T04:49:26.511573214Z" level=info msg="CreateContainer within sandbox \"b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ce0cf992913ad860954def8c85eaaf1ca6d617fcdc72174b82f713551a12bcf3\"" Mar 12 04:49:26.531488 containerd[1510]: time="2026-03-12T04:49:26.531289650Z" level=info msg="StartContainer for \"ce0cf992913ad860954def8c85eaaf1ca6d617fcdc72174b82f713551a12bcf3\"" Mar 12 04:49:26.571468 systemd[1]: Started cri-containerd-a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889.scope - libcontainer container a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889. Mar 12 04:49:26.593385 systemd-networkd[1425]: cali294cc709920: Link UP Mar 12 04:49:26.598797 systemd-networkd[1425]: cali294cc709920: Gained carrier Mar 12 04:49:26.633160 containerd[1510]: time="2026-03-12T04:49:26.629374438Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:49:26.633160 containerd[1510]: time="2026-03-12T04:49:26.629523506Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:49:26.633160 containerd[1510]: time="2026-03-12T04:49:26.629550889Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:49:26.633160 containerd[1510]: time="2026-03-12T04:49:26.629707986Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:49:26.635377 containerd[1510]: time="2026-03-12T04:49:26.634757025Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:49:26.635377 containerd[1510]: time="2026-03-12T04:49:26.634845711Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:49:26.635377 containerd[1510]: time="2026-03-12T04:49:26.634901452Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:49:26.636597 containerd[1510]: time="2026-03-12T04:49:26.635470249Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:24.864 [ERROR][4216] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:25.002 [INFO][4216] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0 coredns-66bc5c9577- kube-system b9bb1a7d-4427-4ab5-9819-9fb3c34a6da8 950 0 2026-03-12 04:48:39 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-1ee83.gb1.brightbox.com coredns-66bc5c9577-tpm4s eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali294cc709920 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" Namespace="kube-system" Pod="coredns-66bc5c9577-tpm4s" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-" Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:25.003 [INFO][4216] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" Namespace="kube-system" Pod="coredns-66bc5c9577-tpm4s" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:25.441 [INFO][4274] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" HandleID="k8s-pod-network.f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:25.554 [INFO][4274] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" HandleID="k8s-pod-network.f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000122f90), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-1ee83.gb1.brightbox.com", "pod":"coredns-66bc5c9577-tpm4s", "timestamp":"2026-03-12 04:49:25.441189388 +0000 UTC"}, Hostname:"srv-1ee83.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000310420)} Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:25.555 [INFO][4274] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:26.242 [INFO][4274] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:26.242 [INFO][4274] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-1ee83.gb1.brightbox.com' Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:26.259 [INFO][4274] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:26.351 [INFO][4274] ipam/ipam.go 409: Looking up existing affinities for host host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:26.406 [INFO][4274] ipam/ipam.go 526: Trying affinity for 192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:26.435 [INFO][4274] ipam/ipam.go 160: Attempting to load block cidr=192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:26.460 [INFO][4274] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:26.460 [INFO][4274] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.54.128/26 handle="k8s-pod-network.f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:26.473 [INFO][4274] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392 Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:26.497 [INFO][4274] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.54.128/26 handle="k8s-pod-network.f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:26.533 [INFO][4274] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.54.135/26] block=192.168.54.128/26 handle="k8s-pod-network.f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:26.539 [INFO][4274] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.54.135/26] handle="k8s-pod-network.f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:26.539 [INFO][4274] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:26.681336 containerd[1510]: 2026-03-12 04:49:26.539 [INFO][4274] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.54.135/26] IPv6=[] ContainerID="f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" HandleID="k8s-pod-network.f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" Mar 12 04:49:26.684528 containerd[1510]: 2026-03-12 04:49:26.558 [INFO][4216] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" Namespace="kube-system" Pod="coredns-66bc5c9577-tpm4s" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"b9bb1a7d-4427-4ab5-9819-9fb3c34a6da8", ResourceVersion:"950", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"", Pod:"coredns-66bc5c9577-tpm4s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.54.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali294cc709920", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:26.684528 containerd[1510]: 2026-03-12 04:49:26.559 [INFO][4216] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.54.135/32] ContainerID="f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" Namespace="kube-system" Pod="coredns-66bc5c9577-tpm4s" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" Mar 12 04:49:26.684528 containerd[1510]: 2026-03-12 04:49:26.559 [INFO][4216] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali294cc709920 ContainerID="f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" Namespace="kube-system" Pod="coredns-66bc5c9577-tpm4s" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" Mar 12 04:49:26.684528 containerd[1510]: 2026-03-12 04:49:26.606 [INFO][4216] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" Namespace="kube-system" Pod="coredns-66bc5c9577-tpm4s" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" Mar 12 04:49:26.684528 containerd[1510]: 2026-03-12 04:49:26.618 [INFO][4216] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" Namespace="kube-system" Pod="coredns-66bc5c9577-tpm4s" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"b9bb1a7d-4427-4ab5-9819-9fb3c34a6da8", ResourceVersion:"950", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392", Pod:"coredns-66bc5c9577-tpm4s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.54.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali294cc709920", MAC:"be:a1:d7:9e:52:7c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:26.684888 containerd[1510]: 2026-03-12 04:49:26.661 [INFO][4216] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392" Namespace="kube-system" Pod="coredns-66bc5c9577-tpm4s" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" Mar 12 04:49:26.754288 containerd[1510]: time="2026-03-12T04:49:26.753244227Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-98bbf8757-vp827,Uid:5d03062c-e112-45ee-991a-aaa27fc24b6c,Namespace:calico-system,Attempt:1,} returns sandbox id \"cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c\"" Mar 12 04:49:26.772360 systemd-networkd[1425]: cali8200a6d65fa: Gained IPv6LL Mar 12 04:49:26.822455 systemd[1]: Started cri-containerd-780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a.scope - libcontainer container 780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a. Mar 12 04:49:26.877710 systemd[1]: Started cri-containerd-4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1.scope - libcontainer container 4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1. Mar 12 04:49:26.981600 containerd[1510]: time="2026-03-12T04:49:26.979572856Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:49:26.981600 containerd[1510]: time="2026-03-12T04:49:26.979685345Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:49:26.981600 containerd[1510]: time="2026-03-12T04:49:26.979731174Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:49:26.981600 containerd[1510]: time="2026-03-12T04:49:26.979933164Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:49:27.060736 containerd[1510]: time="2026-03-12T04:49:27.059264861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6fcb6485c6-mlm75,Uid:5847ada8-9805-4f41-8520-9af174205760,Namespace:calico-system,Attempt:1,} returns sandbox id \"a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889\"" Mar 12 04:49:27.092678 systemd[1]: run-containerd-runc-k8s.io-ce0cf992913ad860954def8c85eaaf1ca6d617fcdc72174b82f713551a12bcf3-runc.PgjzDb.mount: Deactivated successfully. Mar 12 04:49:27.106394 systemd[1]: Started cri-containerd-ce0cf992913ad860954def8c85eaaf1ca6d617fcdc72174b82f713551a12bcf3.scope - libcontainer container ce0cf992913ad860954def8c85eaaf1ca6d617fcdc72174b82f713551a12bcf3. Mar 12 04:49:27.159322 systemd[1]: Started cri-containerd-f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392.scope - libcontainer container f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392. Mar 12 04:49:27.249242 containerd[1510]: time="2026-03-12T04:49:27.248978477Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6fcb6485c6-4gx67,Uid:2e788078-9f19-4645-87ab-03575ffe2f01,Namespace:calico-system,Attempt:1,} returns sandbox id \"4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1\"" Mar 12 04:49:27.284731 systemd-networkd[1425]: calie6cbce439d2: Gained IPv6LL Mar 12 04:49:27.325733 containerd[1510]: time="2026-03-12T04:49:27.325565450Z" level=info msg="StartContainer for \"ce0cf992913ad860954def8c85eaaf1ca6d617fcdc72174b82f713551a12bcf3\" returns successfully" Mar 12 04:49:27.346250 systemd-networkd[1425]: cali195415107d1: Gained IPv6LL Mar 12 04:49:27.441215 systemd-networkd[1425]: calic12f0ca0dbc: Link UP Mar 12 04:49:27.442760 systemd-networkd[1425]: calic12f0ca0dbc: Gained carrier Mar 12 04:49:27.468876 containerd[1510]: time="2026-03-12T04:49:27.466908819Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-tpm4s,Uid:b9bb1a7d-4427-4ab5-9819-9fb3c34a6da8,Namespace:kube-system,Attempt:1,} returns sandbox id \"f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392\"" Mar 12 04:49:27.512762 containerd[1510]: time="2026-03-12T04:49:27.512681217Z" level=info msg="CreateContainer within sandbox \"f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:26.577 [ERROR][4385] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:26.670 [INFO][4385] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--1ee83.gb1.brightbox.com-k8s-whisker--66b7b7bdbb--xpt8w-eth0 whisker-66b7b7bdbb- calico-system a6cbf177-8d80-4028-83be-45013223c63d 977 0 2026-03-12 04:49:25 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:66b7b7bdbb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s srv-1ee83.gb1.brightbox.com whisker-66b7b7bdbb-xpt8w eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calic12f0ca0dbc [] [] }} ContainerID="3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" Namespace="calico-system" Pod="whisker-66b7b7bdbb-xpt8w" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-whisker--66b7b7bdbb--xpt8w-" Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:26.670 [INFO][4385] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" Namespace="calico-system" Pod="whisker-66b7b7bdbb-xpt8w" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-whisker--66b7b7bdbb--xpt8w-eth0" Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.205 [INFO][4510] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" HandleID="k8s-pod-network.3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" Workload="srv--1ee83.gb1.brightbox.com-k8s-whisker--66b7b7bdbb--xpt8w-eth0" Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.266 [INFO][4510] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" HandleID="k8s-pod-network.3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" Workload="srv--1ee83.gb1.brightbox.com-k8s-whisker--66b7b7bdbb--xpt8w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001021b0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-1ee83.gb1.brightbox.com", "pod":"whisker-66b7b7bdbb-xpt8w", "timestamp":"2026-03-12 04:49:27.205674566 +0000 UTC"}, Hostname:"srv-1ee83.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000448b00)} Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.266 [INFO][4510] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.267 [INFO][4510] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.267 [INFO][4510] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-1ee83.gb1.brightbox.com' Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.278 [INFO][4510] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.303 [INFO][4510] ipam/ipam.go 409: Looking up existing affinities for host host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.338 [INFO][4510] ipam/ipam.go 526: Trying affinity for 192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.344 [INFO][4510] ipam/ipam.go 160: Attempting to load block cidr=192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.356 [INFO][4510] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.54.128/26 host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.359 [INFO][4510] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.54.128/26 handle="k8s-pod-network.3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.363 [INFO][4510] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691 Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.384 [INFO][4510] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.54.128/26 handle="k8s-pod-network.3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.400 [INFO][4510] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.54.136/26] block=192.168.54.128/26 handle="k8s-pod-network.3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.401 [INFO][4510] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.54.136/26] handle="k8s-pod-network.3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" host="srv-1ee83.gb1.brightbox.com" Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.401 [INFO][4510] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:27.529502 containerd[1510]: 2026-03-12 04:49:27.402 [INFO][4510] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.54.136/26] IPv6=[] ContainerID="3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" HandleID="k8s-pod-network.3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" Workload="srv--1ee83.gb1.brightbox.com-k8s-whisker--66b7b7bdbb--xpt8w-eth0" Mar 12 04:49:27.533025 containerd[1510]: 2026-03-12 04:49:27.420 [INFO][4385] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" Namespace="calico-system" Pod="whisker-66b7b7bdbb-xpt8w" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-whisker--66b7b7bdbb--xpt8w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-whisker--66b7b7bdbb--xpt8w-eth0", GenerateName:"whisker-66b7b7bdbb-", Namespace:"calico-system", SelfLink:"", UID:"a6cbf177-8d80-4028-83be-45013223c63d", ResourceVersion:"977", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 49, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"66b7b7bdbb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"", Pod:"whisker-66b7b7bdbb-xpt8w", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.54.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calic12f0ca0dbc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:27.533025 containerd[1510]: 2026-03-12 04:49:27.420 [INFO][4385] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.54.136/32] ContainerID="3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" Namespace="calico-system" Pod="whisker-66b7b7bdbb-xpt8w" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-whisker--66b7b7bdbb--xpt8w-eth0" Mar 12 04:49:27.533025 containerd[1510]: 2026-03-12 04:49:27.422 [INFO][4385] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic12f0ca0dbc ContainerID="3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" Namespace="calico-system" Pod="whisker-66b7b7bdbb-xpt8w" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-whisker--66b7b7bdbb--xpt8w-eth0" Mar 12 04:49:27.533025 containerd[1510]: 2026-03-12 04:49:27.444 [INFO][4385] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" Namespace="calico-system" Pod="whisker-66b7b7bdbb-xpt8w" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-whisker--66b7b7bdbb--xpt8w-eth0" Mar 12 04:49:27.533025 containerd[1510]: 2026-03-12 04:49:27.451 [INFO][4385] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" Namespace="calico-system" Pod="whisker-66b7b7bdbb-xpt8w" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-whisker--66b7b7bdbb--xpt8w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-whisker--66b7b7bdbb--xpt8w-eth0", GenerateName:"whisker-66b7b7bdbb-", Namespace:"calico-system", SelfLink:"", UID:"a6cbf177-8d80-4028-83be-45013223c63d", ResourceVersion:"977", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 49, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"66b7b7bdbb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691", Pod:"whisker-66b7b7bdbb-xpt8w", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.54.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calic12f0ca0dbc", MAC:"d2:ed:d9:bc:03:5e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:27.533025 containerd[1510]: 2026-03-12 04:49:27.498 [INFO][4385] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691" Namespace="calico-system" Pod="whisker-66b7b7bdbb-xpt8w" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-whisker--66b7b7bdbb--xpt8w-eth0" Mar 12 04:49:27.538255 systemd-networkd[1425]: calib00f6146ae1: Gained IPv6LL Mar 12 04:49:27.601948 containerd[1510]: time="2026-03-12T04:49:27.601247344Z" level=info msg="CreateContainer within sandbox \"f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"6fbc97d49460f9364f8c32f47aba8128cb0e28ca085c9ef9fcef9b170f0904d8\"" Mar 12 04:49:27.617074 containerd[1510]: time="2026-03-12T04:49:27.615961530Z" level=info msg="StartContainer for \"6fbc97d49460f9364f8c32f47aba8128cb0e28ca085c9ef9fcef9b170f0904d8\"" Mar 12 04:49:27.630581 containerd[1510]: time="2026-03-12T04:49:27.630516728Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-fn29j,Uid:2d76263c-902f-4005-a7fd-9d164ca77df9,Namespace:calico-system,Attempt:1,} returns sandbox id \"780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a\"" Mar 12 04:49:27.694981 containerd[1510]: time="2026-03-12T04:49:27.694865404Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Mar 12 04:49:27.716070 containerd[1510]: time="2026-03-12T04:49:27.715970176Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 4.254028494s" Mar 12 04:49:27.719630 containerd[1510]: time="2026-03-12T04:49:27.719594979Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Mar 12 04:49:27.725504 containerd[1510]: time="2026-03-12T04:49:27.716591624Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:49:27.725504 containerd[1510]: time="2026-03-12T04:49:27.716701492Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:49:27.725504 containerd[1510]: time="2026-03-12T04:49:27.716720773Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:49:27.725504 containerd[1510]: time="2026-03-12T04:49:27.716867554Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:49:27.730775 systemd-networkd[1425]: cali294cc709920: Gained IPv6LL Mar 12 04:49:27.734441 containerd[1510]: time="2026-03-12T04:49:27.734389291Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 12 04:49:27.738355 containerd[1510]: time="2026-03-12T04:49:27.737370474Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:27.742846 containerd[1510]: time="2026-03-12T04:49:27.742797033Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:27.748450 containerd[1510]: time="2026-03-12T04:49:27.748377019Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:27.757412 containerd[1510]: time="2026-03-12T04:49:27.756589288Z" level=info msg="CreateContainer within sandbox \"20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 12 04:49:27.795152 systemd-networkd[1425]: cali17c3c54c9e4: Gained IPv6LL Mar 12 04:49:27.866784 systemd[1]: Started cri-containerd-3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691.scope - libcontainer container 3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691. Mar 12 04:49:27.873462 containerd[1510]: time="2026-03-12T04:49:27.872792974Z" level=info msg="CreateContainer within sandbox \"20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"536b6c3bde31af7e6fc22091e92c620c3d83664ecef1732a2fe95f24b8caf0bc\"" Mar 12 04:49:27.877475 systemd[1]: Started cri-containerd-6fbc97d49460f9364f8c32f47aba8128cb0e28ca085c9ef9fcef9b170f0904d8.scope - libcontainer container 6fbc97d49460f9364f8c32f47aba8128cb0e28ca085c9ef9fcef9b170f0904d8. Mar 12 04:49:27.897265 containerd[1510]: time="2026-03-12T04:49:27.896879665Z" level=info msg="StartContainer for \"536b6c3bde31af7e6fc22091e92c620c3d83664ecef1732a2fe95f24b8caf0bc\"" Mar 12 04:49:27.917234 systemd[1]: run-containerd-runc-k8s.io-f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392-runc.omInvU.mount: Deactivated successfully. Mar 12 04:49:28.037824 containerd[1510]: time="2026-03-12T04:49:28.036003441Z" level=info msg="StartContainer for \"6fbc97d49460f9364f8c32f47aba8128cb0e28ca085c9ef9fcef9b170f0904d8\" returns successfully" Mar 12 04:49:28.074373 systemd[1]: Started cri-containerd-536b6c3bde31af7e6fc22091e92c620c3d83664ecef1732a2fe95f24b8caf0bc.scope - libcontainer container 536b6c3bde31af7e6fc22091e92c620c3d83664ecef1732a2fe95f24b8caf0bc. Mar 12 04:49:28.241239 containerd[1510]: time="2026-03-12T04:49:28.241109592Z" level=info msg="StartContainer for \"536b6c3bde31af7e6fc22091e92c620c3d83664ecef1732a2fe95f24b8caf0bc\" returns successfully" Mar 12 04:49:28.362568 containerd[1510]: time="2026-03-12T04:49:28.362493887Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-66b7b7bdbb-xpt8w,Uid:a6cbf177-8d80-4028-83be-45013223c63d,Namespace:calico-system,Attempt:0,} returns sandbox id \"3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691\"" Mar 12 04:49:28.418447 kubelet[2694]: I0312 04:49:28.418362 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-tpm4s" podStartSLOduration=49.416581546 podStartE2EDuration="49.416581546s" podCreationTimestamp="2026-03-12 04:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 04:49:28.413208382 +0000 UTC m=+54.902146724" watchObservedRunningTime="2026-03-12 04:49:28.416581546 +0000 UTC m=+54.905519868" Mar 12 04:49:28.492997 kubelet[2694]: I0312 04:49:28.492154 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-wd9b5" podStartSLOduration=49.492112545 podStartE2EDuration="49.492112545s" podCreationTimestamp="2026-03-12 04:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 04:49:28.461600195 +0000 UTC m=+54.950538543" watchObservedRunningTime="2026-03-12 04:49:28.492112545 +0000 UTC m=+54.981050879" Mar 12 04:49:28.650144 kernel: calico-node[4100]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 12 04:49:29.074436 systemd-networkd[1425]: calic12f0ca0dbc: Gained IPv6LL Mar 12 04:49:29.964875 systemd-networkd[1425]: vxlan.calico: Link UP Mar 12 04:49:29.964889 systemd-networkd[1425]: vxlan.calico: Gained carrier Mar 12 04:49:31.571531 systemd-networkd[1425]: vxlan.calico: Gained IPv6LL Mar 12 04:49:32.463746 containerd[1510]: time="2026-03-12T04:49:32.463650045Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:32.492332 containerd[1510]: time="2026-03-12T04:49:32.464906969Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Mar 12 04:49:32.508923 containerd[1510]: time="2026-03-12T04:49:32.508817235Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 4.774218334s" Mar 12 04:49:32.508923 containerd[1510]: time="2026-03-12T04:49:32.508909659Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Mar 12 04:49:32.509243 containerd[1510]: time="2026-03-12T04:49:32.509103426Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:32.510345 containerd[1510]: time="2026-03-12T04:49:32.510293199Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:32.525211 containerd[1510]: time="2026-03-12T04:49:32.524927789Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 12 04:49:32.586338 containerd[1510]: time="2026-03-12T04:49:32.586064859Z" level=info msg="CreateContainer within sandbox \"cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 12 04:49:32.656533 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2314818507.mount: Deactivated successfully. Mar 12 04:49:32.687198 containerd[1510]: time="2026-03-12T04:49:32.687105224Z" level=info msg="CreateContainer within sandbox \"cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"290354f6144cd564845ed5d6e9f271dcc2fe6635c6c109465c6cadc19426e2e5\"" Mar 12 04:49:32.688534 containerd[1510]: time="2026-03-12T04:49:32.688403598Z" level=info msg="StartContainer for \"290354f6144cd564845ed5d6e9f271dcc2fe6635c6c109465c6cadc19426e2e5\"" Mar 12 04:49:32.783760 systemd[1]: Started cri-containerd-290354f6144cd564845ed5d6e9f271dcc2fe6635c6c109465c6cadc19426e2e5.scope - libcontainer container 290354f6144cd564845ed5d6e9f271dcc2fe6635c6c109465c6cadc19426e2e5. Mar 12 04:49:32.894060 containerd[1510]: time="2026-03-12T04:49:32.893775451Z" level=info msg="StartContainer for \"290354f6144cd564845ed5d6e9f271dcc2fe6635c6c109465c6cadc19426e2e5\" returns successfully" Mar 12 04:49:33.624073 kubelet[2694]: I0312 04:49:33.623628 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-98bbf8757-vp827" podStartSLOduration=30.865225382 podStartE2EDuration="36.623600955s" podCreationTimestamp="2026-03-12 04:48:57 +0000 UTC" firstStartedPulling="2026-03-12 04:49:26.764439594 +0000 UTC m=+53.253377911" lastFinishedPulling="2026-03-12 04:49:32.522815156 +0000 UTC m=+59.011753484" observedRunningTime="2026-03-12 04:49:33.531604815 +0000 UTC m=+60.020543149" watchObservedRunningTime="2026-03-12 04:49:33.623600955 +0000 UTC m=+60.112539282" Mar 12 04:49:33.745192 containerd[1510]: time="2026-03-12T04:49:33.744313789Z" level=info msg="StopPodSandbox for \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\"" Mar 12 04:49:34.338373 containerd[1510]: 2026-03-12 04:49:34.132 [WARNING][5010] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-whisker--6c56c56549--c7w56-eth0" Mar 12 04:49:34.338373 containerd[1510]: 2026-03-12 04:49:34.134 [INFO][5010] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Mar 12 04:49:34.338373 containerd[1510]: 2026-03-12 04:49:34.134 [INFO][5010] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" iface="eth0" netns="" Mar 12 04:49:34.338373 containerd[1510]: 2026-03-12 04:49:34.134 [INFO][5010] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Mar 12 04:49:34.338373 containerd[1510]: 2026-03-12 04:49:34.134 [INFO][5010] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Mar 12 04:49:34.338373 containerd[1510]: 2026-03-12 04:49:34.299 [INFO][5019] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" HandleID="k8s-pod-network.4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Workload="srv--1ee83.gb1.brightbox.com-k8s-whisker--6c56c56549--c7w56-eth0" Mar 12 04:49:34.338373 containerd[1510]: 2026-03-12 04:49:34.303 [INFO][5019] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:34.338373 containerd[1510]: 2026-03-12 04:49:34.303 [INFO][5019] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:34.338373 containerd[1510]: 2026-03-12 04:49:34.329 [WARNING][5019] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" HandleID="k8s-pod-network.4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Workload="srv--1ee83.gb1.brightbox.com-k8s-whisker--6c56c56549--c7w56-eth0" Mar 12 04:49:34.338373 containerd[1510]: 2026-03-12 04:49:34.329 [INFO][5019] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" HandleID="k8s-pod-network.4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Workload="srv--1ee83.gb1.brightbox.com-k8s-whisker--6c56c56549--c7w56-eth0" Mar 12 04:49:34.338373 containerd[1510]: 2026-03-12 04:49:34.331 [INFO][5019] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:34.338373 containerd[1510]: 2026-03-12 04:49:34.335 [INFO][5010] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Mar 12 04:49:34.341170 containerd[1510]: time="2026-03-12T04:49:34.339996124Z" level=info msg="TearDown network for sandbox \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\" successfully" Mar 12 04:49:34.341170 containerd[1510]: time="2026-03-12T04:49:34.340388152Z" level=info msg="StopPodSandbox for \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\" returns successfully" Mar 12 04:49:34.404105 containerd[1510]: time="2026-03-12T04:49:34.403874980Z" level=info msg="RemovePodSandbox for \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\"" Mar 12 04:49:34.407897 containerd[1510]: time="2026-03-12T04:49:34.407820929Z" level=info msg="Forcibly stopping sandbox \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\"" Mar 12 04:49:34.597994 containerd[1510]: 2026-03-12 04:49:34.482 [WARNING][5034] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" WorkloadEndpoint="srv--1ee83.gb1.brightbox.com-k8s-whisker--6c56c56549--c7w56-eth0" Mar 12 04:49:34.597994 containerd[1510]: 2026-03-12 04:49:34.482 [INFO][5034] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Mar 12 04:49:34.597994 containerd[1510]: 2026-03-12 04:49:34.482 [INFO][5034] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" iface="eth0" netns="" Mar 12 04:49:34.597994 containerd[1510]: 2026-03-12 04:49:34.482 [INFO][5034] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Mar 12 04:49:34.597994 containerd[1510]: 2026-03-12 04:49:34.482 [INFO][5034] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Mar 12 04:49:34.597994 containerd[1510]: 2026-03-12 04:49:34.539 [INFO][5042] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" HandleID="k8s-pod-network.4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Workload="srv--1ee83.gb1.brightbox.com-k8s-whisker--6c56c56549--c7w56-eth0" Mar 12 04:49:34.597994 containerd[1510]: 2026-03-12 04:49:34.540 [INFO][5042] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:34.597994 containerd[1510]: 2026-03-12 04:49:34.540 [INFO][5042] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:34.597994 containerd[1510]: 2026-03-12 04:49:34.580 [WARNING][5042] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" HandleID="k8s-pod-network.4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Workload="srv--1ee83.gb1.brightbox.com-k8s-whisker--6c56c56549--c7w56-eth0" Mar 12 04:49:34.597994 containerd[1510]: 2026-03-12 04:49:34.581 [INFO][5042] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" HandleID="k8s-pod-network.4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Workload="srv--1ee83.gb1.brightbox.com-k8s-whisker--6c56c56549--c7w56-eth0" Mar 12 04:49:34.597994 containerd[1510]: 2026-03-12 04:49:34.584 [INFO][5042] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:34.597994 containerd[1510]: 2026-03-12 04:49:34.590 [INFO][5034] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13" Mar 12 04:49:34.597994 containerd[1510]: time="2026-03-12T04:49:34.595453568Z" level=info msg="TearDown network for sandbox \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\" successfully" Mar 12 04:49:34.636682 containerd[1510]: time="2026-03-12T04:49:34.635462636Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 12 04:49:34.650685 containerd[1510]: time="2026-03-12T04:49:34.650550418Z" level=info msg="RemovePodSandbox \"4eceaa07510a1912a75db34da08818a0a3fda55f0d095bb214d107279ab2ff13\" returns successfully" Mar 12 04:49:34.659077 containerd[1510]: time="2026-03-12T04:49:34.658838223Z" level=info msg="StopPodSandbox for \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\"" Mar 12 04:49:34.883607 containerd[1510]: 2026-03-12 04:49:34.753 [WARNING][5056] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"b9bb1a7d-4427-4ab5-9819-9fb3c34a6da8", ResourceVersion:"1021", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392", Pod:"coredns-66bc5c9577-tpm4s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.54.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali294cc709920", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:34.883607 containerd[1510]: 2026-03-12 04:49:34.753 [INFO][5056] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Mar 12 04:49:34.883607 containerd[1510]: 2026-03-12 04:49:34.753 [INFO][5056] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" iface="eth0" netns="" Mar 12 04:49:34.883607 containerd[1510]: 2026-03-12 04:49:34.753 [INFO][5056] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Mar 12 04:49:34.883607 containerd[1510]: 2026-03-12 04:49:34.753 [INFO][5056] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Mar 12 04:49:34.883607 containerd[1510]: 2026-03-12 04:49:34.829 [INFO][5065] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" HandleID="k8s-pod-network.8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" Mar 12 04:49:34.883607 containerd[1510]: 2026-03-12 04:49:34.830 [INFO][5065] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:34.883607 containerd[1510]: 2026-03-12 04:49:34.830 [INFO][5065] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:34.883607 containerd[1510]: 2026-03-12 04:49:34.862 [WARNING][5065] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" HandleID="k8s-pod-network.8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" Mar 12 04:49:34.883607 containerd[1510]: 2026-03-12 04:49:34.867 [INFO][5065] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" HandleID="k8s-pod-network.8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" Mar 12 04:49:34.883607 containerd[1510]: 2026-03-12 04:49:34.871 [INFO][5065] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:34.883607 containerd[1510]: 2026-03-12 04:49:34.877 [INFO][5056] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Mar 12 04:49:34.904993 containerd[1510]: time="2026-03-12T04:49:34.902519788Z" level=info msg="TearDown network for sandbox \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\" successfully" Mar 12 04:49:34.904993 containerd[1510]: time="2026-03-12T04:49:34.902583198Z" level=info msg="StopPodSandbox for \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\" returns successfully" Mar 12 04:49:34.970919 containerd[1510]: time="2026-03-12T04:49:34.970824539Z" level=info msg="RemovePodSandbox for \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\"" Mar 12 04:49:34.971561 containerd[1510]: time="2026-03-12T04:49:34.971434404Z" level=info msg="Forcibly stopping sandbox \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\"" Mar 12 04:49:35.210961 containerd[1510]: 2026-03-12 04:49:35.132 [WARNING][5080] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"b9bb1a7d-4427-4ab5-9819-9fb3c34a6da8", ResourceVersion:"1021", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"f3c977fef75827ef0de18ac86f28613cef1ac703b472301e87e43d32bdf6c392", Pod:"coredns-66bc5c9577-tpm4s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.54.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali294cc709920", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:35.210961 containerd[1510]: 2026-03-12 04:49:35.133 [INFO][5080] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Mar 12 04:49:35.210961 containerd[1510]: 2026-03-12 04:49:35.133 [INFO][5080] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" iface="eth0" netns="" Mar 12 04:49:35.210961 containerd[1510]: 2026-03-12 04:49:35.133 [INFO][5080] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Mar 12 04:49:35.210961 containerd[1510]: 2026-03-12 04:49:35.133 [INFO][5080] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Mar 12 04:49:35.210961 containerd[1510]: 2026-03-12 04:49:35.188 [INFO][5087] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" HandleID="k8s-pod-network.8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" Mar 12 04:49:35.210961 containerd[1510]: 2026-03-12 04:49:35.188 [INFO][5087] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:35.210961 containerd[1510]: 2026-03-12 04:49:35.188 [INFO][5087] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:35.210961 containerd[1510]: 2026-03-12 04:49:35.202 [WARNING][5087] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" HandleID="k8s-pod-network.8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" Mar 12 04:49:35.210961 containerd[1510]: 2026-03-12 04:49:35.203 [INFO][5087] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" HandleID="k8s-pod-network.8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--tpm4s-eth0" Mar 12 04:49:35.210961 containerd[1510]: 2026-03-12 04:49:35.205 [INFO][5087] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:35.210961 containerd[1510]: 2026-03-12 04:49:35.208 [INFO][5080] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8" Mar 12 04:49:35.212144 containerd[1510]: time="2026-03-12T04:49:35.211013120Z" level=info msg="TearDown network for sandbox \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\" successfully" Mar 12 04:49:35.215640 containerd[1510]: time="2026-03-12T04:49:35.215597570Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 12 04:49:35.215803 containerd[1510]: time="2026-03-12T04:49:35.215712071Z" level=info msg="RemovePodSandbox \"8260722896fa45afd25b2d52de71fabddc3dce9242930b1560083a653d8433d8\" returns successfully" Mar 12 04:49:35.216733 containerd[1510]: time="2026-03-12T04:49:35.216675230Z" level=info msg="StopPodSandbox for \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\"" Mar 12 04:49:35.378564 containerd[1510]: 2026-03-12 04:49:35.294 [WARNING][5102] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"2d76263c-902f-4005-a7fd-9d164ca77df9", ResourceVersion:"988", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a", Pod:"goldmane-cccfbd5cf-fn29j", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.54.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali17c3c54c9e4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:35.378564 containerd[1510]: 2026-03-12 04:49:35.296 [INFO][5102] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Mar 12 04:49:35.378564 containerd[1510]: 2026-03-12 04:49:35.296 [INFO][5102] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" iface="eth0" netns="" Mar 12 04:49:35.378564 containerd[1510]: 2026-03-12 04:49:35.296 [INFO][5102] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Mar 12 04:49:35.378564 containerd[1510]: 2026-03-12 04:49:35.296 [INFO][5102] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Mar 12 04:49:35.378564 containerd[1510]: 2026-03-12 04:49:35.338 [INFO][5109] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" HandleID="k8s-pod-network.447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Workload="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" Mar 12 04:49:35.378564 containerd[1510]: 2026-03-12 04:49:35.338 [INFO][5109] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:35.378564 containerd[1510]: 2026-03-12 04:49:35.338 [INFO][5109] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:35.378564 containerd[1510]: 2026-03-12 04:49:35.367 [WARNING][5109] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" HandleID="k8s-pod-network.447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Workload="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" Mar 12 04:49:35.378564 containerd[1510]: 2026-03-12 04:49:35.367 [INFO][5109] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" HandleID="k8s-pod-network.447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Workload="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" Mar 12 04:49:35.378564 containerd[1510]: 2026-03-12 04:49:35.371 [INFO][5109] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:35.378564 containerd[1510]: 2026-03-12 04:49:35.374 [INFO][5102] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Mar 12 04:49:35.378564 containerd[1510]: time="2026-03-12T04:49:35.378236816Z" level=info msg="TearDown network for sandbox \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\" successfully" Mar 12 04:49:35.378564 containerd[1510]: time="2026-03-12T04:49:35.378309298Z" level=info msg="StopPodSandbox for \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\" returns successfully" Mar 12 04:49:35.382362 containerd[1510]: time="2026-03-12T04:49:35.379674282Z" level=info msg="RemovePodSandbox for \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\"" Mar 12 04:49:35.382362 containerd[1510]: time="2026-03-12T04:49:35.379815990Z" level=info msg="Forcibly stopping sandbox \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\"" Mar 12 04:49:35.576659 containerd[1510]: 2026-03-12 04:49:35.457 [WARNING][5124] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"2d76263c-902f-4005-a7fd-9d164ca77df9", ResourceVersion:"988", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a", Pod:"goldmane-cccfbd5cf-fn29j", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.54.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali17c3c54c9e4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:35.576659 containerd[1510]: 2026-03-12 04:49:35.458 [INFO][5124] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Mar 12 04:49:35.576659 containerd[1510]: 2026-03-12 04:49:35.458 [INFO][5124] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" iface="eth0" netns="" Mar 12 04:49:35.576659 containerd[1510]: 2026-03-12 04:49:35.458 [INFO][5124] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Mar 12 04:49:35.576659 containerd[1510]: 2026-03-12 04:49:35.459 [INFO][5124] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Mar 12 04:49:35.576659 containerd[1510]: 2026-03-12 04:49:35.533 [INFO][5131] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" HandleID="k8s-pod-network.447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Workload="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" Mar 12 04:49:35.576659 containerd[1510]: 2026-03-12 04:49:35.533 [INFO][5131] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:35.576659 containerd[1510]: 2026-03-12 04:49:35.534 [INFO][5131] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:35.576659 containerd[1510]: 2026-03-12 04:49:35.549 [WARNING][5131] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" HandleID="k8s-pod-network.447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Workload="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" Mar 12 04:49:35.576659 containerd[1510]: 2026-03-12 04:49:35.549 [INFO][5131] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" HandleID="k8s-pod-network.447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Workload="srv--1ee83.gb1.brightbox.com-k8s-goldmane--cccfbd5cf--fn29j-eth0" Mar 12 04:49:35.576659 containerd[1510]: 2026-03-12 04:49:35.553 [INFO][5131] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:35.576659 containerd[1510]: 2026-03-12 04:49:35.568 [INFO][5124] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6" Mar 12 04:49:35.576659 containerd[1510]: time="2026-03-12T04:49:35.575306032Z" level=info msg="TearDown network for sandbox \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\" successfully" Mar 12 04:49:35.582184 containerd[1510]: time="2026-03-12T04:49:35.582122098Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 12 04:49:35.582316 containerd[1510]: time="2026-03-12T04:49:35.582225109Z" level=info msg="RemovePodSandbox \"447432f3168f4e88bf4397122a80943201d70ececf32a6182e9141a4a676adf6\" returns successfully" Mar 12 04:49:35.584355 containerd[1510]: time="2026-03-12T04:49:35.584318075Z" level=info msg="StopPodSandbox for \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\"" Mar 12 04:49:35.890664 containerd[1510]: 2026-03-12 04:49:35.689 [WARNING][5145] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"dd0fb06c-f33e-47f9-b47a-58a6d0dd5bf9", ResourceVersion:"1024", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7", Pod:"coredns-66bc5c9577-wd9b5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.54.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8200a6d65fa", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:35.890664 containerd[1510]: 2026-03-12 04:49:35.689 [INFO][5145] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Mar 12 04:49:35.890664 containerd[1510]: 2026-03-12 04:49:35.689 [INFO][5145] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" iface="eth0" netns="" Mar 12 04:49:35.890664 containerd[1510]: 2026-03-12 04:49:35.689 [INFO][5145] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Mar 12 04:49:35.890664 containerd[1510]: 2026-03-12 04:49:35.689 [INFO][5145] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Mar 12 04:49:35.890664 containerd[1510]: 2026-03-12 04:49:35.851 [INFO][5152] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" HandleID="k8s-pod-network.55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" Mar 12 04:49:35.890664 containerd[1510]: 2026-03-12 04:49:35.854 [INFO][5152] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:35.890664 containerd[1510]: 2026-03-12 04:49:35.854 [INFO][5152] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:35.890664 containerd[1510]: 2026-03-12 04:49:35.872 [WARNING][5152] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" HandleID="k8s-pod-network.55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" Mar 12 04:49:35.890664 containerd[1510]: 2026-03-12 04:49:35.873 [INFO][5152] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" HandleID="k8s-pod-network.55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" Mar 12 04:49:35.890664 containerd[1510]: 2026-03-12 04:49:35.882 [INFO][5152] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:35.890664 containerd[1510]: 2026-03-12 04:49:35.886 [INFO][5145] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Mar 12 04:49:35.893370 containerd[1510]: time="2026-03-12T04:49:35.890780347Z" level=info msg="TearDown network for sandbox \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\" successfully" Mar 12 04:49:35.893370 containerd[1510]: time="2026-03-12T04:49:35.890831255Z" level=info msg="StopPodSandbox for \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\" returns successfully" Mar 12 04:49:35.908572 containerd[1510]: time="2026-03-12T04:49:35.907850146Z" level=info msg="RemovePodSandbox for \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\"" Mar 12 04:49:35.908572 containerd[1510]: time="2026-03-12T04:49:35.907996279Z" level=info msg="Forcibly stopping sandbox \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\"" Mar 12 04:49:36.124628 containerd[1510]: 2026-03-12 04:49:36.012 [WARNING][5166] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"dd0fb06c-f33e-47f9-b47a-58a6d0dd5bf9", ResourceVersion:"1024", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"b8e3bd130889b031d0a8ccee7eebb1f72d66745f3e681a42756414a31558f1d7", Pod:"coredns-66bc5c9577-wd9b5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.54.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8200a6d65fa", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:36.124628 containerd[1510]: 2026-03-12 04:49:36.012 [INFO][5166] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Mar 12 04:49:36.124628 containerd[1510]: 2026-03-12 04:49:36.012 [INFO][5166] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" iface="eth0" netns="" Mar 12 04:49:36.124628 containerd[1510]: 2026-03-12 04:49:36.012 [INFO][5166] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Mar 12 04:49:36.124628 containerd[1510]: 2026-03-12 04:49:36.012 [INFO][5166] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Mar 12 04:49:36.124628 containerd[1510]: 2026-03-12 04:49:36.076 [INFO][5173] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" HandleID="k8s-pod-network.55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" Mar 12 04:49:36.124628 containerd[1510]: 2026-03-12 04:49:36.080 [INFO][5173] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:36.124628 containerd[1510]: 2026-03-12 04:49:36.080 [INFO][5173] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:36.124628 containerd[1510]: 2026-03-12 04:49:36.106 [WARNING][5173] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" HandleID="k8s-pod-network.55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" Mar 12 04:49:36.124628 containerd[1510]: 2026-03-12 04:49:36.106 [INFO][5173] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" HandleID="k8s-pod-network.55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Workload="srv--1ee83.gb1.brightbox.com-k8s-coredns--66bc5c9577--wd9b5-eth0" Mar 12 04:49:36.124628 containerd[1510]: 2026-03-12 04:49:36.111 [INFO][5173] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:36.124628 containerd[1510]: 2026-03-12 04:49:36.116 [INFO][5166] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf" Mar 12 04:49:36.128411 containerd[1510]: time="2026-03-12T04:49:36.127116864Z" level=info msg="TearDown network for sandbox \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\" successfully" Mar 12 04:49:36.138727 containerd[1510]: time="2026-03-12T04:49:36.138663521Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 12 04:49:36.139972 containerd[1510]: time="2026-03-12T04:49:36.139936850Z" level=info msg="RemovePodSandbox \"55e158798a44cb11d388ff1dfb3114bf422edf7aa4fe154d2fc636849a359cdf\" returns successfully" Mar 12 04:49:36.151393 containerd[1510]: time="2026-03-12T04:49:36.151247061Z" level=info msg="StopPodSandbox for \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\"" Mar 12 04:49:36.370156 containerd[1510]: 2026-03-12 04:49:36.286 [WARNING][5203] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0", GenerateName:"calico-apiserver-6fcb6485c6-", Namespace:"calico-system", SelfLink:"", UID:"5847ada8-9805-4f41-8520-9af174205760", ResourceVersion:"982", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6fcb6485c6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889", Pod:"calico-apiserver-6fcb6485c6-mlm75", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.54.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calib00f6146ae1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:36.370156 containerd[1510]: 2026-03-12 04:49:36.287 [INFO][5203] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Mar 12 04:49:36.370156 containerd[1510]: 2026-03-12 04:49:36.287 [INFO][5203] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" iface="eth0" netns="" Mar 12 04:49:36.370156 containerd[1510]: 2026-03-12 04:49:36.287 [INFO][5203] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Mar 12 04:49:36.370156 containerd[1510]: 2026-03-12 04:49:36.287 [INFO][5203] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Mar 12 04:49:36.370156 containerd[1510]: 2026-03-12 04:49:36.342 [INFO][5210] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" HandleID="k8s-pod-network.993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" Mar 12 04:49:36.370156 containerd[1510]: 2026-03-12 04:49:36.342 [INFO][5210] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:36.370156 containerd[1510]: 2026-03-12 04:49:36.342 [INFO][5210] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:36.370156 containerd[1510]: 2026-03-12 04:49:36.358 [WARNING][5210] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" HandleID="k8s-pod-network.993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" Mar 12 04:49:36.370156 containerd[1510]: 2026-03-12 04:49:36.358 [INFO][5210] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" HandleID="k8s-pod-network.993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" Mar 12 04:49:36.370156 containerd[1510]: 2026-03-12 04:49:36.360 [INFO][5210] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:36.370156 containerd[1510]: 2026-03-12 04:49:36.366 [INFO][5203] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Mar 12 04:49:36.370156 containerd[1510]: time="2026-03-12T04:49:36.370076246Z" level=info msg="TearDown network for sandbox \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\" successfully" Mar 12 04:49:36.370156 containerd[1510]: time="2026-03-12T04:49:36.370157663Z" level=info msg="StopPodSandbox for \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\" returns successfully" Mar 12 04:49:36.382680 containerd[1510]: time="2026-03-12T04:49:36.382594451Z" level=info msg="RemovePodSandbox for \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\"" Mar 12 04:49:36.382680 containerd[1510]: time="2026-03-12T04:49:36.382659738Z" level=info msg="Forcibly stopping sandbox \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\"" Mar 12 04:49:36.598912 containerd[1510]: 2026-03-12 04:49:36.492 [WARNING][5224] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0", GenerateName:"calico-apiserver-6fcb6485c6-", Namespace:"calico-system", SelfLink:"", UID:"5847ada8-9805-4f41-8520-9af174205760", ResourceVersion:"982", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6fcb6485c6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889", Pod:"calico-apiserver-6fcb6485c6-mlm75", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.54.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calib00f6146ae1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:36.598912 containerd[1510]: 2026-03-12 04:49:36.492 [INFO][5224] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Mar 12 04:49:36.598912 containerd[1510]: 2026-03-12 04:49:36.492 [INFO][5224] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" iface="eth0" netns="" Mar 12 04:49:36.598912 containerd[1510]: 2026-03-12 04:49:36.492 [INFO][5224] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Mar 12 04:49:36.598912 containerd[1510]: 2026-03-12 04:49:36.492 [INFO][5224] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Mar 12 04:49:36.598912 containerd[1510]: 2026-03-12 04:49:36.569 [INFO][5231] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" HandleID="k8s-pod-network.993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" Mar 12 04:49:36.598912 containerd[1510]: 2026-03-12 04:49:36.569 [INFO][5231] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:36.598912 containerd[1510]: 2026-03-12 04:49:36.569 [INFO][5231] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:36.598912 containerd[1510]: 2026-03-12 04:49:36.587 [WARNING][5231] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" HandleID="k8s-pod-network.993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" Mar 12 04:49:36.598912 containerd[1510]: 2026-03-12 04:49:36.587 [INFO][5231] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" HandleID="k8s-pod-network.993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--mlm75-eth0" Mar 12 04:49:36.598912 containerd[1510]: 2026-03-12 04:49:36.591 [INFO][5231] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:36.598912 containerd[1510]: 2026-03-12 04:49:36.595 [INFO][5224] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117" Mar 12 04:49:36.599758 containerd[1510]: time="2026-03-12T04:49:36.598927446Z" level=info msg="TearDown network for sandbox \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\" successfully" Mar 12 04:49:36.609782 containerd[1510]: time="2026-03-12T04:49:36.609247582Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 12 04:49:36.609782 containerd[1510]: time="2026-03-12T04:49:36.609352082Z" level=info msg="RemovePodSandbox \"993632f7227606b2b908836493df0b953e6690d8d39aacc8653bfefeccaf5117\" returns successfully" Mar 12 04:49:36.618456 containerd[1510]: time="2026-03-12T04:49:36.618401651Z" level=info msg="StopPodSandbox for \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\"" Mar 12 04:49:36.818620 containerd[1510]: 2026-03-12 04:49:36.720 [WARNING][5245] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0", GenerateName:"calico-kube-controllers-98bbf8757-", Namespace:"calico-system", SelfLink:"", UID:"5d03062c-e112-45ee-991a-aaa27fc24b6c", ResourceVersion:"1048", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"98bbf8757", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c", Pod:"calico-kube-controllers-98bbf8757-vp827", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.54.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie6cbce439d2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:36.818620 containerd[1510]: 2026-03-12 04:49:36.721 [INFO][5245] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Mar 12 04:49:36.818620 containerd[1510]: 2026-03-12 04:49:36.721 [INFO][5245] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" iface="eth0" netns="" Mar 12 04:49:36.818620 containerd[1510]: 2026-03-12 04:49:36.721 [INFO][5245] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Mar 12 04:49:36.818620 containerd[1510]: 2026-03-12 04:49:36.721 [INFO][5245] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Mar 12 04:49:36.818620 containerd[1510]: 2026-03-12 04:49:36.779 [INFO][5252] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" HandleID="k8s-pod-network.52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" Mar 12 04:49:36.818620 containerd[1510]: 2026-03-12 04:49:36.780 [INFO][5252] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:36.818620 containerd[1510]: 2026-03-12 04:49:36.780 [INFO][5252] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:36.818620 containerd[1510]: 2026-03-12 04:49:36.804 [WARNING][5252] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" HandleID="k8s-pod-network.52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" Mar 12 04:49:36.818620 containerd[1510]: 2026-03-12 04:49:36.805 [INFO][5252] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" HandleID="k8s-pod-network.52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" Mar 12 04:49:36.818620 containerd[1510]: 2026-03-12 04:49:36.811 [INFO][5252] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:36.818620 containerd[1510]: 2026-03-12 04:49:36.815 [INFO][5245] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Mar 12 04:49:36.820021 containerd[1510]: time="2026-03-12T04:49:36.818708422Z" level=info msg="TearDown network for sandbox \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\" successfully" Mar 12 04:49:36.820021 containerd[1510]: time="2026-03-12T04:49:36.818756220Z" level=info msg="StopPodSandbox for \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\" returns successfully" Mar 12 04:49:36.822086 containerd[1510]: time="2026-03-12T04:49:36.821970276Z" level=info msg="RemovePodSandbox for \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\"" Mar 12 04:49:36.822086 containerd[1510]: time="2026-03-12T04:49:36.822017838Z" level=info msg="Forcibly stopping sandbox \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\"" Mar 12 04:49:37.055893 containerd[1510]: 2026-03-12 04:49:36.945 [WARNING][5267] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0", GenerateName:"calico-kube-controllers-98bbf8757-", Namespace:"calico-system", SelfLink:"", UID:"5d03062c-e112-45ee-991a-aaa27fc24b6c", ResourceVersion:"1048", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"98bbf8757", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"cccb386a3f14c6623361b25c528b898641735514679da79a487ae5765c49236c", Pod:"calico-kube-controllers-98bbf8757-vp827", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.54.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie6cbce439d2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:37.055893 containerd[1510]: 2026-03-12 04:49:36.945 [INFO][5267] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Mar 12 04:49:37.055893 containerd[1510]: 2026-03-12 04:49:36.945 [INFO][5267] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" iface="eth0" netns="" Mar 12 04:49:37.055893 containerd[1510]: 2026-03-12 04:49:36.945 [INFO][5267] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Mar 12 04:49:37.055893 containerd[1510]: 2026-03-12 04:49:36.945 [INFO][5267] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Mar 12 04:49:37.055893 containerd[1510]: 2026-03-12 04:49:37.029 [INFO][5274] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" HandleID="k8s-pod-network.52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" Mar 12 04:49:37.055893 containerd[1510]: 2026-03-12 04:49:37.029 [INFO][5274] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:37.055893 containerd[1510]: 2026-03-12 04:49:37.029 [INFO][5274] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:37.055893 containerd[1510]: 2026-03-12 04:49:37.045 [WARNING][5274] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" HandleID="k8s-pod-network.52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" Mar 12 04:49:37.055893 containerd[1510]: 2026-03-12 04:49:37.045 [INFO][5274] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" HandleID="k8s-pod-network.52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--kube--controllers--98bbf8757--vp827-eth0" Mar 12 04:49:37.055893 containerd[1510]: 2026-03-12 04:49:37.048 [INFO][5274] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:37.055893 containerd[1510]: 2026-03-12 04:49:37.051 [INFO][5267] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9" Mar 12 04:49:37.058027 containerd[1510]: time="2026-03-12T04:49:37.056174260Z" level=info msg="TearDown network for sandbox \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\" successfully" Mar 12 04:49:37.061649 containerd[1510]: time="2026-03-12T04:49:37.061610100Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 12 04:49:37.061835 containerd[1510]: time="2026-03-12T04:49:37.061790926Z" level=info msg="RemovePodSandbox \"52ed560d114488f0edc633c361768ba5ab1413d3850c331103d1f50d6200cff9\" returns successfully" Mar 12 04:49:37.071086 containerd[1510]: time="2026-03-12T04:49:37.070993065Z" level=info msg="StopPodSandbox for \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\"" Mar 12 04:49:37.257411 containerd[1510]: 2026-03-12 04:49:37.165 [WARNING][5288] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0", GenerateName:"calico-apiserver-6fcb6485c6-", Namespace:"calico-system", SelfLink:"", UID:"2e788078-9f19-4645-87ab-03575ffe2f01", ResourceVersion:"987", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6fcb6485c6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1", Pod:"calico-apiserver-6fcb6485c6-4gx67", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.54.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali195415107d1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:37.257411 containerd[1510]: 2026-03-12 04:49:37.166 [INFO][5288] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Mar 12 04:49:37.257411 containerd[1510]: 2026-03-12 04:49:37.166 [INFO][5288] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" iface="eth0" netns="" Mar 12 04:49:37.257411 containerd[1510]: 2026-03-12 04:49:37.166 [INFO][5288] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Mar 12 04:49:37.257411 containerd[1510]: 2026-03-12 04:49:37.166 [INFO][5288] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Mar 12 04:49:37.257411 containerd[1510]: 2026-03-12 04:49:37.229 [INFO][5295] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" HandleID="k8s-pod-network.16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" Mar 12 04:49:37.257411 containerd[1510]: 2026-03-12 04:49:37.232 [INFO][5295] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:37.257411 containerd[1510]: 2026-03-12 04:49:37.232 [INFO][5295] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:37.257411 containerd[1510]: 2026-03-12 04:49:37.245 [WARNING][5295] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" HandleID="k8s-pod-network.16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" Mar 12 04:49:37.257411 containerd[1510]: 2026-03-12 04:49:37.245 [INFO][5295] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" HandleID="k8s-pod-network.16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" Mar 12 04:49:37.257411 containerd[1510]: 2026-03-12 04:49:37.249 [INFO][5295] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:37.257411 containerd[1510]: 2026-03-12 04:49:37.252 [INFO][5288] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Mar 12 04:49:37.260134 containerd[1510]: time="2026-03-12T04:49:37.260061158Z" level=info msg="TearDown network for sandbox \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\" successfully" Mar 12 04:49:37.260269 containerd[1510]: time="2026-03-12T04:49:37.260242335Z" level=info msg="StopPodSandbox for \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\" returns successfully" Mar 12 04:49:37.266441 containerd[1510]: time="2026-03-12T04:49:37.266375435Z" level=info msg="RemovePodSandbox for \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\"" Mar 12 04:49:37.266665 containerd[1510]: time="2026-03-12T04:49:37.266637854Z" level=info msg="Forcibly stopping sandbox \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\"" Mar 12 04:49:37.502815 containerd[1510]: 2026-03-12 04:49:37.374 [WARNING][5310] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0", GenerateName:"calico-apiserver-6fcb6485c6-", Namespace:"calico-system", SelfLink:"", UID:"2e788078-9f19-4645-87ab-03575ffe2f01", ResourceVersion:"987", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 48, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6fcb6485c6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-1ee83.gb1.brightbox.com", ContainerID:"4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1", Pod:"calico-apiserver-6fcb6485c6-4gx67", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.54.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali195415107d1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:49:37.502815 containerd[1510]: 2026-03-12 04:49:37.374 [INFO][5310] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Mar 12 04:49:37.502815 containerd[1510]: 2026-03-12 04:49:37.374 [INFO][5310] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" iface="eth0" netns="" Mar 12 04:49:37.502815 containerd[1510]: 2026-03-12 04:49:37.374 [INFO][5310] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Mar 12 04:49:37.502815 containerd[1510]: 2026-03-12 04:49:37.374 [INFO][5310] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Mar 12 04:49:37.502815 containerd[1510]: 2026-03-12 04:49:37.469 [INFO][5318] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" HandleID="k8s-pod-network.16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" Mar 12 04:49:37.502815 containerd[1510]: 2026-03-12 04:49:37.471 [INFO][5318] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:49:37.502815 containerd[1510]: 2026-03-12 04:49:37.471 [INFO][5318] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:49:37.502815 containerd[1510]: 2026-03-12 04:49:37.489 [WARNING][5318] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" HandleID="k8s-pod-network.16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" Mar 12 04:49:37.502815 containerd[1510]: 2026-03-12 04:49:37.489 [INFO][5318] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" HandleID="k8s-pod-network.16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Workload="srv--1ee83.gb1.brightbox.com-k8s-calico--apiserver--6fcb6485c6--4gx67-eth0" Mar 12 04:49:37.502815 containerd[1510]: 2026-03-12 04:49:37.496 [INFO][5318] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:49:37.502815 containerd[1510]: 2026-03-12 04:49:37.500 [INFO][5310] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159" Mar 12 04:49:37.507075 containerd[1510]: time="2026-03-12T04:49:37.506152348Z" level=info msg="TearDown network for sandbox \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\" successfully" Mar 12 04:49:37.511893 containerd[1510]: time="2026-03-12T04:49:37.511830020Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 12 04:49:37.512206 containerd[1510]: time="2026-03-12T04:49:37.512175263Z" level=info msg="RemovePodSandbox \"16062534a495e33057120aaa70edfaaa70ef7fd1ce3c579998934fa3b14cd159\" returns successfully" Mar 12 04:49:38.119519 containerd[1510]: time="2026-03-12T04:49:38.119456875Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:38.121084 containerd[1510]: time="2026-03-12T04:49:38.120179824Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Mar 12 04:49:38.121705 containerd[1510]: time="2026-03-12T04:49:38.121670573Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:38.126753 containerd[1510]: time="2026-03-12T04:49:38.126709758Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:38.140507 containerd[1510]: time="2026-03-12T04:49:38.140441903Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 5.615444795s" Mar 12 04:49:38.140680 containerd[1510]: time="2026-03-12T04:49:38.140651229Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 12 04:49:38.172882 containerd[1510]: time="2026-03-12T04:49:38.171659745Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 12 04:49:38.241476 containerd[1510]: time="2026-03-12T04:49:38.241386119Z" level=info msg="CreateContainer within sandbox \"a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 12 04:49:38.260408 containerd[1510]: time="2026-03-12T04:49:38.259992806Z" level=info msg="CreateContainer within sandbox \"a4c380ba879e8dc99b46da81e6e09c7cd5f83bb13ed27b89039823cfc65d8889\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"bc33862f0aa526fb09af1fa34ec350f279f6f7e8218347911187872a5368a6bb\"" Mar 12 04:49:38.263122 containerd[1510]: time="2026-03-12T04:49:38.262874983Z" level=info msg="StartContainer for \"bc33862f0aa526fb09af1fa34ec350f279f6f7e8218347911187872a5368a6bb\"" Mar 12 04:49:38.396074 systemd[1]: run-containerd-runc-k8s.io-bc33862f0aa526fb09af1fa34ec350f279f6f7e8218347911187872a5368a6bb-runc.HSE3kl.mount: Deactivated successfully. Mar 12 04:49:38.415912 systemd[1]: Started cri-containerd-bc33862f0aa526fb09af1fa34ec350f279f6f7e8218347911187872a5368a6bb.scope - libcontainer container bc33862f0aa526fb09af1fa34ec350f279f6f7e8218347911187872a5368a6bb. Mar 12 04:49:38.516899 containerd[1510]: time="2026-03-12T04:49:38.516548552Z" level=info msg="StartContainer for \"bc33862f0aa526fb09af1fa34ec350f279f6f7e8218347911187872a5368a6bb\" returns successfully" Mar 12 04:49:38.545525 containerd[1510]: time="2026-03-12T04:49:38.545418141Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:38.547860 containerd[1510]: time="2026-03-12T04:49:38.547391056Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 12 04:49:38.552224 containerd[1510]: time="2026-03-12T04:49:38.552165019Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 380.447959ms" Mar 12 04:49:38.552362 containerd[1510]: time="2026-03-12T04:49:38.552335311Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 12 04:49:38.557884 containerd[1510]: time="2026-03-12T04:49:38.557152181Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 12 04:49:38.572836 containerd[1510]: time="2026-03-12T04:49:38.572326617Z" level=info msg="CreateContainer within sandbox \"4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 12 04:49:38.621989 containerd[1510]: time="2026-03-12T04:49:38.620025187Z" level=info msg="CreateContainer within sandbox \"4b05ba9ce115d56f03371661f837ef25fe4aaaf74a4c62de19090b86ef8f78f1\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c1b0aaccec565d63a148d9fecb7d6c1329893988aa09992e2ced0030b8274253\"" Mar 12 04:49:38.627426 containerd[1510]: time="2026-03-12T04:49:38.626642940Z" level=info msg="StartContainer for \"c1b0aaccec565d63a148d9fecb7d6c1329893988aa09992e2ced0030b8274253\"" Mar 12 04:49:38.774555 systemd[1]: Started cri-containerd-c1b0aaccec565d63a148d9fecb7d6c1329893988aa09992e2ced0030b8274253.scope - libcontainer container c1b0aaccec565d63a148d9fecb7d6c1329893988aa09992e2ced0030b8274253. Mar 12 04:49:38.807085 kubelet[2694]: I0312 04:49:38.749179 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-6fcb6485c6-mlm75" podStartSLOduration=32.639907017 podStartE2EDuration="43.733930586s" podCreationTimestamp="2026-03-12 04:48:55 +0000 UTC" firstStartedPulling="2026-03-12 04:49:27.070223705 +0000 UTC m=+53.559162032" lastFinishedPulling="2026-03-12 04:49:38.164247278 +0000 UTC m=+64.653185601" observedRunningTime="2026-03-12 04:49:38.688600221 +0000 UTC m=+65.177538557" watchObservedRunningTime="2026-03-12 04:49:38.733930586 +0000 UTC m=+65.222868916" Mar 12 04:49:38.943156 containerd[1510]: time="2026-03-12T04:49:38.942362278Z" level=info msg="StartContainer for \"c1b0aaccec565d63a148d9fecb7d6c1329893988aa09992e2ced0030b8274253\" returns successfully" Mar 12 04:49:39.631375 kubelet[2694]: I0312 04:49:39.631153 2694 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 04:49:40.651760 kubelet[2694]: I0312 04:49:40.648837 2694 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 04:49:42.852785 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2417767794.mount: Deactivated successfully. Mar 12 04:49:43.692424 containerd[1510]: time="2026-03-12T04:49:43.692301966Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:43.694305 containerd[1510]: time="2026-03-12T04:49:43.693652448Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Mar 12 04:49:43.695659 containerd[1510]: time="2026-03-12T04:49:43.695561591Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:43.703703 containerd[1510]: time="2026-03-12T04:49:43.703582185Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:43.705507 containerd[1510]: time="2026-03-12T04:49:43.705232407Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 5.146824707s" Mar 12 04:49:43.705507 containerd[1510]: time="2026-03-12T04:49:43.705316073Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Mar 12 04:49:43.799715 containerd[1510]: time="2026-03-12T04:49:43.799653671Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 12 04:49:43.952089 containerd[1510]: time="2026-03-12T04:49:43.951888755Z" level=info msg="CreateContainer within sandbox \"780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 12 04:49:43.983620 containerd[1510]: time="2026-03-12T04:49:43.983388792Z" level=info msg="CreateContainer within sandbox \"780ce5d02a168b5be15f51f3bbc5480d1e21cf540bc7f0374b631f82c42a9d0a\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"503bea1bdb8fc5a864f1e6c1823aec03e1f2c5700416eec3446869a5f726b2e7\"" Mar 12 04:49:43.986622 containerd[1510]: time="2026-03-12T04:49:43.986564094Z" level=info msg="StartContainer for \"503bea1bdb8fc5a864f1e6c1823aec03e1f2c5700416eec3446869a5f726b2e7\"" Mar 12 04:49:44.201447 systemd[1]: Started cri-containerd-503bea1bdb8fc5a864f1e6c1823aec03e1f2c5700416eec3446869a5f726b2e7.scope - libcontainer container 503bea1bdb8fc5a864f1e6c1823aec03e1f2c5700416eec3446869a5f726b2e7. Mar 12 04:49:44.294997 containerd[1510]: time="2026-03-12T04:49:44.294561209Z" level=info msg="StartContainer for \"503bea1bdb8fc5a864f1e6c1823aec03e1f2c5700416eec3446869a5f726b2e7\" returns successfully" Mar 12 04:49:44.769111 kubelet[2694]: I0312 04:49:44.749933 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-6fcb6485c6-4gx67" podStartSLOduration=38.446400153 podStartE2EDuration="49.74989051s" podCreationTimestamp="2026-03-12 04:48:55 +0000 UTC" firstStartedPulling="2026-03-12 04:49:27.253173549 +0000 UTC m=+53.742111878" lastFinishedPulling="2026-03-12 04:49:38.556663913 +0000 UTC m=+65.045602235" observedRunningTime="2026-03-12 04:49:39.648324224 +0000 UTC m=+66.137262559" watchObservedRunningTime="2026-03-12 04:49:44.74989051 +0000 UTC m=+71.238828835" Mar 12 04:49:44.769111 kubelet[2694]: I0312 04:49:44.767771 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-cccfbd5cf-fn29j" podStartSLOduration=33.630861045 podStartE2EDuration="49.767745821s" podCreationTimestamp="2026-03-12 04:48:55 +0000 UTC" firstStartedPulling="2026-03-12 04:49:27.639762411 +0000 UTC m=+54.128700734" lastFinishedPulling="2026-03-12 04:49:43.776647189 +0000 UTC m=+70.265585510" observedRunningTime="2026-03-12 04:49:44.747504605 +0000 UTC m=+71.236442949" watchObservedRunningTime="2026-03-12 04:49:44.767745821 +0000 UTC m=+71.256684156" Mar 12 04:49:46.398616 containerd[1510]: time="2026-03-12T04:49:46.398526102Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:46.405380 containerd[1510]: time="2026-03-12T04:49:46.404636563Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Mar 12 04:49:46.408146 containerd[1510]: time="2026-03-12T04:49:46.407934742Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:46.425531 containerd[1510]: time="2026-03-12T04:49:46.425403278Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:46.430296 containerd[1510]: time="2026-03-12T04:49:46.430095105Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 2.6291964s" Mar 12 04:49:46.430296 containerd[1510]: time="2026-03-12T04:49:46.430152813Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Mar 12 04:49:46.433127 containerd[1510]: time="2026-03-12T04:49:46.433073592Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 12 04:49:46.470442 containerd[1510]: time="2026-03-12T04:49:46.470086992Z" level=info msg="CreateContainer within sandbox \"20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 12 04:49:46.514512 containerd[1510]: time="2026-03-12T04:49:46.514442886Z" level=info msg="CreateContainer within sandbox \"20e2df818e054bcaecd0aeda5295548c1b427f687563f933506144a1b8c9628a\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"fc18f4f4868313fdfd112e6b548aa5bc221dbb86ed701e0a376d6d4685a25721\"" Mar 12 04:49:46.516414 containerd[1510]: time="2026-03-12T04:49:46.516104205Z" level=info msg="StartContainer for \"fc18f4f4868313fdfd112e6b548aa5bc221dbb86ed701e0a376d6d4685a25721\"" Mar 12 04:49:46.598877 systemd[1]: Started cri-containerd-fc18f4f4868313fdfd112e6b548aa5bc221dbb86ed701e0a376d6d4685a25721.scope - libcontainer container fc18f4f4868313fdfd112e6b548aa5bc221dbb86ed701e0a376d6d4685a25721. Mar 12 04:49:46.854855 containerd[1510]: time="2026-03-12T04:49:46.854726303Z" level=info msg="StartContainer for \"fc18f4f4868313fdfd112e6b548aa5bc221dbb86ed701e0a376d6d4685a25721\" returns successfully" Mar 12 04:49:47.169394 kubelet[2694]: I0312 04:49:47.162270 2694 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 12 04:49:47.172554 kubelet[2694]: I0312 04:49:47.172455 2694 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 12 04:49:48.209081 containerd[1510]: time="2026-03-12T04:49:48.208798209Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:48.210698 containerd[1510]: time="2026-03-12T04:49:48.210291650Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Mar 12 04:49:48.213073 containerd[1510]: time="2026-03-12T04:49:48.211318567Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:48.215212 containerd[1510]: time="2026-03-12T04:49:48.215174537Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:48.216633 containerd[1510]: time="2026-03-12T04:49:48.216587834Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 1.783395706s" Mar 12 04:49:48.216735 containerd[1510]: time="2026-03-12T04:49:48.216639229Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Mar 12 04:49:48.226191 containerd[1510]: time="2026-03-12T04:49:48.226108405Z" level=info msg="CreateContainer within sandbox \"3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 12 04:49:48.239965 containerd[1510]: time="2026-03-12T04:49:48.239756383Z" level=info msg="CreateContainer within sandbox \"3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"d3f255054f6aa18be5e7b48c5a02fc33d57e85534481d5d3752782c59c4e33a7\"" Mar 12 04:49:48.243984 containerd[1510]: time="2026-03-12T04:49:48.242283901Z" level=info msg="StartContainer for \"d3f255054f6aa18be5e7b48c5a02fc33d57e85534481d5d3752782c59c4e33a7\"" Mar 12 04:49:48.308353 systemd[1]: Started cri-containerd-d3f255054f6aa18be5e7b48c5a02fc33d57e85534481d5d3752782c59c4e33a7.scope - libcontainer container d3f255054f6aa18be5e7b48c5a02fc33d57e85534481d5d3752782c59c4e33a7. Mar 12 04:49:48.383905 containerd[1510]: time="2026-03-12T04:49:48.383645078Z" level=info msg="StartContainer for \"d3f255054f6aa18be5e7b48c5a02fc33d57e85534481d5d3752782c59c4e33a7\" returns successfully" Mar 12 04:49:48.391136 containerd[1510]: time="2026-03-12T04:49:48.390953607Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 12 04:49:50.717839 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount101630092.mount: Deactivated successfully. Mar 12 04:49:50.740064 containerd[1510]: time="2026-03-12T04:49:50.739907648Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:50.742456 containerd[1510]: time="2026-03-12T04:49:50.742326479Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Mar 12 04:49:50.743979 containerd[1510]: time="2026-03-12T04:49:50.743724462Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:50.747470 containerd[1510]: time="2026-03-12T04:49:50.747426962Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:49:50.748826 containerd[1510]: time="2026-03-12T04:49:50.748773976Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 2.357762376s" Mar 12 04:49:50.748971 containerd[1510]: time="2026-03-12T04:49:50.748941620Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Mar 12 04:49:50.766582 containerd[1510]: time="2026-03-12T04:49:50.766526091Z" level=info msg="CreateContainer within sandbox \"3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 12 04:49:50.812948 containerd[1510]: time="2026-03-12T04:49:50.812862857Z" level=info msg="CreateContainer within sandbox \"3207425ae1fe0475a2ce0ccd5354d357d14edd1918113a38077c24a02e5e3691\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"2b8d12ea3f8c3d7f5526f6a1f06f744c0bf1adae1ca074336d62ea7672144e0d\"" Mar 12 04:49:50.814359 containerd[1510]: time="2026-03-12T04:49:50.814121358Z" level=info msg="StartContainer for \"2b8d12ea3f8c3d7f5526f6a1f06f744c0bf1adae1ca074336d62ea7672144e0d\"" Mar 12 04:49:50.915876 systemd[1]: Started cri-containerd-2b8d12ea3f8c3d7f5526f6a1f06f744c0bf1adae1ca074336d62ea7672144e0d.scope - libcontainer container 2b8d12ea3f8c3d7f5526f6a1f06f744c0bf1adae1ca074336d62ea7672144e0d. Mar 12 04:49:51.072184 containerd[1510]: time="2026-03-12T04:49:51.071336766Z" level=info msg="StartContainer for \"2b8d12ea3f8c3d7f5526f6a1f06f744c0bf1adae1ca074336d62ea7672144e0d\" returns successfully" Mar 12 04:49:51.508491 systemd[1]: Started sshd@9-10.230.23.190:22-20.161.92.111:43530.service - OpenSSH per-connection server daemon (20.161.92.111:43530). Mar 12 04:49:51.846337 kubelet[2694]: I0312 04:49:51.845885 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-66b7b7bdbb-xpt8w" podStartSLOduration=4.484060216 podStartE2EDuration="26.845848648s" podCreationTimestamp="2026-03-12 04:49:25 +0000 UTC" firstStartedPulling="2026-03-12 04:49:28.388388351 +0000 UTC m=+54.877326673" lastFinishedPulling="2026-03-12 04:49:50.750176778 +0000 UTC m=+77.239115105" observedRunningTime="2026-03-12 04:49:51.835098912 +0000 UTC m=+78.324037265" watchObservedRunningTime="2026-03-12 04:49:51.845848648 +0000 UTC m=+78.334786978" Mar 12 04:49:51.847307 kubelet[2694]: I0312 04:49:51.846358 2694 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-xdpsf" podStartSLOduration=31.858913923 podStartE2EDuration="54.846349259s" podCreationTimestamp="2026-03-12 04:48:57 +0000 UTC" firstStartedPulling="2026-03-12 04:49:23.445253547 +0000 UTC m=+49.934191868" lastFinishedPulling="2026-03-12 04:49:46.432688883 +0000 UTC m=+72.921627204" observedRunningTime="2026-03-12 04:49:47.814889517 +0000 UTC m=+74.303827852" watchObservedRunningTime="2026-03-12 04:49:51.846349259 +0000 UTC m=+78.335287590" Mar 12 04:49:52.178341 sshd[5652]: Accepted publickey for core from 20.161.92.111 port 43530 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:49:52.182727 sshd[5652]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:49:52.195995 systemd-logind[1487]: New session 12 of user core. Mar 12 04:49:52.204598 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 12 04:49:53.350545 sshd[5652]: pam_unix(sshd:session): session closed for user core Mar 12 04:49:53.357966 systemd[1]: sshd@9-10.230.23.190:22-20.161.92.111:43530.service: Deactivated successfully. Mar 12 04:49:53.361615 systemd[1]: session-12.scope: Deactivated successfully. Mar 12 04:49:53.365572 systemd-logind[1487]: Session 12 logged out. Waiting for processes to exit. Mar 12 04:49:53.368569 systemd-logind[1487]: Removed session 12. Mar 12 04:49:58.458453 systemd[1]: Started sshd@10-10.230.23.190:22-20.161.92.111:43536.service - OpenSSH per-connection server daemon (20.161.92.111:43536). Mar 12 04:49:59.154286 sshd[5697]: Accepted publickey for core from 20.161.92.111 port 43536 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:49:59.155263 sshd[5697]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:49:59.163007 systemd-logind[1487]: New session 13 of user core. Mar 12 04:49:59.174650 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 12 04:49:59.895910 sshd[5697]: pam_unix(sshd:session): session closed for user core Mar 12 04:49:59.908465 systemd[1]: sshd@10-10.230.23.190:22-20.161.92.111:43536.service: Deactivated successfully. Mar 12 04:49:59.912581 systemd[1]: session-13.scope: Deactivated successfully. Mar 12 04:49:59.916949 systemd-logind[1487]: Session 13 logged out. Waiting for processes to exit. Mar 12 04:49:59.920897 systemd-logind[1487]: Removed session 13. Mar 12 04:50:05.009645 systemd[1]: Started sshd@11-10.230.23.190:22-20.161.92.111:47114.service - OpenSSH per-connection server daemon (20.161.92.111:47114). Mar 12 04:50:05.982872 sshd[5729]: Accepted publickey for core from 20.161.92.111 port 47114 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:50:05.986710 sshd[5729]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:50:05.996174 systemd-logind[1487]: New session 14 of user core. Mar 12 04:50:06.005341 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 12 04:50:06.554469 sshd[5729]: pam_unix(sshd:session): session closed for user core Mar 12 04:50:06.560538 systemd-logind[1487]: Session 14 logged out. Waiting for processes to exit. Mar 12 04:50:06.561922 systemd[1]: sshd@11-10.230.23.190:22-20.161.92.111:47114.service: Deactivated successfully. Mar 12 04:50:06.564486 systemd[1]: session-14.scope: Deactivated successfully. Mar 12 04:50:06.566346 systemd-logind[1487]: Removed session 14. Mar 12 04:50:07.009831 kubelet[2694]: I0312 04:50:07.009755 2694 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 04:50:10.577686 kubelet[2694]: I0312 04:50:10.577595 2694 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 04:50:11.654439 systemd[1]: Started sshd@12-10.230.23.190:22-20.161.92.111:37956.service - OpenSSH per-connection server daemon (20.161.92.111:37956). Mar 12 04:50:12.313877 sshd[5780]: Accepted publickey for core from 20.161.92.111 port 37956 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:50:12.316874 sshd[5780]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:50:12.327914 systemd-logind[1487]: New session 15 of user core. Mar 12 04:50:12.336434 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 12 04:50:13.012241 sshd[5780]: pam_unix(sshd:session): session closed for user core Mar 12 04:50:13.018096 systemd-logind[1487]: Session 15 logged out. Waiting for processes to exit. Mar 12 04:50:13.020256 systemd[1]: sshd@12-10.230.23.190:22-20.161.92.111:37956.service: Deactivated successfully. Mar 12 04:50:13.023347 systemd[1]: session-15.scope: Deactivated successfully. Mar 12 04:50:13.025411 systemd-logind[1487]: Removed session 15. Mar 12 04:50:13.117546 systemd[1]: Started sshd@13-10.230.23.190:22-20.161.92.111:37964.service - OpenSSH per-connection server daemon (20.161.92.111:37964). Mar 12 04:50:13.690968 sshd[5795]: Accepted publickey for core from 20.161.92.111 port 37964 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:50:13.694829 sshd[5795]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:50:13.706642 systemd-logind[1487]: New session 16 of user core. Mar 12 04:50:13.713760 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 12 04:50:14.296423 sshd[5795]: pam_unix(sshd:session): session closed for user core Mar 12 04:50:14.304583 systemd[1]: sshd@13-10.230.23.190:22-20.161.92.111:37964.service: Deactivated successfully. Mar 12 04:50:14.309959 systemd[1]: session-16.scope: Deactivated successfully. Mar 12 04:50:14.311846 systemd-logind[1487]: Session 16 logged out. Waiting for processes to exit. Mar 12 04:50:14.313720 systemd-logind[1487]: Removed session 16. Mar 12 04:50:14.399643 systemd[1]: Started sshd@14-10.230.23.190:22-20.161.92.111:37968.service - OpenSSH per-connection server daemon (20.161.92.111:37968). Mar 12 04:50:14.985698 sshd[5805]: Accepted publickey for core from 20.161.92.111 port 37968 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:50:14.991494 sshd[5805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:50:14.999323 systemd-logind[1487]: New session 17 of user core. Mar 12 04:50:15.004310 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 12 04:50:15.563200 sshd[5805]: pam_unix(sshd:session): session closed for user core Mar 12 04:50:15.570571 systemd[1]: sshd@14-10.230.23.190:22-20.161.92.111:37968.service: Deactivated successfully. Mar 12 04:50:15.574360 systemd[1]: session-17.scope: Deactivated successfully. Mar 12 04:50:15.576152 systemd-logind[1487]: Session 17 logged out. Waiting for processes to exit. Mar 12 04:50:15.577963 systemd-logind[1487]: Removed session 17. Mar 12 04:50:20.681732 systemd[1]: Started sshd@15-10.230.23.190:22-20.161.92.111:43554.service - OpenSSH per-connection server daemon (20.161.92.111:43554). Mar 12 04:50:21.299074 sshd[5857]: Accepted publickey for core from 20.161.92.111 port 43554 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:50:21.301633 sshd[5857]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:50:21.309057 systemd-logind[1487]: New session 18 of user core. Mar 12 04:50:21.314606 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 12 04:50:22.027018 sshd[5857]: pam_unix(sshd:session): session closed for user core Mar 12 04:50:22.033785 systemd[1]: sshd@15-10.230.23.190:22-20.161.92.111:43554.service: Deactivated successfully. Mar 12 04:50:22.038182 systemd[1]: session-18.scope: Deactivated successfully. Mar 12 04:50:22.040002 systemd-logind[1487]: Session 18 logged out. Waiting for processes to exit. Mar 12 04:50:22.041903 systemd-logind[1487]: Removed session 18. Mar 12 04:50:22.140784 systemd[1]: Started sshd@16-10.230.23.190:22-20.161.92.111:43562.service - OpenSSH per-connection server daemon (20.161.92.111:43562). Mar 12 04:50:22.750186 sshd[5869]: Accepted publickey for core from 20.161.92.111 port 43562 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:50:22.752673 sshd[5869]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:50:22.760783 systemd-logind[1487]: New session 19 of user core. Mar 12 04:50:22.767355 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 12 04:50:23.644934 sshd[5869]: pam_unix(sshd:session): session closed for user core Mar 12 04:50:23.656371 systemd[1]: sshd@16-10.230.23.190:22-20.161.92.111:43562.service: Deactivated successfully. Mar 12 04:50:23.659056 systemd[1]: session-19.scope: Deactivated successfully. Mar 12 04:50:23.660717 systemd-logind[1487]: Session 19 logged out. Waiting for processes to exit. Mar 12 04:50:23.662149 systemd-logind[1487]: Removed session 19. Mar 12 04:50:23.755718 systemd[1]: Started sshd@17-10.230.23.190:22-20.161.92.111:43574.service - OpenSSH per-connection server daemon (20.161.92.111:43574). Mar 12 04:50:24.371389 sshd[5880]: Accepted publickey for core from 20.161.92.111 port 43574 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:50:24.374087 sshd[5880]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:50:24.383445 systemd-logind[1487]: New session 20 of user core. Mar 12 04:50:24.391460 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 12 04:50:25.723181 sshd[5880]: pam_unix(sshd:session): session closed for user core Mar 12 04:50:25.734265 systemd[1]: sshd@17-10.230.23.190:22-20.161.92.111:43574.service: Deactivated successfully. Mar 12 04:50:25.741522 systemd[1]: session-20.scope: Deactivated successfully. Mar 12 04:50:25.749255 systemd-logind[1487]: Session 20 logged out. Waiting for processes to exit. Mar 12 04:50:25.751511 systemd-logind[1487]: Removed session 20. Mar 12 04:50:25.847493 systemd[1]: Started sshd@18-10.230.23.190:22-20.161.92.111:43584.service - OpenSSH per-connection server daemon (20.161.92.111:43584). Mar 12 04:50:26.483506 sshd[5910]: Accepted publickey for core from 20.161.92.111 port 43584 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:50:26.486085 sshd[5910]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:50:26.494725 systemd-logind[1487]: New session 21 of user core. Mar 12 04:50:26.499302 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 12 04:50:27.601814 sshd[5910]: pam_unix(sshd:session): session closed for user core Mar 12 04:50:27.612013 systemd-logind[1487]: Session 21 logged out. Waiting for processes to exit. Mar 12 04:50:27.613068 systemd[1]: sshd@18-10.230.23.190:22-20.161.92.111:43584.service: Deactivated successfully. Mar 12 04:50:27.616397 systemd[1]: session-21.scope: Deactivated successfully. Mar 12 04:50:27.618727 systemd-logind[1487]: Removed session 21. Mar 12 04:50:27.708570 systemd[1]: Started sshd@19-10.230.23.190:22-20.161.92.111:43586.service - OpenSSH per-connection server daemon (20.161.92.111:43586). Mar 12 04:50:28.312334 sshd[5923]: Accepted publickey for core from 20.161.92.111 port 43586 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:50:28.322382 sshd[5923]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:50:28.332276 systemd-logind[1487]: New session 22 of user core. Mar 12 04:50:28.337460 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 12 04:50:28.956126 sshd[5923]: pam_unix(sshd:session): session closed for user core Mar 12 04:50:28.961668 systemd-logind[1487]: Session 22 logged out. Waiting for processes to exit. Mar 12 04:50:28.962907 systemd[1]: sshd@19-10.230.23.190:22-20.161.92.111:43586.service: Deactivated successfully. Mar 12 04:50:28.966733 systemd[1]: session-22.scope: Deactivated successfully. Mar 12 04:50:28.970724 systemd-logind[1487]: Removed session 22. Mar 12 04:50:31.982386 systemd[1]: run-containerd-runc-k8s.io-290354f6144cd564845ed5d6e9f271dcc2fe6635c6c109465c6cadc19426e2e5-runc.NA7vDS.mount: Deactivated successfully. Mar 12 04:50:34.071751 systemd[1]: Started sshd@20-10.230.23.190:22-20.161.92.111:43150.service - OpenSSH per-connection server daemon (20.161.92.111:43150). Mar 12 04:50:34.712088 sshd[6001]: Accepted publickey for core from 20.161.92.111 port 43150 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:50:34.714451 sshd[6001]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:50:34.721844 systemd-logind[1487]: New session 23 of user core. Mar 12 04:50:34.730289 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 12 04:50:35.418930 sshd[6001]: pam_unix(sshd:session): session closed for user core Mar 12 04:50:35.427324 systemd-logind[1487]: Session 23 logged out. Waiting for processes to exit. Mar 12 04:50:35.427754 systemd[1]: sshd@20-10.230.23.190:22-20.161.92.111:43150.service: Deactivated successfully. Mar 12 04:50:35.430478 systemd[1]: session-23.scope: Deactivated successfully. Mar 12 04:50:35.432343 systemd-logind[1487]: Removed session 23. Mar 12 04:50:40.525530 systemd[1]: Started sshd@21-10.230.23.190:22-20.161.92.111:36932.service - OpenSSH per-connection server daemon (20.161.92.111:36932). Mar 12 04:50:41.116793 sshd[6016]: Accepted publickey for core from 20.161.92.111 port 36932 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:50:41.118987 sshd[6016]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:50:41.127213 systemd-logind[1487]: New session 24 of user core. Mar 12 04:50:41.136313 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 12 04:50:41.750909 sshd[6016]: pam_unix(sshd:session): session closed for user core Mar 12 04:50:41.756266 systemd-logind[1487]: Session 24 logged out. Waiting for processes to exit. Mar 12 04:50:41.756697 systemd[1]: sshd@21-10.230.23.190:22-20.161.92.111:36932.service: Deactivated successfully. Mar 12 04:50:41.759881 systemd[1]: session-24.scope: Deactivated successfully. Mar 12 04:50:41.761755 systemd-logind[1487]: Removed session 24.