Mar 12 04:18:33.936254 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Wed Mar 11 23:23:33 -00 2026 Mar 12 04:18:33.936290 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=0e4243d51ac00bffbb09a606c7378a821ca08f30dbebc6b82c4452fcc120d7bc Mar 12 04:18:33.936300 kernel: BIOS-provided physical RAM map: Mar 12 04:18:33.936311 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 12 04:18:33.936318 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 12 04:18:33.936325 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 12 04:18:33.936334 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Mar 12 04:18:33.936342 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Mar 12 04:18:33.936349 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Mar 12 04:18:33.936356 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Mar 12 04:18:33.936364 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 12 04:18:33.936371 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 12 04:18:33.936382 kernel: NX (Execute Disable) protection: active Mar 12 04:18:33.936389 kernel: APIC: Static calls initialized Mar 12 04:18:33.936399 kernel: SMBIOS 2.8 present. Mar 12 04:18:33.936408 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Mar 12 04:18:33.936416 kernel: Hypervisor detected: KVM Mar 12 04:18:33.936428 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 12 04:18:33.936436 kernel: kvm-clock: using sched offset of 3852130496 cycles Mar 12 04:18:33.936445 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 12 04:18:33.936454 kernel: tsc: Detected 2294.576 MHz processor Mar 12 04:18:33.936463 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 12 04:18:33.936472 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 12 04:18:33.936480 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Mar 12 04:18:33.936489 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 12 04:18:33.936497 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 12 04:18:33.936508 kernel: Using GB pages for direct mapping Mar 12 04:18:33.936517 kernel: ACPI: Early table checksum verification disabled Mar 12 04:18:33.936525 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Mar 12 04:18:33.936534 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 04:18:33.936542 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 04:18:33.936551 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 04:18:33.936559 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Mar 12 04:18:33.936568 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 04:18:33.936576 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 04:18:33.936588 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 04:18:33.936596 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 12 04:18:33.936605 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Mar 12 04:18:33.936613 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Mar 12 04:18:33.936622 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Mar 12 04:18:33.936635 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Mar 12 04:18:33.936644 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Mar 12 04:18:33.936655 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Mar 12 04:18:33.936664 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Mar 12 04:18:33.936673 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Mar 12 04:18:33.936682 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Mar 12 04:18:33.936691 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Mar 12 04:18:33.936699 kernel: SRAT: PXM 0 -> APIC 0x03 -> Node 0 Mar 12 04:18:33.936708 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Mar 12 04:18:33.936720 kernel: SRAT: PXM 0 -> APIC 0x05 -> Node 0 Mar 12 04:18:33.936737 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Mar 12 04:18:33.936746 kernel: SRAT: PXM 0 -> APIC 0x07 -> Node 0 Mar 12 04:18:33.936755 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Mar 12 04:18:33.936764 kernel: SRAT: PXM 0 -> APIC 0x09 -> Node 0 Mar 12 04:18:33.936773 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Mar 12 04:18:33.936781 kernel: SRAT: PXM 0 -> APIC 0x0b -> Node 0 Mar 12 04:18:33.936790 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Mar 12 04:18:33.936799 kernel: SRAT: PXM 0 -> APIC 0x0d -> Node 0 Mar 12 04:18:33.936808 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Mar 12 04:18:33.936820 kernel: SRAT: PXM 0 -> APIC 0x0f -> Node 0 Mar 12 04:18:33.936829 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Mar 12 04:18:33.936838 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Mar 12 04:18:33.936847 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Mar 12 04:18:33.936856 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00000000-0x7ffdbfff] Mar 12 04:18:33.936866 kernel: NODE_DATA(0) allocated [mem 0x7ffd6000-0x7ffdbfff] Mar 12 04:18:33.936874 kernel: Zone ranges: Mar 12 04:18:33.936883 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 12 04:18:33.936892 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Mar 12 04:18:33.936904 kernel: Normal empty Mar 12 04:18:33.936913 kernel: Movable zone start for each node Mar 12 04:18:33.936922 kernel: Early memory node ranges Mar 12 04:18:33.936931 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 12 04:18:33.936940 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Mar 12 04:18:33.936949 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Mar 12 04:18:33.936958 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 12 04:18:33.936967 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 12 04:18:33.936976 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Mar 12 04:18:33.936985 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 12 04:18:33.936996 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 12 04:18:33.937005 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 12 04:18:33.937014 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 12 04:18:33.937023 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 12 04:18:33.937033 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 12 04:18:33.937042 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 12 04:18:33.937051 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 12 04:18:33.937060 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 12 04:18:33.937069 kernel: TSC deadline timer available Mar 12 04:18:33.937081 kernel: smpboot: Allowing 16 CPUs, 14 hotplug CPUs Mar 12 04:18:33.937090 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 12 04:18:33.937099 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Mar 12 04:18:33.937108 kernel: Booting paravirtualized kernel on KVM Mar 12 04:18:33.937117 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 12 04:18:33.937126 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Mar 12 04:18:33.937135 kernel: percpu: Embedded 57 pages/cpu s196328 r8192 d28952 u262144 Mar 12 04:18:33.937144 kernel: pcpu-alloc: s196328 r8192 d28952 u262144 alloc=1*2097152 Mar 12 04:18:33.937152 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Mar 12 04:18:33.937187 kernel: kvm-guest: PV spinlocks enabled Mar 12 04:18:33.937196 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 12 04:18:33.937206 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=0e4243d51ac00bffbb09a606c7378a821ca08f30dbebc6b82c4452fcc120d7bc Mar 12 04:18:33.937215 kernel: random: crng init done Mar 12 04:18:33.937224 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 12 04:18:33.937233 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 12 04:18:33.937242 kernel: Fallback order for Node 0: 0 Mar 12 04:18:33.937252 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515804 Mar 12 04:18:33.937264 kernel: Policy zone: DMA32 Mar 12 04:18:33.937273 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 12 04:18:33.937282 kernel: software IO TLB: area num 16. Mar 12 04:18:33.937292 kernel: Memory: 1901592K/2096616K available (12288K kernel code, 2288K rwdata, 22752K rodata, 42892K init, 2304K bss, 194764K reserved, 0K cma-reserved) Mar 12 04:18:33.937301 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Mar 12 04:18:33.937310 kernel: ftrace: allocating 37996 entries in 149 pages Mar 12 04:18:33.937319 kernel: ftrace: allocated 149 pages with 4 groups Mar 12 04:18:33.937328 kernel: Dynamic Preempt: voluntary Mar 12 04:18:33.937337 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 12 04:18:33.937350 kernel: rcu: RCU event tracing is enabled. Mar 12 04:18:33.937359 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Mar 12 04:18:33.937369 kernel: Trampoline variant of Tasks RCU enabled. Mar 12 04:18:33.937378 kernel: Rude variant of Tasks RCU enabled. Mar 12 04:18:33.937388 kernel: Tracing variant of Tasks RCU enabled. Mar 12 04:18:33.937406 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 12 04:18:33.937418 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Mar 12 04:18:33.937428 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Mar 12 04:18:33.937438 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 12 04:18:33.937447 kernel: Console: colour VGA+ 80x25 Mar 12 04:18:33.937457 kernel: printk: console [tty0] enabled Mar 12 04:18:33.937466 kernel: printk: console [ttyS0] enabled Mar 12 04:18:33.937478 kernel: ACPI: Core revision 20230628 Mar 12 04:18:33.937488 kernel: APIC: Switch to symmetric I/O mode setup Mar 12 04:18:33.937498 kernel: x2apic enabled Mar 12 04:18:33.937507 kernel: APIC: Switched APIC routing to: physical x2apic Mar 12 04:18:33.937517 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2113312ac93, max_idle_ns: 440795244843 ns Mar 12 04:18:33.937529 kernel: Calibrating delay loop (skipped) preset value.. 4589.15 BogoMIPS (lpj=2294576) Mar 12 04:18:33.937539 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 12 04:18:33.937549 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Mar 12 04:18:33.937559 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Mar 12 04:18:33.937568 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 12 04:18:33.937578 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Mar 12 04:18:33.937587 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Mar 12 04:18:33.937597 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Mar 12 04:18:33.937607 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Mar 12 04:18:33.937616 kernel: RETBleed: Mitigation: Enhanced IBRS Mar 12 04:18:33.937630 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 12 04:18:33.937639 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 12 04:18:33.937649 kernel: TAA: Mitigation: Clear CPU buffers Mar 12 04:18:33.937658 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Mar 12 04:18:33.937668 kernel: GDS: Unknown: Dependent on hypervisor status Mar 12 04:18:33.937677 kernel: active return thunk: its_return_thunk Mar 12 04:18:33.937687 kernel: ITS: Mitigation: Aligned branch/return thunks Mar 12 04:18:33.937696 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 12 04:18:33.937706 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 12 04:18:33.937715 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 12 04:18:33.937725 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Mar 12 04:18:33.937744 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Mar 12 04:18:33.937753 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Mar 12 04:18:33.937763 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Mar 12 04:18:33.937772 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 12 04:18:33.937782 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Mar 12 04:18:33.937791 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Mar 12 04:18:33.937801 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Mar 12 04:18:33.937810 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Mar 12 04:18:33.937820 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Mar 12 04:18:33.937829 kernel: Freeing SMP alternatives memory: 32K Mar 12 04:18:33.937839 kernel: pid_max: default: 32768 minimum: 301 Mar 12 04:18:33.937852 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 12 04:18:33.937861 kernel: landlock: Up and running. Mar 12 04:18:33.937871 kernel: SELinux: Initializing. Mar 12 04:18:33.937880 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 12 04:18:33.937890 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 12 04:18:33.937899 kernel: smpboot: CPU0: Intel Xeon Processor (Cascadelake) (family: 0x6, model: 0x55, stepping: 0x6) Mar 12 04:18:33.937909 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 12 04:18:33.937919 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 12 04:18:33.937929 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 12 04:18:33.937940 kernel: Performance Events: unsupported p6 CPU model 85 no PMU driver, software events only. Mar 12 04:18:33.937952 kernel: signal: max sigframe size: 3632 Mar 12 04:18:33.937962 kernel: rcu: Hierarchical SRCU implementation. Mar 12 04:18:33.937972 kernel: rcu: Max phase no-delay instances is 400. Mar 12 04:18:33.937981 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Mar 12 04:18:33.937991 kernel: smp: Bringing up secondary CPUs ... Mar 12 04:18:33.938001 kernel: smpboot: x86: Booting SMP configuration: Mar 12 04:18:33.938011 kernel: .... node #0, CPUs: #1 Mar 12 04:18:33.938021 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Mar 12 04:18:33.938030 kernel: smp: Brought up 1 node, 2 CPUs Mar 12 04:18:33.938043 kernel: smpboot: Max logical packages: 16 Mar 12 04:18:33.938052 kernel: smpboot: Total of 2 processors activated (9178.30 BogoMIPS) Mar 12 04:18:33.938062 kernel: devtmpfs: initialized Mar 12 04:18:33.938072 kernel: x86/mm: Memory block size: 128MB Mar 12 04:18:33.938081 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 12 04:18:33.938091 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Mar 12 04:18:33.938100 kernel: pinctrl core: initialized pinctrl subsystem Mar 12 04:18:33.938110 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 12 04:18:33.938120 kernel: audit: initializing netlink subsys (disabled) Mar 12 04:18:33.938129 kernel: audit: type=2000 audit(1773289113.206:1): state=initialized audit_enabled=0 res=1 Mar 12 04:18:33.938142 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 12 04:18:33.938152 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 12 04:18:33.938168 kernel: cpuidle: using governor menu Mar 12 04:18:33.938186 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 12 04:18:33.938195 kernel: dca service started, version 1.12.1 Mar 12 04:18:33.938205 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Mar 12 04:18:33.938215 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Mar 12 04:18:33.938224 kernel: PCI: Using configuration type 1 for base access Mar 12 04:18:33.938237 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 12 04:18:33.938247 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 12 04:18:33.938257 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 12 04:18:33.938267 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 12 04:18:33.938276 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 12 04:18:33.938286 kernel: ACPI: Added _OSI(Module Device) Mar 12 04:18:33.938295 kernel: ACPI: Added _OSI(Processor Device) Mar 12 04:18:33.938305 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 12 04:18:33.938315 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 12 04:18:33.938327 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 12 04:18:33.938337 kernel: ACPI: Interpreter enabled Mar 12 04:18:33.938347 kernel: ACPI: PM: (supports S0 S5) Mar 12 04:18:33.938356 kernel: ACPI: Using IOAPIC for interrupt routing Mar 12 04:18:33.938366 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 12 04:18:33.938376 kernel: PCI: Using E820 reservations for host bridge windows Mar 12 04:18:33.938385 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 12 04:18:33.938395 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 12 04:18:33.938577 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 12 04:18:33.938687 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 12 04:18:33.938786 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 12 04:18:33.938799 kernel: PCI host bridge to bus 0000:00 Mar 12 04:18:33.938907 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 12 04:18:33.938993 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 12 04:18:33.939079 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 12 04:18:33.939198 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Mar 12 04:18:33.939283 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 12 04:18:33.939365 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Mar 12 04:18:33.939449 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 12 04:18:33.939574 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Mar 12 04:18:33.939689 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 Mar 12 04:18:33.939792 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfa000000-0xfbffffff pref] Mar 12 04:18:33.939894 kernel: pci 0000:00:01.0: reg 0x14: [mem 0xfea50000-0xfea50fff] Mar 12 04:18:33.939988 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea40000-0xfea4ffff pref] Mar 12 04:18:33.940081 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 12 04:18:33.940204 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Mar 12 04:18:33.940304 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea51000-0xfea51fff] Mar 12 04:18:33.940409 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Mar 12 04:18:33.940511 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea52000-0xfea52fff] Mar 12 04:18:33.940614 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Mar 12 04:18:33.940710 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea53000-0xfea53fff] Mar 12 04:18:33.940821 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Mar 12 04:18:33.940916 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea54000-0xfea54fff] Mar 12 04:18:33.941018 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Mar 12 04:18:33.941118 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea55000-0xfea55fff] Mar 12 04:18:33.941246 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Mar 12 04:18:33.941339 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea56000-0xfea56fff] Mar 12 04:18:33.941439 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Mar 12 04:18:33.941531 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea57000-0xfea57fff] Mar 12 04:18:33.941637 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Mar 12 04:18:33.941738 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea58000-0xfea58fff] Mar 12 04:18:33.941845 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Mar 12 04:18:33.941941 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc0c0-0xc0df] Mar 12 04:18:33.942037 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfea59000-0xfea59fff] Mar 12 04:18:33.942131 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Mar 12 04:18:33.942242 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfea00000-0xfea3ffff pref] Mar 12 04:18:33.942345 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Mar 12 04:18:33.942438 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Mar 12 04:18:33.942538 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfea5a000-0xfea5afff] Mar 12 04:18:33.942634 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfd004000-0xfd007fff 64bit pref] Mar 12 04:18:33.942745 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Mar 12 04:18:33.942840 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 12 04:18:33.942941 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Mar 12 04:18:33.943034 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc0e0-0xc0ff] Mar 12 04:18:33.943132 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea5b000-0xfea5bfff] Mar 12 04:18:33.943272 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Mar 12 04:18:33.943366 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Mar 12 04:18:33.943473 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 Mar 12 04:18:33.943569 kernel: pci 0000:01:00.0: reg 0x10: [mem 0xfda00000-0xfda000ff 64bit] Mar 12 04:18:33.943663 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 12 04:18:33.943774 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Mar 12 04:18:33.943868 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 12 04:18:33.943970 kernel: pci_bus 0000:02: extended config space not accessible Mar 12 04:18:33.944087 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 Mar 12 04:18:33.944208 kernel: pci 0000:02:01.0: reg 0x10: [mem 0xfd800000-0xfd80000f] Mar 12 04:18:33.944307 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 12 04:18:33.944404 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 12 04:18:33.944515 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 Mar 12 04:18:33.944613 kernel: pci 0000:03:00.0: reg 0x10: [mem 0xfe800000-0xfe803fff 64bit] Mar 12 04:18:33.944711 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 12 04:18:33.944816 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Mar 12 04:18:33.944910 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 12 04:18:33.945016 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 Mar 12 04:18:33.945115 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Mar 12 04:18:33.945269 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 12 04:18:33.945361 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Mar 12 04:18:33.945451 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 12 04:18:33.945543 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 12 04:18:33.945634 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Mar 12 04:18:33.945724 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 12 04:18:33.945825 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 12 04:18:33.945917 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Mar 12 04:18:33.946016 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 12 04:18:33.946109 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 12 04:18:33.946217 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Mar 12 04:18:33.946311 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 12 04:18:33.946404 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 12 04:18:33.946495 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Mar 12 04:18:33.946587 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 12 04:18:33.946684 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 12 04:18:33.946790 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Mar 12 04:18:33.946883 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 12 04:18:33.946896 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 12 04:18:33.946907 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 12 04:18:33.946917 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 12 04:18:33.946927 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 12 04:18:33.946937 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 12 04:18:33.946947 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 12 04:18:33.946957 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 12 04:18:33.946971 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 12 04:18:33.946981 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 12 04:18:33.946991 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 12 04:18:33.947001 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 12 04:18:33.947011 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 12 04:18:33.947020 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 12 04:18:33.947030 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 12 04:18:33.947040 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 12 04:18:33.947049 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 12 04:18:33.947063 kernel: iommu: Default domain type: Translated Mar 12 04:18:33.947072 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 12 04:18:33.947082 kernel: PCI: Using ACPI for IRQ routing Mar 12 04:18:33.947092 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 12 04:18:33.947102 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 12 04:18:33.947111 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Mar 12 04:18:33.947257 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 12 04:18:33.947351 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 12 04:18:33.947447 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 12 04:18:33.947461 kernel: vgaarb: loaded Mar 12 04:18:33.947471 kernel: clocksource: Switched to clocksource kvm-clock Mar 12 04:18:33.947481 kernel: VFS: Disk quotas dquot_6.6.0 Mar 12 04:18:33.947491 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 12 04:18:33.947501 kernel: pnp: PnP ACPI init Mar 12 04:18:33.947602 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Mar 12 04:18:33.947616 kernel: pnp: PnP ACPI: found 5 devices Mar 12 04:18:33.947631 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 12 04:18:33.947641 kernel: NET: Registered PF_INET protocol family Mar 12 04:18:33.947650 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 12 04:18:33.947660 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Mar 12 04:18:33.947670 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 12 04:18:33.947680 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 12 04:18:33.947690 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Mar 12 04:18:33.947700 kernel: TCP: Hash tables configured (established 16384 bind 16384) Mar 12 04:18:33.947710 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 12 04:18:33.947724 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 12 04:18:33.947739 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 12 04:18:33.947749 kernel: NET: Registered PF_XDP protocol family Mar 12 04:18:33.947844 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Mar 12 04:18:33.947938 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Mar 12 04:18:33.948032 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Mar 12 04:18:33.948126 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Mar 12 04:18:33.948243 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 12 04:18:33.948336 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 12 04:18:33.948429 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 12 04:18:33.948527 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 12 04:18:33.948647 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Mar 12 04:18:33.948783 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Mar 12 04:18:33.948884 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Mar 12 04:18:33.948979 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Mar 12 04:18:33.949071 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Mar 12 04:18:33.949222 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Mar 12 04:18:33.949323 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Mar 12 04:18:33.949416 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Mar 12 04:18:33.949513 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 12 04:18:33.949607 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 12 04:18:33.949705 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 12 04:18:33.949805 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Mar 12 04:18:33.949899 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Mar 12 04:18:33.949996 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 12 04:18:33.950090 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 12 04:18:33.950270 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Mar 12 04:18:33.950375 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Mar 12 04:18:33.950468 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 12 04:18:33.950561 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 12 04:18:33.950656 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Mar 12 04:18:33.950760 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Mar 12 04:18:33.950853 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 12 04:18:33.950947 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 12 04:18:33.951041 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Mar 12 04:18:33.951138 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Mar 12 04:18:33.951297 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 12 04:18:33.951393 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 12 04:18:33.951488 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Mar 12 04:18:33.951582 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Mar 12 04:18:33.951675 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 12 04:18:33.951779 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 12 04:18:33.951873 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Mar 12 04:18:33.951967 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Mar 12 04:18:33.952060 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 12 04:18:33.952165 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 12 04:18:33.952269 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Mar 12 04:18:33.952362 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Mar 12 04:18:33.952453 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 12 04:18:33.952546 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 12 04:18:33.952644 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Mar 12 04:18:33.952744 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Mar 12 04:18:33.952838 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 12 04:18:33.952932 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 12 04:18:33.953016 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 12 04:18:33.953100 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 12 04:18:33.953233 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Mar 12 04:18:33.953316 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Mar 12 04:18:33.953398 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Mar 12 04:18:33.953499 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Mar 12 04:18:33.953589 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Mar 12 04:18:33.953675 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Mar 12 04:18:33.953782 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Mar 12 04:18:33.953878 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Mar 12 04:18:33.953966 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Mar 12 04:18:33.954057 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 12 04:18:33.954152 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Mar 12 04:18:33.956331 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Mar 12 04:18:33.956427 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 12 04:18:33.956535 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Mar 12 04:18:33.956625 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Mar 12 04:18:33.956713 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 12 04:18:33.956834 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Mar 12 04:18:33.956922 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Mar 12 04:18:33.957009 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 12 04:18:33.957105 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Mar 12 04:18:33.958252 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Mar 12 04:18:33.958358 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 12 04:18:33.958458 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Mar 12 04:18:33.958553 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Mar 12 04:18:33.958640 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 12 04:18:33.958744 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Mar 12 04:18:33.958833 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Mar 12 04:18:33.958919 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 12 04:18:33.958939 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 12 04:18:33.958951 kernel: PCI: CLS 0 bytes, default 64 Mar 12 04:18:33.958966 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 12 04:18:33.958977 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Mar 12 04:18:33.958987 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Mar 12 04:18:33.958998 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2113312ac93, max_idle_ns: 440795244843 ns Mar 12 04:18:33.959010 kernel: Initialise system trusted keyrings Mar 12 04:18:33.959020 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Mar 12 04:18:33.959031 kernel: Key type asymmetric registered Mar 12 04:18:33.959041 kernel: Asymmetric key parser 'x509' registered Mar 12 04:18:33.959055 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 12 04:18:33.959066 kernel: io scheduler mq-deadline registered Mar 12 04:18:33.959077 kernel: io scheduler kyber registered Mar 12 04:18:33.959087 kernel: io scheduler bfq registered Mar 12 04:18:33.960288 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 12 04:18:33.960405 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 12 04:18:33.960504 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 04:18:33.960605 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 12 04:18:33.960756 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 12 04:18:33.960929 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 04:18:33.961037 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 12 04:18:33.961132 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 12 04:18:33.961245 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 04:18:33.961349 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 12 04:18:33.961451 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 12 04:18:33.961544 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 04:18:33.961640 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 12 04:18:33.961741 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 12 04:18:33.961836 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 04:18:33.961933 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 12 04:18:33.962031 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 12 04:18:33.962124 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 04:18:33.964277 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 12 04:18:33.964386 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 12 04:18:33.964483 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 04:18:33.964583 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 12 04:18:33.964684 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 12 04:18:33.964788 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 12 04:18:33.964803 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 12 04:18:33.964815 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 12 04:18:33.964827 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Mar 12 04:18:33.964838 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 12 04:18:33.964849 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 12 04:18:33.964864 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 12 04:18:33.964875 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 12 04:18:33.964886 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 12 04:18:33.964897 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 12 04:18:33.964999 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 12 04:18:33.965088 kernel: rtc_cmos 00:03: registered as rtc0 Mar 12 04:18:33.965238 kernel: rtc_cmos 00:03: setting system clock to 2026-03-12T04:18:33 UTC (1773289113) Mar 12 04:18:33.965327 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Mar 12 04:18:33.965347 kernel: intel_pstate: CPU model not supported Mar 12 04:18:33.965358 kernel: NET: Registered PF_INET6 protocol family Mar 12 04:18:33.965368 kernel: Segment Routing with IPv6 Mar 12 04:18:33.965379 kernel: In-situ OAM (IOAM) with IPv6 Mar 12 04:18:33.965390 kernel: NET: Registered PF_PACKET protocol family Mar 12 04:18:33.965400 kernel: Key type dns_resolver registered Mar 12 04:18:33.965411 kernel: IPI shorthand broadcast: enabled Mar 12 04:18:33.965422 kernel: sched_clock: Marking stable (945002592, 126712994)->(1291657400, -219941814) Mar 12 04:18:33.965432 kernel: registered taskstats version 1 Mar 12 04:18:33.965447 kernel: Loading compiled-in X.509 certificates Mar 12 04:18:33.965457 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: 67287262975845098ef9f337a0e8baa9afd38510' Mar 12 04:18:33.965467 kernel: Key type .fscrypt registered Mar 12 04:18:33.965478 kernel: Key type fscrypt-provisioning registered Mar 12 04:18:33.965488 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 12 04:18:33.965499 kernel: ima: Allocated hash algorithm: sha1 Mar 12 04:18:33.965510 kernel: ima: No architecture policies found Mar 12 04:18:33.965520 kernel: clk: Disabling unused clocks Mar 12 04:18:33.965530 kernel: Freeing unused kernel image (initmem) memory: 42892K Mar 12 04:18:33.965545 kernel: Write protecting the kernel read-only data: 36864k Mar 12 04:18:33.965555 kernel: Freeing unused kernel image (rodata/data gap) memory: 1824K Mar 12 04:18:33.965566 kernel: Run /init as init process Mar 12 04:18:33.965576 kernel: with arguments: Mar 12 04:18:33.965587 kernel: /init Mar 12 04:18:33.965598 kernel: with environment: Mar 12 04:18:33.965608 kernel: HOME=/ Mar 12 04:18:33.965618 kernel: TERM=linux Mar 12 04:18:33.965632 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 12 04:18:33.965650 systemd[1]: Detected virtualization kvm. Mar 12 04:18:33.965662 systemd[1]: Detected architecture x86-64. Mar 12 04:18:33.965672 systemd[1]: Running in initrd. Mar 12 04:18:33.965683 systemd[1]: No hostname configured, using default hostname. Mar 12 04:18:33.965694 systemd[1]: Hostname set to . Mar 12 04:18:33.965705 systemd[1]: Initializing machine ID from VM UUID. Mar 12 04:18:33.965716 systemd[1]: Queued start job for default target initrd.target. Mar 12 04:18:33.965738 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 04:18:33.965753 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 04:18:33.965765 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 12 04:18:33.965777 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 12 04:18:33.965788 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 12 04:18:33.965799 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 12 04:18:33.965812 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 12 04:18:33.965826 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 12 04:18:33.965837 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 04:18:33.965849 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 12 04:18:33.965860 systemd[1]: Reached target paths.target - Path Units. Mar 12 04:18:33.965872 systemd[1]: Reached target slices.target - Slice Units. Mar 12 04:18:33.965882 systemd[1]: Reached target swap.target - Swaps. Mar 12 04:18:33.965893 systemd[1]: Reached target timers.target - Timer Units. Mar 12 04:18:33.965905 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 12 04:18:33.965915 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 12 04:18:33.965931 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 12 04:18:33.965942 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 12 04:18:33.965953 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 12 04:18:33.965964 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 12 04:18:33.965975 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 04:18:33.965986 systemd[1]: Reached target sockets.target - Socket Units. Mar 12 04:18:33.965997 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 12 04:18:33.966009 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 12 04:18:33.966023 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 12 04:18:33.966034 systemd[1]: Starting systemd-fsck-usr.service... Mar 12 04:18:33.966045 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 12 04:18:33.966056 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 12 04:18:33.966068 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 04:18:33.966108 systemd-journald[202]: Collecting audit messages is disabled. Mar 12 04:18:33.966140 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 12 04:18:33.966152 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 04:18:33.966172 systemd[1]: Finished systemd-fsck-usr.service. Mar 12 04:18:33.966187 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 12 04:18:33.966208 systemd-journald[202]: Journal started Mar 12 04:18:33.966233 systemd-journald[202]: Runtime Journal (/run/log/journal/3728667685a24618a746dc180dcab054) is 4.7M, max 38.0M, 33.2M free. Mar 12 04:18:33.965997 systemd-modules-load[204]: Inserted module 'overlay' Mar 12 04:18:33.970763 systemd[1]: Started systemd-journald.service - Journal Service. Mar 12 04:18:33.996177 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 12 04:18:33.998172 kernel: Bridge firewalling registered Mar 12 04:18:33.998233 systemd-modules-load[204]: Inserted module 'br_netfilter' Mar 12 04:18:34.001401 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 12 04:18:34.038636 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 12 04:18:34.040614 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 04:18:34.043150 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 12 04:18:34.052469 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 12 04:18:34.058362 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 12 04:18:34.061315 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 12 04:18:34.064495 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 04:18:34.073362 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 12 04:18:34.076872 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 12 04:18:34.083673 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 04:18:34.093361 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 12 04:18:34.106323 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 12 04:18:34.119131 systemd-resolved[233]: Positive Trust Anchors: Mar 12 04:18:34.119821 systemd-resolved[233]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 12 04:18:34.119864 systemd-resolved[233]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 12 04:18:34.124696 dracut-cmdline[240]: dracut-dracut-053 Mar 12 04:18:34.126460 dracut-cmdline[240]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=0e4243d51ac00bffbb09a606c7378a821ca08f30dbebc6b82c4452fcc120d7bc Mar 12 04:18:34.128133 systemd-resolved[233]: Defaulting to hostname 'linux'. Mar 12 04:18:34.129851 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 12 04:18:34.130884 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 12 04:18:34.223235 kernel: SCSI subsystem initialized Mar 12 04:18:34.234197 kernel: Loading iSCSI transport class v2.0-870. Mar 12 04:18:34.246902 kernel: iscsi: registered transport (tcp) Mar 12 04:18:34.271318 kernel: iscsi: registered transport (qla4xxx) Mar 12 04:18:34.271498 kernel: QLogic iSCSI HBA Driver Mar 12 04:18:34.355787 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 12 04:18:34.364348 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 12 04:18:34.399960 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 12 04:18:34.401436 kernel: device-mapper: uevent: version 1.0.3 Mar 12 04:18:34.401455 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 12 04:18:34.452234 kernel: raid6: avx512x4 gen() 17384 MB/s Mar 12 04:18:34.469238 kernel: raid6: avx512x2 gen() 17466 MB/s Mar 12 04:18:34.486255 kernel: raid6: avx512x1 gen() 17567 MB/s Mar 12 04:18:34.503244 kernel: raid6: avx2x4 gen() 17052 MB/s Mar 12 04:18:34.520258 kernel: raid6: avx2x2 gen() 20529 MB/s Mar 12 04:18:34.537284 kernel: raid6: avx2x1 gen() 17753 MB/s Mar 12 04:18:34.537411 kernel: raid6: using algorithm avx2x2 gen() 20529 MB/s Mar 12 04:18:34.555324 kernel: raid6: .... xor() 14844 MB/s, rmw enabled Mar 12 04:18:34.555457 kernel: raid6: using avx512x2 recovery algorithm Mar 12 04:18:34.579255 kernel: xor: automatically using best checksumming function avx Mar 12 04:18:34.752383 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 12 04:18:34.775073 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 12 04:18:34.783402 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 04:18:34.825225 systemd-udevd[422]: Using default interface naming scheme 'v255'. Mar 12 04:18:34.830769 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 04:18:34.841371 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 12 04:18:34.869676 dracut-pre-trigger[431]: rd.md=0: removing MD RAID activation Mar 12 04:18:34.904741 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 12 04:18:34.910320 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 12 04:18:34.991551 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 04:18:34.999721 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 12 04:18:35.031283 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 12 04:18:35.033325 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 12 04:18:35.035016 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 04:18:35.035848 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 12 04:18:35.041323 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 12 04:18:35.062126 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 12 04:18:35.093263 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Mar 12 04:18:35.093471 kernel: cryptd: max_cpu_qlen set to 1000 Mar 12 04:18:35.102865 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Mar 12 04:18:35.112327 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 12 04:18:35.112452 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 12 04:18:35.113035 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 12 04:18:35.113453 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 12 04:18:35.113571 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 04:18:35.113985 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 04:18:35.126200 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 12 04:18:35.126329 kernel: GPT:17805311 != 125829119 Mar 12 04:18:35.126369 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 12 04:18:35.126392 kernel: GPT:17805311 != 125829119 Mar 12 04:18:35.126414 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 12 04:18:35.126436 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 12 04:18:35.129068 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 04:18:35.133429 kernel: AVX2 version of gcm_enc/dec engaged. Mar 12 04:18:35.133468 kernel: AES CTR mode by8 optimization enabled Mar 12 04:18:35.150131 kernel: ACPI: bus type USB registered Mar 12 04:18:35.150201 kernel: usbcore: registered new interface driver usbfs Mar 12 04:18:35.150228 kernel: usbcore: registered new interface driver hub Mar 12 04:18:35.153180 kernel: usbcore: registered new device driver usb Mar 12 04:18:35.191194 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Mar 12 04:18:35.191499 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Mar 12 04:18:35.191684 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 12 04:18:35.200185 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Mar 12 04:18:35.200470 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Mar 12 04:18:35.200610 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Mar 12 04:18:35.205923 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/vda6 scanned by (udev-worker) (472) Mar 12 04:18:35.205996 kernel: hub 1-0:1.0: USB hub found Mar 12 04:18:35.207428 kernel: hub 1-0:1.0: 4 ports detected Mar 12 04:18:35.208174 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 12 04:18:35.212181 kernel: hub 2-0:1.0: USB hub found Mar 12 04:18:35.212429 kernel: hub 2-0:1.0: 4 ports detected Mar 12 04:18:35.217949 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 12 04:18:35.285887 kernel: BTRFS: device fsid 94537345-7f6b-4b2a-965f-248bd6f0b7eb devid 1 transid 33 /dev/vda3 scanned by (udev-worker) (468) Mar 12 04:18:35.285932 kernel: libata version 3.00 loaded. Mar 12 04:18:35.285946 kernel: ahci 0000:00:1f.2: version 3.0 Mar 12 04:18:35.286241 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 12 04:18:35.286260 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Mar 12 04:18:35.286395 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 12 04:18:35.286538 kernel: scsi host0: ahci Mar 12 04:18:35.286703 kernel: scsi host1: ahci Mar 12 04:18:35.286836 kernel: scsi host2: ahci Mar 12 04:18:35.286966 kernel: scsi host3: ahci Mar 12 04:18:35.287101 kernel: scsi host4: ahci Mar 12 04:18:35.288218 kernel: scsi host5: ahci Mar 12 04:18:35.288377 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 41 Mar 12 04:18:35.288393 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 41 Mar 12 04:18:35.288408 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 41 Mar 12 04:18:35.288422 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 41 Mar 12 04:18:35.288436 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 41 Mar 12 04:18:35.288450 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 41 Mar 12 04:18:35.293536 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 04:18:35.300533 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 12 04:18:35.301036 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 12 04:18:35.306788 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 12 04:18:35.311474 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 12 04:18:35.318284 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 12 04:18:35.320307 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 12 04:18:35.327661 disk-uuid[568]: Primary Header is updated. Mar 12 04:18:35.327661 disk-uuid[568]: Secondary Entries is updated. Mar 12 04:18:35.327661 disk-uuid[568]: Secondary Header is updated. Mar 12 04:18:35.332668 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 12 04:18:35.339179 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 12 04:18:35.347993 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 12 04:18:35.453235 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 12 04:18:35.560706 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 12 04:18:35.560900 kernel: ata1: SATA link down (SStatus 0 SControl 300) Mar 12 04:18:35.564887 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 12 04:18:35.568665 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 12 04:18:35.568773 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 12 04:18:35.573888 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 12 04:18:35.593201 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 12 04:18:35.598190 kernel: usbcore: registered new interface driver usbhid Mar 12 04:18:35.598250 kernel: usbhid: USB HID core driver Mar 12 04:18:35.604377 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Mar 12 04:18:35.604456 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Mar 12 04:18:36.358756 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 12 04:18:36.362792 disk-uuid[570]: The operation has completed successfully. Mar 12 04:18:36.410477 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 12 04:18:36.410596 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 12 04:18:36.424340 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 12 04:18:36.452952 sh[589]: Success Mar 12 04:18:36.480204 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Mar 12 04:18:36.531820 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 12 04:18:36.546524 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 12 04:18:36.548606 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 12 04:18:36.590270 kernel: BTRFS info (device dm-0): first mount of filesystem 94537345-7f6b-4b2a-965f-248bd6f0b7eb Mar 12 04:18:36.590404 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 12 04:18:36.590443 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 12 04:18:36.594830 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 12 04:18:36.594913 kernel: BTRFS info (device dm-0): using free space tree Mar 12 04:18:36.603278 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 12 04:18:36.604533 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 12 04:18:36.610423 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 12 04:18:36.615525 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 12 04:18:36.624849 kernel: BTRFS info (device vda6): first mount of filesystem 0ebf6eb2-dc55-4706-86d1-78d37843d203 Mar 12 04:18:36.624906 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 12 04:18:36.624922 kernel: BTRFS info (device vda6): using free space tree Mar 12 04:18:36.632259 kernel: BTRFS info (device vda6): auto enabling async discard Mar 12 04:18:36.644973 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 12 04:18:36.645589 kernel: BTRFS info (device vda6): last unmount of filesystem 0ebf6eb2-dc55-4706-86d1-78d37843d203 Mar 12 04:18:36.652980 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 12 04:18:36.659314 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 12 04:18:36.767505 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 12 04:18:36.769911 ignition[675]: Ignition 2.19.0 Mar 12 04:18:36.770599 ignition[675]: Stage: fetch-offline Mar 12 04:18:36.771013 ignition[675]: no configs at "/usr/lib/ignition/base.d" Mar 12 04:18:36.771025 ignition[675]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 04:18:36.771154 ignition[675]: parsed url from cmdline: "" Mar 12 04:18:36.771172 ignition[675]: no config URL provided Mar 12 04:18:36.771178 ignition[675]: reading system config file "/usr/lib/ignition/user.ign" Mar 12 04:18:36.771187 ignition[675]: no config at "/usr/lib/ignition/user.ign" Mar 12 04:18:36.771192 ignition[675]: failed to fetch config: resource requires networking Mar 12 04:18:36.771386 ignition[675]: Ignition finished successfully Mar 12 04:18:36.776361 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 12 04:18:36.777555 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 12 04:18:36.799772 systemd-networkd[774]: lo: Link UP Mar 12 04:18:36.799783 systemd-networkd[774]: lo: Gained carrier Mar 12 04:18:36.801194 systemd-networkd[774]: Enumeration completed Mar 12 04:18:36.801588 systemd-networkd[774]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 04:18:36.801592 systemd-networkd[774]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 12 04:18:36.802493 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 12 04:18:36.802892 systemd-networkd[774]: eth0: Link UP Mar 12 04:18:36.802897 systemd-networkd[774]: eth0: Gained carrier Mar 12 04:18:36.802904 systemd-networkd[774]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 04:18:36.803220 systemd[1]: Reached target network.target - Network. Mar 12 04:18:36.807316 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 12 04:18:36.815073 systemd-networkd[774]: eth0: DHCPv4 address 10.244.101.2/30, gateway 10.244.101.1 acquired from 10.244.101.1 Mar 12 04:18:36.828451 ignition[777]: Ignition 2.19.0 Mar 12 04:18:36.830050 ignition[777]: Stage: fetch Mar 12 04:18:36.830297 ignition[777]: no configs at "/usr/lib/ignition/base.d" Mar 12 04:18:36.830310 ignition[777]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 04:18:36.830416 ignition[777]: parsed url from cmdline: "" Mar 12 04:18:36.830420 ignition[777]: no config URL provided Mar 12 04:18:36.830425 ignition[777]: reading system config file "/usr/lib/ignition/user.ign" Mar 12 04:18:36.830439 ignition[777]: no config at "/usr/lib/ignition/user.ign" Mar 12 04:18:36.830624 ignition[777]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 12 04:18:36.831180 ignition[777]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 12 04:18:36.831203 ignition[777]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 12 04:18:36.846747 ignition[777]: GET result: OK Mar 12 04:18:36.847502 ignition[777]: parsing config with SHA512: 89a6dd4b3cbf7372ed578cbfeac798c6b0cf66831522a470374b7b3501e2eec12d1833ab4347b1fbea16b5e0b383192d0ec95f3779483280cb4c865785ca189a Mar 12 04:18:36.854978 unknown[777]: fetched base config from "system" Mar 12 04:18:36.854998 unknown[777]: fetched base config from "system" Mar 12 04:18:36.855008 unknown[777]: fetched user config from "openstack" Mar 12 04:18:36.856085 ignition[777]: fetch: fetch complete Mar 12 04:18:36.856095 ignition[777]: fetch: fetch passed Mar 12 04:18:36.856195 ignition[777]: Ignition finished successfully Mar 12 04:18:36.858255 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 12 04:18:36.865345 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 12 04:18:36.893079 ignition[784]: Ignition 2.19.0 Mar 12 04:18:36.893092 ignition[784]: Stage: kargs Mar 12 04:18:36.893283 ignition[784]: no configs at "/usr/lib/ignition/base.d" Mar 12 04:18:36.893294 ignition[784]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 04:18:36.894229 ignition[784]: kargs: kargs passed Mar 12 04:18:36.894276 ignition[784]: Ignition finished successfully Mar 12 04:18:36.896800 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 12 04:18:36.904378 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 12 04:18:36.920980 ignition[790]: Ignition 2.19.0 Mar 12 04:18:36.921011 ignition[790]: Stage: disks Mar 12 04:18:36.921303 ignition[790]: no configs at "/usr/lib/ignition/base.d" Mar 12 04:18:36.921317 ignition[790]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 04:18:36.924839 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 12 04:18:36.922800 ignition[790]: disks: disks passed Mar 12 04:18:36.925951 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 12 04:18:36.922864 ignition[790]: Ignition finished successfully Mar 12 04:18:36.926485 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 12 04:18:36.927388 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 12 04:18:36.928085 systemd[1]: Reached target sysinit.target - System Initialization. Mar 12 04:18:36.928951 systemd[1]: Reached target basic.target - Basic System. Mar 12 04:18:36.942717 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 12 04:18:36.959602 systemd-fsck[798]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 12 04:18:36.964529 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 12 04:18:36.976407 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 12 04:18:37.088195 kernel: EXT4-fs (vda9): mounted filesystem f90926b1-4cc2-4a2d-8c45-4ec584c98779 r/w with ordered data mode. Quota mode: none. Mar 12 04:18:37.089824 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 12 04:18:37.092247 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 12 04:18:37.108467 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 12 04:18:37.112022 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 12 04:18:37.113401 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 12 04:18:37.119328 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Mar 12 04:18:37.120438 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 12 04:18:37.129221 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/vda6 scanned by mount (806) Mar 12 04:18:37.129250 kernel: BTRFS info (device vda6): first mount of filesystem 0ebf6eb2-dc55-4706-86d1-78d37843d203 Mar 12 04:18:37.129271 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 12 04:18:37.129284 kernel: BTRFS info (device vda6): using free space tree Mar 12 04:18:37.120474 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 12 04:18:37.128934 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 12 04:18:37.132181 kernel: BTRFS info (device vda6): auto enabling async discard Mar 12 04:18:37.135306 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 12 04:18:37.138142 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 12 04:18:37.200904 initrd-setup-root[834]: cut: /sysroot/etc/passwd: No such file or directory Mar 12 04:18:37.207182 initrd-setup-root[841]: cut: /sysroot/etc/group: No such file or directory Mar 12 04:18:37.215897 initrd-setup-root[848]: cut: /sysroot/etc/shadow: No such file or directory Mar 12 04:18:37.221057 initrd-setup-root[855]: cut: /sysroot/etc/gshadow: No such file or directory Mar 12 04:18:37.338915 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 12 04:18:37.343343 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 12 04:18:37.346318 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 12 04:18:37.357207 kernel: BTRFS info (device vda6): last unmount of filesystem 0ebf6eb2-dc55-4706-86d1-78d37843d203 Mar 12 04:18:37.377094 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 12 04:18:37.394386 ignition[923]: INFO : Ignition 2.19.0 Mar 12 04:18:37.394386 ignition[923]: INFO : Stage: mount Mar 12 04:18:37.395518 ignition[923]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 04:18:37.395518 ignition[923]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 04:18:37.395518 ignition[923]: INFO : mount: mount passed Mar 12 04:18:37.398950 ignition[923]: INFO : Ignition finished successfully Mar 12 04:18:37.397588 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 12 04:18:37.585534 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 12 04:18:37.981997 systemd-networkd[774]: eth0: Gained IPv6LL Mar 12 04:18:40.582511 systemd-networkd[774]: eth0: Ignoring DHCPv6 address 2a02:1348:17d:1940:24:19ff:fef4:6502/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17d:1940:24:19ff:fef4:6502/64 assigned by NDisc. Mar 12 04:18:40.582539 systemd-networkd[774]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Mar 12 04:18:44.276571 coreos-metadata[808]: Mar 12 04:18:44.276 WARN failed to locate config-drive, using the metadata service API instead Mar 12 04:18:44.294668 coreos-metadata[808]: Mar 12 04:18:44.294 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 12 04:18:44.315953 coreos-metadata[808]: Mar 12 04:18:44.315 INFO Fetch successful Mar 12 04:18:44.317346 coreos-metadata[808]: Mar 12 04:18:44.317 INFO wrote hostname srv-tymtb.gb1.brightbox.com to /sysroot/etc/hostname Mar 12 04:18:44.322318 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 12 04:18:44.322522 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Mar 12 04:18:44.332281 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 12 04:18:44.345344 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 12 04:18:44.354252 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 scanned by mount (939) Mar 12 04:18:44.356861 kernel: BTRFS info (device vda6): first mount of filesystem 0ebf6eb2-dc55-4706-86d1-78d37843d203 Mar 12 04:18:44.356916 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 12 04:18:44.356944 kernel: BTRFS info (device vda6): using free space tree Mar 12 04:18:44.362244 kernel: BTRFS info (device vda6): auto enabling async discard Mar 12 04:18:44.362580 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 12 04:18:44.387184 ignition[957]: INFO : Ignition 2.19.0 Mar 12 04:18:44.387184 ignition[957]: INFO : Stage: files Mar 12 04:18:44.387184 ignition[957]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 04:18:44.387184 ignition[957]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 04:18:44.390339 ignition[957]: DEBUG : files: compiled without relabeling support, skipping Mar 12 04:18:44.391842 ignition[957]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 12 04:18:44.391842 ignition[957]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 12 04:18:44.395083 ignition[957]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 12 04:18:44.395738 ignition[957]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 12 04:18:44.396480 unknown[957]: wrote ssh authorized keys file for user: core Mar 12 04:18:44.397259 ignition[957]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 12 04:18:44.399335 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 12 04:18:44.399335 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Mar 12 04:18:44.719724 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 12 04:18:45.104671 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 12 04:18:45.104671 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 12 04:18:45.108950 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 12 04:18:45.108950 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 12 04:18:45.108950 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 12 04:18:45.108950 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 12 04:18:45.108950 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 12 04:18:45.108950 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 12 04:18:45.108950 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 12 04:18:45.108950 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 12 04:18:45.108950 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 12 04:18:45.108950 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 12 04:18:45.108950 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 12 04:18:45.108950 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 12 04:18:45.108950 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.8-x86-64.raw: attempt #1 Mar 12 04:18:45.498814 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 12 04:18:47.769881 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 12 04:18:47.769881 ignition[957]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 12 04:18:47.778105 ignition[957]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 12 04:18:47.778105 ignition[957]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 12 04:18:47.778105 ignition[957]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 12 04:18:47.778105 ignition[957]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 12 04:18:47.778105 ignition[957]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 12 04:18:47.778105 ignition[957]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 12 04:18:47.778105 ignition[957]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 12 04:18:47.778105 ignition[957]: INFO : files: files passed Mar 12 04:18:47.778105 ignition[957]: INFO : Ignition finished successfully Mar 12 04:18:47.777791 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 12 04:18:47.787410 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 12 04:18:47.789109 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 12 04:18:47.807261 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 12 04:18:47.807387 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 12 04:18:47.815970 initrd-setup-root-after-ignition[985]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 12 04:18:47.815970 initrd-setup-root-after-ignition[985]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 12 04:18:47.817258 initrd-setup-root-after-ignition[989]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 12 04:18:47.819349 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 12 04:18:47.819998 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 12 04:18:47.824320 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 12 04:18:47.871394 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 12 04:18:47.871556 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 12 04:18:47.875748 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 12 04:18:47.876883 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 12 04:18:47.877950 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 12 04:18:47.885613 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 12 04:18:47.907648 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 12 04:18:47.920447 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 12 04:18:47.935693 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 12 04:18:47.936640 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 04:18:47.937878 systemd[1]: Stopped target timers.target - Timer Units. Mar 12 04:18:47.938860 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 12 04:18:47.939043 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 12 04:18:47.940122 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 12 04:18:47.940840 systemd[1]: Stopped target basic.target - Basic System. Mar 12 04:18:47.941734 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 12 04:18:47.942496 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 12 04:18:47.943282 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 12 04:18:47.944138 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 12 04:18:47.944978 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 12 04:18:47.945833 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 12 04:18:47.946612 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 12 04:18:47.947429 systemd[1]: Stopped target swap.target - Swaps. Mar 12 04:18:47.948147 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 12 04:18:47.948275 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 12 04:18:47.949147 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 12 04:18:47.949673 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 04:18:47.950402 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 12 04:18:47.950614 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 04:18:47.951277 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 12 04:18:47.951385 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 12 04:18:47.952445 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 12 04:18:47.952553 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 12 04:18:47.953467 systemd[1]: ignition-files.service: Deactivated successfully. Mar 12 04:18:47.953577 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 12 04:18:47.960324 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 12 04:18:47.962329 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 12 04:18:47.964228 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 12 04:18:47.968629 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 04:18:47.976896 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 12 04:18:47.977034 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 12 04:18:47.984082 ignition[1010]: INFO : Ignition 2.19.0 Mar 12 04:18:47.984082 ignition[1010]: INFO : Stage: umount Mar 12 04:18:47.984082 ignition[1010]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 12 04:18:47.984082 ignition[1010]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 12 04:18:47.984082 ignition[1010]: INFO : umount: umount passed Mar 12 04:18:47.984082 ignition[1010]: INFO : Ignition finished successfully Mar 12 04:18:47.986328 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 12 04:18:47.986439 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 12 04:18:47.989631 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 12 04:18:47.990210 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 12 04:18:47.991710 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 12 04:18:47.991829 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 12 04:18:47.992735 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 12 04:18:47.992787 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 12 04:18:47.994306 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 12 04:18:47.994351 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 12 04:18:47.995049 systemd[1]: Stopped target network.target - Network. Mar 12 04:18:47.995823 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 12 04:18:47.995874 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 12 04:18:47.996614 systemd[1]: Stopped target paths.target - Path Units. Mar 12 04:18:47.998495 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 12 04:18:48.002203 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 04:18:48.003077 systemd[1]: Stopped target slices.target - Slice Units. Mar 12 04:18:48.004364 systemd[1]: Stopped target sockets.target - Socket Units. Mar 12 04:18:48.005930 systemd[1]: iscsid.socket: Deactivated successfully. Mar 12 04:18:48.005974 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 12 04:18:48.006652 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 12 04:18:48.006687 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 12 04:18:48.007796 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 12 04:18:48.007844 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 12 04:18:48.008538 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 12 04:18:48.008584 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 12 04:18:48.009448 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 12 04:18:48.009931 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 12 04:18:48.012870 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 12 04:18:48.015282 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 12 04:18:48.015373 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 12 04:18:48.016232 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 12 04:18:48.016337 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 12 04:18:48.016640 systemd-networkd[774]: eth0: DHCPv6 lease lost Mar 12 04:18:48.018984 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 12 04:18:48.019332 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 12 04:18:48.020445 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 12 04:18:48.020534 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 12 04:18:48.023556 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 12 04:18:48.023605 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 12 04:18:48.036569 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 12 04:18:48.036973 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 12 04:18:48.037021 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 12 04:18:48.037490 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 12 04:18:48.037526 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 12 04:18:48.037960 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 12 04:18:48.037996 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 12 04:18:48.038763 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 12 04:18:48.038802 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 04:18:48.039693 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 04:18:48.050033 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 12 04:18:48.050153 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 12 04:18:48.051247 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 12 04:18:48.051384 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 04:18:48.052844 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 12 04:18:48.052922 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 12 04:18:48.053781 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 12 04:18:48.053813 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 04:18:48.054628 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 12 04:18:48.054671 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 12 04:18:48.055859 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 12 04:18:48.055904 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 12 04:18:48.056597 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 12 04:18:48.056637 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 12 04:18:48.065348 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 12 04:18:48.065837 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 12 04:18:48.065886 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 04:18:48.070445 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 12 04:18:48.070564 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 12 04:18:48.073307 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 12 04:18:48.073379 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 04:18:48.074222 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 12 04:18:48.074288 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 04:18:48.075709 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 12 04:18:48.075869 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 12 04:18:48.077848 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 12 04:18:48.086569 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 12 04:18:48.095661 systemd[1]: Switching root. Mar 12 04:18:48.131098 systemd-journald[202]: Journal stopped Mar 12 04:18:49.163035 systemd-journald[202]: Received SIGTERM from PID 1 (systemd). Mar 12 04:18:49.163123 kernel: SELinux: policy capability network_peer_controls=1 Mar 12 04:18:49.163150 kernel: SELinux: policy capability open_perms=1 Mar 12 04:18:49.165202 kernel: SELinux: policy capability extended_socket_class=1 Mar 12 04:18:49.165226 kernel: SELinux: policy capability always_check_network=0 Mar 12 04:18:49.165240 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 12 04:18:49.165254 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 12 04:18:49.165267 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 12 04:18:49.165286 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 12 04:18:49.165304 systemd[1]: Successfully loaded SELinux policy in 41.575ms. Mar 12 04:18:49.165338 kernel: audit: type=1403 audit(1773289128.260:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 12 04:18:49.165365 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 14.830ms. Mar 12 04:18:49.165382 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 12 04:18:49.165396 systemd[1]: Detected virtualization kvm. Mar 12 04:18:49.165411 systemd[1]: Detected architecture x86-64. Mar 12 04:18:49.165424 systemd[1]: Detected first boot. Mar 12 04:18:49.165438 systemd[1]: Hostname set to . Mar 12 04:18:49.165453 systemd[1]: Initializing machine ID from VM UUID. Mar 12 04:18:49.165468 zram_generator::config[1053]: No configuration found. Mar 12 04:18:49.165489 systemd[1]: Populated /etc with preset unit settings. Mar 12 04:18:49.165506 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 12 04:18:49.165520 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 12 04:18:49.165534 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 12 04:18:49.165550 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 12 04:18:49.165564 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 12 04:18:49.165578 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 12 04:18:49.165593 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 12 04:18:49.165612 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 12 04:18:49.165627 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 12 04:18:49.165653 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 12 04:18:49.165668 systemd[1]: Created slice user.slice - User and Session Slice. Mar 12 04:18:49.165683 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 12 04:18:49.165697 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 12 04:18:49.165710 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 12 04:18:49.165725 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 12 04:18:49.165739 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 12 04:18:49.165757 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 12 04:18:49.165770 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 12 04:18:49.165784 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 12 04:18:49.165802 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 12 04:18:49.165824 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 12 04:18:49.165848 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 12 04:18:49.165869 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 12 04:18:49.165883 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 12 04:18:49.165901 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 12 04:18:49.165916 systemd[1]: Reached target slices.target - Slice Units. Mar 12 04:18:49.165930 systemd[1]: Reached target swap.target - Swaps. Mar 12 04:18:49.165951 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 12 04:18:49.165968 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 12 04:18:49.165987 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 12 04:18:49.166001 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 12 04:18:49.166016 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 12 04:18:49.166030 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 12 04:18:49.166044 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 12 04:18:49.166058 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 12 04:18:49.166072 systemd[1]: Mounting media.mount - External Media Directory... Mar 12 04:18:49.166086 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 12 04:18:49.166103 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 12 04:18:49.166117 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 12 04:18:49.166133 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 12 04:18:49.166148 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 12 04:18:49.171558 systemd[1]: Reached target machines.target - Containers. Mar 12 04:18:49.171588 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 12 04:18:49.171604 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 04:18:49.171619 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 12 04:18:49.171634 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 12 04:18:49.171665 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 12 04:18:49.171679 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 12 04:18:49.171693 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 12 04:18:49.171708 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 12 04:18:49.171721 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 12 04:18:49.171735 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 12 04:18:49.171750 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 12 04:18:49.171764 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 12 04:18:49.171786 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 12 04:18:49.171801 systemd[1]: Stopped systemd-fsck-usr.service. Mar 12 04:18:49.171815 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 12 04:18:49.171829 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 12 04:18:49.171844 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 12 04:18:49.171858 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 12 04:18:49.171872 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 12 04:18:49.171886 systemd[1]: verity-setup.service: Deactivated successfully. Mar 12 04:18:49.171900 systemd[1]: Stopped verity-setup.service. Mar 12 04:18:49.171920 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 12 04:18:49.171934 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 12 04:18:49.171948 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 12 04:18:49.171962 systemd[1]: Mounted media.mount - External Media Directory. Mar 12 04:18:49.171977 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 12 04:18:49.171994 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 12 04:18:49.172008 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 12 04:18:49.172023 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 12 04:18:49.172037 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 12 04:18:49.172051 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 12 04:18:49.172069 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 12 04:18:49.172084 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 12 04:18:49.172105 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 12 04:18:49.172119 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 12 04:18:49.172134 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 12 04:18:49.172151 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 12 04:18:49.172176 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 12 04:18:49.172195 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 12 04:18:49.172216 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 12 04:18:49.172231 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 12 04:18:49.172245 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 12 04:18:49.172288 systemd-journald[1134]: Collecting audit messages is disabled. Mar 12 04:18:49.172325 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 12 04:18:49.172339 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 12 04:18:49.172355 systemd-journald[1134]: Journal started Mar 12 04:18:49.172386 systemd-journald[1134]: Runtime Journal (/run/log/journal/3728667685a24618a746dc180dcab054) is 4.7M, max 38.0M, 33.2M free. Mar 12 04:18:48.872025 systemd[1]: Queued start job for default target multi-user.target. Mar 12 04:18:48.893220 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 12 04:18:48.894047 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 12 04:18:49.173231 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 12 04:18:49.178911 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Mar 12 04:18:49.178964 kernel: loop: module loaded Mar 12 04:18:49.186235 kernel: fuse: init (API version 7.39) Mar 12 04:18:49.188227 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 12 04:18:49.198204 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 12 04:18:49.215524 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 04:18:49.215598 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 12 04:18:49.215630 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 12 04:18:49.225433 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 12 04:18:49.238783 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 12 04:18:49.238856 systemd[1]: Started systemd-journald.service - Journal Service. Mar 12 04:18:49.245656 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 12 04:18:49.245844 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 12 04:18:49.246924 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 12 04:18:49.249209 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 12 04:18:49.249924 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 12 04:18:49.267806 systemd-tmpfiles[1149]: ACLs are not supported, ignoring. Mar 12 04:18:49.267827 systemd-tmpfiles[1149]: ACLs are not supported, ignoring. Mar 12 04:18:49.286253 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 12 04:18:49.301352 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 12 04:18:49.301878 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 12 04:18:49.303903 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 12 04:18:49.305754 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 12 04:18:49.306567 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 12 04:18:49.316914 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 12 04:18:49.324350 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Mar 12 04:18:49.326568 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 12 04:18:49.333250 kernel: loop0: detected capacity change from 0 to 140768 Mar 12 04:18:49.354710 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 12 04:18:49.357858 systemd-journald[1134]: Time spent on flushing to /var/log/journal/3728667685a24618a746dc180dcab054 is 110.980ms for 1155 entries. Mar 12 04:18:49.357858 systemd-journald[1134]: System Journal (/var/log/journal/3728667685a24618a746dc180dcab054) is 8.0M, max 584.8M, 576.8M free. Mar 12 04:18:49.492671 systemd-journald[1134]: Received client request to flush runtime journal. Mar 12 04:18:49.492734 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 12 04:18:49.492753 kernel: ACPI: bus type drm_connector registered Mar 12 04:18:49.492943 kernel: loop1: detected capacity change from 0 to 8 Mar 12 04:18:49.368365 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 12 04:18:49.384622 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 12 04:18:49.393238 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Mar 12 04:18:49.417729 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 12 04:18:49.419263 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 12 04:18:49.467630 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 12 04:18:49.478329 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 12 04:18:49.497342 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 12 04:18:49.505204 kernel: loop2: detected capacity change from 0 to 142488 Mar 12 04:18:49.550301 systemd-tmpfiles[1205]: ACLs are not supported, ignoring. Mar 12 04:18:49.550321 systemd-tmpfiles[1205]: ACLs are not supported, ignoring. Mar 12 04:18:49.556610 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 12 04:18:49.568586 kernel: loop3: detected capacity change from 0 to 228704 Mar 12 04:18:49.569968 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 12 04:18:49.571885 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 12 04:18:49.612192 udevadm[1213]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Mar 12 04:18:49.634184 kernel: loop4: detected capacity change from 0 to 140768 Mar 12 04:18:49.653491 kernel: loop5: detected capacity change from 0 to 8 Mar 12 04:18:49.656186 kernel: loop6: detected capacity change from 0 to 142488 Mar 12 04:18:49.685798 kernel: loop7: detected capacity change from 0 to 228704 Mar 12 04:18:49.702028 (sd-merge)[1216]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Mar 12 04:18:49.702569 (sd-merge)[1216]: Merged extensions into '/usr'. Mar 12 04:18:49.711836 systemd[1]: Reloading requested from client PID 1163 ('systemd-sysext') (unit systemd-sysext.service)... Mar 12 04:18:49.711892 systemd[1]: Reloading... Mar 12 04:18:49.826520 ldconfig[1158]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 12 04:18:49.860224 zram_generator::config[1242]: No configuration found. Mar 12 04:18:50.042723 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 12 04:18:50.092949 systemd[1]: Reloading finished in 380 ms. Mar 12 04:18:50.130604 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 12 04:18:50.131763 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 12 04:18:50.143424 systemd[1]: Starting ensure-sysext.service... Mar 12 04:18:50.147322 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 12 04:18:50.159306 systemd[1]: Reloading requested from client PID 1298 ('systemctl') (unit ensure-sysext.service)... Mar 12 04:18:50.159321 systemd[1]: Reloading... Mar 12 04:18:50.188368 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 12 04:18:50.188750 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 12 04:18:50.189691 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 12 04:18:50.189970 systemd-tmpfiles[1299]: ACLs are not supported, ignoring. Mar 12 04:18:50.190038 systemd-tmpfiles[1299]: ACLs are not supported, ignoring. Mar 12 04:18:50.196032 systemd-tmpfiles[1299]: Detected autofs mount point /boot during canonicalization of boot. Mar 12 04:18:50.196045 systemd-tmpfiles[1299]: Skipping /boot Mar 12 04:18:50.208891 systemd-tmpfiles[1299]: Detected autofs mount point /boot during canonicalization of boot. Mar 12 04:18:50.208905 systemd-tmpfiles[1299]: Skipping /boot Mar 12 04:18:50.245188 zram_generator::config[1326]: No configuration found. Mar 12 04:18:50.387914 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 12 04:18:50.438858 systemd[1]: Reloading finished in 279 ms. Mar 12 04:18:50.463096 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 12 04:18:50.474700 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 12 04:18:50.485357 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 12 04:18:50.491445 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 12 04:18:50.501400 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 12 04:18:50.507507 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 12 04:18:50.512034 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 12 04:18:50.516636 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 12 04:18:50.522664 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 12 04:18:50.524452 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 04:18:50.528936 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 12 04:18:50.538433 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 12 04:18:50.546680 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 12 04:18:50.547838 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 04:18:50.547985 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 12 04:18:50.549712 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 12 04:18:50.560623 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 12 04:18:50.563414 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 12 04:18:50.563638 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 04:18:50.563809 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 04:18:50.571432 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 12 04:18:50.572254 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 12 04:18:50.579070 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 12 04:18:50.580151 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 12 04:18:50.583991 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 12 04:18:50.584884 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 12 04:18:50.585037 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 12 04:18:50.586282 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 12 04:18:50.589105 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 12 04:18:50.589292 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 12 04:18:50.590625 systemd[1]: Finished ensure-sysext.service. Mar 12 04:18:50.593151 systemd-udevd[1395]: Using default interface naming scheme 'v255'. Mar 12 04:18:50.606333 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 12 04:18:50.607101 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 12 04:18:50.607304 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 12 04:18:50.609126 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 12 04:18:50.612450 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 12 04:18:50.612946 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 12 04:18:50.614817 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 12 04:18:50.623301 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 12 04:18:50.634752 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 12 04:18:50.635046 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 12 04:18:50.642448 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 12 04:18:50.643597 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 12 04:18:50.650416 augenrules[1421]: No rules Mar 12 04:18:50.651325 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 12 04:18:50.652964 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 12 04:18:50.661521 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 12 04:18:50.674723 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 12 04:18:50.798364 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 12 04:18:50.799036 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 12 04:18:50.799077 systemd[1]: Reached target time-set.target - System Time Set. Mar 12 04:18:50.824115 systemd-networkd[1429]: lo: Link UP Mar 12 04:18:50.825673 systemd-networkd[1429]: lo: Gained carrier Mar 12 04:18:50.826619 systemd-networkd[1429]: Enumeration completed Mar 12 04:18:50.827320 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 12 04:18:50.834346 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 12 04:18:50.843583 systemd-resolved[1394]: Positive Trust Anchors: Mar 12 04:18:50.844282 systemd-resolved[1394]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 12 04:18:50.844402 systemd-resolved[1394]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 12 04:18:50.853793 systemd-resolved[1394]: Using system hostname 'srv-tymtb.gb1.brightbox.com'. Mar 12 04:18:50.855466 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 12 04:18:50.856020 systemd[1]: Reached target network.target - Network. Mar 12 04:18:50.856405 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 12 04:18:50.880182 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 33 scanned by (udev-worker) (1449) Mar 12 04:18:50.918700 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Mar 12 04:18:50.924188 kernel: ACPI: button: Power Button [PWRF] Mar 12 04:18:50.945568 systemd-networkd[1429]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 04:18:50.945681 systemd-networkd[1429]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 12 04:18:50.947030 systemd-networkd[1429]: eth0: Link UP Mar 12 04:18:50.947125 systemd-networkd[1429]: eth0: Gained carrier Mar 12 04:18:50.947221 systemd-networkd[1429]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 12 04:18:50.954278 systemd-networkd[1429]: eth0: DHCPv4 address 10.244.101.2/30, gateway 10.244.101.1 acquired from 10.244.101.1 Mar 12 04:18:50.955090 systemd-timesyncd[1411]: Network configuration changed, trying to establish connection. Mar 12 04:18:50.983559 kernel: mousedev: PS/2 mouse device common for all mice Mar 12 04:18:50.992104 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 12 04:18:50.997355 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 12 04:18:51.011171 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Mar 12 04:18:51.019192 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 12 04:18:51.022697 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Mar 12 04:18:51.022932 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 12 04:18:51.030174 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 12 04:18:51.075009 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 12 04:18:51.202603 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 12 04:18:51.215690 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 12 04:18:51.222462 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 12 04:18:51.252259 lvm[1472]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 12 04:18:51.278252 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 12 04:18:51.280444 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 12 04:18:51.281066 systemd[1]: Reached target sysinit.target - System Initialization. Mar 12 04:18:51.281728 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 12 04:18:51.282508 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 12 04:18:51.283352 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 12 04:18:51.284037 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 12 04:18:51.284679 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 12 04:18:51.285263 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 12 04:18:51.285304 systemd[1]: Reached target paths.target - Path Units. Mar 12 04:18:51.285780 systemd[1]: Reached target timers.target - Timer Units. Mar 12 04:18:51.287037 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 12 04:18:51.289569 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 12 04:18:51.301756 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 12 04:18:51.303585 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 12 04:18:51.304569 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 12 04:18:51.305051 systemd[1]: Reached target sockets.target - Socket Units. Mar 12 04:18:51.305442 systemd[1]: Reached target basic.target - Basic System. Mar 12 04:18:51.305870 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 12 04:18:51.305896 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 12 04:18:51.308275 systemd[1]: Starting containerd.service - containerd container runtime... Mar 12 04:18:51.311596 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 12 04:18:51.317339 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 12 04:18:51.320301 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 12 04:18:51.323239 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 12 04:18:51.324247 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 12 04:18:51.332219 lvm[1476]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 12 04:18:51.334210 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 12 04:18:51.336582 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 12 04:18:51.339361 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 12 04:18:51.342741 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 12 04:18:51.352353 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 12 04:18:51.353724 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 12 04:18:51.354296 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 12 04:18:51.363376 systemd[1]: Starting update-engine.service - Update Engine... Mar 12 04:18:51.366404 jq[1480]: false Mar 12 04:18:51.365302 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 12 04:18:51.375509 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 12 04:18:51.375720 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 12 04:18:51.396537 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 12 04:18:51.397232 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 12 04:18:51.411969 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 12 04:18:51.416610 dbus-daemon[1479]: [system] SELinux support is enabled Mar 12 04:18:51.419469 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 12 04:18:51.421662 dbus-daemon[1479]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.2' (uid=244 pid=1429 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Mar 12 04:18:51.428517 systemd[1]: motdgen.service: Deactivated successfully. Mar 12 04:18:51.429253 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 12 04:18:51.435552 dbus-daemon[1479]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 12 04:18:51.434659 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 12 04:18:51.434714 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 12 04:18:51.435250 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 12 04:18:51.435266 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 12 04:18:51.447347 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Mar 12 04:18:51.452821 extend-filesystems[1481]: Found loop4 Mar 12 04:18:51.452821 extend-filesystems[1481]: Found loop5 Mar 12 04:18:51.457769 extend-filesystems[1481]: Found loop6 Mar 12 04:18:51.457769 extend-filesystems[1481]: Found loop7 Mar 12 04:18:51.457769 extend-filesystems[1481]: Found vda Mar 12 04:18:51.457769 extend-filesystems[1481]: Found vda1 Mar 12 04:18:51.457769 extend-filesystems[1481]: Found vda2 Mar 12 04:18:51.457769 extend-filesystems[1481]: Found vda3 Mar 12 04:18:51.457769 extend-filesystems[1481]: Found usr Mar 12 04:18:51.457769 extend-filesystems[1481]: Found vda4 Mar 12 04:18:51.457769 extend-filesystems[1481]: Found vda6 Mar 12 04:18:51.457769 extend-filesystems[1481]: Found vda7 Mar 12 04:18:51.457769 extend-filesystems[1481]: Found vda9 Mar 12 04:18:51.457769 extend-filesystems[1481]: Checking size of /dev/vda9 Mar 12 04:18:51.469679 jq[1490]: true Mar 12 04:18:51.466001 (ntainerd)[1506]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 12 04:18:51.478213 update_engine[1488]: I20260312 04:18:51.476804 1488 main.cc:92] Flatcar Update Engine starting Mar 12 04:18:51.482516 tar[1503]: linux-amd64/LICENSE Mar 12 04:18:51.485790 tar[1503]: linux-amd64/helm Mar 12 04:18:51.487087 systemd[1]: Started update-engine.service - Update Engine. Mar 12 04:18:51.498557 update_engine[1488]: I20260312 04:18:51.498029 1488 update_check_scheduler.cc:74] Next update check in 5m27s Mar 12 04:18:51.501374 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 12 04:18:51.530399 jq[1515]: true Mar 12 04:18:51.532256 systemd-logind[1486]: Watching system buttons on /dev/input/event2 (Power Button) Mar 12 04:18:51.532281 systemd-logind[1486]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 12 04:18:51.534282 extend-filesystems[1481]: Resized partition /dev/vda9 Mar 12 04:18:51.532752 systemd-logind[1486]: New seat seat0. Mar 12 04:18:51.548449 extend-filesystems[1520]: resize2fs 1.47.1 (20-May-2024) Mar 12 04:18:51.552603 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Mar 12 04:18:51.640991 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 33 scanned by (udev-worker) (1436) Mar 12 04:18:51.702677 systemd[1]: Started systemd-logind.service - User Login Management. Mar 12 04:18:51.726187 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Mar 12 04:18:51.752774 extend-filesystems[1520]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 12 04:18:51.752774 extend-filesystems[1520]: old_desc_blocks = 1, new_desc_blocks = 8 Mar 12 04:18:51.752774 extend-filesystems[1520]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Mar 12 04:18:51.757743 extend-filesystems[1481]: Resized filesystem in /dev/vda9 Mar 12 04:18:51.753722 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 12 04:18:51.753963 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 12 04:18:51.773585 dbus-daemon[1479]: [system] Successfully activated service 'org.freedesktop.hostname1' Mar 12 04:18:51.773843 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Mar 12 04:18:51.779262 dbus-daemon[1479]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.6' (uid=0 pid=1511 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Mar 12 04:18:51.792529 systemd[1]: Starting polkit.service - Authorization Manager... Mar 12 04:18:51.825571 polkitd[1542]: Started polkitd version 121 Mar 12 04:18:51.834378 bash[1546]: Updated "/home/core/.ssh/authorized_keys" Mar 12 04:18:51.836193 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 12 04:18:51.840981 polkitd[1542]: Loading rules from directory /etc/polkit-1/rules.d Mar 12 04:18:51.841043 polkitd[1542]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 12 04:18:51.842656 polkitd[1542]: Finished loading, compiling and executing 2 rules Mar 12 04:18:51.843301 dbus-daemon[1479]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Mar 12 04:18:51.844237 polkitd[1542]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Mar 12 04:18:51.846421 systemd[1]: Starting sshkeys.service... Mar 12 04:18:51.847498 systemd[1]: Started polkit.service - Authorization Manager. Mar 12 04:18:51.870413 systemd-hostnamed[1511]: Hostname set to (static) Mar 12 04:18:51.877552 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 12 04:18:51.885526 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 12 04:18:51.921055 locksmithd[1516]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 12 04:18:51.933061 containerd[1506]: time="2026-03-12T04:18:51.932957949Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Mar 12 04:18:52.017145 containerd[1506]: time="2026-03-12T04:18:52.016055520Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 12 04:18:52.020370 sshd_keygen[1508]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 12 04:18:52.023042 containerd[1506]: time="2026-03-12T04:18:52.023001056Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 12 04:18:52.023042 containerd[1506]: time="2026-03-12T04:18:52.023038610Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 12 04:18:52.023133 containerd[1506]: time="2026-03-12T04:18:52.023057304Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 12 04:18:52.024041 containerd[1506]: time="2026-03-12T04:18:52.024001445Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 12 04:18:52.024041 containerd[1506]: time="2026-03-12T04:18:52.024040556Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 12 04:18:52.025189 containerd[1506]: time="2026-03-12T04:18:52.024510975Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 12 04:18:52.025189 containerd[1506]: time="2026-03-12T04:18:52.024532669Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 12 04:18:52.025189 containerd[1506]: time="2026-03-12T04:18:52.025060441Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 12 04:18:52.025189 containerd[1506]: time="2026-03-12T04:18:52.025079157Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 12 04:18:52.025189 containerd[1506]: time="2026-03-12T04:18:52.025102582Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 12 04:18:52.025189 containerd[1506]: time="2026-03-12T04:18:52.025114365Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 12 04:18:52.026207 containerd[1506]: time="2026-03-12T04:18:52.026132365Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 12 04:18:52.026491 containerd[1506]: time="2026-03-12T04:18:52.026459947Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 12 04:18:52.026621 containerd[1506]: time="2026-03-12T04:18:52.026602895Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 12 04:18:52.026651 containerd[1506]: time="2026-03-12T04:18:52.026622142Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 12 04:18:52.027206 containerd[1506]: time="2026-03-12T04:18:52.027185914Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 12 04:18:52.027278 containerd[1506]: time="2026-03-12T04:18:52.027264835Z" level=info msg="metadata content store policy set" policy=shared Mar 12 04:18:52.029395 containerd[1506]: time="2026-03-12T04:18:52.029367973Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 12 04:18:52.029520 containerd[1506]: time="2026-03-12T04:18:52.029423177Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 12 04:18:52.029520 containerd[1506]: time="2026-03-12T04:18:52.029447165Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 12 04:18:52.029520 containerd[1506]: time="2026-03-12T04:18:52.029462594Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 12 04:18:52.029520 containerd[1506]: time="2026-03-12T04:18:52.029477869Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 12 04:18:52.029614 containerd[1506]: time="2026-03-12T04:18:52.029603375Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 12 04:18:52.029938 containerd[1506]: time="2026-03-12T04:18:52.029848817Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 12 04:18:52.029988 containerd[1506]: time="2026-03-12T04:18:52.029967007Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 12 04:18:52.029988 containerd[1506]: time="2026-03-12T04:18:52.029984586Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 12 04:18:52.030040 containerd[1506]: time="2026-03-12T04:18:52.030001913Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 12 04:18:52.030040 containerd[1506]: time="2026-03-12T04:18:52.030016139Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 12 04:18:52.030040 containerd[1506]: time="2026-03-12T04:18:52.030029266Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 12 04:18:52.030106 containerd[1506]: time="2026-03-12T04:18:52.030042586Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 12 04:18:52.030106 containerd[1506]: time="2026-03-12T04:18:52.030056790Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 12 04:18:52.030106 containerd[1506]: time="2026-03-12T04:18:52.030070931Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 12 04:18:52.030106 containerd[1506]: time="2026-03-12T04:18:52.030083706Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 12 04:18:52.030106 containerd[1506]: time="2026-03-12T04:18:52.030096672Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 12 04:18:52.030232 containerd[1506]: time="2026-03-12T04:18:52.030108386Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 12 04:18:52.030232 containerd[1506]: time="2026-03-12T04:18:52.030131834Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.030232 containerd[1506]: time="2026-03-12T04:18:52.030145327Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.030232 containerd[1506]: time="2026-03-12T04:18:52.030180574Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.030232 containerd[1506]: time="2026-03-12T04:18:52.030197174Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.030232 containerd[1506]: time="2026-03-12T04:18:52.030217139Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.030232 containerd[1506]: time="2026-03-12T04:18:52.030230606Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.030398 containerd[1506]: time="2026-03-12T04:18:52.030242916Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.030398 containerd[1506]: time="2026-03-12T04:18:52.030258757Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.030398 containerd[1506]: time="2026-03-12T04:18:52.030271235Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.030398 containerd[1506]: time="2026-03-12T04:18:52.030286245Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.030398 containerd[1506]: time="2026-03-12T04:18:52.030297400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.030398 containerd[1506]: time="2026-03-12T04:18:52.030317802Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.030398 containerd[1506]: time="2026-03-12T04:18:52.030332068Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.030398 containerd[1506]: time="2026-03-12T04:18:52.030346500Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 12 04:18:52.030398 containerd[1506]: time="2026-03-12T04:18:52.030366356Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.030398 containerd[1506]: time="2026-03-12T04:18:52.030378200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.030398 containerd[1506]: time="2026-03-12T04:18:52.030390540Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 12 04:18:52.030650 containerd[1506]: time="2026-03-12T04:18:52.030454071Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 12 04:18:52.030650 containerd[1506]: time="2026-03-12T04:18:52.030471481Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 12 04:18:52.030650 containerd[1506]: time="2026-03-12T04:18:52.030484031Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 12 04:18:52.030650 containerd[1506]: time="2026-03-12T04:18:52.030496315Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 12 04:18:52.030650 containerd[1506]: time="2026-03-12T04:18:52.030506225Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.030650 containerd[1506]: time="2026-03-12T04:18:52.030517340Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 12 04:18:52.030650 containerd[1506]: time="2026-03-12T04:18:52.030531868Z" level=info msg="NRI interface is disabled by configuration." Mar 12 04:18:52.030650 containerd[1506]: time="2026-03-12T04:18:52.030542674Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 12 04:18:52.033194 containerd[1506]: time="2026-03-12T04:18:52.030876653Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 12 04:18:52.033194 containerd[1506]: time="2026-03-12T04:18:52.030937522Z" level=info msg="Connect containerd service" Mar 12 04:18:52.033194 containerd[1506]: time="2026-03-12T04:18:52.030972571Z" level=info msg="using legacy CRI server" Mar 12 04:18:52.033194 containerd[1506]: time="2026-03-12T04:18:52.030980377Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 12 04:18:52.033194 containerd[1506]: time="2026-03-12T04:18:52.031100345Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 12 04:18:52.035543 containerd[1506]: time="2026-03-12T04:18:52.035517835Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 12 04:18:52.036286 containerd[1506]: time="2026-03-12T04:18:52.035706852Z" level=info msg="Start subscribing containerd event" Mar 12 04:18:52.036286 containerd[1506]: time="2026-03-12T04:18:52.035781842Z" level=info msg="Start recovering state" Mar 12 04:18:52.036286 containerd[1506]: time="2026-03-12T04:18:52.035852564Z" level=info msg="Start event monitor" Mar 12 04:18:52.036286 containerd[1506]: time="2026-03-12T04:18:52.035872098Z" level=info msg="Start snapshots syncer" Mar 12 04:18:52.036286 containerd[1506]: time="2026-03-12T04:18:52.035881106Z" level=info msg="Start cni network conf syncer for default" Mar 12 04:18:52.036286 containerd[1506]: time="2026-03-12T04:18:52.035889832Z" level=info msg="Start streaming server" Mar 12 04:18:52.036642 containerd[1506]: time="2026-03-12T04:18:52.036618299Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 12 04:18:52.036691 containerd[1506]: time="2026-03-12T04:18:52.036677167Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 12 04:18:52.037690 containerd[1506]: time="2026-03-12T04:18:52.036755943Z" level=info msg="containerd successfully booted in 0.106695s" Mar 12 04:18:52.036838 systemd[1]: Started containerd.service - containerd container runtime. Mar 12 04:18:52.056907 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 12 04:18:52.066696 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 12 04:18:52.073747 systemd[1]: issuegen.service: Deactivated successfully. Mar 12 04:18:52.074432 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 12 04:18:52.084085 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 12 04:18:52.095000 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 12 04:18:52.104089 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 12 04:18:52.113511 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 12 04:18:52.114195 systemd[1]: Reached target getty.target - Login Prompts. Mar 12 04:18:52.369850 tar[1503]: linux-amd64/README.md Mar 12 04:18:52.385686 systemd-networkd[1429]: eth0: Gained IPv6LL Mar 12 04:18:52.388009 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 12 04:18:52.390471 systemd-timesyncd[1411]: Network configuration changed, trying to establish connection. Mar 12 04:18:52.394228 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 12 04:18:52.396003 systemd[1]: Reached target network-online.target - Network is Online. Mar 12 04:18:52.403431 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:18:52.407307 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 12 04:18:52.452400 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 12 04:18:53.280388 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:18:53.282139 (kubelet)[1603]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 04:18:53.895389 systemd-timesyncd[1411]: Network configuration changed, trying to establish connection. Mar 12 04:18:53.897419 systemd-networkd[1429]: eth0: Ignoring DHCPv6 address 2a02:1348:17d:1940:24:19ff:fef4:6502/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17d:1940:24:19ff:fef4:6502/64 assigned by NDisc. Mar 12 04:18:53.897427 systemd-networkd[1429]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Mar 12 04:18:53.898797 kubelet[1603]: E0312 04:18:53.898765 1603 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 04:18:53.902876 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 04:18:53.903085 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 04:18:53.903731 systemd[1]: kubelet.service: Consumed 1.212s CPU time. Mar 12 04:18:55.518465 systemd-timesyncd[1411]: Network configuration changed, trying to establish connection. Mar 12 04:18:57.169935 login[1581]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Mar 12 04:18:57.170392 login[1580]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 12 04:18:57.183527 systemd-logind[1486]: New session 1 of user core. Mar 12 04:18:57.185900 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 12 04:18:57.192061 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 12 04:18:57.219021 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 12 04:18:57.226634 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 12 04:18:57.255502 (systemd)[1619]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 12 04:18:57.372443 systemd[1619]: Queued start job for default target default.target. Mar 12 04:18:57.384006 systemd[1619]: Created slice app.slice - User Application Slice. Mar 12 04:18:57.384045 systemd[1619]: Reached target paths.target - Paths. Mar 12 04:18:57.384070 systemd[1619]: Reached target timers.target - Timers. Mar 12 04:18:57.385567 systemd[1619]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 12 04:18:57.410335 systemd[1619]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 12 04:18:57.410471 systemd[1619]: Reached target sockets.target - Sockets. Mar 12 04:18:57.410487 systemd[1619]: Reached target basic.target - Basic System. Mar 12 04:18:57.410527 systemd[1619]: Reached target default.target - Main User Target. Mar 12 04:18:57.410569 systemd[1619]: Startup finished in 141ms. Mar 12 04:18:57.411020 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 12 04:18:57.421916 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 12 04:18:57.855550 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 12 04:18:57.863783 systemd[1]: Started sshd@0-10.244.101.2:22-20.161.92.111:45580.service - OpenSSH per-connection server daemon (20.161.92.111:45580). Mar 12 04:18:58.176448 login[1581]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 12 04:18:58.190246 systemd-logind[1486]: New session 2 of user core. Mar 12 04:18:58.208357 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 12 04:18:58.457339 sshd[1640]: Accepted publickey for core from 20.161.92.111 port 45580 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:18:58.459589 sshd[1640]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:18:58.471225 systemd-logind[1486]: New session 3 of user core. Mar 12 04:18:58.477399 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 12 04:18:58.480529 coreos-metadata[1478]: Mar 12 04:18:58.480 WARN failed to locate config-drive, using the metadata service API instead Mar 12 04:18:58.500293 coreos-metadata[1478]: Mar 12 04:18:58.500 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Mar 12 04:18:58.506267 coreos-metadata[1478]: Mar 12 04:18:58.506 INFO Fetch failed with 404: resource not found Mar 12 04:18:58.506267 coreos-metadata[1478]: Mar 12 04:18:58.506 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 12 04:18:58.506933 coreos-metadata[1478]: Mar 12 04:18:58.506 INFO Fetch successful Mar 12 04:18:58.507018 coreos-metadata[1478]: Mar 12 04:18:58.506 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Mar 12 04:18:58.527249 coreos-metadata[1478]: Mar 12 04:18:58.527 INFO Fetch successful Mar 12 04:18:58.527522 coreos-metadata[1478]: Mar 12 04:18:58.527 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Mar 12 04:18:58.544310 coreos-metadata[1478]: Mar 12 04:18:58.544 INFO Fetch successful Mar 12 04:18:58.544558 coreos-metadata[1478]: Mar 12 04:18:58.544 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Mar 12 04:18:58.559825 coreos-metadata[1478]: Mar 12 04:18:58.559 INFO Fetch successful Mar 12 04:18:58.560099 coreos-metadata[1478]: Mar 12 04:18:58.559 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Mar 12 04:18:58.580699 coreos-metadata[1478]: Mar 12 04:18:58.580 INFO Fetch successful Mar 12 04:18:58.615830 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 12 04:18:58.619405 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 12 04:18:58.947606 systemd[1]: Started sshd@1-10.244.101.2:22-20.161.92.111:45588.service - OpenSSH per-connection server daemon (20.161.92.111:45588). Mar 12 04:18:58.994278 coreos-metadata[1560]: Mar 12 04:18:58.994 WARN failed to locate config-drive, using the metadata service API instead Mar 12 04:18:59.010902 coreos-metadata[1560]: Mar 12 04:18:59.010 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 12 04:18:59.030826 coreos-metadata[1560]: Mar 12 04:18:59.030 INFO Fetch successful Mar 12 04:18:59.031055 coreos-metadata[1560]: Mar 12 04:18:59.030 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 12 04:18:59.070989 coreos-metadata[1560]: Mar 12 04:18:59.070 INFO Fetch successful Mar 12 04:18:59.075465 unknown[1560]: wrote ssh authorized keys file for user: core Mar 12 04:18:59.114462 update-ssh-keys[1664]: Updated "/home/core/.ssh/authorized_keys" Mar 12 04:18:59.117123 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 12 04:18:59.120986 systemd[1]: Finished sshkeys.service. Mar 12 04:18:59.122529 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 12 04:18:59.122780 systemd[1]: Startup finished in 1.099s (kernel) + 14.545s (initrd) + 10.903s (userspace) = 26.548s. Mar 12 04:18:59.524477 sshd[1660]: Accepted publickey for core from 20.161.92.111 port 45588 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:18:59.528301 sshd[1660]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:18:59.538972 systemd-logind[1486]: New session 4 of user core. Mar 12 04:18:59.551425 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 12 04:18:59.933998 sshd[1660]: pam_unix(sshd:session): session closed for user core Mar 12 04:18:59.939456 systemd[1]: sshd@1-10.244.101.2:22-20.161.92.111:45588.service: Deactivated successfully. Mar 12 04:18:59.942242 systemd[1]: session-4.scope: Deactivated successfully. Mar 12 04:18:59.944700 systemd-logind[1486]: Session 4 logged out. Waiting for processes to exit. Mar 12 04:18:59.946087 systemd-logind[1486]: Removed session 4. Mar 12 04:19:00.047801 systemd[1]: Started sshd@2-10.244.101.2:22-20.161.92.111:38206.service - OpenSSH per-connection server daemon (20.161.92.111:38206). Mar 12 04:19:00.613229 sshd[1673]: Accepted publickey for core from 20.161.92.111 port 38206 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:19:00.616885 sshd[1673]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:19:00.628863 systemd-logind[1486]: New session 5 of user core. Mar 12 04:19:00.640664 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 12 04:19:01.004672 sshd[1673]: pam_unix(sshd:session): session closed for user core Mar 12 04:19:01.010921 systemd[1]: sshd@2-10.244.101.2:22-20.161.92.111:38206.service: Deactivated successfully. Mar 12 04:19:01.010960 systemd-logind[1486]: Session 5 logged out. Waiting for processes to exit. Mar 12 04:19:01.013493 systemd[1]: session-5.scope: Deactivated successfully. Mar 12 04:19:01.015316 systemd-logind[1486]: Removed session 5. Mar 12 04:19:01.117698 systemd[1]: Started sshd@3-10.244.101.2:22-20.161.92.111:38220.service - OpenSSH per-connection server daemon (20.161.92.111:38220). Mar 12 04:19:01.677201 sshd[1680]: Accepted publickey for core from 20.161.92.111 port 38220 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:19:01.681480 sshd[1680]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:19:01.691810 systemd-logind[1486]: New session 6 of user core. Mar 12 04:19:01.704632 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 12 04:19:02.080797 sshd[1680]: pam_unix(sshd:session): session closed for user core Mar 12 04:19:02.088792 systemd[1]: sshd@3-10.244.101.2:22-20.161.92.111:38220.service: Deactivated successfully. Mar 12 04:19:02.093912 systemd[1]: session-6.scope: Deactivated successfully. Mar 12 04:19:02.096635 systemd-logind[1486]: Session 6 logged out. Waiting for processes to exit. Mar 12 04:19:02.099598 systemd-logind[1486]: Removed session 6. Mar 12 04:19:02.193783 systemd[1]: Started sshd@4-10.244.101.2:22-20.161.92.111:38222.service - OpenSSH per-connection server daemon (20.161.92.111:38222). Mar 12 04:19:02.750182 sshd[1687]: Accepted publickey for core from 20.161.92.111 port 38222 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:19:02.752783 sshd[1687]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:19:02.758499 systemd-logind[1486]: New session 7 of user core. Mar 12 04:19:02.765769 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 12 04:19:03.068570 sudo[1690]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 12 04:19:03.068890 sudo[1690]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 04:19:03.082800 sudo[1690]: pam_unix(sudo:session): session closed for user root Mar 12 04:19:03.173138 sshd[1687]: pam_unix(sshd:session): session closed for user core Mar 12 04:19:03.179340 systemd-logind[1486]: Session 7 logged out. Waiting for processes to exit. Mar 12 04:19:03.181927 systemd[1]: sshd@4-10.244.101.2:22-20.161.92.111:38222.service: Deactivated successfully. Mar 12 04:19:03.184381 systemd[1]: session-7.scope: Deactivated successfully. Mar 12 04:19:03.186383 systemd-logind[1486]: Removed session 7. Mar 12 04:19:03.281515 systemd[1]: Started sshd@5-10.244.101.2:22-20.161.92.111:38230.service - OpenSSH per-connection server daemon (20.161.92.111:38230). Mar 12 04:19:03.853213 sshd[1695]: Accepted publickey for core from 20.161.92.111 port 38230 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:19:03.856414 sshd[1695]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:19:03.865430 systemd-logind[1486]: New session 8 of user core. Mar 12 04:19:03.878418 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 12 04:19:04.112495 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 12 04:19:04.121894 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:19:04.164037 sudo[1702]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 12 04:19:04.164410 sudo[1702]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 04:19:04.171698 sudo[1702]: pam_unix(sudo:session): session closed for user root Mar 12 04:19:04.178928 sudo[1701]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Mar 12 04:19:04.179347 sudo[1701]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 04:19:04.200083 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Mar 12 04:19:04.203359 auditctl[1705]: No rules Mar 12 04:19:04.204436 systemd[1]: audit-rules.service: Deactivated successfully. Mar 12 04:19:04.204663 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Mar 12 04:19:04.213582 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 12 04:19:04.244827 augenrules[1723]: No rules Mar 12 04:19:04.247018 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 12 04:19:04.248022 sudo[1701]: pam_unix(sudo:session): session closed for user root Mar 12 04:19:04.281940 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:19:04.287591 (kubelet)[1733]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 04:19:04.338517 sshd[1695]: pam_unix(sshd:session): session closed for user core Mar 12 04:19:04.345186 kubelet[1733]: E0312 04:19:04.344968 1733 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 04:19:04.345724 systemd[1]: sshd@5-10.244.101.2:22-20.161.92.111:38230.service: Deactivated successfully. Mar 12 04:19:04.349028 systemd[1]: session-8.scope: Deactivated successfully. Mar 12 04:19:04.351205 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 04:19:04.351354 systemd-logind[1486]: Session 8 logged out. Waiting for processes to exit. Mar 12 04:19:04.351923 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 04:19:04.354418 systemd-logind[1486]: Removed session 8. Mar 12 04:19:04.447497 systemd[1]: Started sshd@6-10.244.101.2:22-20.161.92.111:38236.service - OpenSSH per-connection server daemon (20.161.92.111:38236). Mar 12 04:19:05.006984 sshd[1743]: Accepted publickey for core from 20.161.92.111 port 38236 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:19:05.008569 sshd[1743]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:19:05.019927 systemd-logind[1486]: New session 9 of user core. Mar 12 04:19:05.026536 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 12 04:19:05.315503 sudo[1746]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 12 04:19:05.315914 sudo[1746]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 12 04:19:05.757520 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 12 04:19:05.757640 (dockerd)[1762]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 12 04:19:06.167615 dockerd[1762]: time="2026-03-12T04:19:06.167540182Z" level=info msg="Starting up" Mar 12 04:19:06.297560 dockerd[1762]: time="2026-03-12T04:19:06.297501643Z" level=info msg="Loading containers: start." Mar 12 04:19:06.422350 kernel: Initializing XFRM netlink socket Mar 12 04:19:06.456342 systemd-timesyncd[1411]: Network configuration changed, trying to establish connection. Mar 12 04:19:06.522241 systemd-networkd[1429]: docker0: Link UP Mar 12 04:19:06.540227 dockerd[1762]: time="2026-03-12T04:19:06.539775208Z" level=info msg="Loading containers: done." Mar 12 04:19:06.559518 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3164004050-merged.mount: Deactivated successfully. Mar 12 04:19:06.562222 dockerd[1762]: time="2026-03-12T04:19:06.561579852Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 12 04:19:06.562222 dockerd[1762]: time="2026-03-12T04:19:06.561717773Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Mar 12 04:19:06.562222 dockerd[1762]: time="2026-03-12T04:19:06.561869615Z" level=info msg="Daemon has completed initialization" Mar 12 04:19:06.592220 dockerd[1762]: time="2026-03-12T04:19:06.591879647Z" level=info msg="API listen on /run/docker.sock" Mar 12 04:19:06.592249 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 12 04:19:07.222896 containerd[1506]: time="2026-03-12T04:19:07.222819021Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\"" Mar 12 04:19:07.962061 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3247489158.mount: Deactivated successfully. Mar 12 04:19:09.052473 systemd-resolved[1394]: Clock change detected. Flushing caches. Mar 12 04:19:09.052819 systemd-timesyncd[1411]: Contacted time server [2a01:7e00::f03c:93ff:fe0e:8e7f]:123 (2.flatcar.pool.ntp.org). Mar 12 04:19:09.052924 systemd-timesyncd[1411]: Initial clock synchronization to Thu 2026-03-12 04:19:09.050821 UTC. Mar 12 04:19:10.466203 containerd[1506]: time="2026-03-12T04:19:10.466098825Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:10.468018 containerd[1506]: time="2026-03-12T04:19:10.467928935Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.9: active requests=0, bytes read=30116194" Mar 12 04:19:10.468574 containerd[1506]: time="2026-03-12T04:19:10.468508868Z" level=info msg="ImageCreate event name:\"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:10.471864 containerd[1506]: time="2026-03-12T04:19:10.471701016Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:10.473299 containerd[1506]: time="2026-03-12T04:19:10.473105848Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.9\" with image id \"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\", size \"30112785\" in 2.363514189s" Mar 12 04:19:10.473299 containerd[1506]: time="2026-03-12T04:19:10.473155154Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\" returns image reference \"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\"" Mar 12 04:19:10.475213 containerd[1506]: time="2026-03-12T04:19:10.475180100Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\"" Mar 12 04:19:12.452942 containerd[1506]: time="2026-03-12T04:19:12.452744721Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:12.454263 containerd[1506]: time="2026-03-12T04:19:12.454219136Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.9: active requests=0, bytes read=26021818" Mar 12 04:19:12.454934 containerd[1506]: time="2026-03-12T04:19:12.454905810Z" level=info msg="ImageCreate event name:\"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:12.459792 containerd[1506]: time="2026-03-12T04:19:12.459745695Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:12.463362 containerd[1506]: time="2026-03-12T04:19:12.463313826Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.9\" with image id \"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\", size \"27678758\" in 1.988092785s" Mar 12 04:19:12.464351 containerd[1506]: time="2026-03-12T04:19:12.463476801Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\" returns image reference \"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\"" Mar 12 04:19:12.465126 containerd[1506]: time="2026-03-12T04:19:12.465067731Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\"" Mar 12 04:19:14.281177 containerd[1506]: time="2026-03-12T04:19:14.281039915Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:14.282872 containerd[1506]: time="2026-03-12T04:19:14.282656770Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.9: active requests=0, bytes read=20162754" Mar 12 04:19:14.283457 containerd[1506]: time="2026-03-12T04:19:14.283412033Z" level=info msg="ImageCreate event name:\"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:14.287883 containerd[1506]: time="2026-03-12T04:19:14.287163644Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:14.288978 containerd[1506]: time="2026-03-12T04:19:14.288314069Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.9\" with image id \"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\", size \"21819712\" in 1.823085069s" Mar 12 04:19:14.288978 containerd[1506]: time="2026-03-12T04:19:14.288374732Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\" returns image reference \"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\"" Mar 12 04:19:14.290131 containerd[1506]: time="2026-03-12T04:19:14.290105514Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\"" Mar 12 04:19:15.248806 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 12 04:19:15.262087 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:19:15.408427 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:19:15.419284 (kubelet)[1983]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 04:19:15.478111 kubelet[1983]: E0312 04:19:15.478045 1983 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 04:19:15.482590 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 04:19:15.482745 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 04:19:15.750004 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3500821486.mount: Deactivated successfully. Mar 12 04:19:16.211302 containerd[1506]: time="2026-03-12T04:19:16.210743857Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:16.212137 containerd[1506]: time="2026-03-12T04:19:16.211974616Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.9: active requests=0, bytes read=31828655" Mar 12 04:19:16.212564 containerd[1506]: time="2026-03-12T04:19:16.212484161Z" level=info msg="ImageCreate event name:\"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:16.215048 containerd[1506]: time="2026-03-12T04:19:16.214970365Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:16.217056 containerd[1506]: time="2026-03-12T04:19:16.215670429Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.9\" with image id \"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\", repo tag \"registry.k8s.io/kube-proxy:v1.33.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\", size \"31827666\" in 1.925530844s" Mar 12 04:19:16.217056 containerd[1506]: time="2026-03-12T04:19:16.215705258Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\" returns image reference \"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\"" Mar 12 04:19:16.217259 containerd[1506]: time="2026-03-12T04:19:16.217142386Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Mar 12 04:19:16.782197 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1151579245.mount: Deactivated successfully. Mar 12 04:19:18.595699 containerd[1506]: time="2026-03-12T04:19:18.594905798Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:18.596867 containerd[1506]: time="2026-03-12T04:19:18.596219756Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942246" Mar 12 04:19:18.597731 containerd[1506]: time="2026-03-12T04:19:18.597692074Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:18.601354 containerd[1506]: time="2026-03-12T04:19:18.601249723Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:18.603729 containerd[1506]: time="2026-03-12T04:19:18.603046943Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 2.385867184s" Mar 12 04:19:18.603729 containerd[1506]: time="2026-03-12T04:19:18.603135001Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Mar 12 04:19:18.604686 containerd[1506]: time="2026-03-12T04:19:18.604456948Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 12 04:19:19.458098 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2694862305.mount: Deactivated successfully. Mar 12 04:19:19.461523 containerd[1506]: time="2026-03-12T04:19:19.461478159Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:19.462406 containerd[1506]: time="2026-03-12T04:19:19.462323282Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:19.462406 containerd[1506]: time="2026-03-12T04:19:19.462366661Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Mar 12 04:19:19.464295 containerd[1506]: time="2026-03-12T04:19:19.464253724Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:19.465599 containerd[1506]: time="2026-03-12T04:19:19.465094020Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 860.605317ms" Mar 12 04:19:19.465599 containerd[1506]: time="2026-03-12T04:19:19.465125592Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Mar 12 04:19:19.465803 containerd[1506]: time="2026-03-12T04:19:19.465633484Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\"" Mar 12 04:19:20.045903 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4102114189.mount: Deactivated successfully. Mar 12 04:19:23.517947 containerd[1506]: time="2026-03-12T04:19:23.517639691Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.24-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:23.520476 containerd[1506]: time="2026-03-12T04:19:23.519898605Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.24-0: active requests=0, bytes read=23718848" Mar 12 04:19:23.530060 containerd[1506]: time="2026-03-12T04:19:23.524057235Z" level=info msg="ImageCreate event name:\"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:23.534262 containerd[1506]: time="2026-03-12T04:19:23.534147905Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:23.536654 containerd[1506]: time="2026-03-12T04:19:23.536399227Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.24-0\" with image id \"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\", repo tag \"registry.k8s.io/etcd:3.5.24-0\", repo digest \"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\", size \"23716032\" in 4.070734295s" Mar 12 04:19:23.536654 containerd[1506]: time="2026-03-12T04:19:23.536447662Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\" returns image reference \"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\"" Mar 12 04:19:24.828898 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Mar 12 04:19:25.499609 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 12 04:19:25.512954 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:19:25.654423 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:19:25.669383 (kubelet)[2147]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 12 04:19:25.728470 kubelet[2147]: E0312 04:19:25.728403 2147 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 12 04:19:25.732551 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 12 04:19:25.732877 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 12 04:19:27.760533 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:19:27.769487 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:19:27.806618 systemd[1]: Reloading requested from client PID 2161 ('systemctl') (unit session-9.scope)... Mar 12 04:19:27.806804 systemd[1]: Reloading... Mar 12 04:19:27.934906 zram_generator::config[2200]: No configuration found. Mar 12 04:19:28.095861 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 12 04:19:28.174086 systemd[1]: Reloading finished in 366 ms. Mar 12 04:19:28.232356 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 12 04:19:28.232440 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 12 04:19:28.232903 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:19:28.237188 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:19:28.383172 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:19:28.396420 (kubelet)[2268]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 12 04:19:28.435609 kubelet[2268]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 04:19:28.435609 kubelet[2268]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 12 04:19:28.435609 kubelet[2268]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 04:19:28.436544 kubelet[2268]: I0312 04:19:28.436070 2268 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 04:19:29.042768 kubelet[2268]: I0312 04:19:29.042681 2268 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 12 04:19:29.042768 kubelet[2268]: I0312 04:19:29.042733 2268 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 04:19:29.043287 kubelet[2268]: I0312 04:19:29.043068 2268 server.go:956] "Client rotation is on, will bootstrap in background" Mar 12 04:19:29.083874 kubelet[2268]: I0312 04:19:29.083439 2268 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 12 04:19:29.084864 kubelet[2268]: E0312 04:19:29.084552 2268 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.244.101.2:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.244.101.2:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 12 04:19:29.093085 kubelet[2268]: E0312 04:19:29.093047 2268 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 12 04:19:29.093085 kubelet[2268]: I0312 04:19:29.093078 2268 server.go:1423] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Mar 12 04:19:29.097782 kubelet[2268]: I0312 04:19:29.097750 2268 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 12 04:19:29.099054 kubelet[2268]: I0312 04:19:29.098997 2268 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 04:19:29.100603 kubelet[2268]: I0312 04:19:29.099038 2268 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-tymtb.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 04:19:29.100603 kubelet[2268]: I0312 04:19:29.100600 2268 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 04:19:29.100603 kubelet[2268]: I0312 04:19:29.100614 2268 container_manager_linux.go:303] "Creating device plugin manager" Mar 12 04:19:29.101041 kubelet[2268]: I0312 04:19:29.100783 2268 state_mem.go:36] "Initialized new in-memory state store" Mar 12 04:19:29.105553 kubelet[2268]: I0312 04:19:29.105527 2268 kubelet.go:480] "Attempting to sync node with API server" Mar 12 04:19:29.105642 kubelet[2268]: I0312 04:19:29.105565 2268 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 04:19:29.105642 kubelet[2268]: I0312 04:19:29.105604 2268 kubelet.go:386] "Adding apiserver pod source" Mar 12 04:19:29.105642 kubelet[2268]: I0312 04:19:29.105638 2268 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 04:19:29.116613 kubelet[2268]: I0312 04:19:29.115880 2268 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 12 04:19:29.116613 kubelet[2268]: E0312 04:19:29.116370 2268 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.244.101.2:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-tymtb.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.244.101.2:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 04:19:29.116613 kubelet[2268]: I0312 04:19:29.116485 2268 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 12 04:19:29.116613 kubelet[2268]: E0312 04:19:29.116500 2268 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.244.101.2:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.244.101.2:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 04:19:29.117963 kubelet[2268]: W0312 04:19:29.117339 2268 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 12 04:19:29.124801 kubelet[2268]: I0312 04:19:29.123427 2268 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 12 04:19:29.124801 kubelet[2268]: I0312 04:19:29.123476 2268 server.go:1289] "Started kubelet" Mar 12 04:19:29.124976 kubelet[2268]: I0312 04:19:29.124964 2268 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 04:19:29.132325 kubelet[2268]: E0312 04:19:29.128027 2268 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.244.101.2:6443/api/v1/namespaces/default/events\": dial tcp 10.244.101.2:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-tymtb.gb1.brightbox.com.189bfd1aa1a682cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-tymtb.gb1.brightbox.com,UID:srv-tymtb.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-tymtb.gb1.brightbox.com,},FirstTimestamp:2026-03-12 04:19:29.123443403 +0000 UTC m=+0.722409993,LastTimestamp:2026-03-12 04:19:29.123443403 +0000 UTC m=+0.722409993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-tymtb.gb1.brightbox.com,}" Mar 12 04:19:29.132325 kubelet[2268]: I0312 04:19:29.131287 2268 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 04:19:29.134137 kubelet[2268]: I0312 04:19:29.134095 2268 server.go:317] "Adding debug handlers to kubelet server" Mar 12 04:19:29.139332 kubelet[2268]: I0312 04:19:29.139300 2268 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 12 04:19:29.140480 kubelet[2268]: E0312 04:19:29.139905 2268 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-tymtb.gb1.brightbox.com\" not found" Mar 12 04:19:29.146909 kubelet[2268]: I0312 04:19:29.146829 2268 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 04:19:29.147128 kubelet[2268]: I0312 04:19:29.147108 2268 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 04:19:29.148882 kubelet[2268]: I0312 04:19:29.147757 2268 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 12 04:19:29.148882 kubelet[2268]: I0312 04:19:29.148072 2268 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 12 04:19:29.148882 kubelet[2268]: I0312 04:19:29.148242 2268 reconciler.go:26] "Reconciler: start to sync state" Mar 12 04:19:29.154975 kubelet[2268]: E0312 04:19:29.154949 2268 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.244.101.2:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.244.101.2:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 12 04:19:29.155071 kubelet[2268]: E0312 04:19:29.155049 2268 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.101.2:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-tymtb.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.101.2:6443: connect: connection refused" interval="200ms" Mar 12 04:19:29.158441 kubelet[2268]: I0312 04:19:29.158422 2268 factory.go:223] Registration of the containerd container factory successfully Mar 12 04:19:29.158660 kubelet[2268]: I0312 04:19:29.158649 2268 factory.go:223] Registration of the systemd container factory successfully Mar 12 04:19:29.158828 kubelet[2268]: I0312 04:19:29.158812 2268 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 12 04:19:29.170779 kubelet[2268]: I0312 04:19:29.170545 2268 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 12 04:19:29.171855 kubelet[2268]: I0312 04:19:29.171799 2268 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 12 04:19:29.171855 kubelet[2268]: I0312 04:19:29.171828 2268 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 12 04:19:29.171967 kubelet[2268]: I0312 04:19:29.171875 2268 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 12 04:19:29.171967 kubelet[2268]: I0312 04:19:29.171886 2268 kubelet.go:2436] "Starting kubelet main sync loop" Mar 12 04:19:29.171967 kubelet[2268]: E0312 04:19:29.171941 2268 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 04:19:29.180898 kubelet[2268]: E0312 04:19:29.180875 2268 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.244.101.2:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.244.101.2:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 12 04:19:29.193200 kubelet[2268]: I0312 04:19:29.193161 2268 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 12 04:19:29.193200 kubelet[2268]: I0312 04:19:29.193193 2268 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 12 04:19:29.193341 kubelet[2268]: I0312 04:19:29.193232 2268 state_mem.go:36] "Initialized new in-memory state store" Mar 12 04:19:29.194524 kubelet[2268]: I0312 04:19:29.194497 2268 policy_none.go:49] "None policy: Start" Mar 12 04:19:29.194589 kubelet[2268]: I0312 04:19:29.194549 2268 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 12 04:19:29.194589 kubelet[2268]: I0312 04:19:29.194571 2268 state_mem.go:35] "Initializing new in-memory state store" Mar 12 04:19:29.201073 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 12 04:19:29.215526 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 12 04:19:29.218727 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 12 04:19:29.227675 kubelet[2268]: E0312 04:19:29.226594 2268 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 12 04:19:29.227675 kubelet[2268]: I0312 04:19:29.226786 2268 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 04:19:29.227675 kubelet[2268]: I0312 04:19:29.226802 2268 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 04:19:29.227675 kubelet[2268]: I0312 04:19:29.227563 2268 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 04:19:29.229201 kubelet[2268]: E0312 04:19:29.229184 2268 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 12 04:19:29.229611 kubelet[2268]: E0312 04:19:29.229596 2268 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-tymtb.gb1.brightbox.com\" not found" Mar 12 04:19:29.290423 systemd[1]: Created slice kubepods-burstable-pod445c2011d5ea8da3a1f95cb62f2c2c5d.slice - libcontainer container kubepods-burstable-pod445c2011d5ea8da3a1f95cb62f2c2c5d.slice. Mar 12 04:19:29.302519 kubelet[2268]: E0312 04:19:29.301775 2268 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-tymtb.gb1.brightbox.com\" not found" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.306773 systemd[1]: Created slice kubepods-burstable-podaba5c5b17422c20f24a3d470e1d12e8a.slice - libcontainer container kubepods-burstable-podaba5c5b17422c20f24a3d470e1d12e8a.slice. Mar 12 04:19:29.308878 kubelet[2268]: E0312 04:19:29.308857 2268 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-tymtb.gb1.brightbox.com\" not found" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.311286 systemd[1]: Created slice kubepods-burstable-pod2c9913156d6dc33340480de136335dcb.slice - libcontainer container kubepods-burstable-pod2c9913156d6dc33340480de136335dcb.slice. Mar 12 04:19:29.313273 kubelet[2268]: E0312 04:19:29.313103 2268 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-tymtb.gb1.brightbox.com\" not found" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.330602 kubelet[2268]: I0312 04:19:29.330033 2268 kubelet_node_status.go:75] "Attempting to register node" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.331101 kubelet[2268]: E0312 04:19:29.331050 2268 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.101.2:6443/api/v1/nodes\": dial tcp 10.244.101.2:6443: connect: connection refused" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.349423 kubelet[2268]: I0312 04:19:29.349368 2268 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/445c2011d5ea8da3a1f95cb62f2c2c5d-ca-certs\") pod \"kube-apiserver-srv-tymtb.gb1.brightbox.com\" (UID: \"445c2011d5ea8da3a1f95cb62f2c2c5d\") " pod="kube-system/kube-apiserver-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.349423 kubelet[2268]: I0312 04:19:29.349419 2268 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/445c2011d5ea8da3a1f95cb62f2c2c5d-k8s-certs\") pod \"kube-apiserver-srv-tymtb.gb1.brightbox.com\" (UID: \"445c2011d5ea8da3a1f95cb62f2c2c5d\") " pod="kube-system/kube-apiserver-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.349639 kubelet[2268]: I0312 04:19:29.349458 2268 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/445c2011d5ea8da3a1f95cb62f2c2c5d-usr-share-ca-certificates\") pod \"kube-apiserver-srv-tymtb.gb1.brightbox.com\" (UID: \"445c2011d5ea8da3a1f95cb62f2c2c5d\") " pod="kube-system/kube-apiserver-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.349639 kubelet[2268]: I0312 04:19:29.349485 2268 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/aba5c5b17422c20f24a3d470e1d12e8a-ca-certs\") pod \"kube-controller-manager-srv-tymtb.gb1.brightbox.com\" (UID: \"aba5c5b17422c20f24a3d470e1d12e8a\") " pod="kube-system/kube-controller-manager-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.349639 kubelet[2268]: I0312 04:19:29.349510 2268 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/aba5c5b17422c20f24a3d470e1d12e8a-flexvolume-dir\") pod \"kube-controller-manager-srv-tymtb.gb1.brightbox.com\" (UID: \"aba5c5b17422c20f24a3d470e1d12e8a\") " pod="kube-system/kube-controller-manager-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.349639 kubelet[2268]: I0312 04:19:29.349537 2268 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/aba5c5b17422c20f24a3d470e1d12e8a-k8s-certs\") pod \"kube-controller-manager-srv-tymtb.gb1.brightbox.com\" (UID: \"aba5c5b17422c20f24a3d470e1d12e8a\") " pod="kube-system/kube-controller-manager-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.349639 kubelet[2268]: I0312 04:19:29.349559 2268 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/aba5c5b17422c20f24a3d470e1d12e8a-kubeconfig\") pod \"kube-controller-manager-srv-tymtb.gb1.brightbox.com\" (UID: \"aba5c5b17422c20f24a3d470e1d12e8a\") " pod="kube-system/kube-controller-manager-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.349883 kubelet[2268]: I0312 04:19:29.349580 2268 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2c9913156d6dc33340480de136335dcb-kubeconfig\") pod \"kube-scheduler-srv-tymtb.gb1.brightbox.com\" (UID: \"2c9913156d6dc33340480de136335dcb\") " pod="kube-system/kube-scheduler-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.349883 kubelet[2268]: I0312 04:19:29.349601 2268 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/aba5c5b17422c20f24a3d470e1d12e8a-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-tymtb.gb1.brightbox.com\" (UID: \"aba5c5b17422c20f24a3d470e1d12e8a\") " pod="kube-system/kube-controller-manager-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.356130 kubelet[2268]: E0312 04:19:29.356073 2268 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.101.2:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-tymtb.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.101.2:6443: connect: connection refused" interval="400ms" Mar 12 04:19:29.537371 kubelet[2268]: I0312 04:19:29.537295 2268 kubelet_node_status.go:75] "Attempting to register node" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.538464 kubelet[2268]: E0312 04:19:29.538292 2268 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.101.2:6443/api/v1/nodes\": dial tcp 10.244.101.2:6443: connect: connection refused" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.604978 containerd[1506]: time="2026-03-12T04:19:29.604726535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-tymtb.gb1.brightbox.com,Uid:445c2011d5ea8da3a1f95cb62f2c2c5d,Namespace:kube-system,Attempt:0,}" Mar 12 04:19:29.617058 containerd[1506]: time="2026-03-12T04:19:29.616762512Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-tymtb.gb1.brightbox.com,Uid:aba5c5b17422c20f24a3d470e1d12e8a,Namespace:kube-system,Attempt:0,}" Mar 12 04:19:29.617058 containerd[1506]: time="2026-03-12T04:19:29.616949338Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-tymtb.gb1.brightbox.com,Uid:2c9913156d6dc33340480de136335dcb,Namespace:kube-system,Attempt:0,}" Mar 12 04:19:29.757438 kubelet[2268]: E0312 04:19:29.757360 2268 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.101.2:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-tymtb.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.101.2:6443: connect: connection refused" interval="800ms" Mar 12 04:19:29.943133 kubelet[2268]: I0312 04:19:29.942902 2268 kubelet_node_status.go:75] "Attempting to register node" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.943767 kubelet[2268]: E0312 04:19:29.943394 2268 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.101.2:6443/api/v1/nodes\": dial tcp 10.244.101.2:6443: connect: connection refused" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:29.944327 kubelet[2268]: E0312 04:19:29.944290 2268 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.244.101.2:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.244.101.2:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 04:19:29.975887 kubelet[2268]: E0312 04:19:29.975759 2268 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.244.101.2:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.244.101.2:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 12 04:19:30.009041 kubelet[2268]: E0312 04:19:30.008950 2268 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.244.101.2:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.244.101.2:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 12 04:19:30.153730 kubelet[2268]: E0312 04:19:30.153615 2268 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.244.101.2:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-tymtb.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.244.101.2:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 04:19:30.155782 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount202686307.mount: Deactivated successfully. Mar 12 04:19:30.160515 containerd[1506]: time="2026-03-12T04:19:30.160417224Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 04:19:30.161329 containerd[1506]: time="2026-03-12T04:19:30.161208243Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 12 04:19:30.161568 containerd[1506]: time="2026-03-12T04:19:30.161530580Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 04:19:30.162270 containerd[1506]: time="2026-03-12T04:19:30.162190652Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Mar 12 04:19:30.162816 containerd[1506]: time="2026-03-12T04:19:30.162772648Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 04:19:30.163777 containerd[1506]: time="2026-03-12T04:19:30.163702789Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 12 04:19:30.165804 containerd[1506]: time="2026-03-12T04:19:30.165757984Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 04:19:30.168861 containerd[1506]: time="2026-03-12T04:19:30.168069945Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 563.105212ms" Mar 12 04:19:30.168861 containerd[1506]: time="2026-03-12T04:19:30.168802807Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 12 04:19:30.171487 containerd[1506]: time="2026-03-12T04:19:30.171443726Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 554.412911ms" Mar 12 04:19:30.172283 containerd[1506]: time="2026-03-12T04:19:30.172260919Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 555.4169ms" Mar 12 04:19:30.342107 containerd[1506]: time="2026-03-12T04:19:30.341922605Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:19:30.342374 containerd[1506]: time="2026-03-12T04:19:30.342155106Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:19:30.342374 containerd[1506]: time="2026-03-12T04:19:30.342183818Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:19:30.342374 containerd[1506]: time="2026-03-12T04:19:30.342271576Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:19:30.345129 containerd[1506]: time="2026-03-12T04:19:30.344863446Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:19:30.345129 containerd[1506]: time="2026-03-12T04:19:30.344922070Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:19:30.345129 containerd[1506]: time="2026-03-12T04:19:30.344939210Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:19:30.345129 containerd[1506]: time="2026-03-12T04:19:30.345013134Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:19:30.350892 containerd[1506]: time="2026-03-12T04:19:30.350562200Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:19:30.350892 containerd[1506]: time="2026-03-12T04:19:30.350605879Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:19:30.350892 containerd[1506]: time="2026-03-12T04:19:30.350623077Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:19:30.350892 containerd[1506]: time="2026-03-12T04:19:30.350688513Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:19:30.383059 systemd[1]: Started cri-containerd-095aa2b4f1a2ff3848d12062bac840f428eb6b8c292c6503d6b0c93b48cfc6e8.scope - libcontainer container 095aa2b4f1a2ff3848d12062bac840f428eb6b8c292c6503d6b0c93b48cfc6e8. Mar 12 04:19:30.395018 systemd[1]: Started cri-containerd-83680a600174805c9d4377e578da88fa3430a8cf1bf39ea22ec33051bcf2d7b8.scope - libcontainer container 83680a600174805c9d4377e578da88fa3430a8cf1bf39ea22ec33051bcf2d7b8. Mar 12 04:19:30.396883 systemd[1]: Started cri-containerd-8621b9a456b72491e2c10f93125a4404a702ab16a75b6e587e43087c0fc5ea23.scope - libcontainer container 8621b9a456b72491e2c10f93125a4404a702ab16a75b6e587e43087c0fc5ea23. Mar 12 04:19:30.488164 containerd[1506]: time="2026-03-12T04:19:30.488092226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-tymtb.gb1.brightbox.com,Uid:2c9913156d6dc33340480de136335dcb,Namespace:kube-system,Attempt:0,} returns sandbox id \"8621b9a456b72491e2c10f93125a4404a702ab16a75b6e587e43087c0fc5ea23\"" Mar 12 04:19:30.491876 containerd[1506]: time="2026-03-12T04:19:30.490548914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-tymtb.gb1.brightbox.com,Uid:aba5c5b17422c20f24a3d470e1d12e8a,Namespace:kube-system,Attempt:0,} returns sandbox id \"83680a600174805c9d4377e578da88fa3430a8cf1bf39ea22ec33051bcf2d7b8\"" Mar 12 04:19:30.495977 containerd[1506]: time="2026-03-12T04:19:30.495941440Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-tymtb.gb1.brightbox.com,Uid:445c2011d5ea8da3a1f95cb62f2c2c5d,Namespace:kube-system,Attempt:0,} returns sandbox id \"095aa2b4f1a2ff3848d12062bac840f428eb6b8c292c6503d6b0c93b48cfc6e8\"" Mar 12 04:19:30.498367 containerd[1506]: time="2026-03-12T04:19:30.498327813Z" level=info msg="CreateContainer within sandbox \"83680a600174805c9d4377e578da88fa3430a8cf1bf39ea22ec33051bcf2d7b8\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 12 04:19:30.499108 containerd[1506]: time="2026-03-12T04:19:30.498934224Z" level=info msg="CreateContainer within sandbox \"8621b9a456b72491e2c10f93125a4404a702ab16a75b6e587e43087c0fc5ea23\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 12 04:19:30.518413 containerd[1506]: time="2026-03-12T04:19:30.518369828Z" level=info msg="CreateContainer within sandbox \"095aa2b4f1a2ff3848d12062bac840f428eb6b8c292c6503d6b0c93b48cfc6e8\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 12 04:19:30.523822 containerd[1506]: time="2026-03-12T04:19:30.523515578Z" level=info msg="CreateContainer within sandbox \"83680a600174805c9d4377e578da88fa3430a8cf1bf39ea22ec33051bcf2d7b8\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"391c9a2dccab907870a93230634df5a91dd8281999c4b6b619bbc5bb2d225b73\"" Mar 12 04:19:30.524929 containerd[1506]: time="2026-03-12T04:19:30.524601662Z" level=info msg="StartContainer for \"391c9a2dccab907870a93230634df5a91dd8281999c4b6b619bbc5bb2d225b73\"" Mar 12 04:19:30.536534 containerd[1506]: time="2026-03-12T04:19:30.536299276Z" level=info msg="CreateContainer within sandbox \"095aa2b4f1a2ff3848d12062bac840f428eb6b8c292c6503d6b0c93b48cfc6e8\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"6157972d9b3b84e760d9ad1ff69f0c5fe756920632a3651b20ce244197cd9e25\"" Mar 12 04:19:30.537288 containerd[1506]: time="2026-03-12T04:19:30.536885916Z" level=info msg="CreateContainer within sandbox \"8621b9a456b72491e2c10f93125a4404a702ab16a75b6e587e43087c0fc5ea23\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"a6d98bfcade139d2f5c928a765878fe3c3ccdc67e887dc492493ed9bec4ac43f\"" Mar 12 04:19:30.537471 containerd[1506]: time="2026-03-12T04:19:30.537450006Z" level=info msg="StartContainer for \"a6d98bfcade139d2f5c928a765878fe3c3ccdc67e887dc492493ed9bec4ac43f\"" Mar 12 04:19:30.539961 containerd[1506]: time="2026-03-12T04:19:30.537810238Z" level=info msg="StartContainer for \"6157972d9b3b84e760d9ad1ff69f0c5fe756920632a3651b20ce244197cd9e25\"" Mar 12 04:19:30.558489 kubelet[2268]: E0312 04:19:30.558451 2268 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.101.2:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-tymtb.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.101.2:6443: connect: connection refused" interval="1.6s" Mar 12 04:19:30.577044 systemd[1]: Started cri-containerd-391c9a2dccab907870a93230634df5a91dd8281999c4b6b619bbc5bb2d225b73.scope - libcontainer container 391c9a2dccab907870a93230634df5a91dd8281999c4b6b619bbc5bb2d225b73. Mar 12 04:19:30.603193 systemd[1]: Started cri-containerd-6157972d9b3b84e760d9ad1ff69f0c5fe756920632a3651b20ce244197cd9e25.scope - libcontainer container 6157972d9b3b84e760d9ad1ff69f0c5fe756920632a3651b20ce244197cd9e25. Mar 12 04:19:30.606149 systemd[1]: Started cri-containerd-a6d98bfcade139d2f5c928a765878fe3c3ccdc67e887dc492493ed9bec4ac43f.scope - libcontainer container a6d98bfcade139d2f5c928a765878fe3c3ccdc67e887dc492493ed9bec4ac43f. Mar 12 04:19:30.690884 containerd[1506]: time="2026-03-12T04:19:30.687066249Z" level=info msg="StartContainer for \"a6d98bfcade139d2f5c928a765878fe3c3ccdc67e887dc492493ed9bec4ac43f\" returns successfully" Mar 12 04:19:30.690884 containerd[1506]: time="2026-03-12T04:19:30.687239571Z" level=info msg="StartContainer for \"391c9a2dccab907870a93230634df5a91dd8281999c4b6b619bbc5bb2d225b73\" returns successfully" Mar 12 04:19:30.696983 containerd[1506]: time="2026-03-12T04:19:30.696107076Z" level=info msg="StartContainer for \"6157972d9b3b84e760d9ad1ff69f0c5fe756920632a3651b20ce244197cd9e25\" returns successfully" Mar 12 04:19:30.748869 kubelet[2268]: I0312 04:19:30.747852 2268 kubelet_node_status.go:75] "Attempting to register node" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:30.748869 kubelet[2268]: E0312 04:19:30.748206 2268 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.101.2:6443/api/v1/nodes\": dial tcp 10.244.101.2:6443: connect: connection refused" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:31.195647 kubelet[2268]: E0312 04:19:31.195606 2268 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-tymtb.gb1.brightbox.com\" not found" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:31.201541 kubelet[2268]: E0312 04:19:31.201347 2268 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-tymtb.gb1.brightbox.com\" not found" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:31.202746 kubelet[2268]: E0312 04:19:31.202722 2268 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-tymtb.gb1.brightbox.com\" not found" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:32.206005 kubelet[2268]: E0312 04:19:32.205974 2268 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-tymtb.gb1.brightbox.com\" not found" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:32.207398 kubelet[2268]: E0312 04:19:32.207208 2268 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-tymtb.gb1.brightbox.com\" not found" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:32.350929 kubelet[2268]: I0312 04:19:32.350902 2268 kubelet_node_status.go:75] "Attempting to register node" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:33.045769 kubelet[2268]: E0312 04:19:33.045681 2268 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-tymtb.gb1.brightbox.com\" not found" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:33.113330 kubelet[2268]: I0312 04:19:33.112793 2268 apiserver.go:52] "Watching apiserver" Mar 12 04:19:33.130808 kubelet[2268]: E0312 04:19:33.130493 2268 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{srv-tymtb.gb1.brightbox.com.189bfd1aa1a682cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-tymtb.gb1.brightbox.com,UID:srv-tymtb.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-tymtb.gb1.brightbox.com,},FirstTimestamp:2026-03-12 04:19:29.123443403 +0000 UTC m=+0.722409993,LastTimestamp:2026-03-12 04:19:29.123443403 +0000 UTC m=+0.722409993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-tymtb.gb1.brightbox.com,}" Mar 12 04:19:33.149898 kubelet[2268]: I0312 04:19:33.149016 2268 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 12 04:19:33.189813 kubelet[2268]: I0312 04:19:33.189756 2268 kubelet_node_status.go:78] "Successfully registered node" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:33.189813 kubelet[2268]: E0312 04:19:33.189794 2268 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"srv-tymtb.gb1.brightbox.com\": node \"srv-tymtb.gb1.brightbox.com\" not found" Mar 12 04:19:33.247685 kubelet[2268]: I0312 04:19:33.247625 2268 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:33.263664 kubelet[2268]: E0312 04:19:33.263594 2268 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-tymtb.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:33.263664 kubelet[2268]: I0312 04:19:33.263642 2268 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:33.269085 kubelet[2268]: E0312 04:19:33.269035 2268 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-tymtb.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:33.269085 kubelet[2268]: I0312 04:19:33.269061 2268 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:33.273559 kubelet[2268]: E0312 04:19:33.273528 2268 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-tymtb.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:35.367169 systemd[1]: Reloading requested from client PID 2555 ('systemctl') (unit session-9.scope)... Mar 12 04:19:35.367195 systemd[1]: Reloading... Mar 12 04:19:35.468898 zram_generator::config[2594]: No configuration found. Mar 12 04:19:35.629274 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 12 04:19:35.728108 systemd[1]: Reloading finished in 360 ms. Mar 12 04:19:35.774242 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:19:35.797929 systemd[1]: kubelet.service: Deactivated successfully. Mar 12 04:19:35.798897 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:19:35.799118 systemd[1]: kubelet.service: Consumed 1.246s CPU time, 127.3M memory peak, 0B memory swap peak. Mar 12 04:19:35.814337 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 12 04:19:36.002195 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 12 04:19:36.013313 (kubelet)[2658]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 12 04:19:36.080260 kubelet[2658]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 04:19:36.080260 kubelet[2658]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 12 04:19:36.080260 kubelet[2658]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 04:19:36.080710 kubelet[2658]: I0312 04:19:36.080313 2658 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 04:19:36.089352 kubelet[2658]: I0312 04:19:36.088809 2658 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 12 04:19:36.089352 kubelet[2658]: I0312 04:19:36.088835 2658 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 04:19:36.089352 kubelet[2658]: I0312 04:19:36.089130 2658 server.go:956] "Client rotation is on, will bootstrap in background" Mar 12 04:19:36.091330 kubelet[2658]: I0312 04:19:36.091309 2658 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 12 04:19:36.099338 kubelet[2658]: I0312 04:19:36.098300 2658 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 12 04:19:36.107483 kubelet[2658]: E0312 04:19:36.107427 2658 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 12 04:19:36.107753 kubelet[2658]: I0312 04:19:36.107735 2658 server.go:1423] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Mar 12 04:19:36.113762 kubelet[2658]: I0312 04:19:36.113737 2658 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 12 04:19:36.114072 kubelet[2658]: I0312 04:19:36.114046 2658 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 04:19:36.114236 kubelet[2658]: I0312 04:19:36.114074 2658 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-tymtb.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 04:19:36.114351 kubelet[2658]: I0312 04:19:36.114255 2658 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 04:19:36.114351 kubelet[2658]: I0312 04:19:36.114265 2658 container_manager_linux.go:303] "Creating device plugin manager" Mar 12 04:19:36.114351 kubelet[2658]: I0312 04:19:36.114343 2658 state_mem.go:36] "Initialized new in-memory state store" Mar 12 04:19:36.114586 kubelet[2658]: I0312 04:19:36.114574 2658 kubelet.go:480] "Attempting to sync node with API server" Mar 12 04:19:36.114618 kubelet[2658]: I0312 04:19:36.114594 2658 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 04:19:36.114654 kubelet[2658]: I0312 04:19:36.114631 2658 kubelet.go:386] "Adding apiserver pod source" Mar 12 04:19:36.114684 kubelet[2658]: I0312 04:19:36.114654 2658 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 04:19:36.117351 kubelet[2658]: I0312 04:19:36.117329 2658 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 12 04:19:36.117874 kubelet[2658]: I0312 04:19:36.117833 2658 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 12 04:19:36.121197 kubelet[2658]: I0312 04:19:36.121178 2658 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 12 04:19:36.121282 kubelet[2658]: I0312 04:19:36.121233 2658 server.go:1289] "Started kubelet" Mar 12 04:19:36.122550 kubelet[2658]: I0312 04:19:36.122516 2658 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 04:19:36.123027 kubelet[2658]: I0312 04:19:36.122980 2658 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 04:19:36.123444 kubelet[2658]: I0312 04:19:36.123430 2658 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 04:19:36.124154 kubelet[2658]: I0312 04:19:36.124140 2658 server.go:317] "Adding debug handlers to kubelet server" Mar 12 04:19:36.127689 kubelet[2658]: I0312 04:19:36.126734 2658 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 04:19:36.141057 kubelet[2658]: I0312 04:19:36.140002 2658 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 12 04:19:36.143218 kubelet[2658]: I0312 04:19:36.143202 2658 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 12 04:19:36.144978 kubelet[2658]: E0312 04:19:36.144956 2658 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-tymtb.gb1.brightbox.com\" not found" Mar 12 04:19:36.147015 kubelet[2658]: I0312 04:19:36.145987 2658 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 12 04:19:36.147339 kubelet[2658]: I0312 04:19:36.147326 2658 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 12 04:19:36.147508 kubelet[2658]: I0312 04:19:36.147499 2658 reconciler.go:26] "Reconciler: start to sync state" Mar 12 04:19:36.157196 kubelet[2658]: I0312 04:19:36.157164 2658 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 12 04:19:36.157196 kubelet[2658]: I0312 04:19:36.157196 2658 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 12 04:19:36.157342 kubelet[2658]: I0312 04:19:36.157222 2658 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 12 04:19:36.157342 kubelet[2658]: I0312 04:19:36.157236 2658 kubelet.go:2436] "Starting kubelet main sync loop" Mar 12 04:19:36.157342 kubelet[2658]: E0312 04:19:36.157277 2658 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 04:19:36.159062 kubelet[2658]: I0312 04:19:36.159043 2658 factory.go:223] Registration of the containerd container factory successfully Mar 12 04:19:36.159165 kubelet[2658]: I0312 04:19:36.159158 2658 factory.go:223] Registration of the systemd container factory successfully Mar 12 04:19:36.159290 kubelet[2658]: I0312 04:19:36.159275 2658 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 12 04:19:36.211760 kubelet[2658]: I0312 04:19:36.211728 2658 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 12 04:19:36.211760 kubelet[2658]: I0312 04:19:36.211748 2658 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 12 04:19:36.211760 kubelet[2658]: I0312 04:19:36.211774 2658 state_mem.go:36] "Initialized new in-memory state store" Mar 12 04:19:36.212086 kubelet[2658]: I0312 04:19:36.212072 2658 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 12 04:19:36.212121 kubelet[2658]: I0312 04:19:36.212086 2658 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 12 04:19:36.212121 kubelet[2658]: I0312 04:19:36.212115 2658 policy_none.go:49] "None policy: Start" Mar 12 04:19:36.212188 kubelet[2658]: I0312 04:19:36.212134 2658 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 12 04:19:36.212188 kubelet[2658]: I0312 04:19:36.212145 2658 state_mem.go:35] "Initializing new in-memory state store" Mar 12 04:19:36.212245 kubelet[2658]: I0312 04:19:36.212240 2658 state_mem.go:75] "Updated machine memory state" Mar 12 04:19:36.218187 kubelet[2658]: E0312 04:19:36.218148 2658 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 12 04:19:36.218393 kubelet[2658]: I0312 04:19:36.218335 2658 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 04:19:36.218393 kubelet[2658]: I0312 04:19:36.218352 2658 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 04:19:36.219521 kubelet[2658]: I0312 04:19:36.219344 2658 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 04:19:36.223722 kubelet[2658]: E0312 04:19:36.222416 2658 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 12 04:19:36.260083 kubelet[2658]: I0312 04:19:36.259081 2658 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:36.260083 kubelet[2658]: I0312 04:19:36.259545 2658 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:36.263728 kubelet[2658]: I0312 04:19:36.260544 2658 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:36.268736 kubelet[2658]: I0312 04:19:36.268693 2658 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 12 04:19:36.271103 kubelet[2658]: I0312 04:19:36.270921 2658 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 12 04:19:36.271954 kubelet[2658]: I0312 04:19:36.271921 2658 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 12 04:19:36.332227 kubelet[2658]: I0312 04:19:36.331787 2658 kubelet_node_status.go:75] "Attempting to register node" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:36.349090 kubelet[2658]: I0312 04:19:36.349036 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/aba5c5b17422c20f24a3d470e1d12e8a-k8s-certs\") pod \"kube-controller-manager-srv-tymtb.gb1.brightbox.com\" (UID: \"aba5c5b17422c20f24a3d470e1d12e8a\") " pod="kube-system/kube-controller-manager-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:36.349090 kubelet[2658]: I0312 04:19:36.349085 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/aba5c5b17422c20f24a3d470e1d12e8a-kubeconfig\") pod \"kube-controller-manager-srv-tymtb.gb1.brightbox.com\" (UID: \"aba5c5b17422c20f24a3d470e1d12e8a\") " pod="kube-system/kube-controller-manager-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:36.349265 kubelet[2658]: I0312 04:19:36.349104 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/445c2011d5ea8da3a1f95cb62f2c2c5d-ca-certs\") pod \"kube-apiserver-srv-tymtb.gb1.brightbox.com\" (UID: \"445c2011d5ea8da3a1f95cb62f2c2c5d\") " pod="kube-system/kube-apiserver-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:36.349265 kubelet[2658]: I0312 04:19:36.349130 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/445c2011d5ea8da3a1f95cb62f2c2c5d-usr-share-ca-certificates\") pod \"kube-apiserver-srv-tymtb.gb1.brightbox.com\" (UID: \"445c2011d5ea8da3a1f95cb62f2c2c5d\") " pod="kube-system/kube-apiserver-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:36.349265 kubelet[2658]: I0312 04:19:36.349151 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/aba5c5b17422c20f24a3d470e1d12e8a-ca-certs\") pod \"kube-controller-manager-srv-tymtb.gb1.brightbox.com\" (UID: \"aba5c5b17422c20f24a3d470e1d12e8a\") " pod="kube-system/kube-controller-manager-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:36.349265 kubelet[2658]: I0312 04:19:36.349168 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/aba5c5b17422c20f24a3d470e1d12e8a-flexvolume-dir\") pod \"kube-controller-manager-srv-tymtb.gb1.brightbox.com\" (UID: \"aba5c5b17422c20f24a3d470e1d12e8a\") " pod="kube-system/kube-controller-manager-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:36.349265 kubelet[2658]: I0312 04:19:36.349184 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/aba5c5b17422c20f24a3d470e1d12e8a-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-tymtb.gb1.brightbox.com\" (UID: \"aba5c5b17422c20f24a3d470e1d12e8a\") " pod="kube-system/kube-controller-manager-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:36.349428 kubelet[2658]: I0312 04:19:36.349211 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2c9913156d6dc33340480de136335dcb-kubeconfig\") pod \"kube-scheduler-srv-tymtb.gb1.brightbox.com\" (UID: \"2c9913156d6dc33340480de136335dcb\") " pod="kube-system/kube-scheduler-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:36.349428 kubelet[2658]: I0312 04:19:36.349227 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/445c2011d5ea8da3a1f95cb62f2c2c5d-k8s-certs\") pod \"kube-apiserver-srv-tymtb.gb1.brightbox.com\" (UID: \"445c2011d5ea8da3a1f95cb62f2c2c5d\") " pod="kube-system/kube-apiserver-srv-tymtb.gb1.brightbox.com" Mar 12 04:19:36.361046 kubelet[2658]: I0312 04:19:36.359801 2658 kubelet_node_status.go:124] "Node was previously registered" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:36.361046 kubelet[2658]: I0312 04:19:36.359983 2658 kubelet_node_status.go:78] "Successfully registered node" node="srv-tymtb.gb1.brightbox.com" Mar 12 04:19:37.116681 kubelet[2658]: I0312 04:19:37.116316 2658 apiserver.go:52] "Watching apiserver" Mar 12 04:19:37.147797 kubelet[2658]: I0312 04:19:37.147722 2658 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 12 04:19:37.216986 kubelet[2658]: I0312 04:19:37.216903 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-tymtb.gb1.brightbox.com" podStartSLOduration=1.216206407 podStartE2EDuration="1.216206407s" podCreationTimestamp="2026-03-12 04:19:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 04:19:37.213577646 +0000 UTC m=+1.193422480" watchObservedRunningTime="2026-03-12 04:19:37.216206407 +0000 UTC m=+1.196051241" Mar 12 04:19:37.226833 kubelet[2658]: I0312 04:19:37.226612 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-tymtb.gb1.brightbox.com" podStartSLOduration=1.226597726 podStartE2EDuration="1.226597726s" podCreationTimestamp="2026-03-12 04:19:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 04:19:37.224949105 +0000 UTC m=+1.204793942" watchObservedRunningTime="2026-03-12 04:19:37.226597726 +0000 UTC m=+1.206442554" Mar 12 04:19:37.251563 kubelet[2658]: I0312 04:19:37.251171 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-tymtb.gb1.brightbox.com" podStartSLOduration=1.2511539489999999 podStartE2EDuration="1.251153949s" podCreationTimestamp="2026-03-12 04:19:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 04:19:37.238639229 +0000 UTC m=+1.218484060" watchObservedRunningTime="2026-03-12 04:19:37.251153949 +0000 UTC m=+1.230998781" Mar 12 04:19:37.462865 update_engine[1488]: I20260312 04:19:37.462507 1488 update_attempter.cc:509] Updating boot flags... Mar 12 04:19:37.507908 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 33 scanned by (udev-worker) (2713) Mar 12 04:19:37.564172 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 33 scanned by (udev-worker) (2711) Mar 12 04:19:37.632171 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 33 scanned by (udev-worker) (2711) Mar 12 04:19:40.407046 kubelet[2658]: I0312 04:19:40.406960 2658 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 12 04:19:40.409460 containerd[1506]: time="2026-03-12T04:19:40.409117915Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 12 04:19:40.410291 kubelet[2658]: I0312 04:19:40.409473 2658 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 12 04:19:40.936615 systemd[1]: Created slice kubepods-besteffort-pod3e94bd2b_0dbb_4b9c_a785_7c0d5ecc01a4.slice - libcontainer container kubepods-besteffort-pod3e94bd2b_0dbb_4b9c_a785_7c0d5ecc01a4.slice. Mar 12 04:19:40.985038 kubelet[2658]: I0312 04:19:40.984994 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e94bd2b-0dbb-4b9c-a785-7c0d5ecc01a4-lib-modules\") pod \"kube-proxy-kj6f5\" (UID: \"3e94bd2b-0dbb-4b9c-a785-7c0d5ecc01a4\") " pod="kube-system/kube-proxy-kj6f5" Mar 12 04:19:40.985038 kubelet[2658]: I0312 04:19:40.985041 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjwx7\" (UniqueName: \"kubernetes.io/projected/3e94bd2b-0dbb-4b9c-a785-7c0d5ecc01a4-kube-api-access-bjwx7\") pod \"kube-proxy-kj6f5\" (UID: \"3e94bd2b-0dbb-4b9c-a785-7c0d5ecc01a4\") " pod="kube-system/kube-proxy-kj6f5" Mar 12 04:19:40.985284 kubelet[2658]: I0312 04:19:40.985071 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/3e94bd2b-0dbb-4b9c-a785-7c0d5ecc01a4-kube-proxy\") pod \"kube-proxy-kj6f5\" (UID: \"3e94bd2b-0dbb-4b9c-a785-7c0d5ecc01a4\") " pod="kube-system/kube-proxy-kj6f5" Mar 12 04:19:40.985284 kubelet[2658]: I0312 04:19:40.985096 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3e94bd2b-0dbb-4b9c-a785-7c0d5ecc01a4-xtables-lock\") pod \"kube-proxy-kj6f5\" (UID: \"3e94bd2b-0dbb-4b9c-a785-7c0d5ecc01a4\") " pod="kube-system/kube-proxy-kj6f5" Mar 12 04:19:41.249441 containerd[1506]: time="2026-03-12T04:19:41.248687682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-kj6f5,Uid:3e94bd2b-0dbb-4b9c-a785-7c0d5ecc01a4,Namespace:kube-system,Attempt:0,}" Mar 12 04:19:41.290679 containerd[1506]: time="2026-03-12T04:19:41.290561609Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:19:41.290679 containerd[1506]: time="2026-03-12T04:19:41.290625676Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:19:41.290679 containerd[1506]: time="2026-03-12T04:19:41.290637239Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:19:41.291211 containerd[1506]: time="2026-03-12T04:19:41.291035634Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:19:41.326314 systemd[1]: Started cri-containerd-141351c001a1b42af5d1915e90c58b9e930091c1bf09993c8dedf5f15905ea83.scope - libcontainer container 141351c001a1b42af5d1915e90c58b9e930091c1bf09993c8dedf5f15905ea83. Mar 12 04:19:41.374899 containerd[1506]: time="2026-03-12T04:19:41.374816416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-kj6f5,Uid:3e94bd2b-0dbb-4b9c-a785-7c0d5ecc01a4,Namespace:kube-system,Attempt:0,} returns sandbox id \"141351c001a1b42af5d1915e90c58b9e930091c1bf09993c8dedf5f15905ea83\"" Mar 12 04:19:41.382012 containerd[1506]: time="2026-03-12T04:19:41.381838159Z" level=info msg="CreateContainer within sandbox \"141351c001a1b42af5d1915e90c58b9e930091c1bf09993c8dedf5f15905ea83\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 12 04:19:41.396019 containerd[1506]: time="2026-03-12T04:19:41.395821696Z" level=info msg="CreateContainer within sandbox \"141351c001a1b42af5d1915e90c58b9e930091c1bf09993c8dedf5f15905ea83\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"38e87bc222cc8747a4c0c9db178a683ab6c0fbf38636e547f5f40fcd946c5a1e\"" Mar 12 04:19:41.398381 containerd[1506]: time="2026-03-12T04:19:41.398120165Z" level=info msg="StartContainer for \"38e87bc222cc8747a4c0c9db178a683ab6c0fbf38636e547f5f40fcd946c5a1e\"" Mar 12 04:19:41.455165 systemd[1]: Started cri-containerd-38e87bc222cc8747a4c0c9db178a683ab6c0fbf38636e547f5f40fcd946c5a1e.scope - libcontainer container 38e87bc222cc8747a4c0c9db178a683ab6c0fbf38636e547f5f40fcd946c5a1e. Mar 12 04:19:41.507514 containerd[1506]: time="2026-03-12T04:19:41.506623855Z" level=info msg="StartContainer for \"38e87bc222cc8747a4c0c9db178a683ab6c0fbf38636e547f5f40fcd946c5a1e\" returns successfully" Mar 12 04:19:41.617522 systemd[1]: Created slice kubepods-besteffort-podc5775117_ea46_477d_b1d6_6f03d4d9d2d0.slice - libcontainer container kubepods-besteffort-podc5775117_ea46_477d_b1d6_6f03d4d9d2d0.slice. Mar 12 04:19:41.689532 kubelet[2658]: I0312 04:19:41.689486 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c5775117-ea46-477d-b1d6-6f03d4d9d2d0-var-lib-calico\") pod \"tigera-operator-6bf85f8dd-8tc2w\" (UID: \"c5775117-ea46-477d-b1d6-6f03d4d9d2d0\") " pod="tigera-operator/tigera-operator-6bf85f8dd-8tc2w" Mar 12 04:19:41.689532 kubelet[2658]: I0312 04:19:41.689530 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkmvj\" (UniqueName: \"kubernetes.io/projected/c5775117-ea46-477d-b1d6-6f03d4d9d2d0-kube-api-access-gkmvj\") pod \"tigera-operator-6bf85f8dd-8tc2w\" (UID: \"c5775117-ea46-477d-b1d6-6f03d4d9d2d0\") " pod="tigera-operator/tigera-operator-6bf85f8dd-8tc2w" Mar 12 04:19:41.762420 systemd[1]: Started sshd@7-10.244.101.2:22-80.94.95.116:48400.service - OpenSSH per-connection server daemon (80.94.95.116:48400). Mar 12 04:19:41.923412 containerd[1506]: time="2026-03-12T04:19:41.923354855Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-8tc2w,Uid:c5775117-ea46-477d-b1d6-6f03d4d9d2d0,Namespace:tigera-operator,Attempt:0,}" Mar 12 04:19:41.963689 containerd[1506]: time="2026-03-12T04:19:41.963415478Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:19:41.963689 containerd[1506]: time="2026-03-12T04:19:41.963483521Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:19:41.963689 containerd[1506]: time="2026-03-12T04:19:41.963499981Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:19:41.963689 containerd[1506]: time="2026-03-12T04:19:41.963600270Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:19:41.985727 systemd[1]: Started cri-containerd-4e49425c01f9b7c32341ddc017c00695776752257bfd0cfa6009c39cf5f4cb74.scope - libcontainer container 4e49425c01f9b7c32341ddc017c00695776752257bfd0cfa6009c39cf5f4cb74. Mar 12 04:19:42.048962 containerd[1506]: time="2026-03-12T04:19:42.048716830Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-8tc2w,Uid:c5775117-ea46-477d-b1d6-6f03d4d9d2d0,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"4e49425c01f9b7c32341ddc017c00695776752257bfd0cfa6009c39cf5f4cb74\"" Mar 12 04:19:42.052410 containerd[1506]: time="2026-03-12T04:19:42.052333782Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 12 04:19:42.727064 sshd[2840]: Invalid user admin from 80.94.95.116 port 48400 Mar 12 04:19:42.802956 sshd[2840]: Connection closed by invalid user admin 80.94.95.116 port 48400 [preauth] Mar 12 04:19:42.804168 systemd[1]: sshd@7-10.244.101.2:22-80.94.95.116:48400.service: Deactivated successfully. Mar 12 04:19:44.226414 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount621411771.mount: Deactivated successfully. Mar 12 04:19:45.209872 containerd[1506]: time="2026-03-12T04:19:45.209349372Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:45.211338 containerd[1506]: time="2026-03-12T04:19:45.210868787Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Mar 12 04:19:45.212116 containerd[1506]: time="2026-03-12T04:19:45.211838496Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:45.217761 containerd[1506]: time="2026-03-12T04:19:45.217733625Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:45.218728 containerd[1506]: time="2026-03-12T04:19:45.218352590Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 3.165979262s" Mar 12 04:19:45.218728 containerd[1506]: time="2026-03-12T04:19:45.218394000Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Mar 12 04:19:45.225262 containerd[1506]: time="2026-03-12T04:19:45.225220104Z" level=info msg="CreateContainer within sandbox \"4e49425c01f9b7c32341ddc017c00695776752257bfd0cfa6009c39cf5f4cb74\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 12 04:19:45.236657 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4102017682.mount: Deactivated successfully. Mar 12 04:19:45.238972 containerd[1506]: time="2026-03-12T04:19:45.238736192Z" level=info msg="CreateContainer within sandbox \"4e49425c01f9b7c32341ddc017c00695776752257bfd0cfa6009c39cf5f4cb74\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"2485be792e78cadbaf2cd14ac6258a8f157df564e3e076cd46b688af21eae24a\"" Mar 12 04:19:45.240141 containerd[1506]: time="2026-03-12T04:19:45.239532784Z" level=info msg="StartContainer for \"2485be792e78cadbaf2cd14ac6258a8f157df564e3e076cd46b688af21eae24a\"" Mar 12 04:19:45.285035 systemd[1]: Started cri-containerd-2485be792e78cadbaf2cd14ac6258a8f157df564e3e076cd46b688af21eae24a.scope - libcontainer container 2485be792e78cadbaf2cd14ac6258a8f157df564e3e076cd46b688af21eae24a. Mar 12 04:19:45.319233 containerd[1506]: time="2026-03-12T04:19:45.319190922Z" level=info msg="StartContainer for \"2485be792e78cadbaf2cd14ac6258a8f157df564e3e076cd46b688af21eae24a\" returns successfully" Mar 12 04:19:46.243959 kubelet[2658]: I0312 04:19:46.243233 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-kj6f5" podStartSLOduration=6.243169464 podStartE2EDuration="6.243169464s" podCreationTimestamp="2026-03-12 04:19:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 04:19:42.219146007 +0000 UTC m=+6.198990844" watchObservedRunningTime="2026-03-12 04:19:46.243169464 +0000 UTC m=+10.223014277" Mar 12 04:19:48.570869 kubelet[2658]: I0312 04:19:48.570785 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6bf85f8dd-8tc2w" podStartSLOduration=4.402997581 podStartE2EDuration="7.570770021s" podCreationTimestamp="2026-03-12 04:19:41 +0000 UTC" firstStartedPulling="2026-03-12 04:19:42.051686881 +0000 UTC m=+6.031531696" lastFinishedPulling="2026-03-12 04:19:45.219459322 +0000 UTC m=+9.199304136" observedRunningTime="2026-03-12 04:19:46.244992034 +0000 UTC m=+10.224836871" watchObservedRunningTime="2026-03-12 04:19:48.570770021 +0000 UTC m=+12.550614858" Mar 12 04:19:52.108443 sudo[1746]: pam_unix(sudo:session): session closed for user root Mar 12 04:19:52.199456 sshd[1743]: pam_unix(sshd:session): session closed for user core Mar 12 04:19:52.207820 systemd[1]: sshd@6-10.244.101.2:22-20.161.92.111:38236.service: Deactivated successfully. Mar 12 04:19:52.211493 systemd[1]: session-9.scope: Deactivated successfully. Mar 12 04:19:52.211875 systemd[1]: session-9.scope: Consumed 7.075s CPU time, 146.4M memory peak, 0B memory swap peak. Mar 12 04:19:52.212420 systemd-logind[1486]: Session 9 logged out. Waiting for processes to exit. Mar 12 04:19:52.215352 systemd-logind[1486]: Removed session 9. Mar 12 04:19:55.323941 systemd[1]: Created slice kubepods-besteffort-podd37eab11_db0a_42d7_91e6_c845e3673007.slice - libcontainer container kubepods-besteffort-podd37eab11_db0a_42d7_91e6_c845e3673007.slice. Mar 12 04:19:55.393058 kubelet[2658]: I0312 04:19:55.392831 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d37eab11-db0a-42d7-91e6-c845e3673007-tigera-ca-bundle\") pod \"calico-typha-5786dbbc98-n2w6b\" (UID: \"d37eab11-db0a-42d7-91e6-c845e3673007\") " pod="calico-system/calico-typha-5786dbbc98-n2w6b" Mar 12 04:19:55.393058 kubelet[2658]: I0312 04:19:55.392909 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d37eab11-db0a-42d7-91e6-c845e3673007-typha-certs\") pod \"calico-typha-5786dbbc98-n2w6b\" (UID: \"d37eab11-db0a-42d7-91e6-c845e3673007\") " pod="calico-system/calico-typha-5786dbbc98-n2w6b" Mar 12 04:19:55.393058 kubelet[2658]: I0312 04:19:55.392931 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdjbx\" (UniqueName: \"kubernetes.io/projected/d37eab11-db0a-42d7-91e6-c845e3673007-kube-api-access-bdjbx\") pod \"calico-typha-5786dbbc98-n2w6b\" (UID: \"d37eab11-db0a-42d7-91e6-c845e3673007\") " pod="calico-system/calico-typha-5786dbbc98-n2w6b" Mar 12 04:19:55.489550 systemd[1]: Created slice kubepods-besteffort-pod18511e5a_88b0_4e2d_902d_3c162d0c3c29.slice - libcontainer container kubepods-besteffort-pod18511e5a_88b0_4e2d_902d_3c162d0c3c29.slice. Mar 12 04:19:55.596042 kubelet[2658]: I0312 04:19:55.595406 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/18511e5a-88b0-4e2d-902d-3c162d0c3c29-cni-net-dir\") pod \"calico-node-cdxj9\" (UID: \"18511e5a-88b0-4e2d-902d-3c162d0c3c29\") " pod="calico-system/calico-node-cdxj9" Mar 12 04:19:55.596042 kubelet[2658]: I0312 04:19:55.595468 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/18511e5a-88b0-4e2d-902d-3c162d0c3c29-var-lib-calico\") pod \"calico-node-cdxj9\" (UID: \"18511e5a-88b0-4e2d-902d-3c162d0c3c29\") " pod="calico-system/calico-node-cdxj9" Mar 12 04:19:55.596042 kubelet[2658]: I0312 04:19:55.595491 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/18511e5a-88b0-4e2d-902d-3c162d0c3c29-var-run-calico\") pod \"calico-node-cdxj9\" (UID: \"18511e5a-88b0-4e2d-902d-3c162d0c3c29\") " pod="calico-system/calico-node-cdxj9" Mar 12 04:19:55.596042 kubelet[2658]: I0312 04:19:55.595510 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/18511e5a-88b0-4e2d-902d-3c162d0c3c29-flexvol-driver-host\") pod \"calico-node-cdxj9\" (UID: \"18511e5a-88b0-4e2d-902d-3c162d0c3c29\") " pod="calico-system/calico-node-cdxj9" Mar 12 04:19:55.596042 kubelet[2658]: I0312 04:19:55.595532 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18511e5a-88b0-4e2d-902d-3c162d0c3c29-lib-modules\") pod \"calico-node-cdxj9\" (UID: \"18511e5a-88b0-4e2d-902d-3c162d0c3c29\") " pod="calico-system/calico-node-cdxj9" Mar 12 04:19:55.596500 kubelet[2658]: I0312 04:19:55.595585 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5mmr\" (UniqueName: \"kubernetes.io/projected/18511e5a-88b0-4e2d-902d-3c162d0c3c29-kube-api-access-h5mmr\") pod \"calico-node-cdxj9\" (UID: \"18511e5a-88b0-4e2d-902d-3c162d0c3c29\") " pod="calico-system/calico-node-cdxj9" Mar 12 04:19:55.596500 kubelet[2658]: I0312 04:19:55.595617 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/18511e5a-88b0-4e2d-902d-3c162d0c3c29-node-certs\") pod \"calico-node-cdxj9\" (UID: \"18511e5a-88b0-4e2d-902d-3c162d0c3c29\") " pod="calico-system/calico-node-cdxj9" Mar 12 04:19:55.596500 kubelet[2658]: I0312 04:19:55.595636 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/18511e5a-88b0-4e2d-902d-3c162d0c3c29-sys-fs\") pod \"calico-node-cdxj9\" (UID: \"18511e5a-88b0-4e2d-902d-3c162d0c3c29\") " pod="calico-system/calico-node-cdxj9" Mar 12 04:19:55.596500 kubelet[2658]: I0312 04:19:55.595657 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/18511e5a-88b0-4e2d-902d-3c162d0c3c29-policysync\") pod \"calico-node-cdxj9\" (UID: \"18511e5a-88b0-4e2d-902d-3c162d0c3c29\") " pod="calico-system/calico-node-cdxj9" Mar 12 04:19:55.596500 kubelet[2658]: I0312 04:19:55.595674 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/18511e5a-88b0-4e2d-902d-3c162d0c3c29-xtables-lock\") pod \"calico-node-cdxj9\" (UID: \"18511e5a-88b0-4e2d-902d-3c162d0c3c29\") " pod="calico-system/calico-node-cdxj9" Mar 12 04:19:55.596683 kubelet[2658]: I0312 04:19:55.595705 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/18511e5a-88b0-4e2d-902d-3c162d0c3c29-cni-bin-dir\") pod \"calico-node-cdxj9\" (UID: \"18511e5a-88b0-4e2d-902d-3c162d0c3c29\") " pod="calico-system/calico-node-cdxj9" Mar 12 04:19:55.596683 kubelet[2658]: I0312 04:19:55.595728 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/18511e5a-88b0-4e2d-902d-3c162d0c3c29-cni-log-dir\") pod \"calico-node-cdxj9\" (UID: \"18511e5a-88b0-4e2d-902d-3c162d0c3c29\") " pod="calico-system/calico-node-cdxj9" Mar 12 04:19:55.596683 kubelet[2658]: I0312 04:19:55.595746 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/18511e5a-88b0-4e2d-902d-3c162d0c3c29-nodeproc\") pod \"calico-node-cdxj9\" (UID: \"18511e5a-88b0-4e2d-902d-3c162d0c3c29\") " pod="calico-system/calico-node-cdxj9" Mar 12 04:19:55.596683 kubelet[2658]: I0312 04:19:55.595788 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/18511e5a-88b0-4e2d-902d-3c162d0c3c29-bpffs\") pod \"calico-node-cdxj9\" (UID: \"18511e5a-88b0-4e2d-902d-3c162d0c3c29\") " pod="calico-system/calico-node-cdxj9" Mar 12 04:19:55.596683 kubelet[2658]: I0312 04:19:55.595818 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18511e5a-88b0-4e2d-902d-3c162d0c3c29-tigera-ca-bundle\") pod \"calico-node-cdxj9\" (UID: \"18511e5a-88b0-4e2d-902d-3c162d0c3c29\") " pod="calico-system/calico-node-cdxj9" Mar 12 04:19:55.605803 kubelet[2658]: E0312 04:19:55.605604 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vmkhw" podUID="a2826227-3a11-47af-9911-51d5f0de7b17" Mar 12 04:19:55.637343 containerd[1506]: time="2026-03-12T04:19:55.637188691Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5786dbbc98-n2w6b,Uid:d37eab11-db0a-42d7-91e6-c845e3673007,Namespace:calico-system,Attempt:0,}" Mar 12 04:19:55.697448 kubelet[2658]: I0312 04:19:55.696160 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2826227-3a11-47af-9911-51d5f0de7b17-kubelet-dir\") pod \"csi-node-driver-vmkhw\" (UID: \"a2826227-3a11-47af-9911-51d5f0de7b17\") " pod="calico-system/csi-node-driver-vmkhw" Mar 12 04:19:55.697448 kubelet[2658]: I0312 04:19:55.696206 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a2826227-3a11-47af-9911-51d5f0de7b17-socket-dir\") pod \"csi-node-driver-vmkhw\" (UID: \"a2826227-3a11-47af-9911-51d5f0de7b17\") " pod="calico-system/csi-node-driver-vmkhw" Mar 12 04:19:55.697448 kubelet[2658]: I0312 04:19:55.696317 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a2826227-3a11-47af-9911-51d5f0de7b17-registration-dir\") pod \"csi-node-driver-vmkhw\" (UID: \"a2826227-3a11-47af-9911-51d5f0de7b17\") " pod="calico-system/csi-node-driver-vmkhw" Mar 12 04:19:55.697448 kubelet[2658]: I0312 04:19:55.696335 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/a2826227-3a11-47af-9911-51d5f0de7b17-varrun\") pod \"csi-node-driver-vmkhw\" (UID: \"a2826227-3a11-47af-9911-51d5f0de7b17\") " pod="calico-system/csi-node-driver-vmkhw" Mar 12 04:19:55.697448 kubelet[2658]: I0312 04:19:55.696411 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd69g\" (UniqueName: \"kubernetes.io/projected/a2826227-3a11-47af-9911-51d5f0de7b17-kube-api-access-hd69g\") pod \"csi-node-driver-vmkhw\" (UID: \"a2826227-3a11-47af-9911-51d5f0de7b17\") " pod="calico-system/csi-node-driver-vmkhw" Mar 12 04:19:55.713244 kubelet[2658]: E0312 04:19:55.713179 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.714915 kubelet[2658]: W0312 04:19:55.714874 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.715033 kubelet[2658]: E0312 04:19:55.714951 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.718442 kubelet[2658]: E0312 04:19:55.718415 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.718442 kubelet[2658]: W0312 04:19:55.718438 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.720394 kubelet[2658]: E0312 04:19:55.718467 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.747253 kubelet[2658]: E0312 04:19:55.747183 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.747253 kubelet[2658]: W0312 04:19:55.747211 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.747506 kubelet[2658]: E0312 04:19:55.747335 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.759423 containerd[1506]: time="2026-03-12T04:19:55.759066330Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:19:55.759423 containerd[1506]: time="2026-03-12T04:19:55.759140095Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:19:55.759423 containerd[1506]: time="2026-03-12T04:19:55.759156849Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:19:55.759423 containerd[1506]: time="2026-03-12T04:19:55.759307219Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:19:55.797886 kubelet[2658]: E0312 04:19:55.797585 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.797886 kubelet[2658]: W0312 04:19:55.797608 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.797886 kubelet[2658]: E0312 04:19:55.797632 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.798383 kubelet[2658]: E0312 04:19:55.798211 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.798383 kubelet[2658]: W0312 04:19:55.798226 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.798383 kubelet[2658]: E0312 04:19:55.798241 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.798686 kubelet[2658]: E0312 04:19:55.798673 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.799138 kubelet[2658]: W0312 04:19:55.798992 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.799138 kubelet[2658]: E0312 04:19:55.799015 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.803788 kubelet[2658]: E0312 04:19:55.800017 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.803788 kubelet[2658]: W0312 04:19:55.800034 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.803788 kubelet[2658]: E0312 04:19:55.800049 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.803788 kubelet[2658]: E0312 04:19:55.800560 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.803788 kubelet[2658]: W0312 04:19:55.800571 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.803788 kubelet[2658]: E0312 04:19:55.800586 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.803788 kubelet[2658]: E0312 04:19:55.803237 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.803788 kubelet[2658]: W0312 04:19:55.803252 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.803788 kubelet[2658]: E0312 04:19:55.803280 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.803788 kubelet[2658]: E0312 04:19:55.803491 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.804224 kubelet[2658]: W0312 04:19:55.803500 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.804224 kubelet[2658]: E0312 04:19:55.803510 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.804224 kubelet[2658]: E0312 04:19:55.803694 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.804224 kubelet[2658]: W0312 04:19:55.803703 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.804224 kubelet[2658]: E0312 04:19:55.803713 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.804560 kubelet[2658]: E0312 04:19:55.804508 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.804560 kubelet[2658]: W0312 04:19:55.804519 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.804560 kubelet[2658]: E0312 04:19:55.804530 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.804934 kubelet[2658]: E0312 04:19:55.804877 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.804934 kubelet[2658]: W0312 04:19:55.804888 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.804934 kubelet[2658]: E0312 04:19:55.804897 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.805298 kubelet[2658]: E0312 04:19:55.805286 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.805431 kubelet[2658]: W0312 04:19:55.805352 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.805431 kubelet[2658]: E0312 04:19:55.805366 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.805614 kubelet[2658]: E0312 04:19:55.805605 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.805676 kubelet[2658]: W0312 04:19:55.805667 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.805894 kubelet[2658]: E0312 04:19:55.805721 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.806185 kubelet[2658]: E0312 04:19:55.806066 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.806185 kubelet[2658]: W0312 04:19:55.806089 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.806185 kubelet[2658]: E0312 04:19:55.806104 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.806527 kubelet[2658]: E0312 04:19:55.806447 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.806527 kubelet[2658]: W0312 04:19:55.806458 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.806527 kubelet[2658]: E0312 04:19:55.806468 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.806744 kubelet[2658]: E0312 04:19:55.806735 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.806797 kubelet[2658]: W0312 04:19:55.806789 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.806870 kubelet[2658]: E0312 04:19:55.806862 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.807092 kubelet[2658]: E0312 04:19:55.807082 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.807157 kubelet[2658]: W0312 04:19:55.807149 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.807216 kubelet[2658]: E0312 04:19:55.807208 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.807516 kubelet[2658]: E0312 04:19:55.807506 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.807579 kubelet[2658]: W0312 04:19:55.807570 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.807630 kubelet[2658]: E0312 04:19:55.807622 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.807828 kubelet[2658]: E0312 04:19:55.807819 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.807912 kubelet[2658]: W0312 04:19:55.807903 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.807963 kubelet[2658]: E0312 04:19:55.807955 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.808197 kubelet[2658]: E0312 04:19:55.808188 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.808273 kubelet[2658]: W0312 04:19:55.808254 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.808321 kubelet[2658]: E0312 04:19:55.808313 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.808538 kubelet[2658]: E0312 04:19:55.808528 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.808594 kubelet[2658]: W0312 04:19:55.808586 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.808677 kubelet[2658]: E0312 04:19:55.808668 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.810585 kubelet[2658]: E0312 04:19:55.809361 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.810585 kubelet[2658]: W0312 04:19:55.809390 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.810585 kubelet[2658]: E0312 04:19:55.809404 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.810585 kubelet[2658]: E0312 04:19:55.809664 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.810585 kubelet[2658]: W0312 04:19:55.809672 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.810585 kubelet[2658]: E0312 04:19:55.809681 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.810585 kubelet[2658]: E0312 04:19:55.809935 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.810585 kubelet[2658]: W0312 04:19:55.809944 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.810585 kubelet[2658]: E0312 04:19:55.809953 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.810585 kubelet[2658]: E0312 04:19:55.810179 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.810954 kubelet[2658]: W0312 04:19:55.810187 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.810954 kubelet[2658]: E0312 04:19:55.810212 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.811365 kubelet[2658]: E0312 04:19:55.811351 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.811467 kubelet[2658]: W0312 04:19:55.811430 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.811467 kubelet[2658]: E0312 04:19:55.811445 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.817729 containerd[1506]: time="2026-03-12T04:19:55.796631719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cdxj9,Uid:18511e5a-88b0-4e2d-902d-3c162d0c3c29,Namespace:calico-system,Attempt:0,}" Mar 12 04:19:55.840046 systemd[1]: Started cri-containerd-214b85241467844e1364711766e45da7ecfdefef77e59af18f8346b4de785c63.scope - libcontainer container 214b85241467844e1364711766e45da7ecfdefef77e59af18f8346b4de785c63. Mar 12 04:19:55.842888 kubelet[2658]: E0312 04:19:55.842835 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:55.842888 kubelet[2658]: W0312 04:19:55.842881 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:55.843286 kubelet[2658]: E0312 04:19:55.842901 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:55.869507 containerd[1506]: time="2026-03-12T04:19:55.869155232Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:19:55.869507 containerd[1506]: time="2026-03-12T04:19:55.869306472Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:19:55.869507 containerd[1506]: time="2026-03-12T04:19:55.869331394Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:19:55.870974 containerd[1506]: time="2026-03-12T04:19:55.870883001Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:19:55.903014 systemd[1]: Started cri-containerd-912af7730ffea5ccff0d4115cfa4dfe53c709d28c572fe78cd01c5c551f05eb9.scope - libcontainer container 912af7730ffea5ccff0d4115cfa4dfe53c709d28c572fe78cd01c5c551f05eb9. Mar 12 04:19:55.929259 containerd[1506]: time="2026-03-12T04:19:55.929071919Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5786dbbc98-n2w6b,Uid:d37eab11-db0a-42d7-91e6-c845e3673007,Namespace:calico-system,Attempt:0,} returns sandbox id \"214b85241467844e1364711766e45da7ecfdefef77e59af18f8346b4de785c63\"" Mar 12 04:19:55.947241 containerd[1506]: time="2026-03-12T04:19:55.947199574Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 12 04:19:55.961540 containerd[1506]: time="2026-03-12T04:19:55.960878650Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cdxj9,Uid:18511e5a-88b0-4e2d-902d-3c162d0c3c29,Namespace:calico-system,Attempt:0,} returns sandbox id \"912af7730ffea5ccff0d4115cfa4dfe53c709d28c572fe78cd01c5c551f05eb9\"" Mar 12 04:19:57.159884 kubelet[2658]: E0312 04:19:57.159124 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vmkhw" podUID="a2826227-3a11-47af-9911-51d5f0de7b17" Mar 12 04:19:57.626104 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1874267350.mount: Deactivated successfully. Mar 12 04:19:59.034434 containerd[1506]: time="2026-03-12T04:19:59.032533086Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:59.050915 containerd[1506]: time="2026-03-12T04:19:59.050777443Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Mar 12 04:19:59.051747 containerd[1506]: time="2026-03-12T04:19:59.051688636Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:59.054540 containerd[1506]: time="2026-03-12T04:19:59.054510432Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:19:59.056908 containerd[1506]: time="2026-03-12T04:19:59.056498344Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 3.108913478s" Mar 12 04:19:59.056908 containerd[1506]: time="2026-03-12T04:19:59.056576151Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Mar 12 04:19:59.063805 containerd[1506]: time="2026-03-12T04:19:59.063546765Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 12 04:19:59.120748 containerd[1506]: time="2026-03-12T04:19:59.120636681Z" level=info msg="CreateContainer within sandbox \"214b85241467844e1364711766e45da7ecfdefef77e59af18f8346b4de785c63\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 12 04:19:59.134934 containerd[1506]: time="2026-03-12T04:19:59.134315691Z" level=info msg="CreateContainer within sandbox \"214b85241467844e1364711766e45da7ecfdefef77e59af18f8346b4de785c63\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"753057f65cbd6844a5c1ec878c92282a05f7b4b59ab7d719e61357a96840e11e\"" Mar 12 04:19:59.139872 containerd[1506]: time="2026-03-12T04:19:59.138937053Z" level=info msg="StartContainer for \"753057f65cbd6844a5c1ec878c92282a05f7b4b59ab7d719e61357a96840e11e\"" Mar 12 04:19:59.161084 kubelet[2658]: E0312 04:19:59.158767 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vmkhw" podUID="a2826227-3a11-47af-9911-51d5f0de7b17" Mar 12 04:19:59.190055 systemd[1]: Started cri-containerd-753057f65cbd6844a5c1ec878c92282a05f7b4b59ab7d719e61357a96840e11e.scope - libcontainer container 753057f65cbd6844a5c1ec878c92282a05f7b4b59ab7d719e61357a96840e11e. Mar 12 04:19:59.242979 containerd[1506]: time="2026-03-12T04:19:59.242941651Z" level=info msg="StartContainer for \"753057f65cbd6844a5c1ec878c92282a05f7b4b59ab7d719e61357a96840e11e\" returns successfully" Mar 12 04:19:59.297757 kubelet[2658]: E0312 04:19:59.297244 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.297757 kubelet[2658]: W0312 04:19:59.297280 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.297757 kubelet[2658]: E0312 04:19:59.297323 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.298944 kubelet[2658]: E0312 04:19:59.298084 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.298944 kubelet[2658]: W0312 04:19:59.298096 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.298944 kubelet[2658]: E0312 04:19:59.298112 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.299442 kubelet[2658]: E0312 04:19:59.299418 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.299442 kubelet[2658]: W0312 04:19:59.299429 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.299522 kubelet[2658]: E0312 04:19:59.299444 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.300177 kubelet[2658]: E0312 04:19:59.300158 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.300177 kubelet[2658]: W0312 04:19:59.300173 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.300368 kubelet[2658]: E0312 04:19:59.300186 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.303139 kubelet[2658]: E0312 04:19:59.303120 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.303139 kubelet[2658]: W0312 04:19:59.303135 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.303313 kubelet[2658]: E0312 04:19:59.303254 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.304098 kubelet[2658]: E0312 04:19:59.304071 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.304174 kubelet[2658]: W0312 04:19:59.304161 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.304217 kubelet[2658]: E0312 04:19:59.304178 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.304472 kubelet[2658]: E0312 04:19:59.304456 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.304472 kubelet[2658]: W0312 04:19:59.304469 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.304568 kubelet[2658]: E0312 04:19:59.304480 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.304892 kubelet[2658]: E0312 04:19:59.304875 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.304954 kubelet[2658]: W0312 04:19:59.304900 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.304954 kubelet[2658]: E0312 04:19:59.304913 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.305430 kubelet[2658]: E0312 04:19:59.305415 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.305430 kubelet[2658]: W0312 04:19:59.305429 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.305776 kubelet[2658]: E0312 04:19:59.305441 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.305776 kubelet[2658]: E0312 04:19:59.305660 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.305776 kubelet[2658]: W0312 04:19:59.305668 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.305776 kubelet[2658]: E0312 04:19:59.305677 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.306143 kubelet[2658]: E0312 04:19:59.305948 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.306143 kubelet[2658]: W0312 04:19:59.305957 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.306143 kubelet[2658]: E0312 04:19:59.305968 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.307018 kubelet[2658]: E0312 04:19:59.306596 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.307018 kubelet[2658]: W0312 04:19:59.306606 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.307018 kubelet[2658]: E0312 04:19:59.306616 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.307143 kubelet[2658]: E0312 04:19:59.307097 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.307143 kubelet[2658]: W0312 04:19:59.307108 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.307143 kubelet[2658]: E0312 04:19:59.307120 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.307976 kubelet[2658]: E0312 04:19:59.307320 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.307976 kubelet[2658]: W0312 04:19:59.307333 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.307976 kubelet[2658]: E0312 04:19:59.307346 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.307976 kubelet[2658]: E0312 04:19:59.307678 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.307976 kubelet[2658]: W0312 04:19:59.307688 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.307976 kubelet[2658]: E0312 04:19:59.307698 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.330247 kubelet[2658]: E0312 04:19:59.330165 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.330247 kubelet[2658]: W0312 04:19:59.330228 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.330436 kubelet[2658]: E0312 04:19:59.330267 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.330771 kubelet[2658]: E0312 04:19:59.330704 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.330771 kubelet[2658]: W0312 04:19:59.330746 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.330771 kubelet[2658]: E0312 04:19:59.330765 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.331183 kubelet[2658]: E0312 04:19:59.331165 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.331183 kubelet[2658]: W0312 04:19:59.331181 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.331268 kubelet[2658]: E0312 04:19:59.331194 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.331595 kubelet[2658]: E0312 04:19:59.331575 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.331595 kubelet[2658]: W0312 04:19:59.331592 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.331683 kubelet[2658]: E0312 04:19:59.331607 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.331965 kubelet[2658]: E0312 04:19:59.331950 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.331965 kubelet[2658]: W0312 04:19:59.331964 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.332055 kubelet[2658]: E0312 04:19:59.331977 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.332292 kubelet[2658]: E0312 04:19:59.332276 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.332292 kubelet[2658]: W0312 04:19:59.332291 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.332368 kubelet[2658]: E0312 04:19:59.332305 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.333068 kubelet[2658]: E0312 04:19:59.333045 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.333068 kubelet[2658]: W0312 04:19:59.333065 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.333182 kubelet[2658]: E0312 04:19:59.333080 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.335101 kubelet[2658]: E0312 04:19:59.335078 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.335101 kubelet[2658]: W0312 04:19:59.335097 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.335217 kubelet[2658]: E0312 04:19:59.335112 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.335449 kubelet[2658]: E0312 04:19:59.335432 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.335449 kubelet[2658]: W0312 04:19:59.335447 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.335540 kubelet[2658]: E0312 04:19:59.335460 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.335706 kubelet[2658]: E0312 04:19:59.335691 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.335706 kubelet[2658]: W0312 04:19:59.335705 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.335767 kubelet[2658]: E0312 04:19:59.335719 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.336025 kubelet[2658]: E0312 04:19:59.336009 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.336025 kubelet[2658]: W0312 04:19:59.336023 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.336113 kubelet[2658]: E0312 04:19:59.336035 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.336395 kubelet[2658]: E0312 04:19:59.336368 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.336472 kubelet[2658]: W0312 04:19:59.336398 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.336472 kubelet[2658]: E0312 04:19:59.336411 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.337376 kubelet[2658]: E0312 04:19:59.337060 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.337376 kubelet[2658]: W0312 04:19:59.337079 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.337376 kubelet[2658]: E0312 04:19:59.337092 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.339537 kubelet[2658]: E0312 04:19:59.338699 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.339537 kubelet[2658]: W0312 04:19:59.338739 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.339537 kubelet[2658]: E0312 04:19:59.338772 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.340043 kubelet[2658]: E0312 04:19:59.339838 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.340043 kubelet[2658]: W0312 04:19:59.339888 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.340043 kubelet[2658]: E0312 04:19:59.339903 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.341514 kubelet[2658]: E0312 04:19:59.340676 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.341514 kubelet[2658]: W0312 04:19:59.340690 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.341514 kubelet[2658]: E0312 04:19:59.340703 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.341795 kubelet[2658]: E0312 04:19:59.341782 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.341998 kubelet[2658]: W0312 04:19:59.341856 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.341998 kubelet[2658]: E0312 04:19:59.341890 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:19:59.342211 kubelet[2658]: E0312 04:19:59.342200 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:19:59.342301 kubelet[2658]: W0312 04:19:59.342261 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:19:59.342301 kubelet[2658]: E0312 04:19:59.342275 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.103679 systemd[1]: run-containerd-runc-k8s.io-753057f65cbd6844a5c1ec878c92282a05f7b4b59ab7d719e61357a96840e11e-runc.yla9MT.mount: Deactivated successfully. Mar 12 04:20:00.275333 kubelet[2658]: I0312 04:20:00.275262 2658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 04:20:00.319515 kubelet[2658]: E0312 04:20:00.319452 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.319515 kubelet[2658]: W0312 04:20:00.319507 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.319743 kubelet[2658]: E0312 04:20:00.319552 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.320181 kubelet[2658]: E0312 04:20:00.320142 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.320181 kubelet[2658]: W0312 04:20:00.320177 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.320317 kubelet[2658]: E0312 04:20:00.320202 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.320700 kubelet[2658]: E0312 04:20:00.320667 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.320744 kubelet[2658]: W0312 04:20:00.320700 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.320744 kubelet[2658]: E0312 04:20:00.320722 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.321462 kubelet[2658]: E0312 04:20:00.321384 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.321532 kubelet[2658]: W0312 04:20:00.321462 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.321532 kubelet[2658]: E0312 04:20:00.321490 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.322158 kubelet[2658]: E0312 04:20:00.322105 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.322214 kubelet[2658]: W0312 04:20:00.322164 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.322266 kubelet[2658]: E0312 04:20:00.322205 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.322754 kubelet[2658]: E0312 04:20:00.322723 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.322815 kubelet[2658]: W0312 04:20:00.322753 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.322815 kubelet[2658]: E0312 04:20:00.322771 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.323079 kubelet[2658]: E0312 04:20:00.323056 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.323079 kubelet[2658]: W0312 04:20:00.323075 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.323191 kubelet[2658]: E0312 04:20:00.323088 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.323414 kubelet[2658]: E0312 04:20:00.323374 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.323475 kubelet[2658]: W0312 04:20:00.323416 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.323475 kubelet[2658]: E0312 04:20:00.323432 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.323786 kubelet[2658]: E0312 04:20:00.323756 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.323786 kubelet[2658]: W0312 04:20:00.323780 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.323923 kubelet[2658]: E0312 04:20:00.323795 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.324178 kubelet[2658]: E0312 04:20:00.324151 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.324178 kubelet[2658]: W0312 04:20:00.324171 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.324298 kubelet[2658]: E0312 04:20:00.324186 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.324479 kubelet[2658]: E0312 04:20:00.324460 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.324541 kubelet[2658]: W0312 04:20:00.324478 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.324541 kubelet[2658]: E0312 04:20:00.324492 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.324784 kubelet[2658]: E0312 04:20:00.324763 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.324784 kubelet[2658]: W0312 04:20:00.324782 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.324950 kubelet[2658]: E0312 04:20:00.324797 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.325126 kubelet[2658]: E0312 04:20:00.325107 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.325180 kubelet[2658]: W0312 04:20:00.325127 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.325180 kubelet[2658]: E0312 04:20:00.325141 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.325430 kubelet[2658]: E0312 04:20:00.325412 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.325497 kubelet[2658]: W0312 04:20:00.325430 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.325497 kubelet[2658]: E0312 04:20:00.325443 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.325733 kubelet[2658]: E0312 04:20:00.325715 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.325788 kubelet[2658]: W0312 04:20:00.325733 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.325788 kubelet[2658]: E0312 04:20:00.325747 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.340507 kubelet[2658]: E0312 04:20:00.340467 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.340507 kubelet[2658]: W0312 04:20:00.340500 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.340665 kubelet[2658]: E0312 04:20:00.340527 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.340965 kubelet[2658]: E0312 04:20:00.340950 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.341030 kubelet[2658]: W0312 04:20:00.340972 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.341030 kubelet[2658]: E0312 04:20:00.340992 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.341568 kubelet[2658]: E0312 04:20:00.341547 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.341624 kubelet[2658]: W0312 04:20:00.341571 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.341624 kubelet[2658]: E0312 04:20:00.341592 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.341937 kubelet[2658]: E0312 04:20:00.341920 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.341995 kubelet[2658]: W0312 04:20:00.341939 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.341995 kubelet[2658]: E0312 04:20:00.341955 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.344011 kubelet[2658]: E0312 04:20:00.343958 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.344011 kubelet[2658]: W0312 04:20:00.344009 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.344135 kubelet[2658]: E0312 04:20:00.344032 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.344483 kubelet[2658]: E0312 04:20:00.344455 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.344542 kubelet[2658]: W0312 04:20:00.344484 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.344542 kubelet[2658]: E0312 04:20:00.344506 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.344955 kubelet[2658]: E0312 04:20:00.344935 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.345002 kubelet[2658]: W0312 04:20:00.344957 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.345002 kubelet[2658]: E0312 04:20:00.344977 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.346180 kubelet[2658]: E0312 04:20:00.346100 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.346180 kubelet[2658]: W0312 04:20:00.346128 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.346180 kubelet[2658]: E0312 04:20:00.346150 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.348117 kubelet[2658]: E0312 04:20:00.348092 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.348223 kubelet[2658]: W0312 04:20:00.348120 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.348223 kubelet[2658]: E0312 04:20:00.348143 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.348557 kubelet[2658]: E0312 04:20:00.348539 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.348602 kubelet[2658]: W0312 04:20:00.348562 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.348602 kubelet[2658]: E0312 04:20:00.348582 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.349106 kubelet[2658]: E0312 04:20:00.349082 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.349158 kubelet[2658]: W0312 04:20:00.349112 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.349249 kubelet[2658]: E0312 04:20:00.349132 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.350407 kubelet[2658]: E0312 04:20:00.350371 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.350482 kubelet[2658]: W0312 04:20:00.350415 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.350482 kubelet[2658]: E0312 04:20:00.350436 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.351808 kubelet[2658]: E0312 04:20:00.351786 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.351897 kubelet[2658]: W0312 04:20:00.351813 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.351897 kubelet[2658]: E0312 04:20:00.351837 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.352754 kubelet[2658]: E0312 04:20:00.352736 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.352822 kubelet[2658]: W0312 04:20:00.352759 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.352822 kubelet[2658]: E0312 04:20:00.352780 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.353936 kubelet[2658]: E0312 04:20:00.353887 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.353936 kubelet[2658]: W0312 04:20:00.353904 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.353936 kubelet[2658]: E0312 04:20:00.353915 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.354173 kubelet[2658]: E0312 04:20:00.354161 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.354173 kubelet[2658]: W0312 04:20:00.354172 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.354452 kubelet[2658]: E0312 04:20:00.354182 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.354789 kubelet[2658]: E0312 04:20:00.354776 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.354789 kubelet[2658]: W0312 04:20:00.354787 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.354911 kubelet[2658]: E0312 04:20:00.354796 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.356529 kubelet[2658]: E0312 04:20:00.355012 2658 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 12 04:20:00.356529 kubelet[2658]: W0312 04:20:00.355024 2658 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 12 04:20:00.356529 kubelet[2658]: E0312 04:20:00.355034 2658 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 12 04:20:00.786863 containerd[1506]: time="2026-03-12T04:20:00.786776052Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:00.787976 containerd[1506]: time="2026-03-12T04:20:00.787935172Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Mar 12 04:20:00.788691 containerd[1506]: time="2026-03-12T04:20:00.788667189Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:00.791301 containerd[1506]: time="2026-03-12T04:20:00.791274980Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:00.792399 containerd[1506]: time="2026-03-12T04:20:00.792374509Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 1.728778019s" Mar 12 04:20:00.792462 containerd[1506]: time="2026-03-12T04:20:00.792423442Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Mar 12 04:20:00.797170 containerd[1506]: time="2026-03-12T04:20:00.797051390Z" level=info msg="CreateContainer within sandbox \"912af7730ffea5ccff0d4115cfa4dfe53c709d28c572fe78cd01c5c551f05eb9\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 12 04:20:00.850207 containerd[1506]: time="2026-03-12T04:20:00.850162512Z" level=info msg="CreateContainer within sandbox \"912af7730ffea5ccff0d4115cfa4dfe53c709d28c572fe78cd01c5c551f05eb9\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"edb6a5229fd05764c5c57946974bdd5398bc42b35e8c301ab9b830c5ed8f9ac3\"" Mar 12 04:20:00.853974 containerd[1506]: time="2026-03-12T04:20:00.851092971Z" level=info msg="StartContainer for \"edb6a5229fd05764c5c57946974bdd5398bc42b35e8c301ab9b830c5ed8f9ac3\"" Mar 12 04:20:00.902061 systemd[1]: Started cri-containerd-edb6a5229fd05764c5c57946974bdd5398bc42b35e8c301ab9b830c5ed8f9ac3.scope - libcontainer container edb6a5229fd05764c5c57946974bdd5398bc42b35e8c301ab9b830c5ed8f9ac3. Mar 12 04:20:00.946287 containerd[1506]: time="2026-03-12T04:20:00.945709863Z" level=info msg="StartContainer for \"edb6a5229fd05764c5c57946974bdd5398bc42b35e8c301ab9b830c5ed8f9ac3\" returns successfully" Mar 12 04:20:00.961880 systemd[1]: cri-containerd-edb6a5229fd05764c5c57946974bdd5398bc42b35e8c301ab9b830c5ed8f9ac3.scope: Deactivated successfully. Mar 12 04:20:01.048265 containerd[1506]: time="2026-03-12T04:20:01.036221476Z" level=info msg="shim disconnected" id=edb6a5229fd05764c5c57946974bdd5398bc42b35e8c301ab9b830c5ed8f9ac3 namespace=k8s.io Mar 12 04:20:01.048265 containerd[1506]: time="2026-03-12T04:20:01.048143782Z" level=warning msg="cleaning up after shim disconnected" id=edb6a5229fd05764c5c57946974bdd5398bc42b35e8c301ab9b830c5ed8f9ac3 namespace=k8s.io Mar 12 04:20:01.048265 containerd[1506]: time="2026-03-12T04:20:01.048175644Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 12 04:20:01.098891 systemd[1]: run-containerd-runc-k8s.io-edb6a5229fd05764c5c57946974bdd5398bc42b35e8c301ab9b830c5ed8f9ac3-runc.4enZHa.mount: Deactivated successfully. Mar 12 04:20:01.099001 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-edb6a5229fd05764c5c57946974bdd5398bc42b35e8c301ab9b830c5ed8f9ac3-rootfs.mount: Deactivated successfully. Mar 12 04:20:01.159593 kubelet[2658]: E0312 04:20:01.158279 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vmkhw" podUID="a2826227-3a11-47af-9911-51d5f0de7b17" Mar 12 04:20:01.304172 containerd[1506]: time="2026-03-12T04:20:01.303449586Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 12 04:20:01.327190 kubelet[2658]: I0312 04:20:01.326036 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5786dbbc98-n2w6b" podStartSLOduration=3.20372074 podStartE2EDuration="6.32600182s" podCreationTimestamp="2026-03-12 04:19:55 +0000 UTC" firstStartedPulling="2026-03-12 04:19:55.940989229 +0000 UTC m=+19.920834042" lastFinishedPulling="2026-03-12 04:19:59.063270273 +0000 UTC m=+23.043115122" observedRunningTime="2026-03-12 04:19:59.301712129 +0000 UTC m=+23.281556942" watchObservedRunningTime="2026-03-12 04:20:01.32600182 +0000 UTC m=+25.305846765" Mar 12 04:20:03.159883 kubelet[2658]: E0312 04:20:03.159346 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vmkhw" podUID="a2826227-3a11-47af-9911-51d5f0de7b17" Mar 12 04:20:05.158077 kubelet[2658]: E0312 04:20:05.157987 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vmkhw" podUID="a2826227-3a11-47af-9911-51d5f0de7b17" Mar 12 04:20:07.160555 kubelet[2658]: E0312 04:20:07.160382 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vmkhw" podUID="a2826227-3a11-47af-9911-51d5f0de7b17" Mar 12 04:20:09.159524 kubelet[2658]: E0312 04:20:09.159444 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vmkhw" podUID="a2826227-3a11-47af-9911-51d5f0de7b17" Mar 12 04:20:11.157972 kubelet[2658]: E0312 04:20:11.157828 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vmkhw" podUID="a2826227-3a11-47af-9911-51d5f0de7b17" Mar 12 04:20:11.171513 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount450510226.mount: Deactivated successfully. Mar 12 04:20:11.219990 containerd[1506]: time="2026-03-12T04:20:11.219622568Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Mar 12 04:20:11.222384 containerd[1506]: time="2026-03-12T04:20:11.215342121Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:11.224908 containerd[1506]: time="2026-03-12T04:20:11.224831372Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:11.226984 containerd[1506]: time="2026-03-12T04:20:11.225606238Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 9.922020762s" Mar 12 04:20:11.226984 containerd[1506]: time="2026-03-12T04:20:11.225643178Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Mar 12 04:20:11.226984 containerd[1506]: time="2026-03-12T04:20:11.226653391Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:11.265452 containerd[1506]: time="2026-03-12T04:20:11.265410663Z" level=info msg="CreateContainer within sandbox \"912af7730ffea5ccff0d4115cfa4dfe53c709d28c572fe78cd01c5c551f05eb9\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 12 04:20:11.324749 containerd[1506]: time="2026-03-12T04:20:11.324686762Z" level=info msg="CreateContainer within sandbox \"912af7730ffea5ccff0d4115cfa4dfe53c709d28c572fe78cd01c5c551f05eb9\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"2cd91c829fab7483f394b297f1974ce0b28f94b0f6a184f74e3cce50b6213da8\"" Mar 12 04:20:11.328368 containerd[1506]: time="2026-03-12T04:20:11.328335382Z" level=info msg="StartContainer for \"2cd91c829fab7483f394b297f1974ce0b28f94b0f6a184f74e3cce50b6213da8\"" Mar 12 04:20:11.386213 systemd[1]: Started cri-containerd-2cd91c829fab7483f394b297f1974ce0b28f94b0f6a184f74e3cce50b6213da8.scope - libcontainer container 2cd91c829fab7483f394b297f1974ce0b28f94b0f6a184f74e3cce50b6213da8. Mar 12 04:20:11.428472 containerd[1506]: time="2026-03-12T04:20:11.427724080Z" level=info msg="StartContainer for \"2cd91c829fab7483f394b297f1974ce0b28f94b0f6a184f74e3cce50b6213da8\" returns successfully" Mar 12 04:20:11.541775 systemd[1]: cri-containerd-2cd91c829fab7483f394b297f1974ce0b28f94b0f6a184f74e3cce50b6213da8.scope: Deactivated successfully. Mar 12 04:20:11.579188 containerd[1506]: time="2026-03-12T04:20:11.573512102Z" level=info msg="shim disconnected" id=2cd91c829fab7483f394b297f1974ce0b28f94b0f6a184f74e3cce50b6213da8 namespace=k8s.io Mar 12 04:20:11.579188 containerd[1506]: time="2026-03-12T04:20:11.579185931Z" level=warning msg="cleaning up after shim disconnected" id=2cd91c829fab7483f394b297f1974ce0b28f94b0f6a184f74e3cce50b6213da8 namespace=k8s.io Mar 12 04:20:11.579461 containerd[1506]: time="2026-03-12T04:20:11.579206643Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 12 04:20:12.171994 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2cd91c829fab7483f394b297f1974ce0b28f94b0f6a184f74e3cce50b6213da8-rootfs.mount: Deactivated successfully. Mar 12 04:20:12.360217 containerd[1506]: time="2026-03-12T04:20:12.359750477Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 12 04:20:13.160481 kubelet[2658]: E0312 04:20:13.159384 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vmkhw" podUID="a2826227-3a11-47af-9911-51d5f0de7b17" Mar 12 04:20:15.158743 kubelet[2658]: E0312 04:20:15.158651 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vmkhw" podUID="a2826227-3a11-47af-9911-51d5f0de7b17" Mar 12 04:20:16.516904 containerd[1506]: time="2026-03-12T04:20:16.516577090Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:16.516904 containerd[1506]: time="2026-03-12T04:20:16.516873617Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Mar 12 04:20:16.525594 containerd[1506]: time="2026-03-12T04:20:16.525183378Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:16.527366 containerd[1506]: time="2026-03-12T04:20:16.527335125Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:16.529503 containerd[1506]: time="2026-03-12T04:20:16.529467489Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 4.169630985s" Mar 12 04:20:16.529579 containerd[1506]: time="2026-03-12T04:20:16.529515158Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Mar 12 04:20:16.538427 containerd[1506]: time="2026-03-12T04:20:16.538399360Z" level=info msg="CreateContainer within sandbox \"912af7730ffea5ccff0d4115cfa4dfe53c709d28c572fe78cd01c5c551f05eb9\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 12 04:20:16.592045 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1915384632.mount: Deactivated successfully. Mar 12 04:20:16.594639 containerd[1506]: time="2026-03-12T04:20:16.594605904Z" level=info msg="CreateContainer within sandbox \"912af7730ffea5ccff0d4115cfa4dfe53c709d28c572fe78cd01c5c551f05eb9\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"9baf380cf34ef65e5a0da8d8e865574223b869690494802f7fbc4230230a95a9\"" Mar 12 04:20:16.597176 containerd[1506]: time="2026-03-12T04:20:16.596312443Z" level=info msg="StartContainer for \"9baf380cf34ef65e5a0da8d8e865574223b869690494802f7fbc4230230a95a9\"" Mar 12 04:20:16.640014 systemd[1]: Started cri-containerd-9baf380cf34ef65e5a0da8d8e865574223b869690494802f7fbc4230230a95a9.scope - libcontainer container 9baf380cf34ef65e5a0da8d8e865574223b869690494802f7fbc4230230a95a9. Mar 12 04:20:16.678944 containerd[1506]: time="2026-03-12T04:20:16.677695573Z" level=info msg="StartContainer for \"9baf380cf34ef65e5a0da8d8e865574223b869690494802f7fbc4230230a95a9\" returns successfully" Mar 12 04:20:17.158703 kubelet[2658]: E0312 04:20:17.158540 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vmkhw" podUID="a2826227-3a11-47af-9911-51d5f0de7b17" Mar 12 04:20:17.385907 systemd[1]: cri-containerd-9baf380cf34ef65e5a0da8d8e865574223b869690494802f7fbc4230230a95a9.scope: Deactivated successfully. Mar 12 04:20:17.440304 kubelet[2658]: I0312 04:20:17.439296 2658 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Mar 12 04:20:17.446501 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9baf380cf34ef65e5a0da8d8e865574223b869690494802f7fbc4230230a95a9-rootfs.mount: Deactivated successfully. Mar 12 04:20:17.453060 containerd[1506]: time="2026-03-12T04:20:17.452787104Z" level=info msg="shim disconnected" id=9baf380cf34ef65e5a0da8d8e865574223b869690494802f7fbc4230230a95a9 namespace=k8s.io Mar 12 04:20:17.453060 containerd[1506]: time="2026-03-12T04:20:17.452883333Z" level=warning msg="cleaning up after shim disconnected" id=9baf380cf34ef65e5a0da8d8e865574223b869690494802f7fbc4230230a95a9 namespace=k8s.io Mar 12 04:20:17.453060 containerd[1506]: time="2026-03-12T04:20:17.452894303Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 12 04:20:17.590786 kubelet[2658]: I0312 04:20:17.590731 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlv7b\" (UniqueName: \"kubernetes.io/projected/52afdedb-78fc-4262-a140-f3e2656f3c8e-kube-api-access-jlv7b\") pod \"calico-kube-controllers-5dd89df469-vlctc\" (UID: \"52afdedb-78fc-4262-a140-f3e2656f3c8e\") " pod="calico-system/calico-kube-controllers-5dd89df469-vlctc" Mar 12 04:20:17.593104 kubelet[2658]: I0312 04:20:17.592496 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f398c48c-ad6a-4dd1-9a27-01ce627d85a6-config-volume\") pod \"coredns-674b8bbfcf-cjtcm\" (UID: \"f398c48c-ad6a-4dd1-9a27-01ce627d85a6\") " pod="kube-system/coredns-674b8bbfcf-cjtcm" Mar 12 04:20:17.593104 kubelet[2658]: I0312 04:20:17.592528 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnwgz\" (UniqueName: \"kubernetes.io/projected/f398c48c-ad6a-4dd1-9a27-01ce627d85a6-kube-api-access-dnwgz\") pod \"coredns-674b8bbfcf-cjtcm\" (UID: \"f398c48c-ad6a-4dd1-9a27-01ce627d85a6\") " pod="kube-system/coredns-674b8bbfcf-cjtcm" Mar 12 04:20:17.593104 kubelet[2658]: I0312 04:20:17.592555 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/193981ad-ae1d-40b4-8904-cc2e820c5b91-config-volume\") pod \"coredns-674b8bbfcf-dwqf2\" (UID: \"193981ad-ae1d-40b4-8904-cc2e820c5b91\") " pod="kube-system/coredns-674b8bbfcf-dwqf2" Mar 12 04:20:17.593104 kubelet[2658]: I0312 04:20:17.592579 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41-whisker-backend-key-pair\") pod \"whisker-696fcdffc9-lx2sm\" (UID: \"ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41\") " pod="calico-system/whisker-696fcdffc9-lx2sm" Mar 12 04:20:17.593104 kubelet[2658]: I0312 04:20:17.592600 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52afdedb-78fc-4262-a140-f3e2656f3c8e-tigera-ca-bundle\") pod \"calico-kube-controllers-5dd89df469-vlctc\" (UID: \"52afdedb-78fc-4262-a140-f3e2656f3c8e\") " pod="calico-system/calico-kube-controllers-5dd89df469-vlctc" Mar 12 04:20:17.593419 kubelet[2658]: I0312 04:20:17.592622 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41-nginx-config\") pod \"whisker-696fcdffc9-lx2sm\" (UID: \"ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41\") " pod="calico-system/whisker-696fcdffc9-lx2sm" Mar 12 04:20:17.593419 kubelet[2658]: I0312 04:20:17.592642 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41-whisker-ca-bundle\") pod \"whisker-696fcdffc9-lx2sm\" (UID: \"ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41\") " pod="calico-system/whisker-696fcdffc9-lx2sm" Mar 12 04:20:17.593419 kubelet[2658]: I0312 04:20:17.592659 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2d9n\" (UniqueName: \"kubernetes.io/projected/193981ad-ae1d-40b4-8904-cc2e820c5b91-kube-api-access-k2d9n\") pod \"coredns-674b8bbfcf-dwqf2\" (UID: \"193981ad-ae1d-40b4-8904-cc2e820c5b91\") " pod="kube-system/coredns-674b8bbfcf-dwqf2" Mar 12 04:20:17.593419 kubelet[2658]: I0312 04:20:17.592686 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9gg4\" (UniqueName: \"kubernetes.io/projected/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41-kube-api-access-r9gg4\") pod \"whisker-696fcdffc9-lx2sm\" (UID: \"ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41\") " pod="calico-system/whisker-696fcdffc9-lx2sm" Mar 12 04:20:17.624794 systemd[1]: Created slice kubepods-burstable-pod193981ad_ae1d_40b4_8904_cc2e820c5b91.slice - libcontainer container kubepods-burstable-pod193981ad_ae1d_40b4_8904_cc2e820c5b91.slice. Mar 12 04:20:17.634534 systemd[1]: Created slice kubepods-besteffort-podac37d1e3_3df4_4ad2_8cbb_00ba6d1d5e41.slice - libcontainer container kubepods-besteffort-podac37d1e3_3df4_4ad2_8cbb_00ba6d1d5e41.slice. Mar 12 04:20:17.644319 systemd[1]: Created slice kubepods-burstable-podf398c48c_ad6a_4dd1_9a27_01ce627d85a6.slice - libcontainer container kubepods-burstable-podf398c48c_ad6a_4dd1_9a27_01ce627d85a6.slice. Mar 12 04:20:17.653186 systemd[1]: Created slice kubepods-besteffort-pod52afdedb_78fc_4262_a140_f3e2656f3c8e.slice - libcontainer container kubepods-besteffort-pod52afdedb_78fc_4262_a140_f3e2656f3c8e.slice. Mar 12 04:20:17.662834 systemd[1]: Created slice kubepods-besteffort-pod8dd5d5e8_c785_4698_ab0f_5124958e0b67.slice - libcontainer container kubepods-besteffort-pod8dd5d5e8_c785_4698_ab0f_5124958e0b67.slice. Mar 12 04:20:17.672827 systemd[1]: Created slice kubepods-besteffort-pod78f8c939_b01e_4526_98a1_09241895ac3e.slice - libcontainer container kubepods-besteffort-pod78f8c939_b01e_4526_98a1_09241895ac3e.slice. Mar 12 04:20:17.686059 systemd[1]: Created slice kubepods-besteffort-pod6f24fc37_2e2e_4c28_be28_717ae5582f31.slice - libcontainer container kubepods-besteffort-pod6f24fc37_2e2e_4c28_be28_717ae5582f31.slice. Mar 12 04:20:17.694021 kubelet[2658]: I0312 04:20:17.693768 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/8dd5d5e8-c785-4698-ab0f-5124958e0b67-goldmane-key-pair\") pod \"goldmane-5b85766d88-nx574\" (UID: \"8dd5d5e8-c785-4698-ab0f-5124958e0b67\") " pod="calico-system/goldmane-5b85766d88-nx574" Mar 12 04:20:17.694021 kubelet[2658]: I0312 04:20:17.693805 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6f24fc37-2e2e-4c28-be28-717ae5582f31-calico-apiserver-certs\") pod \"calico-apiserver-55c7dc45f-pfllz\" (UID: \"6f24fc37-2e2e-4c28-be28-717ae5582f31\") " pod="calico-system/calico-apiserver-55c7dc45f-pfllz" Mar 12 04:20:17.694021 kubelet[2658]: I0312 04:20:17.693823 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c9fs\" (UniqueName: \"kubernetes.io/projected/6f24fc37-2e2e-4c28-be28-717ae5582f31-kube-api-access-4c9fs\") pod \"calico-apiserver-55c7dc45f-pfllz\" (UID: \"6f24fc37-2e2e-4c28-be28-717ae5582f31\") " pod="calico-system/calico-apiserver-55c7dc45f-pfllz" Mar 12 04:20:17.695178 kubelet[2658]: I0312 04:20:17.694936 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dd5d5e8-c785-4698-ab0f-5124958e0b67-config\") pod \"goldmane-5b85766d88-nx574\" (UID: \"8dd5d5e8-c785-4698-ab0f-5124958e0b67\") " pod="calico-system/goldmane-5b85766d88-nx574" Mar 12 04:20:17.695178 kubelet[2658]: I0312 04:20:17.694979 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6wdg\" (UniqueName: \"kubernetes.io/projected/8dd5d5e8-c785-4698-ab0f-5124958e0b67-kube-api-access-c6wdg\") pod \"goldmane-5b85766d88-nx574\" (UID: \"8dd5d5e8-c785-4698-ab0f-5124958e0b67\") " pod="calico-system/goldmane-5b85766d88-nx574" Mar 12 04:20:17.695178 kubelet[2658]: I0312 04:20:17.694996 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/78f8c939-b01e-4526-98a1-09241895ac3e-calico-apiserver-certs\") pod \"calico-apiserver-55c7dc45f-llqv4\" (UID: \"78f8c939-b01e-4526-98a1-09241895ac3e\") " pod="calico-system/calico-apiserver-55c7dc45f-llqv4" Mar 12 04:20:17.695178 kubelet[2658]: I0312 04:20:17.695047 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dd5d5e8-c785-4698-ab0f-5124958e0b67-goldmane-ca-bundle\") pod \"goldmane-5b85766d88-nx574\" (UID: \"8dd5d5e8-c785-4698-ab0f-5124958e0b67\") " pod="calico-system/goldmane-5b85766d88-nx574" Mar 12 04:20:17.705547 kubelet[2658]: I0312 04:20:17.705090 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl89j\" (UniqueName: \"kubernetes.io/projected/78f8c939-b01e-4526-98a1-09241895ac3e-kube-api-access-cl89j\") pod \"calico-apiserver-55c7dc45f-llqv4\" (UID: \"78f8c939-b01e-4526-98a1-09241895ac3e\") " pod="calico-system/calico-apiserver-55c7dc45f-llqv4" Mar 12 04:20:17.969995 containerd[1506]: time="2026-03-12T04:20:17.969791192Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5dd89df469-vlctc,Uid:52afdedb-78fc-4262-a140-f3e2656f3c8e,Namespace:calico-system,Attempt:0,}" Mar 12 04:20:17.975354 containerd[1506]: time="2026-03-12T04:20:17.974884282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dwqf2,Uid:193981ad-ae1d-40b4-8904-cc2e820c5b91,Namespace:kube-system,Attempt:0,}" Mar 12 04:20:17.992284 containerd[1506]: time="2026-03-12T04:20:17.991715880Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-696fcdffc9-lx2sm,Uid:ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41,Namespace:calico-system,Attempt:0,}" Mar 12 04:20:18.006746 containerd[1506]: time="2026-03-12T04:20:18.006190995Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55c7dc45f-pfllz,Uid:6f24fc37-2e2e-4c28-be28-717ae5582f31,Namespace:calico-system,Attempt:0,}" Mar 12 04:20:18.030870 containerd[1506]: time="2026-03-12T04:20:18.029592704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-nx574,Uid:8dd5d5e8-c785-4698-ab0f-5124958e0b67,Namespace:calico-system,Attempt:0,}" Mar 12 04:20:18.030870 containerd[1506]: time="2026-03-12T04:20:18.029864894Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55c7dc45f-llqv4,Uid:78f8c939-b01e-4526-98a1-09241895ac3e,Namespace:calico-system,Attempt:0,}" Mar 12 04:20:18.036492 containerd[1506]: time="2026-03-12T04:20:18.036360352Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-cjtcm,Uid:f398c48c-ad6a-4dd1-9a27-01ce627d85a6,Namespace:kube-system,Attempt:0,}" Mar 12 04:20:18.459004 containerd[1506]: time="2026-03-12T04:20:18.458964961Z" level=info msg="CreateContainer within sandbox \"912af7730ffea5ccff0d4115cfa4dfe53c709d28c572fe78cd01c5c551f05eb9\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 12 04:20:18.495973 containerd[1506]: time="2026-03-12T04:20:18.495924991Z" level=error msg="Failed to destroy network for sandbox \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.496745 containerd[1506]: time="2026-03-12T04:20:18.496714065Z" level=error msg="encountered an error cleaning up failed sandbox \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.497617 containerd[1506]: time="2026-03-12T04:20:18.497030617Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dwqf2,Uid:193981ad-ae1d-40b4-8904-cc2e820c5b91,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.521153 containerd[1506]: time="2026-03-12T04:20:18.520996554Z" level=error msg="Failed to destroy network for sandbox \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.521781 containerd[1506]: time="2026-03-12T04:20:18.521743969Z" level=error msg="encountered an error cleaning up failed sandbox \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.521910 containerd[1506]: time="2026-03-12T04:20:18.521809962Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55c7dc45f-llqv4,Uid:78f8c939-b01e-4526-98a1-09241895ac3e,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.527388 containerd[1506]: time="2026-03-12T04:20:18.527264640Z" level=error msg="Failed to destroy network for sandbox \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.528070 containerd[1506]: time="2026-03-12T04:20:18.528032039Z" level=info msg="CreateContainer within sandbox \"912af7730ffea5ccff0d4115cfa4dfe53c709d28c572fe78cd01c5c551f05eb9\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"4ce2849586e6617afd34cb9fd55caba8203158182a0023fe6bc4b55728bd933f\"" Mar 12 04:20:18.528338 containerd[1506]: time="2026-03-12T04:20:18.528261370Z" level=error msg="Failed to destroy network for sandbox \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.528613 containerd[1506]: time="2026-03-12T04:20:18.528583852Z" level=error msg="encountered an error cleaning up failed sandbox \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.528669 containerd[1506]: time="2026-03-12T04:20:18.528633093Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-696fcdffc9-lx2sm,Uid:ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.529034 containerd[1506]: time="2026-03-12T04:20:18.528943287Z" level=error msg="encountered an error cleaning up failed sandbox \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.529034 containerd[1506]: time="2026-03-12T04:20:18.528988972Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55c7dc45f-pfllz,Uid:6f24fc37-2e2e-4c28-be28-717ae5582f31,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.533872 containerd[1506]: time="2026-03-12T04:20:18.533068309Z" level=error msg="Failed to destroy network for sandbox \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.533872 containerd[1506]: time="2026-03-12T04:20:18.533347102Z" level=error msg="encountered an error cleaning up failed sandbox \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.533872 containerd[1506]: time="2026-03-12T04:20:18.533386121Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5dd89df469-vlctc,Uid:52afdedb-78fc-4262-a140-f3e2656f3c8e,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.536078 containerd[1506]: time="2026-03-12T04:20:18.536032440Z" level=error msg="Failed to destroy network for sandbox \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.536548 containerd[1506]: time="2026-03-12T04:20:18.536522698Z" level=error msg="encountered an error cleaning up failed sandbox \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.536701 containerd[1506]: time="2026-03-12T04:20:18.536678771Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-nx574,Uid:8dd5d5e8-c785-4698-ab0f-5124958e0b67,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.537584 kubelet[2658]: E0312 04:20:18.537376 2658 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.538210 kubelet[2658]: E0312 04:20:18.537754 2658 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.538783 kubelet[2658]: E0312 04:20:18.538609 2658 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-55c7dc45f-pfllz" Mar 12 04:20:18.538783 kubelet[2658]: E0312 04:20:18.538709 2658 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-55c7dc45f-pfllz" Mar 12 04:20:18.539948 kubelet[2658]: E0312 04:20:18.538910 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-55c7dc45f-pfllz_calico-system(6f24fc37-2e2e-4c28-be28-717ae5582f31)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-55c7dc45f-pfllz_calico-system(6f24fc37-2e2e-4c28-be28-717ae5582f31)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-55c7dc45f-pfllz" podUID="6f24fc37-2e2e-4c28-be28-717ae5582f31" Mar 12 04:20:18.539948 kubelet[2658]: E0312 04:20:18.539228 2658 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-55c7dc45f-llqv4" Mar 12 04:20:18.539948 kubelet[2658]: E0312 04:20:18.539272 2658 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-55c7dc45f-llqv4" Mar 12 04:20:18.540117 kubelet[2658]: E0312 04:20:18.539325 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-55c7dc45f-llqv4_calico-system(78f8c939-b01e-4526-98a1-09241895ac3e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-55c7dc45f-llqv4_calico-system(78f8c939-b01e-4526-98a1-09241895ac3e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-55c7dc45f-llqv4" podUID="78f8c939-b01e-4526-98a1-09241895ac3e" Mar 12 04:20:18.540117 kubelet[2658]: E0312 04:20:18.539390 2658 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.540117 kubelet[2658]: E0312 04:20:18.539414 2658 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dwqf2" Mar 12 04:20:18.540231 kubelet[2658]: E0312 04:20:18.539429 2658 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dwqf2" Mar 12 04:20:18.540231 kubelet[2658]: E0312 04:20:18.539455 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-dwqf2_kube-system(193981ad-ae1d-40b4-8904-cc2e820c5b91)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-dwqf2_kube-system(193981ad-ae1d-40b4-8904-cc2e820c5b91)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-dwqf2" podUID="193981ad-ae1d-40b4-8904-cc2e820c5b91" Mar 12 04:20:18.540231 kubelet[2658]: E0312 04:20:18.539545 2658 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.540352 kubelet[2658]: E0312 04:20:18.539568 2658 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-nx574" Mar 12 04:20:18.540352 kubelet[2658]: E0312 04:20:18.539580 2658 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-nx574" Mar 12 04:20:18.540352 kubelet[2658]: E0312 04:20:18.539604 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-5b85766d88-nx574_calico-system(8dd5d5e8-c785-4698-ab0f-5124958e0b67)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-5b85766d88-nx574_calico-system(8dd5d5e8-c785-4698-ab0f-5124958e0b67)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-5b85766d88-nx574" podUID="8dd5d5e8-c785-4698-ab0f-5124958e0b67" Mar 12 04:20:18.540468 kubelet[2658]: E0312 04:20:18.539637 2658 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.540468 kubelet[2658]: E0312 04:20:18.539653 2658 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5dd89df469-vlctc" Mar 12 04:20:18.540468 kubelet[2658]: E0312 04:20:18.539664 2658 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5dd89df469-vlctc" Mar 12 04:20:18.540556 kubelet[2658]: E0312 04:20:18.539694 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5dd89df469-vlctc_calico-system(52afdedb-78fc-4262-a140-f3e2656f3c8e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5dd89df469-vlctc_calico-system(52afdedb-78fc-4262-a140-f3e2656f3c8e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5dd89df469-vlctc" podUID="52afdedb-78fc-4262-a140-f3e2656f3c8e" Mar 12 04:20:18.540556 kubelet[2658]: E0312 04:20:18.539730 2658 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.540556 kubelet[2658]: E0312 04:20:18.539745 2658 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-696fcdffc9-lx2sm" Mar 12 04:20:18.540672 kubelet[2658]: E0312 04:20:18.539758 2658 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-696fcdffc9-lx2sm" Mar 12 04:20:18.540672 kubelet[2658]: E0312 04:20:18.539785 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-696fcdffc9-lx2sm_calico-system(ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-696fcdffc9-lx2sm_calico-system(ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-696fcdffc9-lx2sm" podUID="ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41" Mar 12 04:20:18.541472 containerd[1506]: time="2026-03-12T04:20:18.541449260Z" level=info msg="StartContainer for \"4ce2849586e6617afd34cb9fd55caba8203158182a0023fe6bc4b55728bd933f\"" Mar 12 04:20:18.543018 containerd[1506]: time="2026-03-12T04:20:18.542906342Z" level=error msg="Failed to destroy network for sandbox \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.543336 containerd[1506]: time="2026-03-12T04:20:18.543306419Z" level=error msg="encountered an error cleaning up failed sandbox \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.543469 containerd[1506]: time="2026-03-12T04:20:18.543420139Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-cjtcm,Uid:f398c48c-ad6a-4dd1-9a27-01ce627d85a6,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.543704 kubelet[2658]: E0312 04:20:18.543672 2658 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 12 04:20:18.543759 kubelet[2658]: E0312 04:20:18.543718 2658 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-cjtcm" Mar 12 04:20:18.543759 kubelet[2658]: E0312 04:20:18.543743 2658 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-cjtcm" Mar 12 04:20:18.543888 kubelet[2658]: E0312 04:20:18.543790 2658 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-cjtcm_kube-system(f398c48c-ad6a-4dd1-9a27-01ce627d85a6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-cjtcm_kube-system(f398c48c-ad6a-4dd1-9a27-01ce627d85a6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-cjtcm" podUID="f398c48c-ad6a-4dd1-9a27-01ce627d85a6" Mar 12 04:20:18.629031 systemd[1]: Started cri-containerd-4ce2849586e6617afd34cb9fd55caba8203158182a0023fe6bc4b55728bd933f.scope - libcontainer container 4ce2849586e6617afd34cb9fd55caba8203158182a0023fe6bc4b55728bd933f. Mar 12 04:20:18.667528 containerd[1506]: time="2026-03-12T04:20:18.667457303Z" level=info msg="StartContainer for \"4ce2849586e6617afd34cb9fd55caba8203158182a0023fe6bc4b55728bd933f\" returns successfully" Mar 12 04:20:19.168735 systemd[1]: Created slice kubepods-besteffort-poda2826227_3a11_47af_9911_51d5f0de7b17.slice - libcontainer container kubepods-besteffort-poda2826227_3a11_47af_9911_51d5f0de7b17.slice. Mar 12 04:20:19.174123 containerd[1506]: time="2026-03-12T04:20:19.174091493Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vmkhw,Uid:a2826227-3a11-47af-9911-51d5f0de7b17,Namespace:calico-system,Attempt:0,}" Mar 12 04:20:19.396351 systemd-networkd[1429]: cali5d778c85830: Link UP Mar 12 04:20:19.397070 systemd-networkd[1429]: cali5d778c85830: Gained carrier Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.235 [ERROR][3758] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.268 [INFO][3758] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--tymtb.gb1.brightbox.com-k8s-csi--node--driver--vmkhw-eth0 csi-node-driver- calico-system a2826227-3a11-47af-9911-51d5f0de7b17 738 0 2026-03-12 04:19:55 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6d9d697c7c k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-tymtb.gb1.brightbox.com csi-node-driver-vmkhw eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali5d778c85830 [] [] }} ContainerID="03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" Namespace="calico-system" Pod="csi-node-driver-vmkhw" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-csi--node--driver--vmkhw-" Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.268 [INFO][3758] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" Namespace="calico-system" Pod="csi-node-driver-vmkhw" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-csi--node--driver--vmkhw-eth0" Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.306 [INFO][3778] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" HandleID="k8s-pod-network.03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" Workload="srv--tymtb.gb1.brightbox.com-k8s-csi--node--driver--vmkhw-eth0" Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.315 [INFO][3778] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" HandleID="k8s-pod-network.03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" Workload="srv--tymtb.gb1.brightbox.com-k8s-csi--node--driver--vmkhw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277e80), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-tymtb.gb1.brightbox.com", "pod":"csi-node-driver-vmkhw", "timestamp":"2026-03-12 04:20:19.306513634 +0000 UTC"}, Hostname:"srv-tymtb.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001142c0)} Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.315 [INFO][3778] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.316 [INFO][3778] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.316 [INFO][3778] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-tymtb.gb1.brightbox.com' Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.320 [INFO][3778] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.331 [INFO][3778] ipam/ipam.go 409: Looking up existing affinities for host host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.339 [INFO][3778] ipam/ipam.go 526: Trying affinity for 192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.343 [INFO][3778] ipam/ipam.go 160: Attempting to load block cidr=192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.345 [INFO][3778] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.345 [INFO][3778] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.27.192/26 handle="k8s-pod-network.03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.348 [INFO][3778] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3 Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.357 [INFO][3778] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.27.192/26 handle="k8s-pod-network.03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.363 [INFO][3778] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.27.193/26] block=192.168.27.192/26 handle="k8s-pod-network.03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.363 [INFO][3778] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.27.193/26] handle="k8s-pod-network.03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.363 [INFO][3778] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:19.421674 containerd[1506]: 2026-03-12 04:20:19.363 [INFO][3778] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.27.193/26] IPv6=[] ContainerID="03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" HandleID="k8s-pod-network.03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" Workload="srv--tymtb.gb1.brightbox.com-k8s-csi--node--driver--vmkhw-eth0" Mar 12 04:20:19.423453 containerd[1506]: 2026-03-12 04:20:19.368 [INFO][3758] cni-plugin/k8s.go 418: Populated endpoint ContainerID="03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" Namespace="calico-system" Pod="csi-node-driver-vmkhw" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-csi--node--driver--vmkhw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-csi--node--driver--vmkhw-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a2826227-3a11-47af-9911-51d5f0de7b17", ResourceVersion:"738", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-vmkhw", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.27.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5d778c85830", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:19.423453 containerd[1506]: 2026-03-12 04:20:19.368 [INFO][3758] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.27.193/32] ContainerID="03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" Namespace="calico-system" Pod="csi-node-driver-vmkhw" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-csi--node--driver--vmkhw-eth0" Mar 12 04:20:19.423453 containerd[1506]: 2026-03-12 04:20:19.368 [INFO][3758] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5d778c85830 ContainerID="03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" Namespace="calico-system" Pod="csi-node-driver-vmkhw" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-csi--node--driver--vmkhw-eth0" Mar 12 04:20:19.423453 containerd[1506]: 2026-03-12 04:20:19.392 [INFO][3758] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" Namespace="calico-system" Pod="csi-node-driver-vmkhw" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-csi--node--driver--vmkhw-eth0" Mar 12 04:20:19.423453 containerd[1506]: 2026-03-12 04:20:19.392 [INFO][3758] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" Namespace="calico-system" Pod="csi-node-driver-vmkhw" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-csi--node--driver--vmkhw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-csi--node--driver--vmkhw-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a2826227-3a11-47af-9911-51d5f0de7b17", ResourceVersion:"738", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3", Pod:"csi-node-driver-vmkhw", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.27.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5d778c85830", MAC:"ee:57:4a:6a:89:de", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:19.423453 containerd[1506]: 2026-03-12 04:20:19.412 [INFO][3758] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3" Namespace="calico-system" Pod="csi-node-driver-vmkhw" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-csi--node--driver--vmkhw-eth0" Mar 12 04:20:19.465355 kubelet[2658]: I0312 04:20:19.464862 2658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Mar 12 04:20:19.499664 containerd[1506]: time="2026-03-12T04:20:19.499609388Z" level=info msg="StopPodSandbox for \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\"" Mar 12 04:20:19.503643 containerd[1506]: time="2026-03-12T04:20:19.503601105Z" level=info msg="Ensure that sandbox ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e in task-service has been cleanup successfully" Mar 12 04:20:19.509364 kubelet[2658]: I0312 04:20:19.509013 2658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Mar 12 04:20:19.526043 containerd[1506]: time="2026-03-12T04:20:19.524987612Z" level=info msg="StopPodSandbox for \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\"" Mar 12 04:20:19.528256 kubelet[2658]: I0312 04:20:19.525469 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-cdxj9" podStartSLOduration=3.938856935 podStartE2EDuration="24.509297082s" podCreationTimestamp="2026-03-12 04:19:55 +0000 UTC" firstStartedPulling="2026-03-12 04:19:55.963127621 +0000 UTC m=+19.942972434" lastFinishedPulling="2026-03-12 04:20:16.533567768 +0000 UTC m=+40.513412581" observedRunningTime="2026-03-12 04:20:19.504340666 +0000 UTC m=+43.484185503" watchObservedRunningTime="2026-03-12 04:20:19.509297082 +0000 UTC m=+43.489141923" Mar 12 04:20:19.528779 containerd[1506]: time="2026-03-12T04:20:19.528745919Z" level=info msg="Ensure that sandbox 844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f in task-service has been cleanup successfully" Mar 12 04:20:19.552196 containerd[1506]: time="2026-03-12T04:20:19.536708126Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:20:19.552196 containerd[1506]: time="2026-03-12T04:20:19.536763129Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:20:19.552196 containerd[1506]: time="2026-03-12T04:20:19.536789367Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:20:19.552196 containerd[1506]: time="2026-03-12T04:20:19.536975320Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:20:19.557243 kubelet[2658]: I0312 04:20:19.554463 2658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Mar 12 04:20:19.563216 containerd[1506]: time="2026-03-12T04:20:19.563182649Z" level=info msg="StopPodSandbox for \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\"" Mar 12 04:20:19.563989 containerd[1506]: time="2026-03-12T04:20:19.563965701Z" level=info msg="Ensure that sandbox 8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9 in task-service has been cleanup successfully" Mar 12 04:20:19.567258 kubelet[2658]: I0312 04:20:19.566516 2658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Mar 12 04:20:19.567822 containerd[1506]: time="2026-03-12T04:20:19.567790894Z" level=info msg="StopPodSandbox for \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\"" Mar 12 04:20:19.570552 containerd[1506]: time="2026-03-12T04:20:19.570524712Z" level=info msg="Ensure that sandbox bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb in task-service has been cleanup successfully" Mar 12 04:20:19.571360 kubelet[2658]: I0312 04:20:19.570642 2658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Mar 12 04:20:19.576853 containerd[1506]: time="2026-03-12T04:20:19.576224987Z" level=info msg="StopPodSandbox for \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\"" Mar 12 04:20:19.578577 containerd[1506]: time="2026-03-12T04:20:19.578550559Z" level=info msg="Ensure that sandbox a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402 in task-service has been cleanup successfully" Mar 12 04:20:19.599423 kubelet[2658]: I0312 04:20:19.598961 2658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Mar 12 04:20:19.600367 containerd[1506]: time="2026-03-12T04:20:19.600341005Z" level=info msg="StopPodSandbox for \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\"" Mar 12 04:20:19.601303 containerd[1506]: time="2026-03-12T04:20:19.601279791Z" level=info msg="Ensure that sandbox ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5 in task-service has been cleanup successfully" Mar 12 04:20:19.604026 kubelet[2658]: I0312 04:20:19.604004 2658 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Mar 12 04:20:19.604467 containerd[1506]: time="2026-03-12T04:20:19.604447637Z" level=info msg="StopPodSandbox for \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\"" Mar 12 04:20:19.608813 containerd[1506]: time="2026-03-12T04:20:19.608780817Z" level=info msg="Ensure that sandbox fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9 in task-service has been cleanup successfully" Mar 12 04:20:19.626056 systemd[1]: Started cri-containerd-03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3.scope - libcontainer container 03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3. Mar 12 04:20:19.730201 systemd[1]: run-containerd-runc-k8s.io-4ce2849586e6617afd34cb9fd55caba8203158182a0023fe6bc4b55728bd933f-runc.Y4RcEs.mount: Deactivated successfully. Mar 12 04:20:19.771056 containerd[1506]: time="2026-03-12T04:20:19.771013533Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vmkhw,Uid:a2826227-3a11-47af-9911-51d5f0de7b17,Namespace:calico-system,Attempt:0,} returns sandbox id \"03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3\"" Mar 12 04:20:19.779809 containerd[1506]: time="2026-03-12T04:20:19.779775440Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 12 04:20:20.046476 containerd[1506]: 2026-03-12 04:20:19.777 [INFO][3830] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Mar 12 04:20:20.046476 containerd[1506]: 2026-03-12 04:20:19.777 [INFO][3830] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" iface="eth0" netns="/var/run/netns/cni-37c25f27-b4e5-e71c-d9fb-7ceb433ccdac" Mar 12 04:20:20.046476 containerd[1506]: 2026-03-12 04:20:19.778 [INFO][3830] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" iface="eth0" netns="/var/run/netns/cni-37c25f27-b4e5-e71c-d9fb-7ceb433ccdac" Mar 12 04:20:20.046476 containerd[1506]: 2026-03-12 04:20:19.786 [INFO][3830] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" iface="eth0" netns="/var/run/netns/cni-37c25f27-b4e5-e71c-d9fb-7ceb433ccdac" Mar 12 04:20:20.046476 containerd[1506]: 2026-03-12 04:20:19.786 [INFO][3830] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Mar 12 04:20:20.046476 containerd[1506]: 2026-03-12 04:20:19.786 [INFO][3830] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Mar 12 04:20:20.046476 containerd[1506]: 2026-03-12 04:20:19.999 [INFO][3954] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" HandleID="k8s-pod-network.ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" Mar 12 04:20:20.046476 containerd[1506]: 2026-03-12 04:20:20.001 [INFO][3954] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:20.046476 containerd[1506]: 2026-03-12 04:20:20.002 [INFO][3954] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:20.046476 containerd[1506]: 2026-03-12 04:20:20.018 [WARNING][3954] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" HandleID="k8s-pod-network.ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" Mar 12 04:20:20.046476 containerd[1506]: 2026-03-12 04:20:20.018 [INFO][3954] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" HandleID="k8s-pod-network.ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" Mar 12 04:20:20.046476 containerd[1506]: 2026-03-12 04:20:20.028 [INFO][3954] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:20.046476 containerd[1506]: 2026-03-12 04:20:20.037 [INFO][3830] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Mar 12 04:20:20.049401 containerd[1506]: time="2026-03-12T04:20:20.049367053Z" level=info msg="TearDown network for sandbox \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\" successfully" Mar 12 04:20:20.049488 containerd[1506]: time="2026-03-12T04:20:20.049475259Z" level=info msg="StopPodSandbox for \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\" returns successfully" Mar 12 04:20:20.054334 systemd[1]: run-netns-cni\x2d37c25f27\x2db4e5\x2de71c\x2dd9fb\x2d7ceb433ccdac.mount: Deactivated successfully. Mar 12 04:20:20.066077 containerd[1506]: 2026-03-12 04:20:19.848 [INFO][3930] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Mar 12 04:20:20.066077 containerd[1506]: 2026-03-12 04:20:19.849 [INFO][3930] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" iface="eth0" netns="/var/run/netns/cni-96f0fa88-6b16-53b1-ec58-3a48404c2b22" Mar 12 04:20:20.066077 containerd[1506]: 2026-03-12 04:20:19.849 [INFO][3930] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" iface="eth0" netns="/var/run/netns/cni-96f0fa88-6b16-53b1-ec58-3a48404c2b22" Mar 12 04:20:20.066077 containerd[1506]: 2026-03-12 04:20:19.850 [INFO][3930] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" iface="eth0" netns="/var/run/netns/cni-96f0fa88-6b16-53b1-ec58-3a48404c2b22" Mar 12 04:20:20.066077 containerd[1506]: 2026-03-12 04:20:19.850 [INFO][3930] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Mar 12 04:20:20.066077 containerd[1506]: 2026-03-12 04:20:19.851 [INFO][3930] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Mar 12 04:20:20.066077 containerd[1506]: 2026-03-12 04:20:20.036 [INFO][3963] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" HandleID="k8s-pod-network.fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" Mar 12 04:20:20.066077 containerd[1506]: 2026-03-12 04:20:20.036 [INFO][3963] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:20.066077 containerd[1506]: 2026-03-12 04:20:20.036 [INFO][3963] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:20.066077 containerd[1506]: 2026-03-12 04:20:20.051 [WARNING][3963] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" HandleID="k8s-pod-network.fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" Mar 12 04:20:20.066077 containerd[1506]: 2026-03-12 04:20:20.052 [INFO][3963] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" HandleID="k8s-pod-network.fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" Mar 12 04:20:20.066077 containerd[1506]: 2026-03-12 04:20:20.055 [INFO][3963] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:20.066077 containerd[1506]: 2026-03-12 04:20:20.058 [INFO][3930] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Mar 12 04:20:20.071167 containerd[1506]: time="2026-03-12T04:20:20.071107395Z" level=info msg="TearDown network for sandbox \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\" successfully" Mar 12 04:20:20.071167 containerd[1506]: time="2026-03-12T04:20:20.071158205Z" level=info msg="StopPodSandbox for \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\" returns successfully" Mar 12 04:20:20.073324 systemd[1]: run-netns-cni\x2d96f0fa88\x2d6b16\x2d53b1\x2dec58\x2d3a48404c2b22.mount: Deactivated successfully. Mar 12 04:20:20.086137 containerd[1506]: time="2026-03-12T04:20:20.085932196Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55c7dc45f-llqv4,Uid:78f8c939-b01e-4526-98a1-09241895ac3e,Namespace:calico-system,Attempt:1,}" Mar 12 04:20:20.086891 containerd[1506]: time="2026-03-12T04:20:20.086866751Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5dd89df469-vlctc,Uid:52afdedb-78fc-4262-a140-f3e2656f3c8e,Namespace:calico-system,Attempt:1,}" Mar 12 04:20:20.098488 containerd[1506]: 2026-03-12 04:20:19.907 [INFO][3851] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Mar 12 04:20:20.098488 containerd[1506]: 2026-03-12 04:20:19.908 [INFO][3851] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" iface="eth0" netns="/var/run/netns/cni-8f2d6a44-285c-de40-1d31-09bbf3fbbc0b" Mar 12 04:20:20.098488 containerd[1506]: 2026-03-12 04:20:19.909 [INFO][3851] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" iface="eth0" netns="/var/run/netns/cni-8f2d6a44-285c-de40-1d31-09bbf3fbbc0b" Mar 12 04:20:20.098488 containerd[1506]: 2026-03-12 04:20:19.912 [INFO][3851] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" iface="eth0" netns="/var/run/netns/cni-8f2d6a44-285c-de40-1d31-09bbf3fbbc0b" Mar 12 04:20:20.098488 containerd[1506]: 2026-03-12 04:20:19.912 [INFO][3851] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Mar 12 04:20:20.098488 containerd[1506]: 2026-03-12 04:20:19.912 [INFO][3851] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Mar 12 04:20:20.098488 containerd[1506]: 2026-03-12 04:20:20.076 [INFO][3973] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" HandleID="k8s-pod-network.844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Workload="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" Mar 12 04:20:20.098488 containerd[1506]: 2026-03-12 04:20:20.076 [INFO][3973] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:20.098488 containerd[1506]: 2026-03-12 04:20:20.076 [INFO][3973] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:20.098488 containerd[1506]: 2026-03-12 04:20:20.084 [WARNING][3973] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" HandleID="k8s-pod-network.844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Workload="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" Mar 12 04:20:20.098488 containerd[1506]: 2026-03-12 04:20:20.084 [INFO][3973] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" HandleID="k8s-pod-network.844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Workload="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" Mar 12 04:20:20.098488 containerd[1506]: 2026-03-12 04:20:20.086 [INFO][3973] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:20.098488 containerd[1506]: 2026-03-12 04:20:20.090 [INFO][3851] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Mar 12 04:20:20.103457 containerd[1506]: time="2026-03-12T04:20:20.103419869Z" level=info msg="TearDown network for sandbox \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\" successfully" Mar 12 04:20:20.103457 containerd[1506]: time="2026-03-12T04:20:20.103453040Z" level=info msg="StopPodSandbox for \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\" returns successfully" Mar 12 04:20:20.104114 systemd[1]: run-netns-cni\x2d8f2d6a44\x2d285c\x2dde40\x2d1d31\x2d09bbf3fbbc0b.mount: Deactivated successfully. Mar 12 04:20:20.105161 containerd[1506]: time="2026-03-12T04:20:20.105127292Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-nx574,Uid:8dd5d5e8-c785-4698-ab0f-5124958e0b67,Namespace:calico-system,Attempt:1,}" Mar 12 04:20:20.168344 containerd[1506]: 2026-03-12 04:20:19.937 [INFO][3896] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Mar 12 04:20:20.168344 containerd[1506]: 2026-03-12 04:20:19.937 [INFO][3896] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" iface="eth0" netns="/var/run/netns/cni-f0756e95-2c6b-62c2-f169-0f77af1b48f0" Mar 12 04:20:20.168344 containerd[1506]: 2026-03-12 04:20:19.938 [INFO][3896] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" iface="eth0" netns="/var/run/netns/cni-f0756e95-2c6b-62c2-f169-0f77af1b48f0" Mar 12 04:20:20.168344 containerd[1506]: 2026-03-12 04:20:19.938 [INFO][3896] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" iface="eth0" netns="/var/run/netns/cni-f0756e95-2c6b-62c2-f169-0f77af1b48f0" Mar 12 04:20:20.168344 containerd[1506]: 2026-03-12 04:20:19.938 [INFO][3896] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Mar 12 04:20:20.168344 containerd[1506]: 2026-03-12 04:20:19.938 [INFO][3896] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Mar 12 04:20:20.168344 containerd[1506]: 2026-03-12 04:20:20.123 [INFO][3980] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" HandleID="k8s-pod-network.bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Workload="srv--tymtb.gb1.brightbox.com-k8s-whisker--696fcdffc9--lx2sm-eth0" Mar 12 04:20:20.168344 containerd[1506]: 2026-03-12 04:20:20.125 [INFO][3980] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:20.168344 containerd[1506]: 2026-03-12 04:20:20.125 [INFO][3980] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:20.168344 containerd[1506]: 2026-03-12 04:20:20.139 [WARNING][3980] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" HandleID="k8s-pod-network.bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Workload="srv--tymtb.gb1.brightbox.com-k8s-whisker--696fcdffc9--lx2sm-eth0" Mar 12 04:20:20.168344 containerd[1506]: 2026-03-12 04:20:20.139 [INFO][3980] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" HandleID="k8s-pod-network.bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Workload="srv--tymtb.gb1.brightbox.com-k8s-whisker--696fcdffc9--lx2sm-eth0" Mar 12 04:20:20.168344 containerd[1506]: 2026-03-12 04:20:20.142 [INFO][3980] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:20.168344 containerd[1506]: 2026-03-12 04:20:20.152 [INFO][3896] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Mar 12 04:20:20.170259 containerd[1506]: time="2026-03-12T04:20:20.169614106Z" level=info msg="TearDown network for sandbox \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\" successfully" Mar 12 04:20:20.170476 containerd[1506]: time="2026-03-12T04:20:20.170451000Z" level=info msg="StopPodSandbox for \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\" returns successfully" Mar 12 04:20:20.196010 containerd[1506]: 2026-03-12 04:20:20.001 [INFO][3929] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Mar 12 04:20:20.196010 containerd[1506]: 2026-03-12 04:20:20.001 [INFO][3929] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" iface="eth0" netns="/var/run/netns/cni-74c57dc1-203f-8218-dc04-bfb07c6e503d" Mar 12 04:20:20.196010 containerd[1506]: 2026-03-12 04:20:20.003 [INFO][3929] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" iface="eth0" netns="/var/run/netns/cni-74c57dc1-203f-8218-dc04-bfb07c6e503d" Mar 12 04:20:20.196010 containerd[1506]: 2026-03-12 04:20:20.005 [INFO][3929] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" iface="eth0" netns="/var/run/netns/cni-74c57dc1-203f-8218-dc04-bfb07c6e503d" Mar 12 04:20:20.196010 containerd[1506]: 2026-03-12 04:20:20.005 [INFO][3929] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Mar 12 04:20:20.196010 containerd[1506]: 2026-03-12 04:20:20.005 [INFO][3929] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Mar 12 04:20:20.196010 containerd[1506]: 2026-03-12 04:20:20.134 [INFO][3999] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" HandleID="k8s-pod-network.a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" Mar 12 04:20:20.196010 containerd[1506]: 2026-03-12 04:20:20.136 [INFO][3999] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:20.196010 containerd[1506]: 2026-03-12 04:20:20.147 [INFO][3999] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:20.196010 containerd[1506]: 2026-03-12 04:20:20.176 [WARNING][3999] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" HandleID="k8s-pod-network.a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" Mar 12 04:20:20.196010 containerd[1506]: 2026-03-12 04:20:20.176 [INFO][3999] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" HandleID="k8s-pod-network.a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" Mar 12 04:20:20.196010 containerd[1506]: 2026-03-12 04:20:20.180 [INFO][3999] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:20.196010 containerd[1506]: 2026-03-12 04:20:20.187 [INFO][3929] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Mar 12 04:20:20.199031 containerd[1506]: time="2026-03-12T04:20:20.197820123Z" level=info msg="TearDown network for sandbox \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\" successfully" Mar 12 04:20:20.199031 containerd[1506]: time="2026-03-12T04:20:20.198250192Z" level=info msg="StopPodSandbox for \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\" returns successfully" Mar 12 04:20:20.199905 containerd[1506]: time="2026-03-12T04:20:20.199880869Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dwqf2,Uid:193981ad-ae1d-40b4-8904-cc2e820c5b91,Namespace:kube-system,Attempt:1,}" Mar 12 04:20:20.223459 containerd[1506]: 2026-03-12 04:20:19.972 [INFO][3895] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Mar 12 04:20:20.223459 containerd[1506]: 2026-03-12 04:20:19.972 [INFO][3895] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" iface="eth0" netns="/var/run/netns/cni-dbf407b2-f0eb-ba10-71b5-b77045d22380" Mar 12 04:20:20.223459 containerd[1506]: 2026-03-12 04:20:19.973 [INFO][3895] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" iface="eth0" netns="/var/run/netns/cni-dbf407b2-f0eb-ba10-71b5-b77045d22380" Mar 12 04:20:20.223459 containerd[1506]: 2026-03-12 04:20:19.974 [INFO][3895] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" iface="eth0" netns="/var/run/netns/cni-dbf407b2-f0eb-ba10-71b5-b77045d22380" Mar 12 04:20:20.223459 containerd[1506]: 2026-03-12 04:20:19.974 [INFO][3895] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Mar 12 04:20:20.223459 containerd[1506]: 2026-03-12 04:20:19.974 [INFO][3895] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Mar 12 04:20:20.223459 containerd[1506]: 2026-03-12 04:20:20.148 [INFO][3991] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" HandleID="k8s-pod-network.8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" Mar 12 04:20:20.223459 containerd[1506]: 2026-03-12 04:20:20.148 [INFO][3991] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:20.223459 containerd[1506]: 2026-03-12 04:20:20.189 [INFO][3991] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:20.223459 containerd[1506]: 2026-03-12 04:20:20.200 [WARNING][3991] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" HandleID="k8s-pod-network.8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" Mar 12 04:20:20.223459 containerd[1506]: 2026-03-12 04:20:20.200 [INFO][3991] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" HandleID="k8s-pod-network.8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" Mar 12 04:20:20.223459 containerd[1506]: 2026-03-12 04:20:20.204 [INFO][3991] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:20.223459 containerd[1506]: 2026-03-12 04:20:20.209 [INFO][3895] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Mar 12 04:20:20.224453 containerd[1506]: time="2026-03-12T04:20:20.223801395Z" level=info msg="TearDown network for sandbox \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\" successfully" Mar 12 04:20:20.224453 containerd[1506]: time="2026-03-12T04:20:20.223941960Z" level=info msg="StopPodSandbox for \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\" returns successfully" Mar 12 04:20:20.228468 containerd[1506]: time="2026-03-12T04:20:20.228130055Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55c7dc45f-pfllz,Uid:6f24fc37-2e2e-4c28-be28-717ae5582f31,Namespace:calico-system,Attempt:1,}" Mar 12 04:20:20.228764 kubelet[2658]: I0312 04:20:20.228735 2658 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41-nginx-config\") pod \"ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41\" (UID: \"ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41\") " Mar 12 04:20:20.228929 kubelet[2658]: I0312 04:20:20.228909 2658 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41-whisker-backend-key-pair\") pod \"ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41\" (UID: \"ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41\") " Mar 12 04:20:20.229269 kubelet[2658]: I0312 04:20:20.229247 2658 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9gg4\" (UniqueName: \"kubernetes.io/projected/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41-kube-api-access-r9gg4\") pod \"ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41\" (UID: \"ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41\") " Mar 12 04:20:20.231300 kubelet[2658]: I0312 04:20:20.231269 2658 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41-whisker-ca-bundle\") pod \"ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41\" (UID: \"ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41\") " Mar 12 04:20:20.242518 kubelet[2658]: I0312 04:20:20.238349 2658 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41" (UID: "ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 04:20:20.242518 kubelet[2658]: I0312 04:20:20.242104 2658 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41" (UID: "ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 04:20:20.254185 kubelet[2658]: I0312 04:20:20.253982 2658 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41-kube-api-access-r9gg4" (OuterVolumeSpecName: "kube-api-access-r9gg4") pod "ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41" (UID: "ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41"). InnerVolumeSpecName "kube-api-access-r9gg4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 04:20:20.254438 containerd[1506]: 2026-03-12 04:20:19.973 [INFO][3931] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Mar 12 04:20:20.254438 containerd[1506]: 2026-03-12 04:20:19.974 [INFO][3931] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" iface="eth0" netns="/var/run/netns/cni-05f22843-923b-508f-95d4-640bb1e7f623" Mar 12 04:20:20.254438 containerd[1506]: 2026-03-12 04:20:19.974 [INFO][3931] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" iface="eth0" netns="/var/run/netns/cni-05f22843-923b-508f-95d4-640bb1e7f623" Mar 12 04:20:20.254438 containerd[1506]: 2026-03-12 04:20:19.976 [INFO][3931] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" iface="eth0" netns="/var/run/netns/cni-05f22843-923b-508f-95d4-640bb1e7f623" Mar 12 04:20:20.254438 containerd[1506]: 2026-03-12 04:20:19.976 [INFO][3931] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Mar 12 04:20:20.254438 containerd[1506]: 2026-03-12 04:20:19.976 [INFO][3931] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Mar 12 04:20:20.254438 containerd[1506]: 2026-03-12 04:20:20.162 [INFO][3990] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" HandleID="k8s-pod-network.ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" Mar 12 04:20:20.254438 containerd[1506]: 2026-03-12 04:20:20.162 [INFO][3990] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:20.254438 containerd[1506]: 2026-03-12 04:20:20.204 [INFO][3990] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:20.254438 containerd[1506]: 2026-03-12 04:20:20.218 [WARNING][3990] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" HandleID="k8s-pod-network.ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" Mar 12 04:20:20.254438 containerd[1506]: 2026-03-12 04:20:20.218 [INFO][3990] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" HandleID="k8s-pod-network.ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" Mar 12 04:20:20.254438 containerd[1506]: 2026-03-12 04:20:20.219 [INFO][3990] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:20.254438 containerd[1506]: 2026-03-12 04:20:20.239 [INFO][3931] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Mar 12 04:20:20.254916 kubelet[2658]: I0312 04:20:20.254653 2658 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41" (UID: "ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 04:20:20.255130 containerd[1506]: time="2026-03-12T04:20:20.255104017Z" level=info msg="TearDown network for sandbox \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\" successfully" Mar 12 04:20:20.255196 containerd[1506]: time="2026-03-12T04:20:20.255185115Z" level=info msg="StopPodSandbox for \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\" returns successfully" Mar 12 04:20:20.256173 containerd[1506]: time="2026-03-12T04:20:20.256092537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-cjtcm,Uid:f398c48c-ad6a-4dd1-9a27-01ce627d85a6,Namespace:kube-system,Attempt:1,}" Mar 12 04:20:20.333831 kubelet[2658]: I0312 04:20:20.332421 2658 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r9gg4\" (UniqueName: \"kubernetes.io/projected/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41-kube-api-access-r9gg4\") on node \"srv-tymtb.gb1.brightbox.com\" DevicePath \"\"" Mar 12 04:20:20.333831 kubelet[2658]: I0312 04:20:20.332461 2658 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41-whisker-ca-bundle\") on node \"srv-tymtb.gb1.brightbox.com\" DevicePath \"\"" Mar 12 04:20:20.333831 kubelet[2658]: I0312 04:20:20.332474 2658 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41-nginx-config\") on node \"srv-tymtb.gb1.brightbox.com\" DevicePath \"\"" Mar 12 04:20:20.333831 kubelet[2658]: I0312 04:20:20.332486 2658 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41-whisker-backend-key-pair\") on node \"srv-tymtb.gb1.brightbox.com\" DevicePath \"\"" Mar 12 04:20:20.662140 systemd[1]: Removed slice kubepods-besteffort-podac37d1e3_3df4_4ad2_8cbb_00ba6d1d5e41.slice - libcontainer container kubepods-besteffort-podac37d1e3_3df4_4ad2_8cbb_00ba6d1d5e41.slice. Mar 12 04:20:20.677444 systemd-networkd[1429]: calif176bc62eb1: Link UP Mar 12 04:20:20.679436 systemd-networkd[1429]: calif176bc62eb1: Gained carrier Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.203 [ERROR][4023] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.270 [INFO][4023] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0 calico-kube-controllers-5dd89df469- calico-system 52afdedb-78fc-4262-a140-f3e2656f3c8e 930 0 2026-03-12 04:19:55 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5dd89df469 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-tymtb.gb1.brightbox.com calico-kube-controllers-5dd89df469-vlctc eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calif176bc62eb1 [] [] }} ContainerID="b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" Namespace="calico-system" Pod="calico-kube-controllers-5dd89df469-vlctc" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-" Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.271 [INFO][4023] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" Namespace="calico-system" Pod="calico-kube-controllers-5dd89df469-vlctc" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.485 [INFO][4082] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" HandleID="k8s-pod-network.b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.507 [INFO][4082] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" HandleID="k8s-pod-network.b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ed570), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-tymtb.gb1.brightbox.com", "pod":"calico-kube-controllers-5dd89df469-vlctc", "timestamp":"2026-03-12 04:20:20.485483267 +0000 UTC"}, Hostname:"srv-tymtb.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00038cc60)} Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.508 [INFO][4082] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.508 [INFO][4082] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.508 [INFO][4082] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-tymtb.gb1.brightbox.com' Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.516 [INFO][4082] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.528 [INFO][4082] ipam/ipam.go 409: Looking up existing affinities for host host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.552 [INFO][4082] ipam/ipam.go 526: Trying affinity for 192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.561 [INFO][4082] ipam/ipam.go 160: Attempting to load block cidr=192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.576 [INFO][4082] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.576 [INFO][4082] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.27.192/26 handle="k8s-pod-network.b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.581 [INFO][4082] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4 Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.593 [INFO][4082] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.27.192/26 handle="k8s-pod-network.b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.614 [INFO][4082] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.27.194/26] block=192.168.27.192/26 handle="k8s-pod-network.b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.614 [INFO][4082] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.27.194/26] handle="k8s-pod-network.b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.614 [INFO][4082] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:20.727995 containerd[1506]: 2026-03-12 04:20:20.614 [INFO][4082] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.27.194/26] IPv6=[] ContainerID="b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" HandleID="k8s-pod-network.b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" Mar 12 04:20:20.734716 containerd[1506]: 2026-03-12 04:20:20.636 [INFO][4023] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" Namespace="calico-system" Pod="calico-kube-controllers-5dd89df469-vlctc" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0", GenerateName:"calico-kube-controllers-5dd89df469-", Namespace:"calico-system", SelfLink:"", UID:"52afdedb-78fc-4262-a140-f3e2656f3c8e", ResourceVersion:"930", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5dd89df469", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-5dd89df469-vlctc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.27.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif176bc62eb1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:20.734716 containerd[1506]: 2026-03-12 04:20:20.636 [INFO][4023] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.27.194/32] ContainerID="b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" Namespace="calico-system" Pod="calico-kube-controllers-5dd89df469-vlctc" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" Mar 12 04:20:20.734716 containerd[1506]: 2026-03-12 04:20:20.636 [INFO][4023] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif176bc62eb1 ContainerID="b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" Namespace="calico-system" Pod="calico-kube-controllers-5dd89df469-vlctc" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" Mar 12 04:20:20.734716 containerd[1506]: 2026-03-12 04:20:20.680 [INFO][4023] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" Namespace="calico-system" Pod="calico-kube-controllers-5dd89df469-vlctc" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" Mar 12 04:20:20.734716 containerd[1506]: 2026-03-12 04:20:20.681 [INFO][4023] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" Namespace="calico-system" Pod="calico-kube-controllers-5dd89df469-vlctc" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0", GenerateName:"calico-kube-controllers-5dd89df469-", Namespace:"calico-system", SelfLink:"", UID:"52afdedb-78fc-4262-a140-f3e2656f3c8e", ResourceVersion:"930", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5dd89df469", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4", Pod:"calico-kube-controllers-5dd89df469-vlctc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.27.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif176bc62eb1", MAC:"26:ed:c0:f0:d1:b9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:20.734716 containerd[1506]: 2026-03-12 04:20:20.718 [INFO][4023] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4" Namespace="calico-system" Pod="calico-kube-controllers-5dd89df469-vlctc" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" Mar 12 04:20:20.747351 systemd[1]: run-netns-cni\x2ddbf407b2\x2df0eb\x2dba10\x2d71b5\x2db77045d22380.mount: Deactivated successfully. Mar 12 04:20:20.749681 systemd[1]: run-netns-cni\x2d74c57dc1\x2d203f\x2d8218\x2ddc04\x2dbfb07c6e503d.mount: Deactivated successfully. Mar 12 04:20:20.749751 systemd[1]: run-netns-cni\x2df0756e95\x2d2c6b\x2d62c2\x2df169\x2d0f77af1b48f0.mount: Deactivated successfully. Mar 12 04:20:20.749806 systemd[1]: run-netns-cni\x2d05f22843\x2d923b\x2d508f\x2d95d4\x2d640bb1e7f623.mount: Deactivated successfully. Mar 12 04:20:20.749886 systemd[1]: var-lib-kubelet-pods-ac37d1e3\x2d3df4\x2d4ad2\x2d8cbb\x2d00ba6d1d5e41-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dr9gg4.mount: Deactivated successfully. Mar 12 04:20:20.749957 systemd[1]: var-lib-kubelet-pods-ac37d1e3\x2d3df4\x2d4ad2\x2d8cbb\x2d00ba6d1d5e41-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 12 04:20:20.858475 systemd-networkd[1429]: calib3df51d4be5: Link UP Mar 12 04:20:20.858729 systemd-networkd[1429]: calib3df51d4be5: Gained carrier Mar 12 04:20:20.900537 containerd[1506]: time="2026-03-12T04:20:20.900254352Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:20:20.900537 containerd[1506]: time="2026-03-12T04:20:20.900322428Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:20:20.900537 containerd[1506]: time="2026-03-12T04:20:20.900340049Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:20:20.902863 containerd[1506]: time="2026-03-12T04:20:20.901072383Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:20:20.929519 systemd[1]: Created slice kubepods-besteffort-podab7c989d_0e78_45d0_9b17_74ab0f400fc5.slice - libcontainer container kubepods-besteffort-podab7c989d_0e78_45d0_9b17_74ab0f400fc5.slice. Mar 12 04:20:20.937466 kubelet[2658]: I0312 04:20:20.936987 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab7c989d-0e78-45d0-9b17-74ab0f400fc5-whisker-ca-bundle\") pod \"whisker-6f6f86d876-6cvcx\" (UID: \"ab7c989d-0e78-45d0-9b17-74ab0f400fc5\") " pod="calico-system/whisker-6f6f86d876-6cvcx" Mar 12 04:20:20.937466 kubelet[2658]: I0312 04:20:20.937052 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ab7c989d-0e78-45d0-9b17-74ab0f400fc5-whisker-backend-key-pair\") pod \"whisker-6f6f86d876-6cvcx\" (UID: \"ab7c989d-0e78-45d0-9b17-74ab0f400fc5\") " pod="calico-system/whisker-6f6f86d876-6cvcx" Mar 12 04:20:20.937466 kubelet[2658]: I0312 04:20:20.937073 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/ab7c989d-0e78-45d0-9b17-74ab0f400fc5-nginx-config\") pod \"whisker-6f6f86d876-6cvcx\" (UID: \"ab7c989d-0e78-45d0-9b17-74ab0f400fc5\") " pod="calico-system/whisker-6f6f86d876-6cvcx" Mar 12 04:20:20.937466 kubelet[2658]: I0312 04:20:20.937090 2658 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rq6f\" (UniqueName: \"kubernetes.io/projected/ab7c989d-0e78-45d0-9b17-74ab0f400fc5-kube-api-access-8rq6f\") pod \"whisker-6f6f86d876-6cvcx\" (UID: \"ab7c989d-0e78-45d0-9b17-74ab0f400fc5\") " pod="calico-system/whisker-6f6f86d876-6cvcx" Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.268 [ERROR][4036] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.320 [INFO][4036] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0 goldmane-5b85766d88- calico-system 8dd5d5e8-c785-4698-ab0f-5124958e0b67 932 0 2026-03-12 04:19:54 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:5b85766d88 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s srv-tymtb.gb1.brightbox.com goldmane-5b85766d88-nx574 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calib3df51d4be5 [] [] }} ContainerID="5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" Namespace="calico-system" Pod="goldmane-5b85766d88-nx574" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-" Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.321 [INFO][4036] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" Namespace="calico-system" Pod="goldmane-5b85766d88-nx574" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.561 [INFO][4111] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" HandleID="k8s-pod-network.5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" Workload="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.576 [INFO][4111] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" HandleID="k8s-pod-network.5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" Workload="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e4760), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-tymtb.gb1.brightbox.com", "pod":"goldmane-5b85766d88-nx574", "timestamp":"2026-03-12 04:20:20.561595411 +0000 UTC"}, Hostname:"srv-tymtb.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0000f4dc0)} Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.576 [INFO][4111] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.623 [INFO][4111] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.623 [INFO][4111] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-tymtb.gb1.brightbox.com' Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.629 [INFO][4111] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.644 [INFO][4111] ipam/ipam.go 409: Looking up existing affinities for host host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.720 [INFO][4111] ipam/ipam.go 526: Trying affinity for 192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.745 [INFO][4111] ipam/ipam.go 160: Attempting to load block cidr=192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.766 [INFO][4111] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.766 [INFO][4111] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.27.192/26 handle="k8s-pod-network.5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.772 [INFO][4111] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.801 [INFO][4111] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.27.192/26 handle="k8s-pod-network.5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.832 [INFO][4111] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.27.195/26] block=192.168.27.192/26 handle="k8s-pod-network.5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.832 [INFO][4111] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.27.195/26] handle="k8s-pod-network.5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.832 [INFO][4111] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:20.979728 containerd[1506]: 2026-03-12 04:20:20.832 [INFO][4111] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.27.195/26] IPv6=[] ContainerID="5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" HandleID="k8s-pod-network.5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" Workload="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" Mar 12 04:20:20.982569 containerd[1506]: 2026-03-12 04:20:20.853 [INFO][4036] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" Namespace="calico-system" Pod="goldmane-5b85766d88-nx574" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"8dd5d5e8-c785-4698-ab0f-5124958e0b67", ResourceVersion:"932", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"", Pod:"goldmane-5b85766d88-nx574", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.27.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib3df51d4be5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:20.982569 containerd[1506]: 2026-03-12 04:20:20.854 [INFO][4036] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.27.195/32] ContainerID="5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" Namespace="calico-system" Pod="goldmane-5b85766d88-nx574" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" Mar 12 04:20:20.982569 containerd[1506]: 2026-03-12 04:20:20.854 [INFO][4036] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib3df51d4be5 ContainerID="5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" Namespace="calico-system" Pod="goldmane-5b85766d88-nx574" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" Mar 12 04:20:20.982569 containerd[1506]: 2026-03-12 04:20:20.860 [INFO][4036] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" Namespace="calico-system" Pod="goldmane-5b85766d88-nx574" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" Mar 12 04:20:20.982569 containerd[1506]: 2026-03-12 04:20:20.861 [INFO][4036] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" Namespace="calico-system" Pod="goldmane-5b85766d88-nx574" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"8dd5d5e8-c785-4698-ab0f-5124958e0b67", ResourceVersion:"932", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb", Pod:"goldmane-5b85766d88-nx574", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.27.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib3df51d4be5", MAC:"ae:c2:cb:c0:33:b0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:20.982569 containerd[1506]: 2026-03-12 04:20:20.968 [INFO][4036] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb" Namespace="calico-system" Pod="goldmane-5b85766d88-nx574" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" Mar 12 04:20:21.000033 systemd[1]: Started cri-containerd-b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4.scope - libcontainer container b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4. Mar 12 04:20:21.083010 systemd-networkd[1429]: cali5d778c85830: Gained IPv6LL Mar 12 04:20:21.095981 containerd[1506]: time="2026-03-12T04:20:21.095883049Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:20:21.096117 containerd[1506]: time="2026-03-12T04:20:21.095961484Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:20:21.096117 containerd[1506]: time="2026-03-12T04:20:21.095974210Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:20:21.096117 containerd[1506]: time="2026-03-12T04:20:21.096088153Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:20:21.134036 systemd[1]: Started cri-containerd-5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb.scope - libcontainer container 5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb. Mar 12 04:20:21.136696 systemd-networkd[1429]: cali0de2a8c1690: Link UP Mar 12 04:20:21.147606 systemd-networkd[1429]: cali0de2a8c1690: Gained carrier Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:20.380 [ERROR][4075] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:20.428 [INFO][4075] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0 coredns-674b8bbfcf- kube-system f398c48c-ad6a-4dd1-9a27-01ce627d85a6 935 0 2026-03-12 04:19:41 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-tymtb.gb1.brightbox.com coredns-674b8bbfcf-cjtcm eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali0de2a8c1690 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" Namespace="kube-system" Pod="coredns-674b8bbfcf-cjtcm" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-" Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:20.428 [INFO][4075] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" Namespace="kube-system" Pod="coredns-674b8bbfcf-cjtcm" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:20.691 [INFO][4157] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" HandleID="k8s-pod-network.1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:20.703 [INFO][4157] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" HandleID="k8s-pod-network.1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000125a70), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-tymtb.gb1.brightbox.com", "pod":"coredns-674b8bbfcf-cjtcm", "timestamp":"2026-03-12 04:20:20.691606745 +0000 UTC"}, Hostname:"srv-tymtb.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0004a8580)} Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:20.703 [INFO][4157] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:20.834 [INFO][4157] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:20.834 [INFO][4157] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-tymtb.gb1.brightbox.com' Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:20.847 [INFO][4157] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:20.971 [INFO][4157] ipam/ipam.go 409: Looking up existing affinities for host host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:21.005 [INFO][4157] ipam/ipam.go 526: Trying affinity for 192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:21.015 [INFO][4157] ipam/ipam.go 160: Attempting to load block cidr=192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:21.039 [INFO][4157] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:21.039 [INFO][4157] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.27.192/26 handle="k8s-pod-network.1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:21.057 [INFO][4157] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317 Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:21.074 [INFO][4157] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.27.192/26 handle="k8s-pod-network.1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:21.111 [INFO][4157] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.27.196/26] block=192.168.27.192/26 handle="k8s-pod-network.1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:21.112 [INFO][4157] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.27.196/26] handle="k8s-pod-network.1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:21.113 [INFO][4157] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:21.179931 containerd[1506]: 2026-03-12 04:20:21.113 [INFO][4157] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.27.196/26] IPv6=[] ContainerID="1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" HandleID="k8s-pod-network.1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" Mar 12 04:20:21.180667 containerd[1506]: 2026-03-12 04:20:21.121 [INFO][4075] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" Namespace="kube-system" Pod="coredns-674b8bbfcf-cjtcm" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"f398c48c-ad6a-4dd1-9a27-01ce627d85a6", ResourceVersion:"935", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"", Pod:"coredns-674b8bbfcf-cjtcm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.27.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0de2a8c1690", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:21.180667 containerd[1506]: 2026-03-12 04:20:21.121 [INFO][4075] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.27.196/32] ContainerID="1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" Namespace="kube-system" Pod="coredns-674b8bbfcf-cjtcm" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" Mar 12 04:20:21.180667 containerd[1506]: 2026-03-12 04:20:21.121 [INFO][4075] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0de2a8c1690 ContainerID="1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" Namespace="kube-system" Pod="coredns-674b8bbfcf-cjtcm" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" Mar 12 04:20:21.180667 containerd[1506]: 2026-03-12 04:20:21.156 [INFO][4075] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" Namespace="kube-system" Pod="coredns-674b8bbfcf-cjtcm" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" Mar 12 04:20:21.180667 containerd[1506]: 2026-03-12 04:20:21.161 [INFO][4075] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" Namespace="kube-system" Pod="coredns-674b8bbfcf-cjtcm" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"f398c48c-ad6a-4dd1-9a27-01ce627d85a6", ResourceVersion:"935", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317", Pod:"coredns-674b8bbfcf-cjtcm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.27.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0de2a8c1690", MAC:"8e:f8:8d:af:b4:e8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:21.180667 containerd[1506]: 2026-03-12 04:20:21.176 [INFO][4075] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317" Namespace="kube-system" Pod="coredns-674b8bbfcf-cjtcm" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" Mar 12 04:20:21.238232 systemd-networkd[1429]: cali9c55416e10c: Link UP Mar 12 04:20:21.245714 containerd[1506]: time="2026-03-12T04:20:21.237724227Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:20:21.245714 containerd[1506]: time="2026-03-12T04:20:21.240128656Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:20:21.245714 containerd[1506]: time="2026-03-12T04:20:21.240145209Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:20:21.245714 containerd[1506]: time="2026-03-12T04:20:21.240259227Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:20:21.249163 systemd-networkd[1429]: cali9c55416e10c: Gained carrier Mar 12 04:20:21.256414 containerd[1506]: time="2026-03-12T04:20:21.256030950Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f6f86d876-6cvcx,Uid:ab7c989d-0e78-45d0-9b17-74ab0f400fc5,Namespace:calico-system,Attempt:0,}" Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:20.342 [ERROR][4011] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:20.396 [INFO][4011] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0 calico-apiserver-55c7dc45f- calico-system 78f8c939-b01e-4526-98a1-09241895ac3e 928 0 2026-03-12 04:19:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:55c7dc45f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-tymtb.gb1.brightbox.com calico-apiserver-55c7dc45f-llqv4 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali9c55416e10c [] [] }} ContainerID="eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" Namespace="calico-system" Pod="calico-apiserver-55c7dc45f-llqv4" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-" Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:20.396 [INFO][4011] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" Namespace="calico-system" Pod="calico-apiserver-55c7dc45f-llqv4" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:20.599 [INFO][4149] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" HandleID="k8s-pod-network.eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:20.707 [INFO][4149] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" HandleID="k8s-pod-network.eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002b89e0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-tymtb.gb1.brightbox.com", "pod":"calico-apiserver-55c7dc45f-llqv4", "timestamp":"2026-03-12 04:20:20.599763917 +0000 UTC"}, Hostname:"srv-tymtb.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0002ed340)} Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:20.709 [INFO][4149] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:21.113 [INFO][4149] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:21.113 [INFO][4149] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-tymtb.gb1.brightbox.com' Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:21.127 [INFO][4149] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:21.159 [INFO][4149] ipam/ipam.go 409: Looking up existing affinities for host host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:21.168 [INFO][4149] ipam/ipam.go 526: Trying affinity for 192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:21.172 [INFO][4149] ipam/ipam.go 160: Attempting to load block cidr=192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:21.179 [INFO][4149] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:21.179 [INFO][4149] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.27.192/26 handle="k8s-pod-network.eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:21.183 [INFO][4149] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:21.206 [INFO][4149] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.27.192/26 handle="k8s-pod-network.eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:21.222 [INFO][4149] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.27.197/26] block=192.168.27.192/26 handle="k8s-pod-network.eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:21.222 [INFO][4149] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.27.197/26] handle="k8s-pod-network.eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:21.222 [INFO][4149] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:21.301253 containerd[1506]: 2026-03-12 04:20:21.222 [INFO][4149] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.27.197/26] IPv6=[] ContainerID="eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" HandleID="k8s-pod-network.eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" Mar 12 04:20:21.301047 systemd[1]: Started cri-containerd-1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317.scope - libcontainer container 1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317. Mar 12 04:20:21.303710 containerd[1506]: 2026-03-12 04:20:21.228 [INFO][4011] cni-plugin/k8s.go 418: Populated endpoint ContainerID="eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" Namespace="calico-system" Pod="calico-apiserver-55c7dc45f-llqv4" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0", GenerateName:"calico-apiserver-55c7dc45f-", Namespace:"calico-system", SelfLink:"", UID:"78f8c939-b01e-4526-98a1-09241895ac3e", ResourceVersion:"928", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55c7dc45f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-55c7dc45f-llqv4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.27.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali9c55416e10c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:21.303710 containerd[1506]: 2026-03-12 04:20:21.232 [INFO][4011] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.27.197/32] ContainerID="eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" Namespace="calico-system" Pod="calico-apiserver-55c7dc45f-llqv4" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" Mar 12 04:20:21.303710 containerd[1506]: 2026-03-12 04:20:21.233 [INFO][4011] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9c55416e10c ContainerID="eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" Namespace="calico-system" Pod="calico-apiserver-55c7dc45f-llqv4" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" Mar 12 04:20:21.303710 containerd[1506]: 2026-03-12 04:20:21.251 [INFO][4011] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" Namespace="calico-system" Pod="calico-apiserver-55c7dc45f-llqv4" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" Mar 12 04:20:21.303710 containerd[1506]: 2026-03-12 04:20:21.252 [INFO][4011] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" Namespace="calico-system" Pod="calico-apiserver-55c7dc45f-llqv4" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0", GenerateName:"calico-apiserver-55c7dc45f-", Namespace:"calico-system", SelfLink:"", UID:"78f8c939-b01e-4526-98a1-09241895ac3e", ResourceVersion:"928", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55c7dc45f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd", Pod:"calico-apiserver-55c7dc45f-llqv4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.27.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali9c55416e10c", MAC:"b6:19:07:5e:fe:b1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:21.303710 containerd[1506]: 2026-03-12 04:20:21.283 [INFO][4011] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd" Namespace="calico-system" Pod="calico-apiserver-55c7dc45f-llqv4" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" Mar 12 04:20:21.385495 systemd-networkd[1429]: cali0f69ceb4be9: Link UP Mar 12 04:20:21.386695 systemd-networkd[1429]: cali0f69ceb4be9: Gained carrier Mar 12 04:20:21.389368 containerd[1506]: time="2026-03-12T04:20:21.388711841Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:20:21.389368 containerd[1506]: time="2026-03-12T04:20:21.388795373Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:20:21.389368 containerd[1506]: time="2026-03-12T04:20:21.388812734Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:20:21.389550 containerd[1506]: time="2026-03-12T04:20:21.389057447Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:20:21.438063 systemd[1]: Started cri-containerd-eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd.scope - libcontainer container eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd. Mar 12 04:20:21.446709 containerd[1506]: time="2026-03-12T04:20:21.446337994Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-cjtcm,Uid:f398c48c-ad6a-4dd1-9a27-01ce627d85a6,Namespace:kube-system,Attempt:1,} returns sandbox id \"1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317\"" Mar 12 04:20:21.471260 containerd[1506]: time="2026-03-12T04:20:21.471221294Z" level=info msg="CreateContainer within sandbox \"1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:20.369 [ERROR][4065] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:20.394 [INFO][4065] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0 calico-apiserver-55c7dc45f- calico-system 6f24fc37-2e2e-4c28-be28-717ae5582f31 936 0 2026-03-12 04:19:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:55c7dc45f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-tymtb.gb1.brightbox.com calico-apiserver-55c7dc45f-pfllz eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali0f69ceb4be9 [] [] }} ContainerID="d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" Namespace="calico-system" Pod="calico-apiserver-55c7dc45f-pfllz" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-" Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:20.394 [INFO][4065] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" Namespace="calico-system" Pod="calico-apiserver-55c7dc45f-pfllz" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:20.651 [INFO][4138] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" HandleID="k8s-pod-network.d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:20.765 [INFO][4138] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" HandleID="k8s-pod-network.d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00036c8c0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-tymtb.gb1.brightbox.com", "pod":"calico-apiserver-55c7dc45f-pfllz", "timestamp":"2026-03-12 04:20:20.651213326 +0000 UTC"}, Hostname:"srv-tymtb.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0004ec160)} Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:20.765 [INFO][4138] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:21.223 [INFO][4138] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:21.223 [INFO][4138] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-tymtb.gb1.brightbox.com' Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:21.229 [INFO][4138] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:21.265 [INFO][4138] ipam/ipam.go 409: Looking up existing affinities for host host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:21.286 [INFO][4138] ipam/ipam.go 526: Trying affinity for 192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:21.297 [INFO][4138] ipam/ipam.go 160: Attempting to load block cidr=192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:21.312 [INFO][4138] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:21.312 [INFO][4138] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.27.192/26 handle="k8s-pod-network.d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:21.317 [INFO][4138] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:21.337 [INFO][4138] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.27.192/26 handle="k8s-pod-network.d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:21.354 [INFO][4138] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.27.198/26] block=192.168.27.192/26 handle="k8s-pod-network.d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:21.355 [INFO][4138] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.27.198/26] handle="k8s-pod-network.d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:21.356 [INFO][4138] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:21.516321 containerd[1506]: 2026-03-12 04:20:21.356 [INFO][4138] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.27.198/26] IPv6=[] ContainerID="d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" HandleID="k8s-pod-network.d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" Mar 12 04:20:21.518008 containerd[1506]: 2026-03-12 04:20:21.380 [INFO][4065] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" Namespace="calico-system" Pod="calico-apiserver-55c7dc45f-pfllz" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0", GenerateName:"calico-apiserver-55c7dc45f-", Namespace:"calico-system", SelfLink:"", UID:"6f24fc37-2e2e-4c28-be28-717ae5582f31", ResourceVersion:"936", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55c7dc45f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-55c7dc45f-pfllz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.27.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali0f69ceb4be9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:21.518008 containerd[1506]: 2026-03-12 04:20:21.380 [INFO][4065] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.27.198/32] ContainerID="d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" Namespace="calico-system" Pod="calico-apiserver-55c7dc45f-pfllz" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" Mar 12 04:20:21.518008 containerd[1506]: 2026-03-12 04:20:21.380 [INFO][4065] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0f69ceb4be9 ContainerID="d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" Namespace="calico-system" Pod="calico-apiserver-55c7dc45f-pfllz" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" Mar 12 04:20:21.518008 containerd[1506]: 2026-03-12 04:20:21.387 [INFO][4065] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" Namespace="calico-system" Pod="calico-apiserver-55c7dc45f-pfllz" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" Mar 12 04:20:21.518008 containerd[1506]: 2026-03-12 04:20:21.413 [INFO][4065] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" Namespace="calico-system" Pod="calico-apiserver-55c7dc45f-pfllz" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0", GenerateName:"calico-apiserver-55c7dc45f-", Namespace:"calico-system", SelfLink:"", UID:"6f24fc37-2e2e-4c28-be28-717ae5582f31", ResourceVersion:"936", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55c7dc45f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb", Pod:"calico-apiserver-55c7dc45f-pfllz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.27.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali0f69ceb4be9", MAC:"4e:0f:94:50:67:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:21.518008 containerd[1506]: 2026-03-12 04:20:21.463 [INFO][4065] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb" Namespace="calico-system" Pod="calico-apiserver-55c7dc45f-pfllz" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" Mar 12 04:20:21.535597 systemd-networkd[1429]: cali44fab4f9d33: Link UP Mar 12 04:20:21.535874 systemd-networkd[1429]: cali44fab4f9d33: Gained carrier Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:20.453 [ERROR][4051] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:20.494 [INFO][4051] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0 coredns-674b8bbfcf- kube-system 193981ad-ae1d-40b4-8904-cc2e820c5b91 937 0 2026-03-12 04:19:41 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-tymtb.gb1.brightbox.com coredns-674b8bbfcf-dwqf2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali44fab4f9d33 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" Namespace="kube-system" Pod="coredns-674b8bbfcf-dwqf2" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-" Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:20.495 [INFO][4051] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" Namespace="kube-system" Pod="coredns-674b8bbfcf-dwqf2" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:20.908 [INFO][4175] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" HandleID="k8s-pod-network.f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:20.972 [INFO][4175] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" HandleID="k8s-pod-network.f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e630), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-tymtb.gb1.brightbox.com", "pod":"coredns-674b8bbfcf-dwqf2", "timestamp":"2026-03-12 04:20:20.908542378 +0000 UTC"}, Hostname:"srv-tymtb.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000300580)} Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:20.972 [INFO][4175] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:21.357 [INFO][4175] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:21.357 [INFO][4175] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-tymtb.gb1.brightbox.com' Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:21.375 [INFO][4175] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:21.425 [INFO][4175] ipam/ipam.go 409: Looking up existing affinities for host host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:21.472 [INFO][4175] ipam/ipam.go 526: Trying affinity for 192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:21.478 [INFO][4175] ipam/ipam.go 160: Attempting to load block cidr=192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:21.488 [INFO][4175] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:21.488 [INFO][4175] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.27.192/26 handle="k8s-pod-network.f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:21.491 [INFO][4175] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:21.501 [INFO][4175] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.27.192/26 handle="k8s-pod-network.f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:21.525 [INFO][4175] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.27.199/26] block=192.168.27.192/26 handle="k8s-pod-network.f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:21.525 [INFO][4175] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.27.199/26] handle="k8s-pod-network.f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:21.525 [INFO][4175] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:21.558325 containerd[1506]: 2026-03-12 04:20:21.525 [INFO][4175] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.27.199/26] IPv6=[] ContainerID="f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" HandleID="k8s-pod-network.f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" Mar 12 04:20:21.559116 containerd[1506]: 2026-03-12 04:20:21.529 [INFO][4051] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" Namespace="kube-system" Pod="coredns-674b8bbfcf-dwqf2" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"193981ad-ae1d-40b4-8904-cc2e820c5b91", ResourceVersion:"937", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"", Pod:"coredns-674b8bbfcf-dwqf2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.27.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali44fab4f9d33", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:21.559116 containerd[1506]: 2026-03-12 04:20:21.529 [INFO][4051] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.27.199/32] ContainerID="f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" Namespace="kube-system" Pod="coredns-674b8bbfcf-dwqf2" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" Mar 12 04:20:21.559116 containerd[1506]: 2026-03-12 04:20:21.529 [INFO][4051] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali44fab4f9d33 ContainerID="f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" Namespace="kube-system" Pod="coredns-674b8bbfcf-dwqf2" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" Mar 12 04:20:21.559116 containerd[1506]: 2026-03-12 04:20:21.534 [INFO][4051] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" Namespace="kube-system" Pod="coredns-674b8bbfcf-dwqf2" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" Mar 12 04:20:21.559116 containerd[1506]: 2026-03-12 04:20:21.535 [INFO][4051] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" Namespace="kube-system" Pod="coredns-674b8bbfcf-dwqf2" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"193981ad-ae1d-40b4-8904-cc2e820c5b91", ResourceVersion:"937", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe", Pod:"coredns-674b8bbfcf-dwqf2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.27.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali44fab4f9d33", MAC:"ae:ba:68:83:44:85", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:21.559116 containerd[1506]: 2026-03-12 04:20:21.549 [INFO][4051] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe" Namespace="kube-system" Pod="coredns-674b8bbfcf-dwqf2" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" Mar 12 04:20:21.579190 containerd[1506]: time="2026-03-12T04:20:21.579140055Z" level=info msg="CreateContainer within sandbox \"1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b3ff0caf51533e408d0c7a233f33dbc9a539e6e797836a68ce5a84b1974c6bd4\"" Mar 12 04:20:21.581632 containerd[1506]: time="2026-03-12T04:20:21.581091539Z" level=info msg="StartContainer for \"b3ff0caf51533e408d0c7a233f33dbc9a539e6e797836a68ce5a84b1974c6bd4\"" Mar 12 04:20:21.700535 containerd[1506]: time="2026-03-12T04:20:21.700210732Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5dd89df469-vlctc,Uid:52afdedb-78fc-4262-a140-f3e2656f3c8e,Namespace:calico-system,Attempt:1,} returns sandbox id \"b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4\"" Mar 12 04:20:21.731040 systemd[1]: Started cri-containerd-b3ff0caf51533e408d0c7a233f33dbc9a539e6e797836a68ce5a84b1974c6bd4.scope - libcontainer container b3ff0caf51533e408d0c7a233f33dbc9a539e6e797836a68ce5a84b1974c6bd4. Mar 12 04:20:21.777389 containerd[1506]: time="2026-03-12T04:20:21.775717442Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:20:21.777389 containerd[1506]: time="2026-03-12T04:20:21.775816310Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:20:21.777389 containerd[1506]: time="2026-03-12T04:20:21.775834500Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:20:21.777389 containerd[1506]: time="2026-03-12T04:20:21.775960174Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:20:21.808635 containerd[1506]: time="2026-03-12T04:20:21.808216566Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:20:21.810353 containerd[1506]: time="2026-03-12T04:20:21.808284125Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:20:21.810353 containerd[1506]: time="2026-03-12T04:20:21.809565436Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:20:21.810353 containerd[1506]: time="2026-03-12T04:20:21.809689846Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:20:21.850438 containerd[1506]: time="2026-03-12T04:20:21.849565125Z" level=info msg="StartContainer for \"b3ff0caf51533e408d0c7a233f33dbc9a539e6e797836a68ce5a84b1974c6bd4\" returns successfully" Mar 12 04:20:21.885567 systemd-networkd[1429]: cali3eda39f980a: Link UP Mar 12 04:20:21.887134 systemd-networkd[1429]: cali3eda39f980a: Gained carrier Mar 12 04:20:21.933974 systemd[1]: run-containerd-runc-k8s.io-d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb-runc.vQzrkX.mount: Deactivated successfully. Mar 12 04:20:21.947032 systemd[1]: Started cri-containerd-d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb.scope - libcontainer container d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb. Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.361 [ERROR][4342] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.464 [INFO][4342] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--tymtb.gb1.brightbox.com-k8s-whisker--6f6f86d876--6cvcx-eth0 whisker-6f6f86d876- calico-system ab7c989d-0e78-45d0-9b17-74ab0f400fc5 958 0 2026-03-12 04:20:20 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6f6f86d876 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s srv-tymtb.gb1.brightbox.com whisker-6f6f86d876-6cvcx eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali3eda39f980a [] [] }} ContainerID="0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" Namespace="calico-system" Pod="whisker-6f6f86d876-6cvcx" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-whisker--6f6f86d876--6cvcx-" Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.464 [INFO][4342] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" Namespace="calico-system" Pod="whisker-6f6f86d876-6cvcx" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-whisker--6f6f86d876--6cvcx-eth0" Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.662 [INFO][4419] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" HandleID="k8s-pod-network.0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" Workload="srv--tymtb.gb1.brightbox.com-k8s-whisker--6f6f86d876--6cvcx-eth0" Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.703 [INFO][4419] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" HandleID="k8s-pod-network.0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" Workload="srv--tymtb.gb1.brightbox.com-k8s-whisker--6f6f86d876--6cvcx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000380150), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-tymtb.gb1.brightbox.com", "pod":"whisker-6f6f86d876-6cvcx", "timestamp":"2026-03-12 04:20:21.662642449 +0000 UTC"}, Hostname:"srv-tymtb.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0002e4000)} Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.703 [INFO][4419] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.704 [INFO][4419] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.706 [INFO][4419] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-tymtb.gb1.brightbox.com' Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.741 [INFO][4419] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.777 [INFO][4419] ipam/ipam.go 409: Looking up existing affinities for host host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.801 [INFO][4419] ipam/ipam.go 526: Trying affinity for 192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.806 [INFO][4419] ipam/ipam.go 160: Attempting to load block cidr=192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.811 [INFO][4419] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.27.192/26 host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.811 [INFO][4419] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.27.192/26 handle="k8s-pod-network.0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.819 [INFO][4419] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.833 [INFO][4419] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.27.192/26 handle="k8s-pod-network.0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.863 [INFO][4419] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.27.200/26] block=192.168.27.192/26 handle="k8s-pod-network.0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.863 [INFO][4419] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.27.200/26] handle="k8s-pod-network.0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" host="srv-tymtb.gb1.brightbox.com" Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.863 [INFO][4419] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:21.948203 containerd[1506]: 2026-03-12 04:20:21.863 [INFO][4419] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.27.200/26] IPv6=[] ContainerID="0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" HandleID="k8s-pod-network.0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" Workload="srv--tymtb.gb1.brightbox.com-k8s-whisker--6f6f86d876--6cvcx-eth0" Mar 12 04:20:21.953393 containerd[1506]: 2026-03-12 04:20:21.877 [INFO][4342] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" Namespace="calico-system" Pod="whisker-6f6f86d876-6cvcx" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-whisker--6f6f86d876--6cvcx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-whisker--6f6f86d876--6cvcx-eth0", GenerateName:"whisker-6f6f86d876-", Namespace:"calico-system", SelfLink:"", UID:"ab7c989d-0e78-45d0-9b17-74ab0f400fc5", ResourceVersion:"958", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 20, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6f6f86d876", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"", Pod:"whisker-6f6f86d876-6cvcx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.27.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3eda39f980a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:21.953393 containerd[1506]: 2026-03-12 04:20:21.878 [INFO][4342] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.27.200/32] ContainerID="0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" Namespace="calico-system" Pod="whisker-6f6f86d876-6cvcx" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-whisker--6f6f86d876--6cvcx-eth0" Mar 12 04:20:21.953393 containerd[1506]: 2026-03-12 04:20:21.878 [INFO][4342] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3eda39f980a ContainerID="0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" Namespace="calico-system" Pod="whisker-6f6f86d876-6cvcx" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-whisker--6f6f86d876--6cvcx-eth0" Mar 12 04:20:21.953393 containerd[1506]: 2026-03-12 04:20:21.888 [INFO][4342] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" Namespace="calico-system" Pod="whisker-6f6f86d876-6cvcx" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-whisker--6f6f86d876--6cvcx-eth0" Mar 12 04:20:21.953393 containerd[1506]: 2026-03-12 04:20:21.889 [INFO][4342] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" Namespace="calico-system" Pod="whisker-6f6f86d876-6cvcx" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-whisker--6f6f86d876--6cvcx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-whisker--6f6f86d876--6cvcx-eth0", GenerateName:"whisker-6f6f86d876-", Namespace:"calico-system", SelfLink:"", UID:"ab7c989d-0e78-45d0-9b17-74ab0f400fc5", ResourceVersion:"958", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 20, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6f6f86d876", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d", Pod:"whisker-6f6f86d876-6cvcx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.27.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3eda39f980a", MAC:"3e:ba:5d:56:b3:25", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:21.953393 containerd[1506]: 2026-03-12 04:20:21.920 [INFO][4342] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d" Namespace="calico-system" Pod="whisker-6f6f86d876-6cvcx" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-whisker--6f6f86d876--6cvcx-eth0" Mar 12 04:20:21.986202 systemd[1]: Started cri-containerd-f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe.scope - libcontainer container f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe. Mar 12 04:20:22.007872 containerd[1506]: time="2026-03-12T04:20:22.006979418Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-nx574,Uid:8dd5d5e8-c785-4698-ab0f-5124958e0b67,Namespace:calico-system,Attempt:1,} returns sandbox id \"5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb\"" Mar 12 04:20:22.055478 containerd[1506]: time="2026-03-12T04:20:22.055123685Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 12 04:20:22.055478 containerd[1506]: time="2026-03-12T04:20:22.055186248Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 12 04:20:22.057733 containerd[1506]: time="2026-03-12T04:20:22.055197825Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:20:22.057733 containerd[1506]: time="2026-03-12T04:20:22.055716933Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 12 04:20:22.100054 kubelet[2658]: I0312 04:20:22.099815 2658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 04:20:22.101474 systemd[1]: Started cri-containerd-0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d.scope - libcontainer container 0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d. Mar 12 04:20:22.166205 kubelet[2658]: I0312 04:20:22.164728 2658 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41" path="/var/lib/kubelet/pods/ac37d1e3-3df4-4ad2-8cbb-00ba6d1d5e41/volumes" Mar 12 04:20:22.172140 containerd[1506]: time="2026-03-12T04:20:22.172091174Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dwqf2,Uid:193981ad-ae1d-40b4-8904-cc2e820c5b91,Namespace:kube-system,Attempt:1,} returns sandbox id \"f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe\"" Mar 12 04:20:22.179038 containerd[1506]: time="2026-03-12T04:20:22.178997869Z" level=info msg="CreateContainer within sandbox \"f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 12 04:20:22.193741 containerd[1506]: time="2026-03-12T04:20:22.193703482Z" level=info msg="CreateContainer within sandbox \"f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"0128129d3013ccea13976c613d7d3fe5e7b2701cd76b293f6d98eb2f00c71950\"" Mar 12 04:20:22.194778 containerd[1506]: time="2026-03-12T04:20:22.194747309Z" level=info msg="StartContainer for \"0128129d3013ccea13976c613d7d3fe5e7b2701cd76b293f6d98eb2f00c71950\"" Mar 12 04:20:22.320328 systemd[1]: Started cri-containerd-0128129d3013ccea13976c613d7d3fe5e7b2701cd76b293f6d98eb2f00c71950.scope - libcontainer container 0128129d3013ccea13976c613d7d3fe5e7b2701cd76b293f6d98eb2f00c71950. Mar 12 04:20:22.357574 containerd[1506]: time="2026-03-12T04:20:22.357071197Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55c7dc45f-llqv4,Uid:78f8c939-b01e-4526-98a1-09241895ac3e,Namespace:calico-system,Attempt:1,} returns sandbox id \"eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd\"" Mar 12 04:20:22.379056 containerd[1506]: time="2026-03-12T04:20:22.378378064Z" level=info msg="StartContainer for \"0128129d3013ccea13976c613d7d3fe5e7b2701cd76b293f6d98eb2f00c71950\" returns successfully" Mar 12 04:20:22.426443 containerd[1506]: time="2026-03-12T04:20:22.426396878Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:22.427443 containerd[1506]: time="2026-03-12T04:20:22.427399165Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Mar 12 04:20:22.428360 containerd[1506]: time="2026-03-12T04:20:22.428303952Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:22.432350 containerd[1506]: time="2026-03-12T04:20:22.432303469Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:22.433617 containerd[1506]: time="2026-03-12T04:20:22.433433737Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 2.653410688s" Mar 12 04:20:22.433617 containerd[1506]: time="2026-03-12T04:20:22.433465562Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Mar 12 04:20:22.436284 containerd[1506]: time="2026-03-12T04:20:22.436028180Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 12 04:20:22.438763 containerd[1506]: time="2026-03-12T04:20:22.438732116Z" level=info msg="CreateContainer within sandbox \"03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 12 04:20:22.471047 containerd[1506]: time="2026-03-12T04:20:22.470927501Z" level=info msg="CreateContainer within sandbox \"03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"55693222ba803278c5c96420abebe43af7ddc0a6205b4fdf5388df1aa9422f48\"" Mar 12 04:20:22.472104 containerd[1506]: time="2026-03-12T04:20:22.472072104Z" level=info msg="StartContainer for \"55693222ba803278c5c96420abebe43af7ddc0a6205b4fdf5388df1aa9422f48\"" Mar 12 04:20:22.484149 systemd-networkd[1429]: calib3df51d4be5: Gained IPv6LL Mar 12 04:20:22.560324 containerd[1506]: time="2026-03-12T04:20:22.560286426Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55c7dc45f-pfllz,Uid:6f24fc37-2e2e-4c28-be28-717ae5582f31,Namespace:calico-system,Attempt:1,} returns sandbox id \"d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb\"" Mar 12 04:20:22.579548 systemd[1]: Started cri-containerd-55693222ba803278c5c96420abebe43af7ddc0a6205b4fdf5388df1aa9422f48.scope - libcontainer container 55693222ba803278c5c96420abebe43af7ddc0a6205b4fdf5388df1aa9422f48. Mar 12 04:20:22.589587 containerd[1506]: time="2026-03-12T04:20:22.589298914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f6f86d876-6cvcx,Uid:ab7c989d-0e78-45d0-9b17-74ab0f400fc5,Namespace:calico-system,Attempt:0,} returns sandbox id \"0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d\"" Mar 12 04:20:22.635557 containerd[1506]: time="2026-03-12T04:20:22.635515397Z" level=info msg="StartContainer for \"55693222ba803278c5c96420abebe43af7ddc0a6205b4fdf5388df1aa9422f48\" returns successfully" Mar 12 04:20:22.676753 systemd-networkd[1429]: calif176bc62eb1: Gained IPv6LL Mar 12 04:20:22.757409 kubelet[2658]: I0312 04:20:22.756400 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-dwqf2" podStartSLOduration=41.756377271 podStartE2EDuration="41.756377271s" podCreationTimestamp="2026-03-12 04:19:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 04:20:22.741267576 +0000 UTC m=+46.721112403" watchObservedRunningTime="2026-03-12 04:20:22.756377271 +0000 UTC m=+46.736222098" Mar 12 04:20:22.868074 systemd-networkd[1429]: cali44fab4f9d33: Gained IPv6LL Mar 12 04:20:22.933179 systemd-networkd[1429]: cali0de2a8c1690: Gained IPv6LL Mar 12 04:20:23.060257 systemd-networkd[1429]: cali9c55416e10c: Gained IPv6LL Mar 12 04:20:23.098945 kernel: calico-node[4121]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 12 04:20:23.125073 systemd-networkd[1429]: cali0f69ceb4be9: Gained IPv6LL Mar 12 04:20:23.380077 systemd-networkd[1429]: cali3eda39f980a: Gained IPv6LL Mar 12 04:20:24.106258 systemd-networkd[1429]: vxlan.calico: Link UP Mar 12 04:20:24.106267 systemd-networkd[1429]: vxlan.calico: Gained carrier Mar 12 04:20:25.622262 systemd-networkd[1429]: vxlan.calico: Gained IPv6LL Mar 12 04:20:27.226269 containerd[1506]: time="2026-03-12T04:20:27.226143353Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:27.229230 containerd[1506]: time="2026-03-12T04:20:27.229128273Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Mar 12 04:20:27.231145 containerd[1506]: time="2026-03-12T04:20:27.231088689Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:27.234570 containerd[1506]: time="2026-03-12T04:20:27.234452628Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:27.235421 containerd[1506]: time="2026-03-12T04:20:27.235315570Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 4.798924449s" Mar 12 04:20:27.235421 containerd[1506]: time="2026-03-12T04:20:27.235358598Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Mar 12 04:20:27.238444 containerd[1506]: time="2026-03-12T04:20:27.238410557Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 12 04:20:27.289062 containerd[1506]: time="2026-03-12T04:20:27.288955989Z" level=info msg="CreateContainer within sandbox \"b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 12 04:20:27.347669 containerd[1506]: time="2026-03-12T04:20:27.347613928Z" level=info msg="CreateContainer within sandbox \"b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"913ef32cd4a4f20c9593804a9d5733677374bd134dec397122de1d7e21599495\"" Mar 12 04:20:27.348750 containerd[1506]: time="2026-03-12T04:20:27.348671059Z" level=info msg="StartContainer for \"913ef32cd4a4f20c9593804a9d5733677374bd134dec397122de1d7e21599495\"" Mar 12 04:20:27.500047 systemd[1]: Started cri-containerd-913ef32cd4a4f20c9593804a9d5733677374bd134dec397122de1d7e21599495.scope - libcontainer container 913ef32cd4a4f20c9593804a9d5733677374bd134dec397122de1d7e21599495. Mar 12 04:20:27.556717 containerd[1506]: time="2026-03-12T04:20:27.556601242Z" level=info msg="StartContainer for \"913ef32cd4a4f20c9593804a9d5733677374bd134dec397122de1d7e21599495\" returns successfully" Mar 12 04:20:28.009481 kubelet[2658]: I0312 04:20:28.006635 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5dd89df469-vlctc" podStartSLOduration=27.478296696 podStartE2EDuration="33.006525391s" podCreationTimestamp="2026-03-12 04:19:55 +0000 UTC" firstStartedPulling="2026-03-12 04:20:21.709022413 +0000 UTC m=+45.688867225" lastFinishedPulling="2026-03-12 04:20:27.237251093 +0000 UTC m=+51.217095920" observedRunningTime="2026-03-12 04:20:27.997750069 +0000 UTC m=+51.977594884" watchObservedRunningTime="2026-03-12 04:20:28.006525391 +0000 UTC m=+51.986370204" Mar 12 04:20:28.009481 kubelet[2658]: I0312 04:20:28.007413 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-cjtcm" podStartSLOduration=47.007395225 podStartE2EDuration="47.007395225s" podCreationTimestamp="2026-03-12 04:19:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 04:20:22.785464607 +0000 UTC m=+46.765309444" watchObservedRunningTime="2026-03-12 04:20:28.007395225 +0000 UTC m=+51.987240039" Mar 12 04:20:31.809230 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1320995852.mount: Deactivated successfully. Mar 12 04:20:32.349750 containerd[1506]: time="2026-03-12T04:20:32.349548941Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:32.361336 containerd[1506]: time="2026-03-12T04:20:32.361175053Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Mar 12 04:20:32.362283 containerd[1506]: time="2026-03-12T04:20:32.362202270Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:32.365020 containerd[1506]: time="2026-03-12T04:20:32.364972259Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:32.365942 containerd[1506]: time="2026-03-12T04:20:32.365771791Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 5.127319148s" Mar 12 04:20:32.365942 containerd[1506]: time="2026-03-12T04:20:32.365815930Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Mar 12 04:20:32.367356 containerd[1506]: time="2026-03-12T04:20:32.367335330Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 12 04:20:32.373714 containerd[1506]: time="2026-03-12T04:20:32.373674938Z" level=info msg="CreateContainer within sandbox \"5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 12 04:20:32.388231 containerd[1506]: time="2026-03-12T04:20:32.388194457Z" level=info msg="CreateContainer within sandbox \"5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"6fd160ebd75503cbb342c381cea5984c6d3aed6d30bc894937dff7928553e512\"" Mar 12 04:20:32.391199 containerd[1506]: time="2026-03-12T04:20:32.390048226Z" level=info msg="StartContainer for \"6fd160ebd75503cbb342c381cea5984c6d3aed6d30bc894937dff7928553e512\"" Mar 12 04:20:32.498058 systemd[1]: Started cri-containerd-6fd160ebd75503cbb342c381cea5984c6d3aed6d30bc894937dff7928553e512.scope - libcontainer container 6fd160ebd75503cbb342c381cea5984c6d3aed6d30bc894937dff7928553e512. Mar 12 04:20:32.560048 containerd[1506]: time="2026-03-12T04:20:32.559939480Z" level=info msg="StartContainer for \"6fd160ebd75503cbb342c381cea5984c6d3aed6d30bc894937dff7928553e512\" returns successfully" Mar 12 04:20:33.044708 kubelet[2658]: I0312 04:20:33.043863 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-5b85766d88-nx574" podStartSLOduration=28.690615405 podStartE2EDuration="39.04380058s" podCreationTimestamp="2026-03-12 04:19:54 +0000 UTC" firstStartedPulling="2026-03-12 04:20:22.013811771 +0000 UTC m=+45.993656583" lastFinishedPulling="2026-03-12 04:20:32.366996944 +0000 UTC m=+56.346841758" observedRunningTime="2026-03-12 04:20:33.038178907 +0000 UTC m=+57.018023737" watchObservedRunningTime="2026-03-12 04:20:33.04380058 +0000 UTC m=+57.023645416" Mar 12 04:20:35.872344 containerd[1506]: time="2026-03-12T04:20:35.872192117Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:35.873794 containerd[1506]: time="2026-03-12T04:20:35.873719229Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Mar 12 04:20:35.876137 containerd[1506]: time="2026-03-12T04:20:35.875729865Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:35.899573 containerd[1506]: time="2026-03-12T04:20:35.899518675Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:35.908747 containerd[1506]: time="2026-03-12T04:20:35.907719604Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 3.539900246s" Mar 12 04:20:35.908747 containerd[1506]: time="2026-03-12T04:20:35.907810699Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 12 04:20:35.918223 containerd[1506]: time="2026-03-12T04:20:35.915139300Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 12 04:20:35.921979 containerd[1506]: time="2026-03-12T04:20:35.921948328Z" level=info msg="CreateContainer within sandbox \"eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 12 04:20:35.943339 containerd[1506]: time="2026-03-12T04:20:35.943257809Z" level=info msg="CreateContainer within sandbox \"eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"87311a54a693bf86700a9eea95ff5672b550aa55483e4b37592ddf383d68e65a\"" Mar 12 04:20:35.945134 containerd[1506]: time="2026-03-12T04:20:35.944271633Z" level=info msg="StartContainer for \"87311a54a693bf86700a9eea95ff5672b550aa55483e4b37592ddf383d68e65a\"" Mar 12 04:20:35.999946 systemd[1]: Started cri-containerd-87311a54a693bf86700a9eea95ff5672b550aa55483e4b37592ddf383d68e65a.scope - libcontainer container 87311a54a693bf86700a9eea95ff5672b550aa55483e4b37592ddf383d68e65a. Mar 12 04:20:36.062949 containerd[1506]: time="2026-03-12T04:20:36.062823626Z" level=info msg="StartContainer for \"87311a54a693bf86700a9eea95ff5672b550aa55483e4b37592ddf383d68e65a\" returns successfully" Mar 12 04:20:36.235817 containerd[1506]: time="2026-03-12T04:20:36.235010170Z" level=info msg="StopPodSandbox for \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\"" Mar 12 04:20:36.292401 containerd[1506]: time="2026-03-12T04:20:36.292301086Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:36.295543 containerd[1506]: time="2026-03-12T04:20:36.294537245Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 12 04:20:36.302089 containerd[1506]: time="2026-03-12T04:20:36.301449086Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 386.270211ms" Mar 12 04:20:36.303376 containerd[1506]: time="2026-03-12T04:20:36.302542551Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 12 04:20:36.307028 containerd[1506]: time="2026-03-12T04:20:36.307005624Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 12 04:20:36.312623 containerd[1506]: time="2026-03-12T04:20:36.312386782Z" level=info msg="CreateContainer within sandbox \"d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 12 04:20:36.339417 containerd[1506]: time="2026-03-12T04:20:36.339374480Z" level=info msg="CreateContainer within sandbox \"d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"5ba8814763600df09c69016aefd6e3381e1784eafa8b4af60e84d3a0c47286b9\"" Mar 12 04:20:36.342872 containerd[1506]: time="2026-03-12T04:20:36.340484584Z" level=info msg="StartContainer for \"5ba8814763600df09c69016aefd6e3381e1784eafa8b4af60e84d3a0c47286b9\"" Mar 12 04:20:36.413109 systemd[1]: Started cri-containerd-5ba8814763600df09c69016aefd6e3381e1784eafa8b4af60e84d3a0c47286b9.scope - libcontainer container 5ba8814763600df09c69016aefd6e3381e1784eafa8b4af60e84d3a0c47286b9. Mar 12 04:20:36.527941 containerd[1506]: time="2026-03-12T04:20:36.527837532Z" level=info msg="StartContainer for \"5ba8814763600df09c69016aefd6e3381e1784eafa8b4af60e84d3a0c47286b9\" returns successfully" Mar 12 04:20:36.855215 containerd[1506]: 2026-03-12 04:20:36.511 [WARNING][5155] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0", GenerateName:"calico-apiserver-55c7dc45f-", Namespace:"calico-system", SelfLink:"", UID:"6f24fc37-2e2e-4c28-be28-717ae5582f31", ResourceVersion:"967", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55c7dc45f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb", Pod:"calico-apiserver-55c7dc45f-pfllz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.27.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali0f69ceb4be9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:36.855215 containerd[1506]: 2026-03-12 04:20:36.516 [INFO][5155] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Mar 12 04:20:36.855215 containerd[1506]: 2026-03-12 04:20:36.516 [INFO][5155] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" iface="eth0" netns="" Mar 12 04:20:36.855215 containerd[1506]: 2026-03-12 04:20:36.516 [INFO][5155] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Mar 12 04:20:36.855215 containerd[1506]: 2026-03-12 04:20:36.516 [INFO][5155] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Mar 12 04:20:36.855215 containerd[1506]: 2026-03-12 04:20:36.805 [INFO][5192] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" HandleID="k8s-pod-network.8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" Mar 12 04:20:36.855215 containerd[1506]: 2026-03-12 04:20:36.809 [INFO][5192] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:36.855215 containerd[1506]: 2026-03-12 04:20:36.811 [INFO][5192] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:36.855215 containerd[1506]: 2026-03-12 04:20:36.834 [WARNING][5192] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" HandleID="k8s-pod-network.8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" Mar 12 04:20:36.855215 containerd[1506]: 2026-03-12 04:20:36.835 [INFO][5192] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" HandleID="k8s-pod-network.8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" Mar 12 04:20:36.855215 containerd[1506]: 2026-03-12 04:20:36.837 [INFO][5192] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:36.855215 containerd[1506]: 2026-03-12 04:20:36.848 [INFO][5155] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Mar 12 04:20:36.855215 containerd[1506]: time="2026-03-12T04:20:36.854761825Z" level=info msg="TearDown network for sandbox \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\" successfully" Mar 12 04:20:36.855215 containerd[1506]: time="2026-03-12T04:20:36.854785751Z" level=info msg="StopPodSandbox for \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\" returns successfully" Mar 12 04:20:37.089687 containerd[1506]: time="2026-03-12T04:20:37.089625016Z" level=info msg="RemovePodSandbox for \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\"" Mar 12 04:20:37.105963 containerd[1506]: time="2026-03-12T04:20:37.103191506Z" level=info msg="Forcibly stopping sandbox \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\"" Mar 12 04:20:37.134267 kubelet[2658]: I0312 04:20:37.133642 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-55c7dc45f-pfllz" podStartSLOduration=29.392248763 podStartE2EDuration="43.133233324s" podCreationTimestamp="2026-03-12 04:19:54 +0000 UTC" firstStartedPulling="2026-03-12 04:20:22.564745317 +0000 UTC m=+46.544590133" lastFinishedPulling="2026-03-12 04:20:36.305729882 +0000 UTC m=+60.285574694" observedRunningTime="2026-03-12 04:20:37.131080703 +0000 UTC m=+61.110925517" watchObservedRunningTime="2026-03-12 04:20:37.133233324 +0000 UTC m=+61.113078157" Mar 12 04:20:37.170700 kubelet[2658]: I0312 04:20:37.170636 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-55c7dc45f-llqv4" podStartSLOduration=29.61558161 podStartE2EDuration="43.170616861s" podCreationTimestamp="2026-03-12 04:19:54 +0000 UTC" firstStartedPulling="2026-03-12 04:20:22.35970298 +0000 UTC m=+46.339547792" lastFinishedPulling="2026-03-12 04:20:35.91473823 +0000 UTC m=+59.894583043" observedRunningTime="2026-03-12 04:20:37.16989386 +0000 UTC m=+61.149738697" watchObservedRunningTime="2026-03-12 04:20:37.170616861 +0000 UTC m=+61.150461694" Mar 12 04:20:37.367896 containerd[1506]: 2026-03-12 04:20:37.259 [WARNING][5219] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0", GenerateName:"calico-apiserver-55c7dc45f-", Namespace:"calico-system", SelfLink:"", UID:"6f24fc37-2e2e-4c28-be28-717ae5582f31", ResourceVersion:"1066", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55c7dc45f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"d6c02bf5cae4fad0481a13ef82aca3862c3b74996bd3658a5b2511e578449cdb", Pod:"calico-apiserver-55c7dc45f-pfllz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.27.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali0f69ceb4be9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:37.367896 containerd[1506]: 2026-03-12 04:20:37.259 [INFO][5219] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Mar 12 04:20:37.367896 containerd[1506]: 2026-03-12 04:20:37.260 [INFO][5219] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" iface="eth0" netns="" Mar 12 04:20:37.367896 containerd[1506]: 2026-03-12 04:20:37.260 [INFO][5219] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Mar 12 04:20:37.367896 containerd[1506]: 2026-03-12 04:20:37.260 [INFO][5219] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Mar 12 04:20:37.367896 containerd[1506]: 2026-03-12 04:20:37.314 [INFO][5229] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" HandleID="k8s-pod-network.8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" Mar 12 04:20:37.367896 containerd[1506]: 2026-03-12 04:20:37.314 [INFO][5229] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:37.367896 containerd[1506]: 2026-03-12 04:20:37.314 [INFO][5229] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:37.367896 containerd[1506]: 2026-03-12 04:20:37.346 [WARNING][5229] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" HandleID="k8s-pod-network.8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" Mar 12 04:20:37.367896 containerd[1506]: 2026-03-12 04:20:37.346 [INFO][5229] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" HandleID="k8s-pod-network.8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--pfllz-eth0" Mar 12 04:20:37.367896 containerd[1506]: 2026-03-12 04:20:37.348 [INFO][5229] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:37.367896 containerd[1506]: 2026-03-12 04:20:37.358 [INFO][5219] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9" Mar 12 04:20:37.367896 containerd[1506]: time="2026-03-12T04:20:37.367400020Z" level=info msg="TearDown network for sandbox \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\" successfully" Mar 12 04:20:37.391886 containerd[1506]: time="2026-03-12T04:20:37.391728750Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 12 04:20:37.430170 containerd[1506]: time="2026-03-12T04:20:37.428525372Z" level=info msg="RemovePodSandbox \"8ff30ef7a04c29aff02f90fb124ad78bf0a30d3f24ef61c4cfc66a56f52d99f9\" returns successfully" Mar 12 04:20:37.435459 containerd[1506]: time="2026-03-12T04:20:37.435421706Z" level=info msg="StopPodSandbox for \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\"" Mar 12 04:20:37.594358 containerd[1506]: 2026-03-12 04:20:37.524 [WARNING][5244] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-whisker--696fcdffc9--lx2sm-eth0" Mar 12 04:20:37.594358 containerd[1506]: 2026-03-12 04:20:37.524 [INFO][5244] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Mar 12 04:20:37.594358 containerd[1506]: 2026-03-12 04:20:37.525 [INFO][5244] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" iface="eth0" netns="" Mar 12 04:20:37.594358 containerd[1506]: 2026-03-12 04:20:37.525 [INFO][5244] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Mar 12 04:20:37.594358 containerd[1506]: 2026-03-12 04:20:37.525 [INFO][5244] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Mar 12 04:20:37.594358 containerd[1506]: 2026-03-12 04:20:37.575 [INFO][5252] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" HandleID="k8s-pod-network.bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Workload="srv--tymtb.gb1.brightbox.com-k8s-whisker--696fcdffc9--lx2sm-eth0" Mar 12 04:20:37.594358 containerd[1506]: 2026-03-12 04:20:37.576 [INFO][5252] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:37.594358 containerd[1506]: 2026-03-12 04:20:37.576 [INFO][5252] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:37.594358 containerd[1506]: 2026-03-12 04:20:37.586 [WARNING][5252] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" HandleID="k8s-pod-network.bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Workload="srv--tymtb.gb1.brightbox.com-k8s-whisker--696fcdffc9--lx2sm-eth0" Mar 12 04:20:37.594358 containerd[1506]: 2026-03-12 04:20:37.586 [INFO][5252] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" HandleID="k8s-pod-network.bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Workload="srv--tymtb.gb1.brightbox.com-k8s-whisker--696fcdffc9--lx2sm-eth0" Mar 12 04:20:37.594358 containerd[1506]: 2026-03-12 04:20:37.588 [INFO][5252] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:37.594358 containerd[1506]: 2026-03-12 04:20:37.591 [INFO][5244] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Mar 12 04:20:37.594358 containerd[1506]: time="2026-03-12T04:20:37.593893508Z" level=info msg="TearDown network for sandbox \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\" successfully" Mar 12 04:20:37.594358 containerd[1506]: time="2026-03-12T04:20:37.593918200Z" level=info msg="StopPodSandbox for \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\" returns successfully" Mar 12 04:20:37.595700 containerd[1506]: time="2026-03-12T04:20:37.594569419Z" level=info msg="RemovePodSandbox for \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\"" Mar 12 04:20:37.595700 containerd[1506]: time="2026-03-12T04:20:37.594594971Z" level=info msg="Forcibly stopping sandbox \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\"" Mar 12 04:20:37.721274 containerd[1506]: 2026-03-12 04:20:37.647 [WARNING][5266] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" WorkloadEndpoint="srv--tymtb.gb1.brightbox.com-k8s-whisker--696fcdffc9--lx2sm-eth0" Mar 12 04:20:37.721274 containerd[1506]: 2026-03-12 04:20:37.647 [INFO][5266] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Mar 12 04:20:37.721274 containerd[1506]: 2026-03-12 04:20:37.648 [INFO][5266] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" iface="eth0" netns="" Mar 12 04:20:37.721274 containerd[1506]: 2026-03-12 04:20:37.648 [INFO][5266] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Mar 12 04:20:37.721274 containerd[1506]: 2026-03-12 04:20:37.648 [INFO][5266] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Mar 12 04:20:37.721274 containerd[1506]: 2026-03-12 04:20:37.699 [INFO][5273] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" HandleID="k8s-pod-network.bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Workload="srv--tymtb.gb1.brightbox.com-k8s-whisker--696fcdffc9--lx2sm-eth0" Mar 12 04:20:37.721274 containerd[1506]: 2026-03-12 04:20:37.699 [INFO][5273] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:37.721274 containerd[1506]: 2026-03-12 04:20:37.700 [INFO][5273] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:37.721274 containerd[1506]: 2026-03-12 04:20:37.711 [WARNING][5273] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" HandleID="k8s-pod-network.bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Workload="srv--tymtb.gb1.brightbox.com-k8s-whisker--696fcdffc9--lx2sm-eth0" Mar 12 04:20:37.721274 containerd[1506]: 2026-03-12 04:20:37.711 [INFO][5273] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" HandleID="k8s-pod-network.bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Workload="srv--tymtb.gb1.brightbox.com-k8s-whisker--696fcdffc9--lx2sm-eth0" Mar 12 04:20:37.721274 containerd[1506]: 2026-03-12 04:20:37.715 [INFO][5273] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:37.721274 containerd[1506]: 2026-03-12 04:20:37.718 [INFO][5266] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb" Mar 12 04:20:37.721274 containerd[1506]: time="2026-03-12T04:20:37.721211694Z" level=info msg="TearDown network for sandbox \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\" successfully" Mar 12 04:20:37.726060 containerd[1506]: time="2026-03-12T04:20:37.725634962Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 12 04:20:37.726060 containerd[1506]: time="2026-03-12T04:20:37.725744357Z" level=info msg="RemovePodSandbox \"bcb0902d61e6cdaffd96a52964b2a7420c051d4e568ee6d78c20f3cf626a90cb\" returns successfully" Mar 12 04:20:37.727935 containerd[1506]: time="2026-03-12T04:20:37.727889438Z" level=info msg="StopPodSandbox for \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\"" Mar 12 04:20:37.902796 containerd[1506]: 2026-03-12 04:20:37.788 [WARNING][5287] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0", GenerateName:"calico-kube-controllers-5dd89df469-", Namespace:"calico-system", SelfLink:"", UID:"52afdedb-78fc-4262-a140-f3e2656f3c8e", ResourceVersion:"1033", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5dd89df469", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4", Pod:"calico-kube-controllers-5dd89df469-vlctc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.27.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif176bc62eb1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:37.902796 containerd[1506]: 2026-03-12 04:20:37.789 [INFO][5287] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Mar 12 04:20:37.902796 containerd[1506]: 2026-03-12 04:20:37.789 [INFO][5287] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" iface="eth0" netns="" Mar 12 04:20:37.902796 containerd[1506]: 2026-03-12 04:20:37.789 [INFO][5287] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Mar 12 04:20:37.902796 containerd[1506]: 2026-03-12 04:20:37.789 [INFO][5287] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Mar 12 04:20:37.902796 containerd[1506]: 2026-03-12 04:20:37.868 [INFO][5294] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" HandleID="k8s-pod-network.fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" Mar 12 04:20:37.902796 containerd[1506]: 2026-03-12 04:20:37.869 [INFO][5294] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:37.902796 containerd[1506]: 2026-03-12 04:20:37.869 [INFO][5294] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:37.902796 containerd[1506]: 2026-03-12 04:20:37.893 [WARNING][5294] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" HandleID="k8s-pod-network.fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" Mar 12 04:20:37.902796 containerd[1506]: 2026-03-12 04:20:37.893 [INFO][5294] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" HandleID="k8s-pod-network.fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" Mar 12 04:20:37.902796 containerd[1506]: 2026-03-12 04:20:37.896 [INFO][5294] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:37.902796 containerd[1506]: 2026-03-12 04:20:37.899 [INFO][5287] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Mar 12 04:20:37.902796 containerd[1506]: time="2026-03-12T04:20:37.901996575Z" level=info msg="TearDown network for sandbox \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\" successfully" Mar 12 04:20:37.902796 containerd[1506]: time="2026-03-12T04:20:37.902031676Z" level=info msg="StopPodSandbox for \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\" returns successfully" Mar 12 04:20:37.905560 containerd[1506]: time="2026-03-12T04:20:37.903378439Z" level=info msg="RemovePodSandbox for \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\"" Mar 12 04:20:37.905560 containerd[1506]: time="2026-03-12T04:20:37.903422844Z" level=info msg="Forcibly stopping sandbox \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\"" Mar 12 04:20:38.077724 containerd[1506]: 2026-03-12 04:20:37.984 [WARNING][5308] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0", GenerateName:"calico-kube-controllers-5dd89df469-", Namespace:"calico-system", SelfLink:"", UID:"52afdedb-78fc-4262-a140-f3e2656f3c8e", ResourceVersion:"1033", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5dd89df469", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"b5f5ca79902a53c939ef918464fe1511de9f595bafcd8e921d95b9c621709db4", Pod:"calico-kube-controllers-5dd89df469-vlctc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.27.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif176bc62eb1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:38.077724 containerd[1506]: 2026-03-12 04:20:37.985 [INFO][5308] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Mar 12 04:20:38.077724 containerd[1506]: 2026-03-12 04:20:37.985 [INFO][5308] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" iface="eth0" netns="" Mar 12 04:20:38.077724 containerd[1506]: 2026-03-12 04:20:37.985 [INFO][5308] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Mar 12 04:20:38.077724 containerd[1506]: 2026-03-12 04:20:37.985 [INFO][5308] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Mar 12 04:20:38.077724 containerd[1506]: 2026-03-12 04:20:38.052 [INFO][5315] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" HandleID="k8s-pod-network.fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" Mar 12 04:20:38.077724 containerd[1506]: 2026-03-12 04:20:38.053 [INFO][5315] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:38.077724 containerd[1506]: 2026-03-12 04:20:38.053 [INFO][5315] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:38.077724 containerd[1506]: 2026-03-12 04:20:38.065 [WARNING][5315] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" HandleID="k8s-pod-network.fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" Mar 12 04:20:38.077724 containerd[1506]: 2026-03-12 04:20:38.065 [INFO][5315] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" HandleID="k8s-pod-network.fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--kube--controllers--5dd89df469--vlctc-eth0" Mar 12 04:20:38.077724 containerd[1506]: 2026-03-12 04:20:38.070 [INFO][5315] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:38.077724 containerd[1506]: 2026-03-12 04:20:38.072 [INFO][5308] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9" Mar 12 04:20:38.079495 containerd[1506]: time="2026-03-12T04:20:38.077771362Z" level=info msg="TearDown network for sandbox \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\" successfully" Mar 12 04:20:38.083886 containerd[1506]: time="2026-03-12T04:20:38.082949050Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 12 04:20:38.083886 containerd[1506]: time="2026-03-12T04:20:38.083028697Z" level=info msg="RemovePodSandbox \"fe0c1b12236601e3fc8309a23a415a361945ce3436f2d74c74b058b16fa563e9\" returns successfully" Mar 12 04:20:38.084738 containerd[1506]: time="2026-03-12T04:20:38.084696014Z" level=info msg="StopPodSandbox for \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\"" Mar 12 04:20:38.097051 kubelet[2658]: I0312 04:20:38.093712 2658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 04:20:38.257236 containerd[1506]: 2026-03-12 04:20:38.157 [WARNING][5329] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"8dd5d5e8-c785-4698-ab0f-5124958e0b67", ResourceVersion:"1046", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb", Pod:"goldmane-5b85766d88-nx574", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.27.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib3df51d4be5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:38.257236 containerd[1506]: 2026-03-12 04:20:38.157 [INFO][5329] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Mar 12 04:20:38.257236 containerd[1506]: 2026-03-12 04:20:38.157 [INFO][5329] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" iface="eth0" netns="" Mar 12 04:20:38.257236 containerd[1506]: 2026-03-12 04:20:38.157 [INFO][5329] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Mar 12 04:20:38.257236 containerd[1506]: 2026-03-12 04:20:38.158 [INFO][5329] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Mar 12 04:20:38.257236 containerd[1506]: 2026-03-12 04:20:38.239 [INFO][5337] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" HandleID="k8s-pod-network.844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Workload="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" Mar 12 04:20:38.257236 containerd[1506]: 2026-03-12 04:20:38.240 [INFO][5337] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:38.257236 containerd[1506]: 2026-03-12 04:20:38.240 [INFO][5337] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:38.257236 containerd[1506]: 2026-03-12 04:20:38.248 [WARNING][5337] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" HandleID="k8s-pod-network.844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Workload="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" Mar 12 04:20:38.257236 containerd[1506]: 2026-03-12 04:20:38.248 [INFO][5337] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" HandleID="k8s-pod-network.844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Workload="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" Mar 12 04:20:38.257236 containerd[1506]: 2026-03-12 04:20:38.250 [INFO][5337] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:38.257236 containerd[1506]: 2026-03-12 04:20:38.255 [INFO][5329] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Mar 12 04:20:38.260235 containerd[1506]: time="2026-03-12T04:20:38.257290836Z" level=info msg="TearDown network for sandbox \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\" successfully" Mar 12 04:20:38.260235 containerd[1506]: time="2026-03-12T04:20:38.257319826Z" level=info msg="StopPodSandbox for \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\" returns successfully" Mar 12 04:20:38.260235 containerd[1506]: time="2026-03-12T04:20:38.257899135Z" level=info msg="RemovePodSandbox for \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\"" Mar 12 04:20:38.260235 containerd[1506]: time="2026-03-12T04:20:38.257928117Z" level=info msg="Forcibly stopping sandbox \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\"" Mar 12 04:20:38.387305 containerd[1506]: 2026-03-12 04:20:38.313 [WARNING][5352] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"8dd5d5e8-c785-4698-ab0f-5124958e0b67", ResourceVersion:"1046", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"5c1f18cf530410af231b0bec3188b19fad125e6ec81a13ea93c16cfce9739ccb", Pod:"goldmane-5b85766d88-nx574", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.27.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib3df51d4be5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:38.387305 containerd[1506]: 2026-03-12 04:20:38.313 [INFO][5352] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Mar 12 04:20:38.387305 containerd[1506]: 2026-03-12 04:20:38.313 [INFO][5352] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" iface="eth0" netns="" Mar 12 04:20:38.387305 containerd[1506]: 2026-03-12 04:20:38.313 [INFO][5352] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Mar 12 04:20:38.387305 containerd[1506]: 2026-03-12 04:20:38.313 [INFO][5352] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Mar 12 04:20:38.387305 containerd[1506]: 2026-03-12 04:20:38.364 [INFO][5360] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" HandleID="k8s-pod-network.844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Workload="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" Mar 12 04:20:38.387305 containerd[1506]: 2026-03-12 04:20:38.365 [INFO][5360] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:38.387305 containerd[1506]: 2026-03-12 04:20:38.365 [INFO][5360] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:38.387305 containerd[1506]: 2026-03-12 04:20:38.378 [WARNING][5360] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" HandleID="k8s-pod-network.844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Workload="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" Mar 12 04:20:38.387305 containerd[1506]: 2026-03-12 04:20:38.378 [INFO][5360] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" HandleID="k8s-pod-network.844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Workload="srv--tymtb.gb1.brightbox.com-k8s-goldmane--5b85766d88--nx574-eth0" Mar 12 04:20:38.387305 containerd[1506]: 2026-03-12 04:20:38.381 [INFO][5360] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:38.387305 containerd[1506]: 2026-03-12 04:20:38.383 [INFO][5352] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f" Mar 12 04:20:38.387305 containerd[1506]: time="2026-03-12T04:20:38.386541889Z" level=info msg="TearDown network for sandbox \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\" successfully" Mar 12 04:20:38.394202 containerd[1506]: time="2026-03-12T04:20:38.393874965Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 12 04:20:38.394202 containerd[1506]: time="2026-03-12T04:20:38.393969478Z" level=info msg="RemovePodSandbox \"844dcf3ae2878d323b4dd7562f2d487ce3d82ea815f07f50e25d956ad7acc40f\" returns successfully" Mar 12 04:20:38.394926 containerd[1506]: time="2026-03-12T04:20:38.394901591Z" level=info msg="StopPodSandbox for \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\"" Mar 12 04:20:38.565676 containerd[1506]: 2026-03-12 04:20:38.485 [WARNING][5377] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"f398c48c-ad6a-4dd1-9a27-01ce627d85a6", ResourceVersion:"1005", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317", Pod:"coredns-674b8bbfcf-cjtcm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.27.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0de2a8c1690", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:38.565676 containerd[1506]: 2026-03-12 04:20:38.486 [INFO][5377] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Mar 12 04:20:38.565676 containerd[1506]: 2026-03-12 04:20:38.486 [INFO][5377] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" iface="eth0" netns="" Mar 12 04:20:38.565676 containerd[1506]: 2026-03-12 04:20:38.486 [INFO][5377] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Mar 12 04:20:38.565676 containerd[1506]: 2026-03-12 04:20:38.486 [INFO][5377] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Mar 12 04:20:38.565676 containerd[1506]: 2026-03-12 04:20:38.547 [INFO][5384] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" HandleID="k8s-pod-network.ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" Mar 12 04:20:38.565676 containerd[1506]: 2026-03-12 04:20:38.547 [INFO][5384] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:38.565676 containerd[1506]: 2026-03-12 04:20:38.547 [INFO][5384] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:38.565676 containerd[1506]: 2026-03-12 04:20:38.557 [WARNING][5384] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" HandleID="k8s-pod-network.ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" Mar 12 04:20:38.565676 containerd[1506]: 2026-03-12 04:20:38.557 [INFO][5384] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" HandleID="k8s-pod-network.ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" Mar 12 04:20:38.565676 containerd[1506]: 2026-03-12 04:20:38.559 [INFO][5384] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:38.565676 containerd[1506]: 2026-03-12 04:20:38.562 [INFO][5377] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Mar 12 04:20:38.567765 containerd[1506]: time="2026-03-12T04:20:38.565722357Z" level=info msg="TearDown network for sandbox \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\" successfully" Mar 12 04:20:38.567765 containerd[1506]: time="2026-03-12T04:20:38.565748205Z" level=info msg="StopPodSandbox for \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\" returns successfully" Mar 12 04:20:38.567765 containerd[1506]: time="2026-03-12T04:20:38.567082928Z" level=info msg="RemovePodSandbox for \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\"" Mar 12 04:20:38.567765 containerd[1506]: time="2026-03-12T04:20:38.567111487Z" level=info msg="Forcibly stopping sandbox \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\"" Mar 12 04:20:38.704025 containerd[1506]: 2026-03-12 04:20:38.627 [WARNING][5398] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"f398c48c-ad6a-4dd1-9a27-01ce627d85a6", ResourceVersion:"1005", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"1e80e49a7cd2f78365d04d81c0eb1c95d627dda66cd15e3cf21eae043ec40317", Pod:"coredns-674b8bbfcf-cjtcm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.27.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0de2a8c1690", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:38.704025 containerd[1506]: 2026-03-12 04:20:38.627 [INFO][5398] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Mar 12 04:20:38.704025 containerd[1506]: 2026-03-12 04:20:38.627 [INFO][5398] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" iface="eth0" netns="" Mar 12 04:20:38.704025 containerd[1506]: 2026-03-12 04:20:38.627 [INFO][5398] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Mar 12 04:20:38.704025 containerd[1506]: 2026-03-12 04:20:38.627 [INFO][5398] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Mar 12 04:20:38.704025 containerd[1506]: 2026-03-12 04:20:38.687 [INFO][5406] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" HandleID="k8s-pod-network.ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" Mar 12 04:20:38.704025 containerd[1506]: 2026-03-12 04:20:38.687 [INFO][5406] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:38.704025 containerd[1506]: 2026-03-12 04:20:38.688 [INFO][5406] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:38.704025 containerd[1506]: 2026-03-12 04:20:38.696 [WARNING][5406] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" HandleID="k8s-pod-network.ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" Mar 12 04:20:38.704025 containerd[1506]: 2026-03-12 04:20:38.696 [INFO][5406] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" HandleID="k8s-pod-network.ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--cjtcm-eth0" Mar 12 04:20:38.704025 containerd[1506]: 2026-03-12 04:20:38.698 [INFO][5406] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:38.704025 containerd[1506]: 2026-03-12 04:20:38.701 [INFO][5398] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5" Mar 12 04:20:38.704948 containerd[1506]: time="2026-03-12T04:20:38.703994840Z" level=info msg="TearDown network for sandbox \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\" successfully" Mar 12 04:20:38.709563 containerd[1506]: time="2026-03-12T04:20:38.709524905Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 12 04:20:38.709961 containerd[1506]: time="2026-03-12T04:20:38.709613642Z" level=info msg="RemovePodSandbox \"ef6a20d6e5485471b61cbaf65aae5d2721ad3c83a1c288e508fa9ab96d14b2e5\" returns successfully" Mar 12 04:20:38.710521 containerd[1506]: time="2026-03-12T04:20:38.710461787Z" level=info msg="StopPodSandbox for \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\"" Mar 12 04:20:38.837405 containerd[1506]: 2026-03-12 04:20:38.764 [WARNING][5420] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"193981ad-ae1d-40b4-8904-cc2e820c5b91", ResourceVersion:"1001", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe", Pod:"coredns-674b8bbfcf-dwqf2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.27.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali44fab4f9d33", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:38.837405 containerd[1506]: 2026-03-12 04:20:38.765 [INFO][5420] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Mar 12 04:20:38.837405 containerd[1506]: 2026-03-12 04:20:38.765 [INFO][5420] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" iface="eth0" netns="" Mar 12 04:20:38.837405 containerd[1506]: 2026-03-12 04:20:38.765 [INFO][5420] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Mar 12 04:20:38.837405 containerd[1506]: 2026-03-12 04:20:38.765 [INFO][5420] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Mar 12 04:20:38.837405 containerd[1506]: 2026-03-12 04:20:38.811 [INFO][5427] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" HandleID="k8s-pod-network.a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" Mar 12 04:20:38.837405 containerd[1506]: 2026-03-12 04:20:38.811 [INFO][5427] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:38.837405 containerd[1506]: 2026-03-12 04:20:38.812 [INFO][5427] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:38.837405 containerd[1506]: 2026-03-12 04:20:38.828 [WARNING][5427] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" HandleID="k8s-pod-network.a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" Mar 12 04:20:38.837405 containerd[1506]: 2026-03-12 04:20:38.829 [INFO][5427] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" HandleID="k8s-pod-network.a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" Mar 12 04:20:38.837405 containerd[1506]: 2026-03-12 04:20:38.832 [INFO][5427] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:38.837405 containerd[1506]: 2026-03-12 04:20:38.834 [INFO][5420] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Mar 12 04:20:38.837405 containerd[1506]: time="2026-03-12T04:20:38.837485883Z" level=info msg="TearDown network for sandbox \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\" successfully" Mar 12 04:20:38.837405 containerd[1506]: time="2026-03-12T04:20:38.837515053Z" level=info msg="StopPodSandbox for \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\" returns successfully" Mar 12 04:20:38.839836 containerd[1506]: time="2026-03-12T04:20:38.838197650Z" level=info msg="RemovePodSandbox for \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\"" Mar 12 04:20:38.839836 containerd[1506]: time="2026-03-12T04:20:38.838225732Z" level=info msg="Forcibly stopping sandbox \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\"" Mar 12 04:20:38.950437 containerd[1506]: 2026-03-12 04:20:38.892 [WARNING][5441] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"193981ad-ae1d-40b4-8904-cc2e820c5b91", ResourceVersion:"1001", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"f4d7ca1b3272a27da7a481879cca029c15e67e0f70a6a519e60fe1b27e1426fe", Pod:"coredns-674b8bbfcf-dwqf2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.27.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali44fab4f9d33", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:38.950437 containerd[1506]: 2026-03-12 04:20:38.892 [INFO][5441] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Mar 12 04:20:38.950437 containerd[1506]: 2026-03-12 04:20:38.892 [INFO][5441] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" iface="eth0" netns="" Mar 12 04:20:38.950437 containerd[1506]: 2026-03-12 04:20:38.892 [INFO][5441] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Mar 12 04:20:38.950437 containerd[1506]: 2026-03-12 04:20:38.892 [INFO][5441] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Mar 12 04:20:38.950437 containerd[1506]: 2026-03-12 04:20:38.924 [INFO][5449] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" HandleID="k8s-pod-network.a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" Mar 12 04:20:38.950437 containerd[1506]: 2026-03-12 04:20:38.924 [INFO][5449] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:38.950437 containerd[1506]: 2026-03-12 04:20:38.924 [INFO][5449] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:38.950437 containerd[1506]: 2026-03-12 04:20:38.942 [WARNING][5449] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" HandleID="k8s-pod-network.a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" Mar 12 04:20:38.950437 containerd[1506]: 2026-03-12 04:20:38.942 [INFO][5449] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" HandleID="k8s-pod-network.a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Workload="srv--tymtb.gb1.brightbox.com-k8s-coredns--674b8bbfcf--dwqf2-eth0" Mar 12 04:20:38.950437 containerd[1506]: 2026-03-12 04:20:38.944 [INFO][5449] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:38.950437 containerd[1506]: 2026-03-12 04:20:38.947 [INFO][5441] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402" Mar 12 04:20:38.950437 containerd[1506]: time="2026-03-12T04:20:38.949150540Z" level=info msg="TearDown network for sandbox \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\" successfully" Mar 12 04:20:38.953750 containerd[1506]: time="2026-03-12T04:20:38.953686280Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 12 04:20:38.953996 containerd[1506]: time="2026-03-12T04:20:38.953977653Z" level=info msg="RemovePodSandbox \"a5927a1dbedfe1264946969e808758d4ab9a2e75faa6d26b14940d2e9c6f2402\" returns successfully" Mar 12 04:20:38.955871 containerd[1506]: time="2026-03-12T04:20:38.954831838Z" level=info msg="StopPodSandbox for \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\"" Mar 12 04:20:39.104150 containerd[1506]: 2026-03-12 04:20:39.013 [WARNING][5464] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0", GenerateName:"calico-apiserver-55c7dc45f-", Namespace:"calico-system", SelfLink:"", UID:"78f8c939-b01e-4526-98a1-09241895ac3e", ResourceVersion:"1077", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55c7dc45f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd", Pod:"calico-apiserver-55c7dc45f-llqv4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.27.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali9c55416e10c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:39.104150 containerd[1506]: 2026-03-12 04:20:39.016 [INFO][5464] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Mar 12 04:20:39.104150 containerd[1506]: 2026-03-12 04:20:39.016 [INFO][5464] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" iface="eth0" netns="" Mar 12 04:20:39.104150 containerd[1506]: 2026-03-12 04:20:39.017 [INFO][5464] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Mar 12 04:20:39.104150 containerd[1506]: 2026-03-12 04:20:39.017 [INFO][5464] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Mar 12 04:20:39.104150 containerd[1506]: 2026-03-12 04:20:39.071 [INFO][5471] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" HandleID="k8s-pod-network.ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" Mar 12 04:20:39.104150 containerd[1506]: 2026-03-12 04:20:39.071 [INFO][5471] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:39.104150 containerd[1506]: 2026-03-12 04:20:39.071 [INFO][5471] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:39.104150 containerd[1506]: 2026-03-12 04:20:39.082 [WARNING][5471] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" HandleID="k8s-pod-network.ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" Mar 12 04:20:39.104150 containerd[1506]: 2026-03-12 04:20:39.084 [INFO][5471] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" HandleID="k8s-pod-network.ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" Mar 12 04:20:39.104150 containerd[1506]: 2026-03-12 04:20:39.089 [INFO][5471] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:39.104150 containerd[1506]: 2026-03-12 04:20:39.097 [INFO][5464] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Mar 12 04:20:39.111535 containerd[1506]: time="2026-03-12T04:20:39.104450217Z" level=info msg="TearDown network for sandbox \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\" successfully" Mar 12 04:20:39.111535 containerd[1506]: time="2026-03-12T04:20:39.104479285Z" level=info msg="StopPodSandbox for \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\" returns successfully" Mar 12 04:20:39.111535 containerd[1506]: time="2026-03-12T04:20:39.104915668Z" level=info msg="RemovePodSandbox for \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\"" Mar 12 04:20:39.111535 containerd[1506]: time="2026-03-12T04:20:39.104944288Z" level=info msg="Forcibly stopping sandbox \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\"" Mar 12 04:20:39.365861 containerd[1506]: 2026-03-12 04:20:39.193 [WARNING][5489] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0", GenerateName:"calico-apiserver-55c7dc45f-", Namespace:"calico-system", SelfLink:"", UID:"78f8c939-b01e-4526-98a1-09241895ac3e", ResourceVersion:"1077", Generation:0, CreationTimestamp:time.Date(2026, time.March, 12, 4, 19, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55c7dc45f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-tymtb.gb1.brightbox.com", ContainerID:"eadbc8e73d725fe3fca3237976a2e8ead05cb936102caaa95654fc87873efbfd", Pod:"calico-apiserver-55c7dc45f-llqv4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.27.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali9c55416e10c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 12 04:20:39.365861 containerd[1506]: 2026-03-12 04:20:39.194 [INFO][5489] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Mar 12 04:20:39.365861 containerd[1506]: 2026-03-12 04:20:39.194 [INFO][5489] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" iface="eth0" netns="" Mar 12 04:20:39.365861 containerd[1506]: 2026-03-12 04:20:39.194 [INFO][5489] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Mar 12 04:20:39.365861 containerd[1506]: 2026-03-12 04:20:39.194 [INFO][5489] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Mar 12 04:20:39.365861 containerd[1506]: 2026-03-12 04:20:39.292 [INFO][5496] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" HandleID="k8s-pod-network.ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" Mar 12 04:20:39.365861 containerd[1506]: 2026-03-12 04:20:39.292 [INFO][5496] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 12 04:20:39.365861 containerd[1506]: 2026-03-12 04:20:39.292 [INFO][5496] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 12 04:20:39.365861 containerd[1506]: 2026-03-12 04:20:39.325 [WARNING][5496] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" HandleID="k8s-pod-network.ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" Mar 12 04:20:39.365861 containerd[1506]: 2026-03-12 04:20:39.325 [INFO][5496] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" HandleID="k8s-pod-network.ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Workload="srv--tymtb.gb1.brightbox.com-k8s-calico--apiserver--55c7dc45f--llqv4-eth0" Mar 12 04:20:39.365861 containerd[1506]: 2026-03-12 04:20:39.333 [INFO][5496] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 12 04:20:39.365861 containerd[1506]: 2026-03-12 04:20:39.338 [INFO][5489] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e" Mar 12 04:20:39.365861 containerd[1506]: time="2026-03-12T04:20:39.363492465Z" level=info msg="TearDown network for sandbox \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\" successfully" Mar 12 04:20:39.459656 containerd[1506]: time="2026-03-12T04:20:39.459611060Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 12 04:20:39.460435 containerd[1506]: time="2026-03-12T04:20:39.459895722Z" level=info msg="RemovePodSandbox \"ab610c510bb179aa788ea561fc7fd5938b5801d7bb996fae36e9d4bf3793b47e\" returns successfully" Mar 12 04:20:39.545997 containerd[1506]: time="2026-03-12T04:20:39.545952369Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:39.553034 containerd[1506]: time="2026-03-12T04:20:39.551691665Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Mar 12 04:20:39.555542 containerd[1506]: time="2026-03-12T04:20:39.555484210Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:39.559705 containerd[1506]: time="2026-03-12T04:20:39.558599857Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:39.559705 containerd[1506]: time="2026-03-12T04:20:39.559400728Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 3.25191543s" Mar 12 04:20:39.559705 containerd[1506]: time="2026-03-12T04:20:39.559435186Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Mar 12 04:20:39.561510 containerd[1506]: time="2026-03-12T04:20:39.561488387Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 12 04:20:39.603865 containerd[1506]: time="2026-03-12T04:20:39.602245305Z" level=info msg="CreateContainer within sandbox \"0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 12 04:20:39.622886 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3245987658.mount: Deactivated successfully. Mar 12 04:20:39.631108 containerd[1506]: time="2026-03-12T04:20:39.631067274Z" level=info msg="CreateContainer within sandbox \"0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"e7131678f0854ed49ce6fe4e056224eadbb0dfafa3bd90e78531e4b0c3187200\"" Mar 12 04:20:39.631908 containerd[1506]: time="2026-03-12T04:20:39.631875194Z" level=info msg="StartContainer for \"e7131678f0854ed49ce6fe4e056224eadbb0dfafa3bd90e78531e4b0c3187200\"" Mar 12 04:20:39.703773 systemd[1]: run-containerd-runc-k8s.io-e7131678f0854ed49ce6fe4e056224eadbb0dfafa3bd90e78531e4b0c3187200-runc.kkzEn9.mount: Deactivated successfully. Mar 12 04:20:39.739619 systemd[1]: Started cri-containerd-e7131678f0854ed49ce6fe4e056224eadbb0dfafa3bd90e78531e4b0c3187200.scope - libcontainer container e7131678f0854ed49ce6fe4e056224eadbb0dfafa3bd90e78531e4b0c3187200. Mar 12 04:20:39.827759 containerd[1506]: time="2026-03-12T04:20:39.827597600Z" level=info msg="StartContainer for \"e7131678f0854ed49ce6fe4e056224eadbb0dfafa3bd90e78531e4b0c3187200\" returns successfully" Mar 12 04:20:41.624010 containerd[1506]: time="2026-03-12T04:20:41.623008435Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:41.625103 containerd[1506]: time="2026-03-12T04:20:41.624396398Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Mar 12 04:20:41.626494 containerd[1506]: time="2026-03-12T04:20:41.625162004Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:41.628022 containerd[1506]: time="2026-03-12T04:20:41.627389525Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:41.629509 containerd[1506]: time="2026-03-12T04:20:41.629024480Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 2.06714087s" Mar 12 04:20:41.629509 containerd[1506]: time="2026-03-12T04:20:41.629067004Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Mar 12 04:20:41.631911 containerd[1506]: time="2026-03-12T04:20:41.631054650Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 12 04:20:41.634392 containerd[1506]: time="2026-03-12T04:20:41.634369675Z" level=info msg="CreateContainer within sandbox \"03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 12 04:20:41.656787 containerd[1506]: time="2026-03-12T04:20:41.656698482Z" level=info msg="CreateContainer within sandbox \"03a5cec701674541e8d5e7961eb4e21fad6b3b1aa16b1e2a0917cd829b80c1b3\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"3afe203a7874cc143619f60c5ac000782e8943be0c82e644c1f795d783c7b174\"" Mar 12 04:20:41.657691 containerd[1506]: time="2026-03-12T04:20:41.657644123Z" level=info msg="StartContainer for \"3afe203a7874cc143619f60c5ac000782e8943be0c82e644c1f795d783c7b174\"" Mar 12 04:20:41.785549 systemd[1]: run-containerd-runc-k8s.io-3afe203a7874cc143619f60c5ac000782e8943be0c82e644c1f795d783c7b174-runc.6PF7Y4.mount: Deactivated successfully. Mar 12 04:20:41.794533 systemd[1]: Started cri-containerd-3afe203a7874cc143619f60c5ac000782e8943be0c82e644c1f795d783c7b174.scope - libcontainer container 3afe203a7874cc143619f60c5ac000782e8943be0c82e644c1f795d783c7b174. Mar 12 04:20:41.843649 containerd[1506]: time="2026-03-12T04:20:41.843159235Z" level=info msg="StartContainer for \"3afe203a7874cc143619f60c5ac000782e8943be0c82e644c1f795d783c7b174\" returns successfully" Mar 12 04:20:42.475154 kubelet[2658]: I0312 04:20:42.473289 2658 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 12 04:20:42.480109 kubelet[2658]: I0312 04:20:42.480013 2658 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 12 04:20:43.709782 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2890612968.mount: Deactivated successfully. Mar 12 04:20:43.729350 containerd[1506]: time="2026-03-12T04:20:43.728992659Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:43.731959 containerd[1506]: time="2026-03-12T04:20:43.730338517Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Mar 12 04:20:43.731959 containerd[1506]: time="2026-03-12T04:20:43.731004448Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:43.739069 containerd[1506]: time="2026-03-12T04:20:43.737896798Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 12 04:20:43.739069 containerd[1506]: time="2026-03-12T04:20:43.738800082Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 2.107713436s" Mar 12 04:20:43.739069 containerd[1506]: time="2026-03-12T04:20:43.738865873Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Mar 12 04:20:43.758104 containerd[1506]: time="2026-03-12T04:20:43.757985548Z" level=info msg="CreateContainer within sandbox \"0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 12 04:20:43.780425 containerd[1506]: time="2026-03-12T04:20:43.780317736Z" level=info msg="CreateContainer within sandbox \"0b966c2228b85ba48693164c33efa422481c976be1d979f5a5a1032fe73d114d\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"d0df750d0ea8bcd50d0ada37d167f0321222816f314e83d0a3062d6c04a43323\"" Mar 12 04:20:43.784060 containerd[1506]: time="2026-03-12T04:20:43.782736815Z" level=info msg="StartContainer for \"d0df750d0ea8bcd50d0ada37d167f0321222816f314e83d0a3062d6c04a43323\"" Mar 12 04:20:43.841063 systemd[1]: Started cri-containerd-d0df750d0ea8bcd50d0ada37d167f0321222816f314e83d0a3062d6c04a43323.scope - libcontainer container d0df750d0ea8bcd50d0ada37d167f0321222816f314e83d0a3062d6c04a43323. Mar 12 04:20:43.895417 containerd[1506]: time="2026-03-12T04:20:43.895373572Z" level=info msg="StartContainer for \"d0df750d0ea8bcd50d0ada37d167f0321222816f314e83d0a3062d6c04a43323\" returns successfully" Mar 12 04:20:44.252213 kubelet[2658]: I0312 04:20:44.247389 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6f6f86d876-6cvcx" podStartSLOduration=3.066669767 podStartE2EDuration="24.215087401s" podCreationTimestamp="2026-03-12 04:20:20 +0000 UTC" firstStartedPulling="2026-03-12 04:20:22.591954318 +0000 UTC m=+46.571799130" lastFinishedPulling="2026-03-12 04:20:43.740371952 +0000 UTC m=+67.720216764" observedRunningTime="2026-03-12 04:20:44.213128853 +0000 UTC m=+68.192973690" watchObservedRunningTime="2026-03-12 04:20:44.215087401 +0000 UTC m=+68.194932237" Mar 12 04:20:44.252213 kubelet[2658]: I0312 04:20:44.251786 2658 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-vmkhw" podStartSLOduration=27.399344446 podStartE2EDuration="49.251768377s" podCreationTimestamp="2026-03-12 04:19:55 +0000 UTC" firstStartedPulling="2026-03-12 04:20:19.777283968 +0000 UTC m=+43.757128782" lastFinishedPulling="2026-03-12 04:20:41.629707903 +0000 UTC m=+65.609552713" observedRunningTime="2026-03-12 04:20:42.194405459 +0000 UTC m=+66.174250295" watchObservedRunningTime="2026-03-12 04:20:44.251768377 +0000 UTC m=+68.231613208" Mar 12 04:20:50.005616 kubelet[2658]: I0312 04:20:50.005554 2658 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 04:21:03.760725 systemd[1]: run-containerd-runc-k8s.io-6fd160ebd75503cbb342c381cea5984c6d3aed6d30bc894937dff7928553e512-runc.lauXEh.mount: Deactivated successfully. Mar 12 04:21:05.419177 systemd[1]: Started sshd@8-10.244.101.2:22-20.161.92.111:35646.service - OpenSSH per-connection server daemon (20.161.92.111:35646). Mar 12 04:21:06.109737 sshd[5775]: Accepted publickey for core from 20.161.92.111 port 35646 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:21:06.113143 sshd[5775]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:21:06.127632 systemd-logind[1486]: New session 10 of user core. Mar 12 04:21:06.134025 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 12 04:21:07.056139 sshd[5775]: pam_unix(sshd:session): session closed for user core Mar 12 04:21:07.067732 systemd[1]: sshd@8-10.244.101.2:22-20.161.92.111:35646.service: Deactivated successfully. Mar 12 04:21:07.071112 systemd[1]: session-10.scope: Deactivated successfully. Mar 12 04:21:07.072417 systemd-logind[1486]: Session 10 logged out. Waiting for processes to exit. Mar 12 04:21:07.074418 systemd-logind[1486]: Removed session 10. Mar 12 04:21:12.170622 systemd[1]: Started sshd@9-10.244.101.2:22-20.161.92.111:52226.service - OpenSSH per-connection server daemon (20.161.92.111:52226). Mar 12 04:21:12.792752 sshd[5791]: Accepted publickey for core from 20.161.92.111 port 52226 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:21:12.796272 sshd[5791]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:21:12.808175 systemd-logind[1486]: New session 11 of user core. Mar 12 04:21:12.818540 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 12 04:21:13.548767 sshd[5791]: pam_unix(sshd:session): session closed for user core Mar 12 04:21:13.557827 systemd[1]: sshd@9-10.244.101.2:22-20.161.92.111:52226.service: Deactivated successfully. Mar 12 04:21:13.561604 systemd[1]: session-11.scope: Deactivated successfully. Mar 12 04:21:13.564435 systemd-logind[1486]: Session 11 logged out. Waiting for processes to exit. Mar 12 04:21:13.567104 systemd-logind[1486]: Removed session 11. Mar 12 04:21:18.667190 systemd[1]: Started sshd@10-10.244.101.2:22-20.161.92.111:52238.service - OpenSSH per-connection server daemon (20.161.92.111:52238). Mar 12 04:21:19.281031 sshd[5805]: Accepted publickey for core from 20.161.92.111 port 52238 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:21:19.285048 sshd[5805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:21:19.295693 systemd-logind[1486]: New session 12 of user core. Mar 12 04:21:19.302031 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 12 04:21:19.813951 sshd[5805]: pam_unix(sshd:session): session closed for user core Mar 12 04:21:19.820620 systemd[1]: sshd@10-10.244.101.2:22-20.161.92.111:52238.service: Deactivated successfully. Mar 12 04:21:19.824950 systemd[1]: session-12.scope: Deactivated successfully. Mar 12 04:21:19.827986 systemd-logind[1486]: Session 12 logged out. Waiting for processes to exit. Mar 12 04:21:19.829558 systemd-logind[1486]: Removed session 12. Mar 12 04:21:24.922321 systemd[1]: Started sshd@11-10.244.101.2:22-20.161.92.111:59842.service - OpenSSH per-connection server daemon (20.161.92.111:59842). Mar 12 04:21:25.593831 sshd[5843]: Accepted publickey for core from 20.161.92.111 port 59842 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:21:25.597372 sshd[5843]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:21:25.609134 systemd-logind[1486]: New session 13 of user core. Mar 12 04:21:25.613048 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 12 04:21:26.234862 sshd[5843]: pam_unix(sshd:session): session closed for user core Mar 12 04:21:26.244694 systemd[1]: sshd@11-10.244.101.2:22-20.161.92.111:59842.service: Deactivated successfully. Mar 12 04:21:26.245116 systemd-logind[1486]: Session 13 logged out. Waiting for processes to exit. Mar 12 04:21:26.249312 systemd[1]: session-13.scope: Deactivated successfully. Mar 12 04:21:26.250493 systemd-logind[1486]: Removed session 13. Mar 12 04:21:26.358639 systemd[1]: Started sshd@12-10.244.101.2:22-20.161.92.111:59844.service - OpenSSH per-connection server daemon (20.161.92.111:59844). Mar 12 04:21:26.987959 sshd[5873]: Accepted publickey for core from 20.161.92.111 port 59844 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:21:26.993216 sshd[5873]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:21:27.003135 systemd-logind[1486]: New session 14 of user core. Mar 12 04:21:27.010369 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 12 04:21:27.665860 sshd[5873]: pam_unix(sshd:session): session closed for user core Mar 12 04:21:27.672703 systemd[1]: sshd@12-10.244.101.2:22-20.161.92.111:59844.service: Deactivated successfully. Mar 12 04:21:27.677587 systemd[1]: session-14.scope: Deactivated successfully. Mar 12 04:21:27.680971 systemd-logind[1486]: Session 14 logged out. Waiting for processes to exit. Mar 12 04:21:27.681872 systemd-logind[1486]: Removed session 14. Mar 12 04:21:27.767118 systemd[1]: Started sshd@13-10.244.101.2:22-20.161.92.111:59852.service - OpenSSH per-connection server daemon (20.161.92.111:59852). Mar 12 04:21:28.105725 systemd[1]: run-containerd-runc-k8s.io-913ef32cd4a4f20c9593804a9d5733677374bd134dec397122de1d7e21599495-runc.RvyBBC.mount: Deactivated successfully. Mar 12 04:21:28.367716 sshd[5884]: Accepted publickey for core from 20.161.92.111 port 59852 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:21:28.369416 sshd[5884]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:21:28.378476 systemd-logind[1486]: New session 15 of user core. Mar 12 04:21:28.387308 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 12 04:21:28.931054 sshd[5884]: pam_unix(sshd:session): session closed for user core Mar 12 04:21:28.935024 systemd[1]: sshd@13-10.244.101.2:22-20.161.92.111:59852.service: Deactivated successfully. Mar 12 04:21:28.940222 systemd[1]: session-15.scope: Deactivated successfully. Mar 12 04:21:28.941775 systemd-logind[1486]: Session 15 logged out. Waiting for processes to exit. Mar 12 04:21:28.945022 systemd-logind[1486]: Removed session 15. Mar 12 04:21:34.035179 systemd[1]: Started sshd@14-10.244.101.2:22-20.161.92.111:56368.service - OpenSSH per-connection server daemon (20.161.92.111:56368). Mar 12 04:21:34.637909 sshd[5916]: Accepted publickey for core from 20.161.92.111 port 56368 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:21:34.643607 sshd[5916]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:21:34.657655 systemd-logind[1486]: New session 16 of user core. Mar 12 04:21:34.664073 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 12 04:21:35.216167 sshd[5916]: pam_unix(sshd:session): session closed for user core Mar 12 04:21:35.226020 systemd-logind[1486]: Session 16 logged out. Waiting for processes to exit. Mar 12 04:21:35.226586 systemd[1]: sshd@14-10.244.101.2:22-20.161.92.111:56368.service: Deactivated successfully. Mar 12 04:21:35.229244 systemd[1]: session-16.scope: Deactivated successfully. Mar 12 04:21:35.230340 systemd-logind[1486]: Removed session 16. Mar 12 04:21:35.324783 systemd[1]: Started sshd@15-10.244.101.2:22-20.161.92.111:56382.service - OpenSSH per-connection server daemon (20.161.92.111:56382). Mar 12 04:21:35.933952 sshd[5948]: Accepted publickey for core from 20.161.92.111 port 56382 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:21:35.936706 sshd[5948]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:21:35.947221 systemd-logind[1486]: New session 17 of user core. Mar 12 04:21:35.954030 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 12 04:21:36.845325 sshd[5948]: pam_unix(sshd:session): session closed for user core Mar 12 04:21:36.857436 systemd[1]: sshd@15-10.244.101.2:22-20.161.92.111:56382.service: Deactivated successfully. Mar 12 04:21:36.860706 systemd[1]: session-17.scope: Deactivated successfully. Mar 12 04:21:36.862929 systemd-logind[1486]: Session 17 logged out. Waiting for processes to exit. Mar 12 04:21:36.864313 systemd-logind[1486]: Removed session 17. Mar 12 04:21:36.962374 systemd[1]: Started sshd@16-10.244.101.2:22-20.161.92.111:56398.service - OpenSSH per-connection server daemon (20.161.92.111:56398). Mar 12 04:21:37.608177 sshd[5961]: Accepted publickey for core from 20.161.92.111 port 56398 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:21:37.613243 sshd[5961]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:21:37.622642 systemd-logind[1486]: New session 18 of user core. Mar 12 04:21:37.630051 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 12 04:21:39.037985 sshd[5961]: pam_unix(sshd:session): session closed for user core Mar 12 04:21:39.060347 systemd[1]: sshd@16-10.244.101.2:22-20.161.92.111:56398.service: Deactivated successfully. Mar 12 04:21:39.064755 systemd[1]: session-18.scope: Deactivated successfully. Mar 12 04:21:39.066429 systemd-logind[1486]: Session 18 logged out. Waiting for processes to exit. Mar 12 04:21:39.068057 systemd-logind[1486]: Removed session 18. Mar 12 04:21:39.120354 systemd[1]: Started sshd@17-10.244.101.2:22-20.161.92.111:56408.service - OpenSSH per-connection server daemon (20.161.92.111:56408). Mar 12 04:21:39.743723 sshd[5988]: Accepted publickey for core from 20.161.92.111 port 56408 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:21:39.747869 sshd[5988]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:21:39.758544 systemd-logind[1486]: New session 19 of user core. Mar 12 04:21:39.764104 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 12 04:21:40.771252 systemd[1]: run-containerd-runc-k8s.io-913ef32cd4a4f20c9593804a9d5733677374bd134dec397122de1d7e21599495-runc.d3X3Dw.mount: Deactivated successfully. Mar 12 04:21:40.910668 sshd[5988]: pam_unix(sshd:session): session closed for user core Mar 12 04:21:40.921125 systemd[1]: sshd@17-10.244.101.2:22-20.161.92.111:56408.service: Deactivated successfully. Mar 12 04:21:40.926767 systemd[1]: session-19.scope: Deactivated successfully. Mar 12 04:21:40.928203 systemd-logind[1486]: Session 19 logged out. Waiting for processes to exit. Mar 12 04:21:40.929798 systemd-logind[1486]: Removed session 19. Mar 12 04:21:41.017726 systemd[1]: Started sshd@18-10.244.101.2:22-20.161.92.111:41358.service - OpenSSH per-connection server daemon (20.161.92.111:41358). Mar 12 04:21:41.633063 sshd[6017]: Accepted publickey for core from 20.161.92.111 port 41358 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:21:41.635732 sshd[6017]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:21:41.647133 systemd-logind[1486]: New session 20 of user core. Mar 12 04:21:41.652157 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 12 04:21:42.128210 sshd[6017]: pam_unix(sshd:session): session closed for user core Mar 12 04:21:42.140697 systemd-logind[1486]: Session 20 logged out. Waiting for processes to exit. Mar 12 04:21:42.141897 systemd[1]: sshd@18-10.244.101.2:22-20.161.92.111:41358.service: Deactivated successfully. Mar 12 04:21:42.145554 systemd[1]: session-20.scope: Deactivated successfully. Mar 12 04:21:42.147243 systemd-logind[1486]: Removed session 20. Mar 12 04:21:47.246528 systemd[1]: Started sshd@19-10.244.101.2:22-20.161.92.111:41370.service - OpenSSH per-connection server daemon (20.161.92.111:41370). Mar 12 04:21:47.861443 sshd[6043]: Accepted publickey for core from 20.161.92.111 port 41370 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:21:47.867657 sshd[6043]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:21:47.879336 systemd-logind[1486]: New session 21 of user core. Mar 12 04:21:47.883225 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 12 04:21:48.395143 sshd[6043]: pam_unix(sshd:session): session closed for user core Mar 12 04:21:48.406195 systemd[1]: sshd@19-10.244.101.2:22-20.161.92.111:41370.service: Deactivated successfully. Mar 12 04:21:48.410661 systemd[1]: session-21.scope: Deactivated successfully. Mar 12 04:21:48.412306 systemd-logind[1486]: Session 21 logged out. Waiting for processes to exit. Mar 12 04:21:48.414177 systemd-logind[1486]: Removed session 21. Mar 12 04:21:53.498590 systemd[1]: Started sshd@20-10.244.101.2:22-20.161.92.111:45598.service - OpenSSH per-connection server daemon (20.161.92.111:45598). Mar 12 04:21:54.141064 sshd[6079]: Accepted publickey for core from 20.161.92.111 port 45598 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:21:54.147138 sshd[6079]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:21:54.157268 systemd-logind[1486]: New session 22 of user core. Mar 12 04:21:54.162025 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 12 04:21:54.767234 sshd[6079]: pam_unix(sshd:session): session closed for user core Mar 12 04:21:54.779313 systemd[1]: sshd@20-10.244.101.2:22-20.161.92.111:45598.service: Deactivated successfully. Mar 12 04:21:54.785068 systemd[1]: session-22.scope: Deactivated successfully. Mar 12 04:21:54.787255 systemd-logind[1486]: Session 22 logged out. Waiting for processes to exit. Mar 12 04:21:54.790365 systemd-logind[1486]: Removed session 22. Mar 12 04:21:59.880152 systemd[1]: Started sshd@21-10.244.101.2:22-20.161.92.111:45610.service - OpenSSH per-connection server daemon (20.161.92.111:45610). Mar 12 04:22:00.475937 sshd[6145]: Accepted publickey for core from 20.161.92.111 port 45610 ssh2: RSA SHA256:Og1sBJQhpCaSrAUaqgWUKLRz71/5xOULak8g1URRdac Mar 12 04:22:00.479336 sshd[6145]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 12 04:22:00.493575 systemd-logind[1486]: New session 23 of user core. Mar 12 04:22:00.502003 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 12 04:22:01.034375 sshd[6145]: pam_unix(sshd:session): session closed for user core Mar 12 04:22:01.044402 systemd[1]: sshd@21-10.244.101.2:22-20.161.92.111:45610.service: Deactivated successfully. Mar 12 04:22:01.049803 systemd[1]: session-23.scope: Deactivated successfully. Mar 12 04:22:01.050823 systemd-logind[1486]: Session 23 logged out. Waiting for processes to exit. Mar 12 04:22:01.052296 systemd-logind[1486]: Removed session 23.