Mar 14 00:35:34.032396 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Fri Mar 13 22:25:24 -00 2026 Mar 14 00:35:34.032431 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=06bcfe4e320f7b61768d05159b69b4eeccebc9d161fb2cdaf8d6998ab1e14ac7 Mar 14 00:35:34.032459 kernel: BIOS-provided physical RAM map: Mar 14 00:35:34.034556 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 14 00:35:34.034575 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 14 00:35:34.034585 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 14 00:35:34.034599 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Mar 14 00:35:34.034609 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Mar 14 00:35:34.034619 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Mar 14 00:35:34.034629 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Mar 14 00:35:34.034639 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 14 00:35:34.034649 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 14 00:35:34.034667 kernel: NX (Execute Disable) protection: active Mar 14 00:35:34.034677 kernel: APIC: Static calls initialized Mar 14 00:35:34.034696 kernel: SMBIOS 2.8 present. Mar 14 00:35:34.034708 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Mar 14 00:35:34.034719 kernel: Hypervisor detected: KVM Mar 14 00:35:34.034734 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 14 00:35:34.034745 kernel: kvm-clock: using sched offset of 4415662206 cycles Mar 14 00:35:34.034756 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 14 00:35:34.034774 kernel: tsc: Detected 2799.998 MHz processor Mar 14 00:35:34.034785 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 14 00:35:34.034797 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 14 00:35:34.034807 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Mar 14 00:35:34.034829 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 14 00:35:34.034840 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 14 00:35:34.034856 kernel: Using GB pages for direct mapping Mar 14 00:35:34.034867 kernel: ACPI: Early table checksum verification disabled Mar 14 00:35:34.034878 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Mar 14 00:35:34.034889 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 14 00:35:34.034901 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 14 00:35:34.034911 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 14 00:35:34.034922 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Mar 14 00:35:34.034933 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 14 00:35:34.034944 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 14 00:35:34.034959 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 14 00:35:34.034970 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 14 00:35:34.034981 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Mar 14 00:35:34.034992 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Mar 14 00:35:34.035003 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Mar 14 00:35:34.035019 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Mar 14 00:35:34.035031 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Mar 14 00:35:34.035046 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Mar 14 00:35:34.035058 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Mar 14 00:35:34.035069 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Mar 14 00:35:34.035081 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Mar 14 00:35:34.035092 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Mar 14 00:35:34.035103 kernel: SRAT: PXM 0 -> APIC 0x03 -> Node 0 Mar 14 00:35:34.035114 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Mar 14 00:35:34.035126 kernel: SRAT: PXM 0 -> APIC 0x05 -> Node 0 Mar 14 00:35:34.035154 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Mar 14 00:35:34.035165 kernel: SRAT: PXM 0 -> APIC 0x07 -> Node 0 Mar 14 00:35:34.035176 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Mar 14 00:35:34.035187 kernel: SRAT: PXM 0 -> APIC 0x09 -> Node 0 Mar 14 00:35:34.035198 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Mar 14 00:35:34.035209 kernel: SRAT: PXM 0 -> APIC 0x0b -> Node 0 Mar 14 00:35:34.035220 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Mar 14 00:35:34.035243 kernel: SRAT: PXM 0 -> APIC 0x0d -> Node 0 Mar 14 00:35:34.035254 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Mar 14 00:35:34.035268 kernel: SRAT: PXM 0 -> APIC 0x0f -> Node 0 Mar 14 00:35:34.035279 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Mar 14 00:35:34.035290 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Mar 14 00:35:34.035301 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Mar 14 00:35:34.035312 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00000000-0x7ffdbfff] Mar 14 00:35:34.035323 kernel: NODE_DATA(0) allocated [mem 0x7ffd6000-0x7ffdbfff] Mar 14 00:35:34.035334 kernel: Zone ranges: Mar 14 00:35:34.035344 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 14 00:35:34.035355 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Mar 14 00:35:34.035366 kernel: Normal empty Mar 14 00:35:34.035381 kernel: Movable zone start for each node Mar 14 00:35:34.035391 kernel: Early memory node ranges Mar 14 00:35:34.035402 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 14 00:35:34.035413 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Mar 14 00:35:34.035423 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Mar 14 00:35:34.035434 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 14 00:35:34.035445 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 14 00:35:34.035455 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Mar 14 00:35:34.035466 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 14 00:35:34.035514 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 14 00:35:34.035527 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 14 00:35:34.035538 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 14 00:35:34.035549 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 14 00:35:34.035573 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 14 00:35:34.035584 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 14 00:35:34.035596 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 14 00:35:34.035607 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 14 00:35:34.035618 kernel: TSC deadline timer available Mar 14 00:35:34.035635 kernel: smpboot: Allowing 16 CPUs, 14 hotplug CPUs Mar 14 00:35:34.035646 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 14 00:35:34.035658 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Mar 14 00:35:34.035669 kernel: Booting paravirtualized kernel on KVM Mar 14 00:35:34.035681 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 14 00:35:34.035693 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Mar 14 00:35:34.035704 kernel: percpu: Embedded 57 pages/cpu s196328 r8192 d28952 u262144 Mar 14 00:35:34.035715 kernel: pcpu-alloc: s196328 r8192 d28952 u262144 alloc=1*2097152 Mar 14 00:35:34.035727 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Mar 14 00:35:34.035743 kernel: kvm-guest: PV spinlocks enabled Mar 14 00:35:34.035754 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 14 00:35:34.035767 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=06bcfe4e320f7b61768d05159b69b4eeccebc9d161fb2cdaf8d6998ab1e14ac7 Mar 14 00:35:34.035779 kernel: random: crng init done Mar 14 00:35:34.035791 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 14 00:35:34.035810 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 14 00:35:34.035833 kernel: Fallback order for Node 0: 0 Mar 14 00:35:34.035845 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515804 Mar 14 00:35:34.035861 kernel: Policy zone: DMA32 Mar 14 00:35:34.035873 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 14 00:35:34.035885 kernel: software IO TLB: area num 16. Mar 14 00:35:34.035897 kernel: Memory: 1901600K/2096616K available (12288K kernel code, 2288K rwdata, 22752K rodata, 42892K init, 2304K bss, 194756K reserved, 0K cma-reserved) Mar 14 00:35:34.035908 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Mar 14 00:35:34.035920 kernel: Kernel/User page tables isolation: enabled Mar 14 00:35:34.035931 kernel: ftrace: allocating 37996 entries in 149 pages Mar 14 00:35:34.035943 kernel: ftrace: allocated 149 pages with 4 groups Mar 14 00:35:34.035955 kernel: Dynamic Preempt: voluntary Mar 14 00:35:34.035970 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 14 00:35:34.035983 kernel: rcu: RCU event tracing is enabled. Mar 14 00:35:34.035994 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Mar 14 00:35:34.036006 kernel: Trampoline variant of Tasks RCU enabled. Mar 14 00:35:34.036018 kernel: Rude variant of Tasks RCU enabled. Mar 14 00:35:34.036038 kernel: Tracing variant of Tasks RCU enabled. Mar 14 00:35:34.036054 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 14 00:35:34.036067 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Mar 14 00:35:34.036079 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Mar 14 00:35:34.036090 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 14 00:35:34.036102 kernel: Console: colour VGA+ 80x25 Mar 14 00:35:34.036114 kernel: printk: console [tty0] enabled Mar 14 00:35:34.036130 kernel: printk: console [ttyS0] enabled Mar 14 00:35:34.036143 kernel: ACPI: Core revision 20230628 Mar 14 00:35:34.036155 kernel: APIC: Switch to symmetric I/O mode setup Mar 14 00:35:34.036167 kernel: x2apic enabled Mar 14 00:35:34.036179 kernel: APIC: Switched APIC routing to: physical x2apic Mar 14 00:35:34.036195 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Mar 14 00:35:34.036207 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Mar 14 00:35:34.036219 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 14 00:35:34.036231 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Mar 14 00:35:34.036243 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Mar 14 00:35:34.036255 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 14 00:35:34.036267 kernel: Spectre V2 : Mitigation: Retpolines Mar 14 00:35:34.036279 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Mar 14 00:35:34.036304 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Mar 14 00:35:34.036316 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 14 00:35:34.036331 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 14 00:35:34.036343 kernel: MDS: Mitigation: Clear CPU buffers Mar 14 00:35:34.036354 kernel: MMIO Stale Data: Unknown: No mitigations Mar 14 00:35:34.036366 kernel: SRBDS: Unknown: Dependent on hypervisor status Mar 14 00:35:34.036377 kernel: active return thunk: its_return_thunk Mar 14 00:35:34.036389 kernel: ITS: Mitigation: Aligned branch/return thunks Mar 14 00:35:34.036401 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 14 00:35:34.036412 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 14 00:35:34.036424 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 14 00:35:34.036435 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 14 00:35:34.036447 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Mar 14 00:35:34.038097 kernel: Freeing SMP alternatives memory: 32K Mar 14 00:35:34.038117 kernel: pid_max: default: 32768 minimum: 301 Mar 14 00:35:34.038130 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 14 00:35:34.038142 kernel: landlock: Up and running. Mar 14 00:35:34.038154 kernel: SELinux: Initializing. Mar 14 00:35:34.038167 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 14 00:35:34.038179 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 14 00:35:34.038191 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Mar 14 00:35:34.038203 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 14 00:35:34.038215 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 14 00:35:34.038228 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 14 00:35:34.038248 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Mar 14 00:35:34.038260 kernel: signal: max sigframe size: 1776 Mar 14 00:35:34.038279 kernel: rcu: Hierarchical SRCU implementation. Mar 14 00:35:34.038304 kernel: rcu: Max phase no-delay instances is 400. Mar 14 00:35:34.038316 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Mar 14 00:35:34.038328 kernel: smp: Bringing up secondary CPUs ... Mar 14 00:35:34.038341 kernel: smpboot: x86: Booting SMP configuration: Mar 14 00:35:34.038365 kernel: .... node #0, CPUs: #1 Mar 14 00:35:34.038376 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Mar 14 00:35:34.038391 kernel: smp: Brought up 1 node, 2 CPUs Mar 14 00:35:34.038403 kernel: smpboot: Max logical packages: 16 Mar 14 00:35:34.038414 kernel: smpboot: Total of 2 processors activated (11199.99 BogoMIPS) Mar 14 00:35:34.038425 kernel: devtmpfs: initialized Mar 14 00:35:34.038436 kernel: x86/mm: Memory block size: 128MB Mar 14 00:35:34.038447 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 14 00:35:34.038465 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Mar 14 00:35:34.038488 kernel: pinctrl core: initialized pinctrl subsystem Mar 14 00:35:34.038501 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 14 00:35:34.038529 kernel: audit: initializing netlink subsys (disabled) Mar 14 00:35:34.038540 kernel: audit: type=2000 audit(1773448532.007:1): state=initialized audit_enabled=0 res=1 Mar 14 00:35:34.038551 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 14 00:35:34.038562 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 14 00:35:34.038572 kernel: cpuidle: using governor menu Mar 14 00:35:34.038583 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 14 00:35:34.038593 kernel: dca service started, version 1.12.1 Mar 14 00:35:34.038604 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Mar 14 00:35:34.038615 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Mar 14 00:35:34.038630 kernel: PCI: Using configuration type 1 for base access Mar 14 00:35:34.038641 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 14 00:35:34.038651 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 14 00:35:34.038675 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 14 00:35:34.038686 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 14 00:35:34.038697 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 14 00:35:34.038708 kernel: ACPI: Added _OSI(Module Device) Mar 14 00:35:34.038719 kernel: ACPI: Added _OSI(Processor Device) Mar 14 00:35:34.038730 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 14 00:35:34.038745 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 14 00:35:34.038756 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 14 00:35:34.038779 kernel: ACPI: Interpreter enabled Mar 14 00:35:34.038791 kernel: ACPI: PM: (supports S0 S5) Mar 14 00:35:34.038802 kernel: ACPI: Using IOAPIC for interrupt routing Mar 14 00:35:34.038825 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 14 00:35:34.038851 kernel: PCI: Using E820 reservations for host bridge windows Mar 14 00:35:34.038863 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 14 00:35:34.038875 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 14 00:35:34.039175 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 14 00:35:34.039389 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 14 00:35:34.041662 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 14 00:35:34.041683 kernel: PCI host bridge to bus 0000:00 Mar 14 00:35:34.041914 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 14 00:35:34.042074 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 14 00:35:34.042264 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 14 00:35:34.042427 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Mar 14 00:35:34.043621 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 14 00:35:34.043793 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Mar 14 00:35:34.043961 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 14 00:35:34.044143 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Mar 14 00:35:34.044361 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 Mar 14 00:35:34.044552 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfa000000-0xfbffffff pref] Mar 14 00:35:34.044720 kernel: pci 0000:00:01.0: reg 0x14: [mem 0xfea50000-0xfea50fff] Mar 14 00:35:34.044930 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea40000-0xfea4ffff pref] Mar 14 00:35:34.045104 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 14 00:35:34.045314 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Mar 14 00:35:34.047545 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea51000-0xfea51fff] Mar 14 00:35:34.047761 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Mar 14 00:35:34.047960 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea52000-0xfea52fff] Mar 14 00:35:34.048146 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Mar 14 00:35:34.048327 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea53000-0xfea53fff] Mar 14 00:35:34.048532 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Mar 14 00:35:34.048703 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea54000-0xfea54fff] Mar 14 00:35:34.048918 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Mar 14 00:35:34.049081 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea55000-0xfea55fff] Mar 14 00:35:34.049253 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Mar 14 00:35:34.049427 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea56000-0xfea56fff] Mar 14 00:35:34.051654 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Mar 14 00:35:34.051849 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea57000-0xfea57fff] Mar 14 00:35:34.052046 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Mar 14 00:35:34.052218 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea58000-0xfea58fff] Mar 14 00:35:34.052409 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Mar 14 00:35:34.054633 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc0c0-0xc0df] Mar 14 00:35:34.054837 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfea59000-0xfea59fff] Mar 14 00:35:34.055004 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Mar 14 00:35:34.055190 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfea00000-0xfea3ffff pref] Mar 14 00:35:34.055388 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Mar 14 00:35:34.055594 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Mar 14 00:35:34.055769 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfea5a000-0xfea5afff] Mar 14 00:35:34.055963 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfd004000-0xfd007fff 64bit pref] Mar 14 00:35:34.056152 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Mar 14 00:35:34.056341 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 14 00:35:34.056547 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Mar 14 00:35:34.056724 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc0e0-0xc0ff] Mar 14 00:35:34.056911 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea5b000-0xfea5bfff] Mar 14 00:35:34.057079 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Mar 14 00:35:34.057260 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Mar 14 00:35:34.057439 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 Mar 14 00:35:34.059687 kernel: pci 0000:01:00.0: reg 0x10: [mem 0xfda00000-0xfda000ff 64bit] Mar 14 00:35:34.059891 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 14 00:35:34.060058 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Mar 14 00:35:34.060221 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 14 00:35:34.060434 kernel: pci_bus 0000:02: extended config space not accessible Mar 14 00:35:34.060708 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 Mar 14 00:35:34.060919 kernel: pci 0000:02:01.0: reg 0x10: [mem 0xfd800000-0xfd80000f] Mar 14 00:35:34.065627 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 14 00:35:34.065819 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 14 00:35:34.066031 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 Mar 14 00:35:34.066210 kernel: pci 0000:03:00.0: reg 0x10: [mem 0xfe800000-0xfe803fff 64bit] Mar 14 00:35:34.066395 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 14 00:35:34.066602 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Mar 14 00:35:34.066763 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 14 00:35:34.066956 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 Mar 14 00:35:34.067145 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Mar 14 00:35:34.067305 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 14 00:35:34.067482 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Mar 14 00:35:34.067660 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 14 00:35:34.067839 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 14 00:35:34.068004 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Mar 14 00:35:34.068195 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 14 00:35:34.068392 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 14 00:35:34.068610 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Mar 14 00:35:34.068784 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 14 00:35:34.068981 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 14 00:35:34.069162 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Mar 14 00:35:34.069345 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 14 00:35:34.069541 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 14 00:35:34.069721 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Mar 14 00:35:34.069923 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 14 00:35:34.070114 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 14 00:35:34.070292 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Mar 14 00:35:34.071614 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 14 00:35:34.071636 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 14 00:35:34.071649 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 14 00:35:34.071662 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 14 00:35:34.071674 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 14 00:35:34.071687 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 14 00:35:34.071714 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 14 00:35:34.071727 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 14 00:35:34.071739 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 14 00:35:34.071752 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 14 00:35:34.071764 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 14 00:35:34.071776 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 14 00:35:34.071789 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 14 00:35:34.071805 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 14 00:35:34.071828 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 14 00:35:34.071846 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 14 00:35:34.071859 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 14 00:35:34.071871 kernel: iommu: Default domain type: Translated Mar 14 00:35:34.071884 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 14 00:35:34.071900 kernel: PCI: Using ACPI for IRQ routing Mar 14 00:35:34.071913 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 14 00:35:34.071925 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 14 00:35:34.071937 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Mar 14 00:35:34.072105 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 14 00:35:34.072276 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 14 00:35:34.072438 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 14 00:35:34.075555 kernel: vgaarb: loaded Mar 14 00:35:34.075571 kernel: clocksource: Switched to clocksource kvm-clock Mar 14 00:35:34.075584 kernel: VFS: Disk quotas dquot_6.6.0 Mar 14 00:35:34.075596 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 14 00:35:34.075608 kernel: pnp: PnP ACPI init Mar 14 00:35:34.075784 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Mar 14 00:35:34.075822 kernel: pnp: PnP ACPI: found 5 devices Mar 14 00:35:34.075837 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 14 00:35:34.075849 kernel: NET: Registered PF_INET protocol family Mar 14 00:35:34.075862 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 14 00:35:34.075874 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Mar 14 00:35:34.075886 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 14 00:35:34.075898 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 14 00:35:34.075910 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Mar 14 00:35:34.075928 kernel: TCP: Hash tables configured (established 16384 bind 16384) Mar 14 00:35:34.075940 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 14 00:35:34.075953 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 14 00:35:34.075965 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 14 00:35:34.075977 kernel: NET: Registered PF_XDP protocol family Mar 14 00:35:34.076142 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Mar 14 00:35:34.076307 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Mar 14 00:35:34.076506 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Mar 14 00:35:34.076698 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Mar 14 00:35:34.076874 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 14 00:35:34.077037 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 14 00:35:34.077218 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 14 00:35:34.077374 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 14 00:35:34.077559 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Mar 14 00:35:34.077732 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Mar 14 00:35:34.077920 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Mar 14 00:35:34.078082 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Mar 14 00:35:34.078252 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Mar 14 00:35:34.078422 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Mar 14 00:35:34.078605 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Mar 14 00:35:34.078767 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Mar 14 00:35:34.078947 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 14 00:35:34.079143 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 14 00:35:34.079308 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 14 00:35:34.082498 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Mar 14 00:35:34.082694 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Mar 14 00:35:34.082881 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 14 00:35:34.083045 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 14 00:35:34.083239 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Mar 14 00:35:34.083392 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Mar 14 00:35:34.083564 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 14 00:35:34.083749 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 14 00:35:34.083934 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Mar 14 00:35:34.084107 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Mar 14 00:35:34.084288 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 14 00:35:34.084455 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 14 00:35:34.086659 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Mar 14 00:35:34.086868 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Mar 14 00:35:34.087033 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 14 00:35:34.087199 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 14 00:35:34.087370 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Mar 14 00:35:34.087553 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Mar 14 00:35:34.087717 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 14 00:35:34.087907 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 14 00:35:34.088069 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Mar 14 00:35:34.088274 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Mar 14 00:35:34.088456 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 14 00:35:34.089649 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 14 00:35:34.089856 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Mar 14 00:35:34.090019 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Mar 14 00:35:34.090222 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 14 00:35:34.090404 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 14 00:35:34.090611 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Mar 14 00:35:34.090803 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Mar 14 00:35:34.090987 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 14 00:35:34.091142 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 14 00:35:34.091298 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 14 00:35:34.091455 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 14 00:35:34.093949 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Mar 14 00:35:34.094115 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Mar 14 00:35:34.094265 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Mar 14 00:35:34.094451 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Mar 14 00:35:34.094642 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Mar 14 00:35:34.094832 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Mar 14 00:35:34.095009 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Mar 14 00:35:34.095188 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Mar 14 00:35:34.095356 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Mar 14 00:35:34.099763 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 14 00:35:34.099975 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Mar 14 00:35:34.100143 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Mar 14 00:35:34.100322 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 14 00:35:34.100520 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Mar 14 00:35:34.100701 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Mar 14 00:35:34.100902 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 14 00:35:34.101081 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Mar 14 00:35:34.101269 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Mar 14 00:35:34.101421 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 14 00:35:34.101618 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Mar 14 00:35:34.101821 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Mar 14 00:35:34.101985 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 14 00:35:34.102182 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Mar 14 00:35:34.102347 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Mar 14 00:35:34.102529 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 14 00:35:34.102727 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Mar 14 00:35:34.102928 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Mar 14 00:35:34.103082 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 14 00:35:34.103112 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 14 00:35:34.103139 kernel: PCI: CLS 0 bytes, default 64 Mar 14 00:35:34.103153 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 14 00:35:34.103166 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Mar 14 00:35:34.103190 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Mar 14 00:35:34.103203 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Mar 14 00:35:34.103224 kernel: Initialise system trusted keyrings Mar 14 00:35:34.103237 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Mar 14 00:35:34.103249 kernel: Key type asymmetric registered Mar 14 00:35:34.103266 kernel: Asymmetric key parser 'x509' registered Mar 14 00:35:34.103288 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 14 00:35:34.103314 kernel: io scheduler mq-deadline registered Mar 14 00:35:34.103327 kernel: io scheduler kyber registered Mar 14 00:35:34.103339 kernel: io scheduler bfq registered Mar 14 00:35:34.103526 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 14 00:35:34.103720 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 14 00:35:34.103913 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 14 00:35:34.104110 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 14 00:35:34.104273 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 14 00:35:34.104446 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 14 00:35:34.104655 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 14 00:35:34.104853 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 14 00:35:34.105015 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 14 00:35:34.105222 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 14 00:35:34.105414 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 14 00:35:34.105599 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 14 00:35:34.105766 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 14 00:35:34.105941 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 14 00:35:34.106117 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 14 00:35:34.106279 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 14 00:35:34.106447 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 14 00:35:34.106655 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 14 00:35:34.106863 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 14 00:35:34.107024 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 14 00:35:34.107185 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 14 00:35:34.107373 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 14 00:35:34.109622 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 14 00:35:34.109823 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 14 00:35:34.109845 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 14 00:35:34.109860 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 14 00:35:34.109874 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Mar 14 00:35:34.109887 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 14 00:35:34.109908 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 14 00:35:34.109922 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 14 00:35:34.109935 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 14 00:35:34.109948 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 14 00:35:34.109961 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 14 00:35:34.110151 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 14 00:35:34.110358 kernel: rtc_cmos 00:03: registered as rtc0 Mar 14 00:35:34.110554 kernel: rtc_cmos 00:03: setting system clock to 2026-03-14T00:35:33 UTC (1773448533) Mar 14 00:35:34.110728 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Mar 14 00:35:34.110748 kernel: intel_pstate: CPU model not supported Mar 14 00:35:34.110761 kernel: NET: Registered PF_INET6 protocol family Mar 14 00:35:34.110774 kernel: Segment Routing with IPv6 Mar 14 00:35:34.110787 kernel: In-situ OAM (IOAM) with IPv6 Mar 14 00:35:34.110800 kernel: NET: Registered PF_PACKET protocol family Mar 14 00:35:34.110823 kernel: Key type dns_resolver registered Mar 14 00:35:34.110837 kernel: IPI shorthand broadcast: enabled Mar 14 00:35:34.110850 kernel: sched_clock: Marking stable (1326078402, 230081984)->(1676362587, -120202201) Mar 14 00:35:34.110869 kernel: registered taskstats version 1 Mar 14 00:35:34.110882 kernel: Loading compiled-in X.509 certificates Mar 14 00:35:34.110896 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: a10808ddb7a43f470807cfbbb5be2c08229c2dec' Mar 14 00:35:34.110908 kernel: Key type .fscrypt registered Mar 14 00:35:34.110921 kernel: Key type fscrypt-provisioning registered Mar 14 00:35:34.110933 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 14 00:35:34.110946 kernel: ima: Allocated hash algorithm: sha1 Mar 14 00:35:34.110959 kernel: ima: No architecture policies found Mar 14 00:35:34.110972 kernel: clk: Disabling unused clocks Mar 14 00:35:34.110989 kernel: Freeing unused kernel image (initmem) memory: 42892K Mar 14 00:35:34.111003 kernel: Write protecting the kernel read-only data: 36864k Mar 14 00:35:34.111016 kernel: Freeing unused kernel image (rodata/data gap) memory: 1824K Mar 14 00:35:34.111028 kernel: Run /init as init process Mar 14 00:35:34.111041 kernel: with arguments: Mar 14 00:35:34.111054 kernel: /init Mar 14 00:35:34.111067 kernel: with environment: Mar 14 00:35:34.111079 kernel: HOME=/ Mar 14 00:35:34.111092 kernel: TERM=linux Mar 14 00:35:34.111112 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 14 00:35:34.111129 systemd[1]: Detected virtualization kvm. Mar 14 00:35:34.111143 systemd[1]: Detected architecture x86-64. Mar 14 00:35:34.111156 systemd[1]: Running in initrd. Mar 14 00:35:34.111169 systemd[1]: No hostname configured, using default hostname. Mar 14 00:35:34.111182 systemd[1]: Hostname set to . Mar 14 00:35:34.111207 systemd[1]: Initializing machine ID from VM UUID. Mar 14 00:35:34.111224 systemd[1]: Queued start job for default target initrd.target. Mar 14 00:35:34.111238 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 14 00:35:34.111251 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 14 00:35:34.111277 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 14 00:35:34.111290 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 14 00:35:34.111304 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 14 00:35:34.111318 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 14 00:35:34.111337 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 14 00:35:34.111352 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 14 00:35:34.111366 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 14 00:35:34.111379 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 14 00:35:34.111393 systemd[1]: Reached target paths.target - Path Units. Mar 14 00:35:34.111406 systemd[1]: Reached target slices.target - Slice Units. Mar 14 00:35:34.111420 systemd[1]: Reached target swap.target - Swaps. Mar 14 00:35:34.111433 systemd[1]: Reached target timers.target - Timer Units. Mar 14 00:35:34.111451 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 14 00:35:34.111466 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 14 00:35:34.112573 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 14 00:35:34.112590 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 14 00:35:34.112603 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 14 00:35:34.112616 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 14 00:35:34.112629 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 14 00:35:34.112662 systemd[1]: Reached target sockets.target - Socket Units. Mar 14 00:35:34.112676 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 14 00:35:34.112707 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 14 00:35:34.112720 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 14 00:35:34.112733 systemd[1]: Starting systemd-fsck-usr.service... Mar 14 00:35:34.112746 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 14 00:35:34.112759 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 14 00:35:34.112785 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 14 00:35:34.112798 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 14 00:35:34.112839 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 14 00:35:34.112861 systemd[1]: Finished systemd-fsck-usr.service. Mar 14 00:35:34.112915 systemd-journald[203]: Collecting audit messages is disabled. Mar 14 00:35:34.112953 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 14 00:35:34.112967 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 14 00:35:34.112980 kernel: Bridge firewalling registered Mar 14 00:35:34.112994 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 14 00:35:34.113008 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 14 00:35:34.113023 systemd-journald[203]: Journal started Mar 14 00:35:34.113053 systemd-journald[203]: Runtime Journal (/run/log/journal/b7038b4291a040b69a267862903b4f56) is 4.7M, max 38.0M, 33.2M free. Mar 14 00:35:34.033552 systemd-modules-load[204]: Inserted module 'overlay' Mar 14 00:35:34.115880 systemd[1]: Started systemd-journald.service - Journal Service. Mar 14 00:35:34.104079 systemd-modules-load[204]: Inserted module 'br_netfilter' Mar 14 00:35:34.123685 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 14 00:35:34.126644 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 14 00:35:34.144668 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 14 00:35:34.150548 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 14 00:35:34.159873 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 14 00:35:34.165115 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 14 00:35:34.174692 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 14 00:35:34.177669 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 14 00:35:34.182106 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 14 00:35:34.191803 dracut-cmdline[232]: dracut-dracut-053 Mar 14 00:35:34.193685 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 14 00:35:34.201319 dracut-cmdline[232]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=06bcfe4e320f7b61768d05159b69b4eeccebc9d161fb2cdaf8d6998ab1e14ac7 Mar 14 00:35:34.206112 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 14 00:35:34.246042 systemd-resolved[242]: Positive Trust Anchors: Mar 14 00:35:34.246065 systemd-resolved[242]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 14 00:35:34.246116 systemd-resolved[242]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 14 00:35:34.256069 systemd-resolved[242]: Defaulting to hostname 'linux'. Mar 14 00:35:34.258503 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 14 00:35:34.260298 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 14 00:35:34.315599 kernel: SCSI subsystem initialized Mar 14 00:35:34.326471 kernel: Loading iSCSI transport class v2.0-870. Mar 14 00:35:34.339478 kernel: iscsi: registered transport (tcp) Mar 14 00:35:34.365638 kernel: iscsi: registered transport (qla4xxx) Mar 14 00:35:34.365676 kernel: QLogic iSCSI HBA Driver Mar 14 00:35:34.418132 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 14 00:35:34.425678 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 14 00:35:34.463892 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 14 00:35:34.463942 kernel: device-mapper: uevent: version 1.0.3 Mar 14 00:35:34.466181 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 14 00:35:34.514628 kernel: raid6: sse2x4 gen() 13661 MB/s Mar 14 00:35:34.531536 kernel: raid6: sse2x2 gen() 9509 MB/s Mar 14 00:35:34.550129 kernel: raid6: sse2x1 gen() 8992 MB/s Mar 14 00:35:34.550165 kernel: raid6: using algorithm sse2x4 gen() 13661 MB/s Mar 14 00:35:34.569184 kernel: raid6: .... xor() 7801 MB/s, rmw enabled Mar 14 00:35:34.569217 kernel: raid6: using ssse3x2 recovery algorithm Mar 14 00:35:34.595482 kernel: xor: automatically using best checksumming function avx Mar 14 00:35:34.783531 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 14 00:35:34.798418 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 14 00:35:34.806819 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 14 00:35:34.835161 systemd-udevd[421]: Using default interface naming scheme 'v255'. Mar 14 00:35:34.842114 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 14 00:35:34.851907 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 14 00:35:34.873144 dracut-pre-trigger[430]: rd.md=0: removing MD RAID activation Mar 14 00:35:34.912260 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 14 00:35:34.917691 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 14 00:35:35.044285 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 14 00:35:35.054665 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 14 00:35:35.077833 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 14 00:35:35.079782 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 14 00:35:35.080710 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 14 00:35:35.084991 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 14 00:35:35.094037 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 14 00:35:35.113529 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 14 00:35:35.157495 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Mar 14 00:35:35.175782 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Mar 14 00:35:35.176042 kernel: cryptd: max_cpu_qlen set to 1000 Mar 14 00:35:35.200554 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 14 00:35:35.200619 kernel: GPT:17805311 != 125829119 Mar 14 00:35:35.200647 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 14 00:35:35.200677 kernel: GPT:17805311 != 125829119 Mar 14 00:35:35.200693 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 14 00:35:35.200708 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 14 00:35:35.216585 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 14 00:35:35.216792 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 14 00:35:35.219911 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 14 00:35:35.220919 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 14 00:35:35.223857 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 14 00:35:35.224604 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 14 00:35:35.234520 kernel: libata version 3.00 loaded. Mar 14 00:35:35.237174 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 14 00:35:35.247910 kernel: AVX version of gcm_enc/dec engaged. Mar 14 00:35:35.247955 kernel: AES CTR mode by8 optimization enabled Mar 14 00:35:35.249871 kernel: ACPI: bus type USB registered Mar 14 00:35:35.251525 kernel: usbcore: registered new interface driver usbfs Mar 14 00:35:35.253695 kernel: usbcore: registered new interface driver hub Mar 14 00:35:35.256073 kernel: usbcore: registered new device driver usb Mar 14 00:35:35.278477 kernel: ahci 0000:00:1f.2: version 3.0 Mar 14 00:35:35.278826 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 14 00:35:35.282472 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Mar 14 00:35:35.282701 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 14 00:35:35.318499 kernel: scsi host0: ahci Mar 14 00:35:35.321797 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Mar 14 00:35:35.322063 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Mar 14 00:35:35.322278 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 14 00:35:35.322521 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Mar 14 00:35:35.322720 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Mar 14 00:35:35.322940 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Mar 14 00:35:35.333839 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/vda6 scanned by (udev-worker) (468) Mar 14 00:35:35.333862 kernel: hub 1-0:1.0: USB hub found Mar 14 00:35:35.334094 kernel: hub 1-0:1.0: 4 ports detected Mar 14 00:35:35.334321 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 14 00:35:35.338525 kernel: BTRFS: device fsid cd4a88d6-c21b-44c8-aac6-68c13cee1def devid 1 transid 34 /dev/vda3 scanned by (udev-worker) (475) Mar 14 00:35:35.347478 kernel: scsi host1: ahci Mar 14 00:35:35.347723 kernel: hub 2-0:1.0: USB hub found Mar 14 00:35:35.347964 kernel: hub 2-0:1.0: 4 ports detected Mar 14 00:35:35.351470 kernel: scsi host2: ahci Mar 14 00:35:35.353507 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 14 00:35:35.439362 kernel: scsi host3: ahci Mar 14 00:35:35.439643 kernel: scsi host4: ahci Mar 14 00:35:35.439903 kernel: scsi host5: ahci Mar 14 00:35:35.440116 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 38 Mar 14 00:35:35.440136 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 38 Mar 14 00:35:35.440153 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 38 Mar 14 00:35:35.440169 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 38 Mar 14 00:35:35.440185 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 38 Mar 14 00:35:35.440213 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 38 Mar 14 00:35:35.439541 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 14 00:35:35.447262 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 14 00:35:35.453214 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 14 00:35:35.453993 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 14 00:35:35.462119 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 14 00:35:35.469655 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 14 00:35:35.471668 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 14 00:35:35.481221 disk-uuid[565]: Primary Header is updated. Mar 14 00:35:35.481221 disk-uuid[565]: Secondary Entries is updated. Mar 14 00:35:35.481221 disk-uuid[565]: Secondary Header is updated. Mar 14 00:35:35.487544 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 14 00:35:35.500509 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 14 00:35:35.507468 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 14 00:35:35.510547 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 14 00:35:35.579556 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 14 00:35:35.683142 kernel: ata1: SATA link down (SStatus 0 SControl 300) Mar 14 00:35:35.683243 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 14 00:35:35.683270 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 14 00:35:35.683287 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 14 00:35:35.686468 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 14 00:35:35.686508 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 14 00:35:35.725481 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 14 00:35:35.732666 kernel: usbcore: registered new interface driver usbhid Mar 14 00:35:35.732715 kernel: usbhid: USB HID core driver Mar 14 00:35:35.739469 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Mar 14 00:35:35.739536 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Mar 14 00:35:36.503560 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 14 00:35:36.504779 disk-uuid[566]: The operation has completed successfully. Mar 14 00:35:36.559099 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 14 00:35:36.559293 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 14 00:35:36.571682 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 14 00:35:36.579349 sh[587]: Success Mar 14 00:35:36.596154 kernel: device-mapper: verity: sha256 using implementation "sha256-avx" Mar 14 00:35:36.657932 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 14 00:35:36.668623 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 14 00:35:36.671236 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 14 00:35:36.700922 kernel: BTRFS info (device dm-0): first mount of filesystem cd4a88d6-c21b-44c8-aac6-68c13cee1def Mar 14 00:35:36.700995 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 14 00:35:36.703135 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 14 00:35:36.706463 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 14 00:35:36.706504 kernel: BTRFS info (device dm-0): using free space tree Mar 14 00:35:36.717643 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 14 00:35:36.719245 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 14 00:35:36.726646 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 14 00:35:36.729636 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 14 00:35:36.741493 kernel: BTRFS info (device vda6): first mount of filesystem 0ec14b75-fea9-4657-9245-934c6406ae1a Mar 14 00:35:36.741555 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 14 00:35:36.744499 kernel: BTRFS info (device vda6): using free space tree Mar 14 00:35:36.748484 kernel: BTRFS info (device vda6): auto enabling async discard Mar 14 00:35:36.762941 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 14 00:35:36.764928 kernel: BTRFS info (device vda6): last unmount of filesystem 0ec14b75-fea9-4657-9245-934c6406ae1a Mar 14 00:35:36.772143 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 14 00:35:36.780935 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 14 00:35:36.916503 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 14 00:35:36.919796 ignition[671]: Ignition 2.19.0 Mar 14 00:35:36.919816 ignition[671]: Stage: fetch-offline Mar 14 00:35:36.919896 ignition[671]: no configs at "/usr/lib/ignition/base.d" Mar 14 00:35:36.919916 ignition[671]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 14 00:35:36.920070 ignition[671]: parsed url from cmdline: "" Mar 14 00:35:36.920088 ignition[671]: no config URL provided Mar 14 00:35:36.920097 ignition[671]: reading system config file "/usr/lib/ignition/user.ign" Mar 14 00:35:36.920112 ignition[671]: no config at "/usr/lib/ignition/user.ign" Mar 14 00:35:36.920120 ignition[671]: failed to fetch config: resource requires networking Mar 14 00:35:36.920397 ignition[671]: Ignition finished successfully Mar 14 00:35:36.926787 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 14 00:35:36.928516 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 14 00:35:36.967968 systemd-networkd[774]: lo: Link UP Mar 14 00:35:36.967985 systemd-networkd[774]: lo: Gained carrier Mar 14 00:35:36.970419 systemd-networkd[774]: Enumeration completed Mar 14 00:35:36.970960 systemd-networkd[774]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 14 00:35:36.970966 systemd-networkd[774]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 14 00:35:36.971590 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 14 00:35:36.973164 systemd-networkd[774]: eth0: Link UP Mar 14 00:35:36.973169 systemd-networkd[774]: eth0: Gained carrier Mar 14 00:35:36.973179 systemd-networkd[774]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 14 00:35:36.975646 systemd[1]: Reached target network.target - Network. Mar 14 00:35:36.984746 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 14 00:35:36.996589 systemd-networkd[774]: eth0: DHCPv4 address 10.230.50.222/30, gateway 10.230.50.221 acquired from 10.230.50.221 Mar 14 00:35:37.006331 ignition[777]: Ignition 2.19.0 Mar 14 00:35:37.006350 ignition[777]: Stage: fetch Mar 14 00:35:37.006633 ignition[777]: no configs at "/usr/lib/ignition/base.d" Mar 14 00:35:37.006653 ignition[777]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 14 00:35:37.006843 ignition[777]: parsed url from cmdline: "" Mar 14 00:35:37.006850 ignition[777]: no config URL provided Mar 14 00:35:37.006859 ignition[777]: reading system config file "/usr/lib/ignition/user.ign" Mar 14 00:35:37.006875 ignition[777]: no config at "/usr/lib/ignition/user.ign" Mar 14 00:35:37.007055 ignition[777]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 14 00:35:37.007445 ignition[777]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 14 00:35:37.007507 ignition[777]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 14 00:35:37.023757 ignition[777]: GET result: OK Mar 14 00:35:37.024702 ignition[777]: parsing config with SHA512: ba22abf99d3ea0c54c970a6ace92203a6ca0ff8a22773c43d8f212525b9d57ba7335c3bba6a5eebb22db480f19b7152d83c33b6e5dbda187ac721c9e69c15181 Mar 14 00:35:37.031317 unknown[777]: fetched base config from "system" Mar 14 00:35:37.031340 unknown[777]: fetched base config from "system" Mar 14 00:35:37.031932 ignition[777]: fetch: fetch complete Mar 14 00:35:37.031349 unknown[777]: fetched user config from "openstack" Mar 14 00:35:37.031944 ignition[777]: fetch: fetch passed Mar 14 00:35:37.035285 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 14 00:35:37.032006 ignition[777]: Ignition finished successfully Mar 14 00:35:37.042650 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 14 00:35:37.074849 ignition[785]: Ignition 2.19.0 Mar 14 00:35:37.074873 ignition[785]: Stage: kargs Mar 14 00:35:37.075096 ignition[785]: no configs at "/usr/lib/ignition/base.d" Mar 14 00:35:37.075116 ignition[785]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 14 00:35:37.080061 ignition[785]: kargs: kargs passed Mar 14 00:35:37.080153 ignition[785]: Ignition finished successfully Mar 14 00:35:37.082878 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 14 00:35:37.090793 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 14 00:35:37.114222 ignition[791]: Ignition 2.19.0 Mar 14 00:35:37.115321 ignition[791]: Stage: disks Mar 14 00:35:37.115572 ignition[791]: no configs at "/usr/lib/ignition/base.d" Mar 14 00:35:37.115593 ignition[791]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 14 00:35:37.118367 ignition[791]: disks: disks passed Mar 14 00:35:37.118470 ignition[791]: Ignition finished successfully Mar 14 00:35:37.119785 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 14 00:35:37.121321 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 14 00:35:37.122982 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 14 00:35:37.124485 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 14 00:35:37.126021 systemd[1]: Reached target sysinit.target - System Initialization. Mar 14 00:35:37.127348 systemd[1]: Reached target basic.target - Basic System. Mar 14 00:35:37.136784 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 14 00:35:37.156839 systemd-fsck[800]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 14 00:35:37.159860 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 14 00:35:37.165553 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 14 00:35:37.281486 kernel: EXT4-fs (vda9): mounted filesystem 08e1a4ba-bbe3-4d29-aaf8-5eb22e9a9bf3 r/w with ordered data mode. Quota mode: none. Mar 14 00:35:37.283138 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 14 00:35:37.284662 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 14 00:35:37.292589 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 14 00:35:37.295587 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 14 00:35:37.297561 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 14 00:35:37.303833 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Mar 14 00:35:37.306832 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 14 00:35:37.323143 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/vda6 scanned by mount (808) Mar 14 00:35:37.323299 kernel: BTRFS info (device vda6): first mount of filesystem 0ec14b75-fea9-4657-9245-934c6406ae1a Mar 14 00:35:37.323325 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 14 00:35:37.323343 kernel: BTRFS info (device vda6): using free space tree Mar 14 00:35:37.323361 kernel: BTRFS info (device vda6): auto enabling async discard Mar 14 00:35:37.306876 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 14 00:35:37.318474 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 14 00:35:37.328699 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 14 00:35:37.332151 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 14 00:35:37.402353 initrd-setup-root[835]: cut: /sysroot/etc/passwd: No such file or directory Mar 14 00:35:37.410807 initrd-setup-root[843]: cut: /sysroot/etc/group: No such file or directory Mar 14 00:35:37.420122 initrd-setup-root[850]: cut: /sysroot/etc/shadow: No such file or directory Mar 14 00:35:37.427576 initrd-setup-root[857]: cut: /sysroot/etc/gshadow: No such file or directory Mar 14 00:35:37.531651 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 14 00:35:37.543628 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 14 00:35:37.548776 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 14 00:35:37.559501 kernel: BTRFS info (device vda6): last unmount of filesystem 0ec14b75-fea9-4657-9245-934c6406ae1a Mar 14 00:35:37.590578 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 14 00:35:37.595824 ignition[924]: INFO : Ignition 2.19.0 Mar 14 00:35:37.597610 ignition[924]: INFO : Stage: mount Mar 14 00:35:37.597610 ignition[924]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 14 00:35:37.597610 ignition[924]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 14 00:35:37.601514 ignition[924]: INFO : mount: mount passed Mar 14 00:35:37.601514 ignition[924]: INFO : Ignition finished successfully Mar 14 00:35:37.603202 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 14 00:35:37.698485 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 14 00:35:38.180898 systemd-networkd[774]: eth0: Gained IPv6LL Mar 14 00:35:39.687354 systemd-networkd[774]: eth0: Ignoring DHCPv6 address 2a02:1348:179:8cb7:24:19ff:fee6:32de/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:8cb7:24:19ff:fee6:32de/64 assigned by NDisc. Mar 14 00:35:39.687372 systemd-networkd[774]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Mar 14 00:35:44.471649 coreos-metadata[810]: Mar 14 00:35:44.471 WARN failed to locate config-drive, using the metadata service API instead Mar 14 00:35:44.496175 coreos-metadata[810]: Mar 14 00:35:44.496 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 14 00:35:44.510397 coreos-metadata[810]: Mar 14 00:35:44.510 INFO Fetch successful Mar 14 00:35:44.511294 coreos-metadata[810]: Mar 14 00:35:44.510 INFO wrote hostname srv-zkxct.gb1.brightbox.com to /sysroot/etc/hostname Mar 14 00:35:44.512929 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 14 00:35:44.513110 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Mar 14 00:35:44.521584 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 14 00:35:44.537683 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 14 00:35:44.556504 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 scanned by mount (941) Mar 14 00:35:44.564506 kernel: BTRFS info (device vda6): first mount of filesystem 0ec14b75-fea9-4657-9245-934c6406ae1a Mar 14 00:35:44.564587 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 14 00:35:44.564609 kernel: BTRFS info (device vda6): using free space tree Mar 14 00:35:44.568484 kernel: BTRFS info (device vda6): auto enabling async discard Mar 14 00:35:44.571839 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 14 00:35:44.602193 ignition[959]: INFO : Ignition 2.19.0 Mar 14 00:35:44.605163 ignition[959]: INFO : Stage: files Mar 14 00:35:44.605163 ignition[959]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 14 00:35:44.605163 ignition[959]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 14 00:35:44.605163 ignition[959]: DEBUG : files: compiled without relabeling support, skipping Mar 14 00:35:44.608423 ignition[959]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 14 00:35:44.608423 ignition[959]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 14 00:35:44.610958 ignition[959]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 14 00:35:44.611905 ignition[959]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 14 00:35:44.612847 unknown[959]: wrote ssh authorized keys file for user: core Mar 14 00:35:44.613835 ignition[959]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 14 00:35:44.615005 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 14 00:35:44.616186 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Mar 14 00:35:44.856772 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 14 00:35:45.208253 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 14 00:35:45.211489 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 14 00:35:45.211489 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 14 00:35:45.211489 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 14 00:35:45.211489 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 14 00:35:45.211489 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 14 00:35:45.211489 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 14 00:35:45.211489 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 14 00:35:45.224784 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 14 00:35:45.224784 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 14 00:35:45.224784 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 14 00:35:45.224784 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Mar 14 00:35:45.224784 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Mar 14 00:35:45.224784 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Mar 14 00:35:45.224784 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.35.1-x86-64.raw: attempt #1 Mar 14 00:35:45.603756 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 14 00:35:47.028672 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Mar 14 00:35:47.028672 ignition[959]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 14 00:35:47.031713 ignition[959]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 14 00:35:47.031713 ignition[959]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 14 00:35:47.031713 ignition[959]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 14 00:35:47.031713 ignition[959]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 14 00:35:47.031713 ignition[959]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 14 00:35:47.031713 ignition[959]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 14 00:35:47.031713 ignition[959]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 14 00:35:47.031713 ignition[959]: INFO : files: files passed Mar 14 00:35:47.031713 ignition[959]: INFO : Ignition finished successfully Mar 14 00:35:47.032949 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 14 00:35:47.044764 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 14 00:35:47.048648 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 14 00:35:47.052360 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 14 00:35:47.053352 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 14 00:35:47.075871 initrd-setup-root-after-ignition[987]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 14 00:35:47.075871 initrd-setup-root-after-ignition[987]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 14 00:35:47.079820 initrd-setup-root-after-ignition[991]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 14 00:35:47.082124 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 14 00:35:47.084693 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 14 00:35:47.091685 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 14 00:35:47.132221 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 14 00:35:47.132395 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 14 00:35:47.134102 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 14 00:35:47.135108 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 14 00:35:47.136984 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 14 00:35:47.147156 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 14 00:35:47.167193 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 14 00:35:47.177711 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 14 00:35:47.190930 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 14 00:35:47.192973 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 14 00:35:47.194091 systemd[1]: Stopped target timers.target - Timer Units. Mar 14 00:35:47.195618 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 14 00:35:47.195872 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 14 00:35:47.197612 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 14 00:35:47.198666 systemd[1]: Stopped target basic.target - Basic System. Mar 14 00:35:47.200239 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 14 00:35:47.201961 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 14 00:35:47.203319 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 14 00:35:47.204977 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 14 00:35:47.206688 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 14 00:35:47.208212 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 14 00:35:47.209659 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 14 00:35:47.211289 systemd[1]: Stopped target swap.target - Swaps. Mar 14 00:35:47.212753 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 14 00:35:47.212955 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 14 00:35:47.214754 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 14 00:35:47.215792 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 14 00:35:47.217151 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 14 00:35:47.217545 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 14 00:35:47.218744 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 14 00:35:47.218924 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 14 00:35:47.221030 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 14 00:35:47.221195 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 14 00:35:47.223906 systemd[1]: ignition-files.service: Deactivated successfully. Mar 14 00:35:47.224061 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 14 00:35:47.230838 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 14 00:35:47.233763 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 14 00:35:47.235186 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 14 00:35:47.235386 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 14 00:35:47.240845 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 14 00:35:47.241020 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 14 00:35:47.249219 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 14 00:35:47.249408 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 14 00:35:47.265501 ignition[1011]: INFO : Ignition 2.19.0 Mar 14 00:35:47.265501 ignition[1011]: INFO : Stage: umount Mar 14 00:35:47.273851 ignition[1011]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 14 00:35:47.273851 ignition[1011]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 14 00:35:47.273851 ignition[1011]: INFO : umount: umount passed Mar 14 00:35:47.273851 ignition[1011]: INFO : Ignition finished successfully Mar 14 00:35:47.270518 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 14 00:35:47.274310 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 14 00:35:47.274548 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 14 00:35:47.276265 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 14 00:35:47.276439 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 14 00:35:47.277809 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 14 00:35:47.277884 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 14 00:35:47.279150 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 14 00:35:47.279223 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 14 00:35:47.280540 systemd[1]: Stopped target network.target - Network. Mar 14 00:35:47.281782 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 14 00:35:47.281852 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 14 00:35:47.283235 systemd[1]: Stopped target paths.target - Path Units. Mar 14 00:35:47.284528 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 14 00:35:47.286611 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 14 00:35:47.287537 systemd[1]: Stopped target slices.target - Slice Units. Mar 14 00:35:47.288839 systemd[1]: Stopped target sockets.target - Socket Units. Mar 14 00:35:47.290313 systemd[1]: iscsid.socket: Deactivated successfully. Mar 14 00:35:47.290382 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 14 00:35:47.291658 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 14 00:35:47.291723 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 14 00:35:47.293112 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 14 00:35:47.293188 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 14 00:35:47.294694 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 14 00:35:47.294778 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 14 00:35:47.296277 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 14 00:35:47.298168 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 14 00:35:47.299970 systemd-networkd[774]: eth0: DHCPv6 lease lost Mar 14 00:35:47.303192 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 14 00:35:47.303400 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 14 00:35:47.308254 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 14 00:35:47.308467 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 14 00:35:47.311192 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 14 00:35:47.311552 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 14 00:35:47.319706 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 14 00:35:47.322088 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 14 00:35:47.322166 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 14 00:35:47.322976 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 14 00:35:47.323051 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 14 00:35:47.324980 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 14 00:35:47.325046 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 14 00:35:47.326392 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 14 00:35:47.326506 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 14 00:35:47.328195 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 14 00:35:47.340020 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 14 00:35:47.340276 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 14 00:35:47.344416 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 14 00:35:47.344695 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 14 00:35:47.347996 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 14 00:35:47.348098 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 14 00:35:47.349780 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 14 00:35:47.349849 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 14 00:35:47.351228 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 14 00:35:47.351301 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 14 00:35:47.353321 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 14 00:35:47.353425 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 14 00:35:47.354755 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 14 00:35:47.354829 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 14 00:35:47.363799 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 14 00:35:47.365035 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 14 00:35:47.365118 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 14 00:35:47.365898 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 14 00:35:47.365964 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 14 00:35:47.366735 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 14 00:35:47.366798 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 14 00:35:47.368293 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 14 00:35:47.368369 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 14 00:35:47.375698 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 14 00:35:47.375897 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 14 00:35:47.397624 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 14 00:35:47.397827 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 14 00:35:47.399679 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 14 00:35:47.400746 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 14 00:35:47.400817 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 14 00:35:47.408699 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 14 00:35:47.418847 systemd[1]: Switching root. Mar 14 00:35:47.461541 systemd-journald[203]: Received SIGTERM from PID 1 (systemd). Mar 14 00:35:47.461665 systemd-journald[203]: Journal stopped Mar 14 00:35:48.908728 kernel: SELinux: policy capability network_peer_controls=1 Mar 14 00:35:48.909973 kernel: SELinux: policy capability open_perms=1 Mar 14 00:35:48.910004 kernel: SELinux: policy capability extended_socket_class=1 Mar 14 00:35:48.910028 kernel: SELinux: policy capability always_check_network=0 Mar 14 00:35:48.910047 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 14 00:35:48.910071 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 14 00:35:48.910095 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 14 00:35:48.910129 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 14 00:35:48.910149 kernel: audit: type=1403 audit(1773448547.692:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 14 00:35:48.910176 systemd[1]: Successfully loaded SELinux policy in 48.128ms. Mar 14 00:35:48.910209 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 20.806ms. Mar 14 00:35:48.910248 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 14 00:35:48.910269 systemd[1]: Detected virtualization kvm. Mar 14 00:35:48.910287 systemd[1]: Detected architecture x86-64. Mar 14 00:35:48.910314 systemd[1]: Detected first boot. Mar 14 00:35:48.910346 systemd[1]: Hostname set to . Mar 14 00:35:48.910388 systemd[1]: Initializing machine ID from VM UUID. Mar 14 00:35:48.910408 zram_generator::config[1055]: No configuration found. Mar 14 00:35:48.910428 systemd[1]: Populated /etc with preset unit settings. Mar 14 00:35:48.911489 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 14 00:35:48.911527 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 14 00:35:48.911550 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 14 00:35:48.911572 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 14 00:35:48.911624 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 14 00:35:48.911655 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 14 00:35:48.911675 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 14 00:35:48.911695 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 14 00:35:48.911715 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 14 00:35:48.911741 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 14 00:35:48.911762 systemd[1]: Created slice user.slice - User and Session Slice. Mar 14 00:35:48.911782 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 14 00:35:48.911802 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 14 00:35:48.911834 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 14 00:35:48.911865 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 14 00:35:48.911885 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 14 00:35:48.911906 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 14 00:35:48.911926 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 14 00:35:48.911945 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 14 00:35:48.911965 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 14 00:35:48.911984 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 14 00:35:48.912016 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 14 00:35:48.912050 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 14 00:35:48.912069 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 14 00:35:48.912090 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 14 00:35:48.912120 systemd[1]: Reached target slices.target - Slice Units. Mar 14 00:35:48.912138 systemd[1]: Reached target swap.target - Swaps. Mar 14 00:35:48.912155 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 14 00:35:48.912195 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 14 00:35:48.912215 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 14 00:35:48.912233 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 14 00:35:48.912274 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 14 00:35:48.912310 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 14 00:35:48.912344 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 14 00:35:48.912373 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 14 00:35:48.912397 systemd[1]: Mounting media.mount - External Media Directory... Mar 14 00:35:48.912428 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 14 00:35:48.914502 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 14 00:35:48.914547 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 14 00:35:48.914570 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 14 00:35:48.914591 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 14 00:35:48.914611 systemd[1]: Reached target machines.target - Containers. Mar 14 00:35:48.914645 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 14 00:35:48.914667 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 14 00:35:48.914687 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 14 00:35:48.914707 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 14 00:35:48.914726 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 14 00:35:48.914748 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 14 00:35:48.914767 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 14 00:35:48.914786 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 14 00:35:48.914815 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 14 00:35:48.914847 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 14 00:35:48.914878 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 14 00:35:48.914898 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 14 00:35:48.914918 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 14 00:35:48.914938 systemd[1]: Stopped systemd-fsck-usr.service. Mar 14 00:35:48.914973 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 14 00:35:48.914992 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 14 00:35:48.915011 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 14 00:35:48.915029 kernel: loop: module loaded Mar 14 00:35:48.915072 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 14 00:35:48.915101 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 14 00:35:48.915132 systemd[1]: verity-setup.service: Deactivated successfully. Mar 14 00:35:48.915159 systemd[1]: Stopped verity-setup.service. Mar 14 00:35:48.915191 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 14 00:35:48.915209 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 14 00:35:48.915228 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 14 00:35:48.915270 systemd[1]: Mounted media.mount - External Media Directory. Mar 14 00:35:48.915291 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 14 00:35:48.915324 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 14 00:35:48.915343 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 14 00:35:48.915362 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 14 00:35:48.917474 systemd-journald[1151]: Collecting audit messages is disabled. Mar 14 00:35:48.917538 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 14 00:35:48.917563 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 14 00:35:48.917583 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 14 00:35:48.917603 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 14 00:35:48.917636 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 14 00:35:48.917658 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 14 00:35:48.917678 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 14 00:35:48.917699 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 14 00:35:48.917726 systemd-journald[1151]: Journal started Mar 14 00:35:48.917763 systemd-journald[1151]: Runtime Journal (/run/log/journal/b7038b4291a040b69a267862903b4f56) is 4.7M, max 38.0M, 33.2M free. Mar 14 00:35:48.499758 systemd[1]: Queued start job for default target multi-user.target. Mar 14 00:35:48.522645 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 14 00:35:48.523264 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 14 00:35:48.920622 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 14 00:35:48.924895 systemd[1]: Started systemd-journald.service - Journal Service. Mar 14 00:35:48.924870 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 14 00:35:48.926313 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 14 00:35:48.927514 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 14 00:35:48.950864 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 14 00:35:48.956869 kernel: ACPI: bus type drm_connector registered Mar 14 00:35:48.956947 kernel: fuse: init (API version 7.39) Mar 14 00:35:48.960597 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 14 00:35:48.961417 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 14 00:35:48.961489 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 14 00:35:48.963763 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Mar 14 00:35:48.969729 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 14 00:35:48.975591 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 14 00:35:48.976562 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 14 00:35:48.984669 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 14 00:35:48.987803 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 14 00:35:48.992577 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 14 00:35:48.994761 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 14 00:35:49.000112 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 14 00:35:49.005658 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 14 00:35:49.017805 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 14 00:35:49.027732 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 14 00:35:49.032418 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 14 00:35:49.034755 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 14 00:35:49.036162 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 14 00:35:49.037077 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 14 00:35:49.039310 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 14 00:35:49.040985 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 14 00:35:49.065573 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 14 00:35:49.084497 systemd-journald[1151]: Time spent on flushing to /var/log/journal/b7038b4291a040b69a267862903b4f56 is 123.153ms for 1138 entries. Mar 14 00:35:49.084497 systemd-journald[1151]: System Journal (/var/log/journal/b7038b4291a040b69a267862903b4f56) is 8.0M, max 584.8M, 576.8M free. Mar 14 00:35:49.280848 systemd-journald[1151]: Received client request to flush runtime journal. Mar 14 00:35:49.280930 kernel: loop0: detected capacity change from 0 to 8 Mar 14 00:35:49.280968 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 14 00:35:49.281010 kernel: loop1: detected capacity change from 0 to 140768 Mar 14 00:35:49.281042 kernel: loop2: detected capacity change from 0 to 142488 Mar 14 00:35:49.089013 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 14 00:35:49.094399 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 14 00:35:49.097360 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 14 00:35:49.110562 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Mar 14 00:35:49.113695 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 14 00:35:49.169917 systemd-tmpfiles[1186]: ACLs are not supported, ignoring. Mar 14 00:35:49.169938 systemd-tmpfiles[1186]: ACLs are not supported, ignoring. Mar 14 00:35:49.220602 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 14 00:35:49.230787 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 14 00:35:49.234738 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 14 00:35:49.235753 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Mar 14 00:35:49.284143 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 14 00:35:49.326479 kernel: loop3: detected capacity change from 0 to 217752 Mar 14 00:35:49.370610 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 14 00:35:49.389808 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 14 00:35:49.394352 kernel: loop4: detected capacity change from 0 to 8 Mar 14 00:35:49.392018 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 14 00:35:49.411305 kernel: loop5: detected capacity change from 0 to 140768 Mar 14 00:35:49.410013 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 14 00:35:49.423108 systemd-tmpfiles[1211]: ACLs are not supported, ignoring. Mar 14 00:35:49.423146 systemd-tmpfiles[1211]: ACLs are not supported, ignoring. Mar 14 00:35:49.435333 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 14 00:35:49.438829 udevadm[1215]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 14 00:35:49.452480 kernel: loop6: detected capacity change from 0 to 142488 Mar 14 00:35:49.482480 kernel: loop7: detected capacity change from 0 to 217752 Mar 14 00:35:49.504744 (sd-merge)[1212]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Mar 14 00:35:49.507472 (sd-merge)[1212]: Merged extensions into '/usr'. Mar 14 00:35:49.516602 systemd[1]: Reloading requested from client PID 1185 ('systemd-sysext') (unit systemd-sysext.service)... Mar 14 00:35:49.516625 systemd[1]: Reloading... Mar 14 00:35:49.626489 zram_generator::config[1238]: No configuration found. Mar 14 00:35:49.762707 ldconfig[1180]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 14 00:35:49.922366 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 14 00:35:49.990203 systemd[1]: Reloading finished in 472 ms. Mar 14 00:35:50.021955 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 14 00:35:50.023563 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 14 00:35:50.036717 systemd[1]: Starting ensure-sysext.service... Mar 14 00:35:50.047761 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 14 00:35:50.079120 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 14 00:35:50.081817 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 14 00:35:50.083249 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 14 00:35:50.083703 systemd-tmpfiles[1299]: ACLs are not supported, ignoring. Mar 14 00:35:50.083839 systemd-tmpfiles[1299]: ACLs are not supported, ignoring. Mar 14 00:35:50.089583 systemd-tmpfiles[1299]: Detected autofs mount point /boot during canonicalization of boot. Mar 14 00:35:50.089599 systemd-tmpfiles[1299]: Skipping /boot Mar 14 00:35:50.089615 systemd[1]: Reloading requested from client PID 1298 ('systemctl') (unit ensure-sysext.service)... Mar 14 00:35:50.089634 systemd[1]: Reloading... Mar 14 00:35:50.108581 systemd-tmpfiles[1299]: Detected autofs mount point /boot during canonicalization of boot. Mar 14 00:35:50.108600 systemd-tmpfiles[1299]: Skipping /boot Mar 14 00:35:50.189480 zram_generator::config[1326]: No configuration found. Mar 14 00:35:50.359128 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 14 00:35:50.427411 systemd[1]: Reloading finished in 337 ms. Mar 14 00:35:50.449275 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 14 00:35:50.456011 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 14 00:35:50.478810 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 14 00:35:50.485739 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 14 00:35:50.489698 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 14 00:35:50.500675 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 14 00:35:50.506345 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 14 00:35:50.517714 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 14 00:35:50.527482 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 14 00:35:50.527756 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 14 00:35:50.537788 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 14 00:35:50.541218 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 14 00:35:50.556842 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 14 00:35:50.558653 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 14 00:35:50.558815 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 14 00:35:50.562233 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 14 00:35:50.563994 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 14 00:35:50.564241 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 14 00:35:50.564391 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 14 00:35:50.567720 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 14 00:35:50.567984 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 14 00:35:50.585446 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 14 00:35:50.585889 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 14 00:35:50.600910 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 14 00:35:50.607832 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 14 00:35:50.609713 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 14 00:35:50.615817 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 14 00:35:50.616553 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 14 00:35:50.619521 systemd[1]: Finished ensure-sysext.service. Mar 14 00:35:50.621515 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 14 00:35:50.623164 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 14 00:35:50.623354 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 14 00:35:50.624248 augenrules[1411]: No rules Mar 14 00:35:50.626090 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 14 00:35:50.630808 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 14 00:35:50.631039 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 14 00:35:50.633327 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 14 00:35:50.633615 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 14 00:35:50.635250 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 14 00:35:50.636554 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 14 00:35:50.644323 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 14 00:35:50.654875 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 14 00:35:50.654978 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 14 00:35:50.669699 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 14 00:35:50.670555 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 14 00:35:50.671088 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 14 00:35:50.680705 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 14 00:35:50.687604 systemd-udevd[1395]: Using default interface naming scheme 'v255'. Mar 14 00:35:50.693843 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 14 00:35:50.720942 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 14 00:35:50.733636 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 14 00:35:50.744683 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 14 00:35:50.836204 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 14 00:35:50.837251 systemd[1]: Reached target time-set.target - System Time Set. Mar 14 00:35:50.863673 systemd-resolved[1392]: Positive Trust Anchors: Mar 14 00:35:50.863703 systemd-resolved[1392]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 14 00:35:50.863748 systemd-resolved[1392]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 14 00:35:50.874216 systemd-resolved[1392]: Using system hostname 'srv-zkxct.gb1.brightbox.com'. Mar 14 00:35:50.878309 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 14 00:35:50.879261 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 14 00:35:50.889944 systemd-networkd[1436]: lo: Link UP Mar 14 00:35:50.890361 systemd-networkd[1436]: lo: Gained carrier Mar 14 00:35:50.892068 systemd-networkd[1436]: Enumeration completed Mar 14 00:35:50.892322 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 14 00:35:50.893674 systemd[1]: Reached target network.target - Network. Mar 14 00:35:50.905732 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 14 00:35:50.957840 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 14 00:35:50.987482 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 34 scanned by (udev-worker) (1447) Mar 14 00:35:51.004739 systemd-networkd[1436]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 14 00:35:51.004920 systemd-networkd[1436]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 14 00:35:51.006251 systemd-networkd[1436]: eth0: Link UP Mar 14 00:35:51.006262 systemd-networkd[1436]: eth0: Gained carrier Mar 14 00:35:51.006279 systemd-networkd[1436]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 14 00:35:51.022583 systemd-networkd[1436]: eth0: DHCPv4 address 10.230.50.222/30, gateway 10.230.50.221 acquired from 10.230.50.221 Mar 14 00:35:51.027543 systemd-timesyncd[1425]: Network configuration changed, trying to establish connection. Mar 14 00:35:51.058494 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Mar 14 00:35:51.082476 kernel: ACPI: button: Power Button [PWRF] Mar 14 00:35:51.082593 kernel: mousedev: PS/2 mouse device common for all mice Mar 14 00:35:51.104926 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 14 00:35:51.113993 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 14 00:35:51.138831 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 14 00:35:51.148528 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 14 00:35:51.153437 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Mar 14 00:35:51.153758 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 14 00:35:51.171205 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Mar 14 00:35:51.235907 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 14 00:35:51.438744 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 14 00:35:51.454676 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 14 00:35:51.456250 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 14 00:35:51.475820 lvm[1472]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 14 00:35:51.510474 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 14 00:35:51.511960 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 14 00:35:51.512967 systemd[1]: Reached target sysinit.target - System Initialization. Mar 14 00:35:51.514072 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 14 00:35:51.515112 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 14 00:35:51.516427 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 14 00:35:51.517631 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 14 00:35:51.518585 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 14 00:35:51.519492 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 14 00:35:51.519651 systemd[1]: Reached target paths.target - Path Units. Mar 14 00:35:51.520393 systemd[1]: Reached target timers.target - Timer Units. Mar 14 00:35:51.522570 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 14 00:35:51.525542 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 14 00:35:51.531179 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 14 00:35:51.533895 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 14 00:35:51.535351 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 14 00:35:51.536224 systemd[1]: Reached target sockets.target - Socket Units. Mar 14 00:35:51.536910 systemd[1]: Reached target basic.target - Basic System. Mar 14 00:35:51.537675 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 14 00:35:51.537724 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 14 00:35:51.545586 systemd[1]: Starting containerd.service - containerd container runtime... Mar 14 00:35:51.551651 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 14 00:35:51.554346 lvm[1477]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 14 00:35:51.563697 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 14 00:35:51.566294 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 14 00:35:51.578598 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 14 00:35:51.579402 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 14 00:35:51.585687 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 14 00:35:51.591675 jq[1481]: false Mar 14 00:35:51.596570 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 14 00:35:51.604674 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 14 00:35:51.607224 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 14 00:35:51.607626 extend-filesystems[1482]: Found loop4 Mar 14 00:35:51.620930 extend-filesystems[1482]: Found loop5 Mar 14 00:35:51.620930 extend-filesystems[1482]: Found loop6 Mar 14 00:35:51.620930 extend-filesystems[1482]: Found loop7 Mar 14 00:35:51.620930 extend-filesystems[1482]: Found vda Mar 14 00:35:51.620930 extend-filesystems[1482]: Found vda1 Mar 14 00:35:51.620930 extend-filesystems[1482]: Found vda2 Mar 14 00:35:51.620930 extend-filesystems[1482]: Found vda3 Mar 14 00:35:51.620930 extend-filesystems[1482]: Found usr Mar 14 00:35:51.620930 extend-filesystems[1482]: Found vda4 Mar 14 00:35:51.620930 extend-filesystems[1482]: Found vda6 Mar 14 00:35:51.620930 extend-filesystems[1482]: Found vda7 Mar 14 00:35:51.620930 extend-filesystems[1482]: Found vda9 Mar 14 00:35:51.620930 extend-filesystems[1482]: Checking size of /dev/vda9 Mar 14 00:35:51.627645 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 14 00:35:51.633750 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 14 00:35:51.641688 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 14 00:35:51.644033 systemd[1]: Starting update-engine.service - Update Engine... Mar 14 00:35:51.650596 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 14 00:35:51.652967 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 14 00:35:51.673125 dbus-daemon[1480]: [system] SELinux support is enabled Mar 14 00:35:51.668227 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 14 00:35:51.668552 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 14 00:35:51.675099 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 14 00:35:51.693121 update_engine[1493]: I20260314 00:35:51.685901 1493 main.cc:92] Flatcar Update Engine starting Mar 14 00:35:51.693121 update_engine[1493]: I20260314 00:35:51.688536 1493 update_check_scheduler.cc:74] Next update check in 7m3s Mar 14 00:35:51.680204 dbus-daemon[1480]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.2' (uid=244 pid=1436 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Mar 14 00:35:51.694640 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 14 00:35:51.695394 dbus-daemon[1480]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 14 00:35:51.695543 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 14 00:35:51.697822 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 14 00:35:51.697853 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 14 00:35:51.706625 jq[1494]: true Mar 14 00:35:51.701333 systemd[1]: Started update-engine.service - Update Engine. Mar 14 00:35:51.719229 (ntainerd)[1504]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 14 00:35:51.722691 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Mar 14 00:35:51.727662 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 14 00:35:51.739497 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 14 00:35:51.739789 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 14 00:35:51.759981 extend-filesystems[1482]: Resized partition /dev/vda9 Mar 14 00:35:51.764294 extend-filesystems[1515]: resize2fs 1.47.1 (20-May-2024) Mar 14 00:35:51.805716 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Mar 14 00:35:51.811528 tar[1496]: linux-amd64/LICENSE Mar 14 00:35:51.811528 tar[1496]: linux-amd64/helm Mar 14 00:35:51.811918 jq[1510]: true Mar 14 00:35:51.850888 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 34 scanned by (udev-worker) (1446) Mar 14 00:35:51.849637 systemd[1]: motdgen.service: Deactivated successfully. Mar 14 00:35:51.849958 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 14 00:35:51.969249 dbus-daemon[1480]: [system] Successfully activated service 'org.freedesktop.hostname1' Mar 14 00:35:51.969899 dbus-daemon[1480]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.5' (uid=0 pid=1507 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Mar 14 00:35:51.978652 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Mar 14 00:35:51.996496 systemd[1]: Starting polkit.service - Authorization Manager... Mar 14 00:35:52.014157 systemd-logind[1491]: Watching system buttons on /dev/input/event2 (Power Button) Mar 14 00:35:52.014220 systemd-logind[1491]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 14 00:35:52.014614 systemd-logind[1491]: New seat seat0. Mar 14 00:35:52.020402 systemd[1]: Started systemd-logind.service - User Login Management. Mar 14 00:35:52.027772 polkitd[1539]: Started polkitd version 121 Mar 14 00:35:52.065480 polkitd[1539]: Loading rules from directory /etc/polkit-1/rules.d Mar 14 00:35:52.070391 polkitd[1539]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 14 00:35:52.071801 polkitd[1539]: Finished loading, compiling and executing 2 rules Mar 14 00:35:52.074107 dbus-daemon[1480]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Mar 14 00:35:52.079037 polkitd[1539]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Mar 14 00:35:52.074582 systemd[1]: Started polkit.service - Authorization Manager. Mar 14 00:35:52.092951 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 14 00:35:52.098685 bash[1538]: Updated "/home/core/.ssh/authorized_keys" Mar 14 00:35:52.101257 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 14 00:35:52.113787 systemd[1]: Starting sshkeys.service... Mar 14 00:35:52.119583 locksmithd[1508]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 14 00:35:52.134558 systemd-hostnamed[1507]: Hostname set to (static) Mar 14 00:35:52.163345 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 14 00:35:52.176805 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 14 00:35:52.190696 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Mar 14 00:35:52.213374 extend-filesystems[1515]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 14 00:35:52.213374 extend-filesystems[1515]: old_desc_blocks = 1, new_desc_blocks = 8 Mar 14 00:35:52.213374 extend-filesystems[1515]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Mar 14 00:35:52.217929 extend-filesystems[1482]: Resized filesystem in /dev/vda9 Mar 14 00:35:52.218286 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 14 00:35:52.219548 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 14 00:35:52.260607 systemd-networkd[1436]: eth0: Gained IPv6LL Mar 14 00:35:52.262186 systemd-timesyncd[1425]: Network configuration changed, trying to establish connection. Mar 14 00:35:52.264071 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 14 00:35:52.269735 systemd[1]: Reached target network-online.target - Network is Online. Mar 14 00:35:52.277849 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 14 00:35:52.286844 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 14 00:35:52.290606 containerd[1504]: time="2026-03-14T00:35:52.290480370Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Mar 14 00:35:52.389227 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 14 00:35:52.409898 containerd[1504]: time="2026-03-14T00:35:52.408611122Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 14 00:35:52.415567 containerd[1504]: time="2026-03-14T00:35:52.415514341Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 14 00:35:52.416224 containerd[1504]: time="2026-03-14T00:35:52.416197162Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 14 00:35:52.416336 containerd[1504]: time="2026-03-14T00:35:52.416301369Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 14 00:35:52.416766 containerd[1504]: time="2026-03-14T00:35:52.416737631Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 14 00:35:52.417521 containerd[1504]: time="2026-03-14T00:35:52.417493365Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 14 00:35:52.417720 containerd[1504]: time="2026-03-14T00:35:52.417691609Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 14 00:35:52.417823 containerd[1504]: time="2026-03-14T00:35:52.417789218Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 14 00:35:52.418292 containerd[1504]: time="2026-03-14T00:35:52.418251391Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 14 00:35:52.420929 containerd[1504]: time="2026-03-14T00:35:52.419318977Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 14 00:35:52.420929 containerd[1504]: time="2026-03-14T00:35:52.419377023Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 14 00:35:52.420929 containerd[1504]: time="2026-03-14T00:35:52.419398365Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 14 00:35:52.420929 containerd[1504]: time="2026-03-14T00:35:52.419551723Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 14 00:35:52.420929 containerd[1504]: time="2026-03-14T00:35:52.419978515Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 14 00:35:52.420929 containerd[1504]: time="2026-03-14T00:35:52.420136544Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 14 00:35:52.420929 containerd[1504]: time="2026-03-14T00:35:52.420171827Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 14 00:35:52.420929 containerd[1504]: time="2026-03-14T00:35:52.420333755Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 14 00:35:52.422550 containerd[1504]: time="2026-03-14T00:35:52.420444439Z" level=info msg="metadata content store policy set" policy=shared Mar 14 00:35:52.428571 containerd[1504]: time="2026-03-14T00:35:52.428540972Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 14 00:35:52.429064 containerd[1504]: time="2026-03-14T00:35:52.429037389Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 14 00:35:52.429245 containerd[1504]: time="2026-03-14T00:35:52.429209944Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 14 00:35:52.429361 containerd[1504]: time="2026-03-14T00:35:52.429329184Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 14 00:35:52.429580 containerd[1504]: time="2026-03-14T00:35:52.429552722Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 14 00:35:52.430153 containerd[1504]: time="2026-03-14T00:35:52.430116137Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 14 00:35:52.431325 containerd[1504]: time="2026-03-14T00:35:52.431298468Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 14 00:35:52.432141 containerd[1504]: time="2026-03-14T00:35:52.432114854Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 14 00:35:52.432263 containerd[1504]: time="2026-03-14T00:35:52.432227454Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 14 00:35:52.432384 containerd[1504]: time="2026-03-14T00:35:52.432340402Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 14 00:35:52.434197 containerd[1504]: time="2026-03-14T00:35:52.433401634Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 14 00:35:52.434197 containerd[1504]: time="2026-03-14T00:35:52.433436928Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 14 00:35:52.434197 containerd[1504]: time="2026-03-14T00:35:52.433458348Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 14 00:35:52.434197 containerd[1504]: time="2026-03-14T00:35:52.433497533Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 14 00:35:52.434197 containerd[1504]: time="2026-03-14T00:35:52.433551887Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 14 00:35:52.434197 containerd[1504]: time="2026-03-14T00:35:52.433574799Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 14 00:35:52.434197 containerd[1504]: time="2026-03-14T00:35:52.433593188Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 14 00:35:52.434197 containerd[1504]: time="2026-03-14T00:35:52.433619504Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 14 00:35:52.434197 containerd[1504]: time="2026-03-14T00:35:52.433679175Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.434197 containerd[1504]: time="2026-03-14T00:35:52.433702132Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.434197 containerd[1504]: time="2026-03-14T00:35:52.433720175Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.434197 containerd[1504]: time="2026-03-14T00:35:52.433740103Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.434197 containerd[1504]: time="2026-03-14T00:35:52.433779115Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.434197 containerd[1504]: time="2026-03-14T00:35:52.433830371Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.434690 containerd[1504]: time="2026-03-14T00:35:52.433850850Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.434690 containerd[1504]: time="2026-03-14T00:35:52.433910496Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.434690 containerd[1504]: time="2026-03-14T00:35:52.433938117Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.434690 containerd[1504]: time="2026-03-14T00:35:52.433970246Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.434690 containerd[1504]: time="2026-03-14T00:35:52.433993705Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.434690 containerd[1504]: time="2026-03-14T00:35:52.434011843Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.434690 containerd[1504]: time="2026-03-14T00:35:52.434031570Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.434690 containerd[1504]: time="2026-03-14T00:35:52.434053496Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 14 00:35:52.434690 containerd[1504]: time="2026-03-14T00:35:52.434086701Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.434690 containerd[1504]: time="2026-03-14T00:35:52.434106698Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.434690 containerd[1504]: time="2026-03-14T00:35:52.434123045Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 14 00:35:52.438397 containerd[1504]: time="2026-03-14T00:35:52.435710073Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 14 00:35:52.438397 containerd[1504]: time="2026-03-14T00:35:52.436165394Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 14 00:35:52.438397 containerd[1504]: time="2026-03-14T00:35:52.436191714Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 14 00:35:52.438397 containerd[1504]: time="2026-03-14T00:35:52.436223994Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 14 00:35:52.438397 containerd[1504]: time="2026-03-14T00:35:52.436238793Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.438397 containerd[1504]: time="2026-03-14T00:35:52.436257074Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 14 00:35:52.438397 containerd[1504]: time="2026-03-14T00:35:52.436286445Z" level=info msg="NRI interface is disabled by configuration." Mar 14 00:35:52.438397 containerd[1504]: time="2026-03-14T00:35:52.436309885Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 14 00:35:52.438707 containerd[1504]: time="2026-03-14T00:35:52.436797325Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 14 00:35:52.438707 containerd[1504]: time="2026-03-14T00:35:52.436888890Z" level=info msg="Connect containerd service" Mar 14 00:35:52.438707 containerd[1504]: time="2026-03-14T00:35:52.436941222Z" level=info msg="using legacy CRI server" Mar 14 00:35:52.438707 containerd[1504]: time="2026-03-14T00:35:52.436957009Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 14 00:35:52.438707 containerd[1504]: time="2026-03-14T00:35:52.437142017Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 14 00:35:52.441948 containerd[1504]: time="2026-03-14T00:35:52.441911636Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 14 00:35:52.444556 containerd[1504]: time="2026-03-14T00:35:52.444506284Z" level=info msg="Start subscribing containerd event" Mar 14 00:35:52.446789 containerd[1504]: time="2026-03-14T00:35:52.446719337Z" level=info msg="Start recovering state" Mar 14 00:35:52.447337 containerd[1504]: time="2026-03-14T00:35:52.447309279Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 14 00:35:52.447853 containerd[1504]: time="2026-03-14T00:35:52.447817091Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 14 00:35:52.448032 containerd[1504]: time="2026-03-14T00:35:52.447541753Z" level=info msg="Start event monitor" Mar 14 00:35:52.448094 containerd[1504]: time="2026-03-14T00:35:52.448034916Z" level=info msg="Start snapshots syncer" Mar 14 00:35:52.448094 containerd[1504]: time="2026-03-14T00:35:52.448057099Z" level=info msg="Start cni network conf syncer for default" Mar 14 00:35:52.448094 containerd[1504]: time="2026-03-14T00:35:52.448089929Z" level=info msg="Start streaming server" Mar 14 00:35:52.457606 containerd[1504]: time="2026-03-14T00:35:52.454502641Z" level=info msg="containerd successfully booted in 0.168463s" Mar 14 00:35:52.455303 systemd[1]: Started containerd.service - containerd container runtime. Mar 14 00:35:52.640498 sshd_keygen[1516]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 14 00:35:52.682441 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 14 00:35:52.692870 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 14 00:35:52.702480 systemd[1]: Started sshd@0-10.230.50.222:22-20.161.92.111:33606.service - OpenSSH per-connection server daemon (20.161.92.111:33606). Mar 14 00:35:52.718392 systemd[1]: issuegen.service: Deactivated successfully. Mar 14 00:35:52.718720 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 14 00:35:52.729783 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 14 00:35:52.764041 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 14 00:35:52.777547 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 14 00:35:52.786946 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 14 00:35:52.788026 systemd[1]: Reached target getty.target - Login Prompts. Mar 14 00:35:52.962442 tar[1496]: linux-amd64/README.md Mar 14 00:35:52.978190 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 14 00:35:53.296251 sshd[1588]: Accepted publickey for core from 20.161.92.111 port 33606 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:35:53.298363 sshd[1588]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:35:53.319640 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 14 00:35:53.330982 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 14 00:35:53.341831 systemd-logind[1491]: New session 1 of user core. Mar 14 00:35:53.355824 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 14 00:35:53.366936 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 14 00:35:53.378311 (systemd)[1604]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 14 00:35:53.518354 systemd[1604]: Queued start job for default target default.target. Mar 14 00:35:53.521369 systemd[1604]: Created slice app.slice - User Application Slice. Mar 14 00:35:53.521543 systemd[1604]: Reached target paths.target - Paths. Mar 14 00:35:53.521571 systemd[1604]: Reached target timers.target - Timers. Mar 14 00:35:53.525006 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 14 00:35:53.525560 systemd[1604]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 14 00:35:53.535358 (kubelet)[1615]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 14 00:35:53.545127 systemd[1604]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 14 00:35:53.545355 systemd[1604]: Reached target sockets.target - Sockets. Mar 14 00:35:53.545381 systemd[1604]: Reached target basic.target - Basic System. Mar 14 00:35:53.545636 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 14 00:35:53.547525 systemd[1604]: Reached target default.target - Main User Target. Mar 14 00:35:53.547594 systemd[1604]: Startup finished in 158ms. Mar 14 00:35:53.557167 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 14 00:35:53.766631 systemd-timesyncd[1425]: Network configuration changed, trying to establish connection. Mar 14 00:35:53.769328 systemd-networkd[1436]: eth0: Ignoring DHCPv6 address 2a02:1348:179:8cb7:24:19ff:fee6:32de/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:8cb7:24:19ff:fee6:32de/64 assigned by NDisc. Mar 14 00:35:53.769346 systemd-networkd[1436]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Mar 14 00:35:53.975740 systemd[1]: Started sshd@1-10.230.50.222:22-20.161.92.111:36180.service - OpenSSH per-connection server daemon (20.161.92.111:36180). Mar 14 00:35:54.104100 kubelet[1615]: E0314 00:35:54.103794 1615 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 14 00:35:54.107098 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 14 00:35:54.107372 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 14 00:35:54.525256 sshd[1627]: Accepted publickey for core from 20.161.92.111 port 36180 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:35:54.527510 sshd[1627]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:35:54.536518 systemd-logind[1491]: New session 2 of user core. Mar 14 00:35:54.546792 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 14 00:35:54.920831 sshd[1627]: pam_unix(sshd:session): session closed for user core Mar 14 00:35:54.926529 systemd-logind[1491]: Session 2 logged out. Waiting for processes to exit. Mar 14 00:35:54.928118 systemd[1]: sshd@1-10.230.50.222:22-20.161.92.111:36180.service: Deactivated successfully. Mar 14 00:35:54.931049 systemd[1]: session-2.scope: Deactivated successfully. Mar 14 00:35:54.932864 systemd-logind[1491]: Removed session 2. Mar 14 00:35:55.013651 systemd-timesyncd[1425]: Network configuration changed, trying to establish connection. Mar 14 00:35:55.024882 systemd[1]: Started sshd@2-10.230.50.222:22-20.161.92.111:36186.service - OpenSSH per-connection server daemon (20.161.92.111:36186). Mar 14 00:35:55.586413 sshd[1636]: Accepted publickey for core from 20.161.92.111 port 36186 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:35:55.589306 sshd[1636]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:35:55.595585 systemd-logind[1491]: New session 3 of user core. Mar 14 00:35:55.602741 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 14 00:35:55.994308 sshd[1636]: pam_unix(sshd:session): session closed for user core Mar 14 00:35:55.999813 systemd[1]: sshd@2-10.230.50.222:22-20.161.92.111:36186.service: Deactivated successfully. Mar 14 00:35:56.002126 systemd[1]: session-3.scope: Deactivated successfully. Mar 14 00:35:56.004021 systemd-logind[1491]: Session 3 logged out. Waiting for processes to exit. Mar 14 00:35:56.005707 systemd-logind[1491]: Removed session 3. Mar 14 00:35:57.847083 login[1597]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 14 00:35:57.851887 login[1596]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 14 00:35:57.859083 systemd-logind[1491]: New session 4 of user core. Mar 14 00:35:57.873956 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 14 00:35:57.878417 systemd-logind[1491]: New session 5 of user core. Mar 14 00:35:57.885878 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 14 00:35:58.798893 coreos-metadata[1479]: Mar 14 00:35:58.798 WARN failed to locate config-drive, using the metadata service API instead Mar 14 00:35:58.826289 coreos-metadata[1479]: Mar 14 00:35:58.825 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Mar 14 00:35:58.832576 coreos-metadata[1479]: Mar 14 00:35:58.832 INFO Fetch failed with 404: resource not found Mar 14 00:35:58.832576 coreos-metadata[1479]: Mar 14 00:35:58.832 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 14 00:35:58.833415 coreos-metadata[1479]: Mar 14 00:35:58.833 INFO Fetch successful Mar 14 00:35:58.833415 coreos-metadata[1479]: Mar 14 00:35:58.833 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Mar 14 00:35:58.846616 coreos-metadata[1479]: Mar 14 00:35:58.846 INFO Fetch successful Mar 14 00:35:58.846616 coreos-metadata[1479]: Mar 14 00:35:58.846 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Mar 14 00:35:58.862567 coreos-metadata[1479]: Mar 14 00:35:58.862 INFO Fetch successful Mar 14 00:35:58.862567 coreos-metadata[1479]: Mar 14 00:35:58.862 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Mar 14 00:35:58.879021 coreos-metadata[1479]: Mar 14 00:35:58.878 INFO Fetch successful Mar 14 00:35:58.879243 coreos-metadata[1479]: Mar 14 00:35:58.879 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Mar 14 00:35:58.899446 coreos-metadata[1479]: Mar 14 00:35:58.899 INFO Fetch successful Mar 14 00:35:58.924652 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 14 00:35:58.926650 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 14 00:35:59.343519 coreos-metadata[1560]: Mar 14 00:35:59.342 WARN failed to locate config-drive, using the metadata service API instead Mar 14 00:35:59.365549 coreos-metadata[1560]: Mar 14 00:35:59.365 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 14 00:35:59.390626 coreos-metadata[1560]: Mar 14 00:35:59.390 INFO Fetch successful Mar 14 00:35:59.390810 coreos-metadata[1560]: Mar 14 00:35:59.390 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 14 00:35:59.417548 coreos-metadata[1560]: Mar 14 00:35:59.417 INFO Fetch successful Mar 14 00:35:59.420206 unknown[1560]: wrote ssh authorized keys file for user: core Mar 14 00:35:59.440669 update-ssh-keys[1679]: Updated "/home/core/.ssh/authorized_keys" Mar 14 00:35:59.442430 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 14 00:35:59.445010 systemd[1]: Finished sshkeys.service. Mar 14 00:35:59.448869 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 14 00:35:59.449313 systemd[1]: Startup finished in 1.499s (kernel) + 13.947s (initrd) + 11.804s (userspace) = 27.250s. Mar 14 00:36:04.357986 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 14 00:36:04.369046 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 14 00:36:04.605679 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 14 00:36:04.606957 (kubelet)[1690]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 14 00:36:04.660596 kubelet[1690]: E0314 00:36:04.660216 1690 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 14 00:36:04.665167 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 14 00:36:04.665427 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 14 00:36:06.097794 systemd[1]: Started sshd@3-10.230.50.222:22-20.161.92.111:49430.service - OpenSSH per-connection server daemon (20.161.92.111:49430). Mar 14 00:36:06.657612 sshd[1698]: Accepted publickey for core from 20.161.92.111 port 49430 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:36:06.659163 sshd[1698]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:36:06.665624 systemd-logind[1491]: New session 6 of user core. Mar 14 00:36:06.674999 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 14 00:36:07.055306 sshd[1698]: pam_unix(sshd:session): session closed for user core Mar 14 00:36:07.060318 systemd-logind[1491]: Session 6 logged out. Waiting for processes to exit. Mar 14 00:36:07.061666 systemd[1]: sshd@3-10.230.50.222:22-20.161.92.111:49430.service: Deactivated successfully. Mar 14 00:36:07.064070 systemd[1]: session-6.scope: Deactivated successfully. Mar 14 00:36:07.066289 systemd-logind[1491]: Removed session 6. Mar 14 00:36:07.162824 systemd[1]: Started sshd@4-10.230.50.222:22-20.161.92.111:49440.service - OpenSSH per-connection server daemon (20.161.92.111:49440). Mar 14 00:36:07.714470 sshd[1705]: Accepted publickey for core from 20.161.92.111 port 49440 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:36:07.715389 sshd[1705]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:36:07.723253 systemd-logind[1491]: New session 7 of user core. Mar 14 00:36:07.728720 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 14 00:36:08.103602 sshd[1705]: pam_unix(sshd:session): session closed for user core Mar 14 00:36:08.110075 systemd[1]: sshd@4-10.230.50.222:22-20.161.92.111:49440.service: Deactivated successfully. Mar 14 00:36:08.112175 systemd[1]: session-7.scope: Deactivated successfully. Mar 14 00:36:08.113099 systemd-logind[1491]: Session 7 logged out. Waiting for processes to exit. Mar 14 00:36:08.114654 systemd-logind[1491]: Removed session 7. Mar 14 00:36:08.214871 systemd[1]: Started sshd@5-10.230.50.222:22-20.161.92.111:49450.service - OpenSSH per-connection server daemon (20.161.92.111:49450). Mar 14 00:36:08.760722 sshd[1712]: Accepted publickey for core from 20.161.92.111 port 49450 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:36:08.763439 sshd[1712]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:36:08.769682 systemd-logind[1491]: New session 8 of user core. Mar 14 00:36:08.782963 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 14 00:36:09.156455 sshd[1712]: pam_unix(sshd:session): session closed for user core Mar 14 00:36:09.163023 systemd[1]: sshd@5-10.230.50.222:22-20.161.92.111:49450.service: Deactivated successfully. Mar 14 00:36:09.165577 systemd[1]: session-8.scope: Deactivated successfully. Mar 14 00:36:09.166737 systemd-logind[1491]: Session 8 logged out. Waiting for processes to exit. Mar 14 00:36:09.168380 systemd-logind[1491]: Removed session 8. Mar 14 00:36:09.261791 systemd[1]: Started sshd@6-10.230.50.222:22-20.161.92.111:49454.service - OpenSSH per-connection server daemon (20.161.92.111:49454). Mar 14 00:36:09.811257 sshd[1719]: Accepted publickey for core from 20.161.92.111 port 49454 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:36:09.812198 sshd[1719]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:36:09.818862 systemd-logind[1491]: New session 9 of user core. Mar 14 00:36:09.825723 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 14 00:36:10.130079 sudo[1722]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 14 00:36:10.130585 sudo[1722]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 14 00:36:10.148015 sudo[1722]: pam_unix(sudo:session): session closed for user root Mar 14 00:36:10.236535 sshd[1719]: pam_unix(sshd:session): session closed for user core Mar 14 00:36:10.241031 systemd-logind[1491]: Session 9 logged out. Waiting for processes to exit. Mar 14 00:36:10.241818 systemd[1]: sshd@6-10.230.50.222:22-20.161.92.111:49454.service: Deactivated successfully. Mar 14 00:36:10.243924 systemd[1]: session-9.scope: Deactivated successfully. Mar 14 00:36:10.245854 systemd-logind[1491]: Removed session 9. Mar 14 00:36:10.337514 systemd[1]: Started sshd@7-10.230.50.222:22-20.161.92.111:36686.service - OpenSSH per-connection server daemon (20.161.92.111:36686). Mar 14 00:36:10.904807 sshd[1727]: Accepted publickey for core from 20.161.92.111 port 36686 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:36:10.907528 sshd[1727]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:36:10.913797 systemd-logind[1491]: New session 10 of user core. Mar 14 00:36:10.930006 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 14 00:36:11.219090 sudo[1731]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 14 00:36:11.220465 sudo[1731]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 14 00:36:11.225993 sudo[1731]: pam_unix(sudo:session): session closed for user root Mar 14 00:36:11.233674 sudo[1730]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Mar 14 00:36:11.234100 sudo[1730]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 14 00:36:11.253126 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Mar 14 00:36:11.256823 auditctl[1734]: No rules Mar 14 00:36:11.257637 systemd[1]: audit-rules.service: Deactivated successfully. Mar 14 00:36:11.257932 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Mar 14 00:36:11.264952 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 14 00:36:11.298637 augenrules[1752]: No rules Mar 14 00:36:11.299550 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 14 00:36:11.301345 sudo[1730]: pam_unix(sudo:session): session closed for user root Mar 14 00:36:11.390072 sshd[1727]: pam_unix(sshd:session): session closed for user core Mar 14 00:36:11.394133 systemd[1]: sshd@7-10.230.50.222:22-20.161.92.111:36686.service: Deactivated successfully. Mar 14 00:36:11.396836 systemd[1]: session-10.scope: Deactivated successfully. Mar 14 00:36:11.398753 systemd-logind[1491]: Session 10 logged out. Waiting for processes to exit. Mar 14 00:36:11.400124 systemd-logind[1491]: Removed session 10. Mar 14 00:36:11.495900 systemd[1]: Started sshd@8-10.230.50.222:22-20.161.92.111:36700.service - OpenSSH per-connection server daemon (20.161.92.111:36700). Mar 14 00:36:12.045824 sshd[1760]: Accepted publickey for core from 20.161.92.111 port 36700 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:36:12.048008 sshd[1760]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:36:12.054907 systemd-logind[1491]: New session 11 of user core. Mar 14 00:36:12.071727 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 14 00:36:12.356175 sudo[1763]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 14 00:36:12.356778 sudo[1763]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 14 00:36:12.806800 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 14 00:36:12.807032 (dockerd)[1779]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 14 00:36:13.236500 dockerd[1779]: time="2026-03-14T00:36:13.230610798Z" level=info msg="Starting up" Mar 14 00:36:13.401520 dockerd[1779]: time="2026-03-14T00:36:13.401281447Z" level=info msg="Loading containers: start." Mar 14 00:36:13.543978 kernel: Initializing XFRM netlink socket Mar 14 00:36:13.581835 systemd-timesyncd[1425]: Network configuration changed, trying to establish connection. Mar 14 00:36:13.643288 systemd-networkd[1436]: docker0: Link UP Mar 14 00:36:13.659509 dockerd[1779]: time="2026-03-14T00:36:13.659184348Z" level=info msg="Loading containers: done." Mar 14 00:36:13.685051 dockerd[1779]: time="2026-03-14T00:36:13.683325339Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 14 00:36:13.685051 dockerd[1779]: time="2026-03-14T00:36:13.683511001Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Mar 14 00:36:13.685051 dockerd[1779]: time="2026-03-14T00:36:13.683677224Z" level=info msg="Daemon has completed initialization" Mar 14 00:36:13.684467 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3282599087-merged.mount: Deactivated successfully. Mar 14 00:36:13.721131 dockerd[1779]: time="2026-03-14T00:36:13.721024896Z" level=info msg="API listen on /run/docker.sock" Mar 14 00:36:13.723466 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 14 00:36:14.367166 containerd[1504]: time="2026-03-14T00:36:14.366526976Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.2\"" Mar 14 00:36:14.915961 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 14 00:36:14.923758 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 14 00:36:15.111270 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 14 00:36:15.122898 (kubelet)[1930]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 14 00:36:15.200227 kubelet[1930]: E0314 00:36:15.170924 1930 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 14 00:36:15.172836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 14 00:36:15.173073 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 14 00:36:15.459603 systemd-timesyncd[1425]: Contacted time server [2a02:ac00:2:1::5]:123 (2.flatcar.pool.ntp.org). Mar 14 00:36:15.459719 systemd-timesyncd[1425]: Initial clock synchronization to Sat 2026-03-14 00:36:15.529685 UTC. Mar 14 00:36:15.611081 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1840091619.mount: Deactivated successfully. Mar 14 00:36:17.360815 containerd[1504]: time="2026-03-14T00:36:17.359176070Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:17.360815 containerd[1504]: time="2026-03-14T00:36:17.360686234Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.35.2: active requests=0, bytes read=27696475" Mar 14 00:36:17.362309 containerd[1504]: time="2026-03-14T00:36:17.362275619Z" level=info msg="ImageCreate event name:\"sha256:66108468ce51257077e642f2f509cd61d470029036a7954a1a47ca15b2706dda\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:17.366427 containerd[1504]: time="2026-03-14T00:36:17.366370963Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:68cdc586f13b13edb7aa30a18155be530136a39cfd5ef8672aad8ccc98f0a7f7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:17.371362 containerd[1504]: time="2026-03-14T00:36:17.371313454Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.35.2\" with image id \"sha256:66108468ce51257077e642f2f509cd61d470029036a7954a1a47ca15b2706dda\", repo tag \"registry.k8s.io/kube-apiserver:v1.35.2\", repo digest \"registry.k8s.io/kube-apiserver@sha256:68cdc586f13b13edb7aa30a18155be530136a39cfd5ef8672aad8ccc98f0a7f7\", size \"27693066\" in 3.004650589s" Mar 14 00:36:17.371527 containerd[1504]: time="2026-03-14T00:36:17.371499249Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.2\" returns image reference \"sha256:66108468ce51257077e642f2f509cd61d470029036a7954a1a47ca15b2706dda\"" Mar 14 00:36:17.387597 containerd[1504]: time="2026-03-14T00:36:17.387528174Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.2\"" Mar 14 00:36:19.509489 containerd[1504]: time="2026-03-14T00:36:19.509374398Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:19.511081 containerd[1504]: time="2026-03-14T00:36:19.510959680Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.35.2: active requests=0, bytes read=21450708" Mar 14 00:36:19.512478 containerd[1504]: time="2026-03-14T00:36:19.511811515Z" level=info msg="ImageCreate event name:\"sha256:0f2dd35011c05b55a97c9304ae1d36cfd58499cc1fd3dd8ccdf6efef1144e36a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:19.517412 containerd[1504]: time="2026-03-14T00:36:19.517377708Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d9784320a41dd1b155c0ad8fdb5823d60c475870f3dd23865edde36b585748f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:19.518909 containerd[1504]: time="2026-03-14T00:36:19.518869070Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.35.2\" with image id \"sha256:0f2dd35011c05b55a97c9304ae1d36cfd58499cc1fd3dd8ccdf6efef1144e36a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.35.2\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d9784320a41dd1b155c0ad8fdb5823d60c475870f3dd23865edde36b585748f2\", size \"23142311\" in 2.13092502s" Mar 14 00:36:19.519002 containerd[1504]: time="2026-03-14T00:36:19.518916222Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.2\" returns image reference \"sha256:0f2dd35011c05b55a97c9304ae1d36cfd58499cc1fd3dd8ccdf6efef1144e36a\"" Mar 14 00:36:19.519760 containerd[1504]: time="2026-03-14T00:36:19.519723840Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.2\"" Mar 14 00:36:20.865496 containerd[1504]: time="2026-03-14T00:36:20.864416594Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:20.866139 containerd[1504]: time="2026-03-14T00:36:20.866010539Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.35.2: active requests=0, bytes read=15548437" Mar 14 00:36:20.867486 containerd[1504]: time="2026-03-14T00:36:20.866911202Z" level=info msg="ImageCreate event name:\"sha256:ee83c410d7938aa1752b4e79a8d51f03710b4becc23b2e095fba471049fb2914\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:20.872028 containerd[1504]: time="2026-03-14T00:36:20.871961895Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:5833e2c4b779215efe7a48126c067de199e86aa5a86518693adeef16db0ff943\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:20.874313 containerd[1504]: time="2026-03-14T00:36:20.873782094Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.35.2\" with image id \"sha256:ee83c410d7938aa1752b4e79a8d51f03710b4becc23b2e095fba471049fb2914\", repo tag \"registry.k8s.io/kube-scheduler:v1.35.2\", repo digest \"registry.k8s.io/kube-scheduler@sha256:5833e2c4b779215efe7a48126c067de199e86aa5a86518693adeef16db0ff943\", size \"17240058\" in 1.353220568s" Mar 14 00:36:20.874313 containerd[1504]: time="2026-03-14T00:36:20.873829070Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.2\" returns image reference \"sha256:ee83c410d7938aa1752b4e79a8d51f03710b4becc23b2e095fba471049fb2914\"" Mar 14 00:36:20.874847 containerd[1504]: time="2026-03-14T00:36:20.874657434Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.2\"" Mar 14 00:36:22.567547 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1671397214.mount: Deactivated successfully. Mar 14 00:36:23.065595 containerd[1504]: time="2026-03-14T00:36:23.064219066Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:23.066662 containerd[1504]: time="2026-03-14T00:36:23.066606450Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.35.2: active requests=0, bytes read=25685320" Mar 14 00:36:23.067835 containerd[1504]: time="2026-03-14T00:36:23.067797846Z" level=info msg="ImageCreate event name:\"sha256:3c471cf273e44f68c91b48985c27627d581915b9ee5e72f7227bbf2146008b5e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:23.070800 containerd[1504]: time="2026-03-14T00:36:23.070759137Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:015265214cc874b593a7adccdcfe4ac15d2b8e9ae89881bdcd5bcb99d42e1862\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:23.071955 containerd[1504]: time="2026-03-14T00:36:23.071917613Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.35.2\" with image id \"sha256:3c471cf273e44f68c91b48985c27627d581915b9ee5e72f7227bbf2146008b5e\", repo tag \"registry.k8s.io/kube-proxy:v1.35.2\", repo digest \"registry.k8s.io/kube-proxy@sha256:015265214cc874b593a7adccdcfe4ac15d2b8e9ae89881bdcd5bcb99d42e1862\", size \"25684331\" in 2.197220886s" Mar 14 00:36:23.072084 containerd[1504]: time="2026-03-14T00:36:23.072056541Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.2\" returns image reference \"sha256:3c471cf273e44f68c91b48985c27627d581915b9ee5e72f7227bbf2146008b5e\"" Mar 14 00:36:23.073120 containerd[1504]: time="2026-03-14T00:36:23.072882536Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\"" Mar 14 00:36:23.680930 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2178478555.mount: Deactivated successfully. Mar 14 00:36:23.810987 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Mar 14 00:36:25.171737 containerd[1504]: time="2026-03-14T00:36:25.170048271Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:25.171737 containerd[1504]: time="2026-03-14T00:36:25.171234028Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.13.1: active requests=0, bytes read=23556550" Mar 14 00:36:25.173209 containerd[1504]: time="2026-03-14T00:36:25.173168233Z" level=info msg="ImageCreate event name:\"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:25.178192 containerd[1504]: time="2026-03-14T00:36:25.178045145Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:25.183517 containerd[1504]: time="2026-03-14T00:36:25.182591869Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.13.1\" with image id \"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\", repo tag \"registry.k8s.io/coredns/coredns:v1.13.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\", size \"23553139\" in 2.10965899s" Mar 14 00:36:25.183517 containerd[1504]: time="2026-03-14T00:36:25.182648666Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\" returns image reference \"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\"" Mar 14 00:36:25.184364 containerd[1504]: time="2026-03-14T00:36:25.184265947Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 14 00:36:25.423822 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 14 00:36:25.431007 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 14 00:36:25.699681 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 14 00:36:25.703859 (kubelet)[2079]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 14 00:36:25.775169 kubelet[2079]: E0314 00:36:25.775082 2079 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 14 00:36:25.778659 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 14 00:36:25.778912 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 14 00:36:25.944017 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2156199089.mount: Deactivated successfully. Mar 14 00:36:25.954173 containerd[1504]: time="2026-03-14T00:36:25.953874717Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:25.955377 containerd[1504]: time="2026-03-14T00:36:25.955123574Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321226" Mar 14 00:36:25.956178 containerd[1504]: time="2026-03-14T00:36:25.956137457Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:25.962188 containerd[1504]: time="2026-03-14T00:36:25.960915990Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:25.962188 containerd[1504]: time="2026-03-14T00:36:25.961670419Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 777.339735ms" Mar 14 00:36:25.962188 containerd[1504]: time="2026-03-14T00:36:25.961728798Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Mar 14 00:36:25.962772 containerd[1504]: time="2026-03-14T00:36:25.962730255Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\"" Mar 14 00:36:26.848155 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1010028467.mount: Deactivated successfully. Mar 14 00:36:27.987950 containerd[1504]: time="2026-03-14T00:36:27.987840132Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.6-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:27.989815 containerd[1504]: time="2026-03-14T00:36:27.989477577Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.6-0: active requests=0, bytes read=23630330" Mar 14 00:36:27.992473 containerd[1504]: time="2026-03-14T00:36:27.990779171Z" level=info msg="ImageCreate event name:\"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:27.994661 containerd[1504]: time="2026-03-14T00:36:27.994625672Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:27.996276 containerd[1504]: time="2026-03-14T00:36:27.996225035Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.6-0\" with image id \"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\", repo tag \"registry.k8s.io/etcd:3.6.6-0\", repo digest \"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\", size \"23641797\" in 2.033310265s" Mar 14 00:36:27.996378 containerd[1504]: time="2026-03-14T00:36:27.996281540Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\" returns image reference \"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\"" Mar 14 00:36:28.525426 systemd[1]: Started sshd@9-10.230.50.222:22-185.156.73.233:35794.service - OpenSSH per-connection server daemon (185.156.73.233:35794). Mar 14 00:36:29.481924 sshd[2171]: Invalid user ubnt from 185.156.73.233 port 35794 Mar 14 00:36:29.581515 sshd[2171]: Connection closed by invalid user ubnt 185.156.73.233 port 35794 [preauth] Mar 14 00:36:29.583447 systemd[1]: sshd@9-10.230.50.222:22-185.156.73.233:35794.service: Deactivated successfully. Mar 14 00:36:29.834737 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 14 00:36:29.850960 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 14 00:36:29.894562 systemd[1]: Reloading requested from client PID 2182 ('systemctl') (unit session-11.scope)... Mar 14 00:36:29.894606 systemd[1]: Reloading... Mar 14 00:36:30.058490 zram_generator::config[2221]: No configuration found. Mar 14 00:36:30.243058 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 14 00:36:30.352046 systemd[1]: Reloading finished in 456 ms. Mar 14 00:36:30.441820 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 14 00:36:30.441969 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 14 00:36:30.442826 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 14 00:36:30.449290 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 14 00:36:30.720938 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 14 00:36:30.736240 (kubelet)[2289]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 14 00:36:30.799967 kubelet[2289]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 00:36:31.170676 kubelet[2289]: I0314 00:36:31.170564 2289 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Mar 14 00:36:31.170676 kubelet[2289]: I0314 00:36:31.170645 2289 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 14 00:36:31.171967 kubelet[2289]: I0314 00:36:31.171909 2289 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 14 00:36:31.171967 kubelet[2289]: I0314 00:36:31.171933 2289 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 14 00:36:31.172354 kubelet[2289]: I0314 00:36:31.172320 2289 server.go:951] "Client rotation is on, will bootstrap in background" Mar 14 00:36:31.182512 kubelet[2289]: E0314 00:36:31.181761 2289 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.230.50.222:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.230.50.222:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 14 00:36:31.184482 kubelet[2289]: I0314 00:36:31.183538 2289 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 14 00:36:31.191318 kubelet[2289]: E0314 00:36:31.191277 2289 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 14 00:36:31.191569 kubelet[2289]: I0314 00:36:31.191549 2289 server.go:1395] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Mar 14 00:36:31.198750 kubelet[2289]: I0314 00:36:31.198720 2289 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 14 00:36:31.199928 kubelet[2289]: I0314 00:36:31.199860 2289 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 14 00:36:31.200267 kubelet[2289]: I0314 00:36:31.199918 2289 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-zkxct.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 14 00:36:31.200548 kubelet[2289]: I0314 00:36:31.200278 2289 topology_manager.go:143] "Creating topology manager with none policy" Mar 14 00:36:31.200548 kubelet[2289]: I0314 00:36:31.200294 2289 container_manager_linux.go:308] "Creating device plugin manager" Mar 14 00:36:31.200548 kubelet[2289]: I0314 00:36:31.200487 2289 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Mar 14 00:36:31.202961 kubelet[2289]: I0314 00:36:31.202929 2289 state_mem.go:41] "Initialized" logger="CPUManager state memory" Mar 14 00:36:31.203337 kubelet[2289]: I0314 00:36:31.203317 2289 kubelet.go:482] "Attempting to sync node with API server" Mar 14 00:36:31.203407 kubelet[2289]: I0314 00:36:31.203347 2289 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 14 00:36:31.203407 kubelet[2289]: I0314 00:36:31.203405 2289 kubelet.go:394] "Adding apiserver pod source" Mar 14 00:36:31.205488 kubelet[2289]: I0314 00:36:31.203431 2289 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 14 00:36:31.207385 kubelet[2289]: I0314 00:36:31.207337 2289 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 14 00:36:31.210162 kubelet[2289]: I0314 00:36:31.209875 2289 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 14 00:36:31.210162 kubelet[2289]: I0314 00:36:31.209925 2289 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 14 00:36:31.210162 kubelet[2289]: W0314 00:36:31.210030 2289 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 14 00:36:31.215036 kubelet[2289]: I0314 00:36:31.215015 2289 server.go:1257] "Started kubelet" Mar 14 00:36:31.215506 kubelet[2289]: I0314 00:36:31.215212 2289 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Mar 14 00:36:31.215747 kubelet[2289]: I0314 00:36:31.215672 2289 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 14 00:36:31.215840 kubelet[2289]: I0314 00:36:31.215798 2289 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 14 00:36:31.217481 kubelet[2289]: I0314 00:36:31.216286 2289 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 14 00:36:31.217481 kubelet[2289]: I0314 00:36:31.216996 2289 server.go:317] "Adding debug handlers to kubelet server" Mar 14 00:36:31.221115 kubelet[2289]: I0314 00:36:31.221083 2289 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Mar 14 00:36:31.227610 kubelet[2289]: E0314 00:36:31.225721 2289 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.230.50.222:6443/api/v1/namespaces/default/events\": dial tcp 10.230.50.222:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-zkxct.gb1.brightbox.com.189c8e18fac8d830 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-zkxct.gb1.brightbox.com,UID:,APIVersion:v1,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-zkxct.gb1.brightbox.com,},FirstTimestamp:2026-03-14 00:36:31.21495864 +0000 UTC m=+0.473366234,LastTimestamp:2026-03-14 00:36:31.21495864 +0000 UTC m=+0.473366234,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-zkxct.gb1.brightbox.com,}" Mar 14 00:36:31.228415 kubelet[2289]: I0314 00:36:31.228384 2289 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 14 00:36:31.233505 kubelet[2289]: I0314 00:36:31.233480 2289 volume_manager.go:311] "Starting Kubelet Volume Manager" Mar 14 00:36:31.234235 kubelet[2289]: E0314 00:36:31.234177 2289 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"srv-zkxct.gb1.brightbox.com\" not found" Mar 14 00:36:31.235205 kubelet[2289]: I0314 00:36:31.235182 2289 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 14 00:36:31.235639 kubelet[2289]: I0314 00:36:31.235619 2289 reconciler.go:29] "Reconciler: start to sync state" Mar 14 00:36:31.239344 kubelet[2289]: E0314 00:36:31.239290 2289 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.50.222:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-zkxct.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.50.222:6443: connect: connection refused" interval="200ms" Mar 14 00:36:31.239740 kubelet[2289]: I0314 00:36:31.239710 2289 factory.go:223] Registration of the systemd container factory successfully Mar 14 00:36:31.240628 kubelet[2289]: I0314 00:36:31.240597 2289 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 14 00:36:31.244000 kubelet[2289]: E0314 00:36:31.243968 2289 kubelet.go:1656] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 14 00:36:31.247077 kubelet[2289]: I0314 00:36:31.247028 2289 factory.go:223] Registration of the containerd container factory successfully Mar 14 00:36:31.254423 kubelet[2289]: I0314 00:36:31.254371 2289 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 14 00:36:31.256645 kubelet[2289]: I0314 00:36:31.256041 2289 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 14 00:36:31.256645 kubelet[2289]: I0314 00:36:31.256094 2289 status_manager.go:249] "Starting to sync pod status with apiserver" Mar 14 00:36:31.256645 kubelet[2289]: I0314 00:36:31.256147 2289 kubelet.go:2501] "Starting kubelet main sync loop" Mar 14 00:36:31.256645 kubelet[2289]: E0314 00:36:31.256267 2289 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 14 00:36:31.296997 kubelet[2289]: I0314 00:36:31.296965 2289 cpu_manager.go:225] "Starting" policy="none" Mar 14 00:36:31.297233 kubelet[2289]: I0314 00:36:31.297204 2289 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 14 00:36:31.297328 kubelet[2289]: I0314 00:36:31.297311 2289 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Mar 14 00:36:31.299997 kubelet[2289]: I0314 00:36:31.299974 2289 policy_none.go:50] "Start" Mar 14 00:36:31.300135 kubelet[2289]: I0314 00:36:31.300115 2289 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 14 00:36:31.300255 kubelet[2289]: I0314 00:36:31.300228 2289 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 14 00:36:31.302545 kubelet[2289]: I0314 00:36:31.301581 2289 policy_none.go:44] "Start" Mar 14 00:36:31.308437 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 14 00:36:31.326580 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 14 00:36:31.331583 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 14 00:36:31.335443 kubelet[2289]: E0314 00:36:31.335414 2289 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"srv-zkxct.gb1.brightbox.com\" not found" Mar 14 00:36:31.339181 kubelet[2289]: E0314 00:36:31.339125 2289 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 14 00:36:31.342459 kubelet[2289]: I0314 00:36:31.342440 2289 eviction_manager.go:194] "Eviction manager: starting control loop" Mar 14 00:36:31.342764 kubelet[2289]: I0314 00:36:31.342606 2289 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 14 00:36:31.344381 kubelet[2289]: I0314 00:36:31.344340 2289 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Mar 14 00:36:31.346283 kubelet[2289]: E0314 00:36:31.346139 2289 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 14 00:36:31.346283 kubelet[2289]: E0314 00:36:31.346261 2289 eviction_manager.go:297] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-zkxct.gb1.brightbox.com\" not found" Mar 14 00:36:31.380520 systemd[1]: Created slice kubepods-burstable-pod5cfdb1c58853a426e774b9b953b8dafb.slice - libcontainer container kubepods-burstable-pod5cfdb1c58853a426e774b9b953b8dafb.slice. Mar 14 00:36:31.391104 kubelet[2289]: E0314 00:36:31.390726 2289 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zkxct.gb1.brightbox.com\" not found" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:31.396372 systemd[1]: Created slice kubepods-burstable-podd2759cf7ac9ef1235d06408beff94329.slice - libcontainer container kubepods-burstable-podd2759cf7ac9ef1235d06408beff94329.slice. Mar 14 00:36:31.405787 kubelet[2289]: E0314 00:36:31.405526 2289 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zkxct.gb1.brightbox.com\" not found" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:31.409949 systemd[1]: Created slice kubepods-burstable-podc3940fbe4432c8d709a6c465f6b5a5b9.slice - libcontainer container kubepods-burstable-podc3940fbe4432c8d709a6c465f6b5a5b9.slice. Mar 14 00:36:31.413048 kubelet[2289]: E0314 00:36:31.413006 2289 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zkxct.gb1.brightbox.com\" not found" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:31.436253 kubelet[2289]: I0314 00:36:31.436112 2289 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c3940fbe4432c8d709a6c465f6b5a5b9-usr-share-ca-certificates\") pod \"kube-apiserver-srv-zkxct.gb1.brightbox.com\" (UID: \"c3940fbe4432c8d709a6c465f6b5a5b9\") " pod="kube-system/kube-apiserver-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:31.436764 kubelet[2289]: I0314 00:36:31.436543 2289 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5cfdb1c58853a426e774b9b953b8dafb-ca-certs\") pod \"kube-controller-manager-srv-zkxct.gb1.brightbox.com\" (UID: \"5cfdb1c58853a426e774b9b953b8dafb\") " pod="kube-system/kube-controller-manager-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:31.436764 kubelet[2289]: I0314 00:36:31.436633 2289 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5cfdb1c58853a426e774b9b953b8dafb-flexvolume-dir\") pod \"kube-controller-manager-srv-zkxct.gb1.brightbox.com\" (UID: \"5cfdb1c58853a426e774b9b953b8dafb\") " pod="kube-system/kube-controller-manager-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:31.437192 kubelet[2289]: I0314 00:36:31.436710 2289 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5cfdb1c58853a426e774b9b953b8dafb-k8s-certs\") pod \"kube-controller-manager-srv-zkxct.gb1.brightbox.com\" (UID: \"5cfdb1c58853a426e774b9b953b8dafb\") " pod="kube-system/kube-controller-manager-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:31.437192 kubelet[2289]: I0314 00:36:31.437063 2289 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5cfdb1c58853a426e774b9b953b8dafb-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-zkxct.gb1.brightbox.com\" (UID: \"5cfdb1c58853a426e774b9b953b8dafb\") " pod="kube-system/kube-controller-manager-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:31.437192 kubelet[2289]: I0314 00:36:31.437095 2289 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c3940fbe4432c8d709a6c465f6b5a5b9-ca-certs\") pod \"kube-apiserver-srv-zkxct.gb1.brightbox.com\" (UID: \"c3940fbe4432c8d709a6c465f6b5a5b9\") " pod="kube-system/kube-apiserver-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:31.437192 kubelet[2289]: I0314 00:36:31.437160 2289 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c3940fbe4432c8d709a6c465f6b5a5b9-k8s-certs\") pod \"kube-apiserver-srv-zkxct.gb1.brightbox.com\" (UID: \"c3940fbe4432c8d709a6c465f6b5a5b9\") " pod="kube-system/kube-apiserver-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:31.437800 kubelet[2289]: I0314 00:36:31.437580 2289 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5cfdb1c58853a426e774b9b953b8dafb-kubeconfig\") pod \"kube-controller-manager-srv-zkxct.gb1.brightbox.com\" (UID: \"5cfdb1c58853a426e774b9b953b8dafb\") " pod="kube-system/kube-controller-manager-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:31.437800 kubelet[2289]: I0314 00:36:31.437711 2289 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d2759cf7ac9ef1235d06408beff94329-kubeconfig\") pod \"kube-scheduler-srv-zkxct.gb1.brightbox.com\" (UID: \"d2759cf7ac9ef1235d06408beff94329\") " pod="kube-system/kube-scheduler-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:31.440648 kubelet[2289]: E0314 00:36:31.440598 2289 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.50.222:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-zkxct.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.50.222:6443: connect: connection refused" interval="400ms" Mar 14 00:36:31.446940 kubelet[2289]: I0314 00:36:31.446429 2289 kubelet_node_status.go:74] "Attempting to register node" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:31.446940 kubelet[2289]: E0314 00:36:31.446858 2289 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://10.230.50.222:6443/api/v1/nodes\": dial tcp 10.230.50.222:6443: connect: connection refused" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:31.650875 kubelet[2289]: I0314 00:36:31.650837 2289 kubelet_node_status.go:74] "Attempting to register node" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:31.651622 kubelet[2289]: E0314 00:36:31.651591 2289 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://10.230.50.222:6443/api/v1/nodes\": dial tcp 10.230.50.222:6443: connect: connection refused" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:31.696102 containerd[1504]: time="2026-03-14T00:36:31.695950260Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-zkxct.gb1.brightbox.com,Uid:5cfdb1c58853a426e774b9b953b8dafb,Namespace:kube-system,Attempt:0,}" Mar 14 00:36:31.719957 containerd[1504]: time="2026-03-14T00:36:31.719905707Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-zkxct.gb1.brightbox.com,Uid:d2759cf7ac9ef1235d06408beff94329,Namespace:kube-system,Attempt:0,}" Mar 14 00:36:31.720917 containerd[1504]: time="2026-03-14T00:36:31.720779229Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-zkxct.gb1.brightbox.com,Uid:c3940fbe4432c8d709a6c465f6b5a5b9,Namespace:kube-system,Attempt:0,}" Mar 14 00:36:31.841917 kubelet[2289]: E0314 00:36:31.841763 2289 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.50.222:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-zkxct.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.50.222:6443: connect: connection refused" interval="800ms" Mar 14 00:36:32.055108 kubelet[2289]: I0314 00:36:32.054943 2289 kubelet_node_status.go:74] "Attempting to register node" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:32.055510 kubelet[2289]: E0314 00:36:32.055476 2289 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://10.230.50.222:6443/api/v1/nodes\": dial tcp 10.230.50.222:6443: connect: connection refused" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:32.429847 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1580102552.mount: Deactivated successfully. Mar 14 00:36:32.439547 containerd[1504]: time="2026-03-14T00:36:32.438749059Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 14 00:36:32.441597 containerd[1504]: time="2026-03-14T00:36:32.441557242Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 14 00:36:32.442402 containerd[1504]: time="2026-03-14T00:36:32.442355915Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 14 00:36:32.443440 containerd[1504]: time="2026-03-14T00:36:32.443384632Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 14 00:36:32.444681 containerd[1504]: time="2026-03-14T00:36:32.444558325Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Mar 14 00:36:32.445431 containerd[1504]: time="2026-03-14T00:36:32.445394426Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 14 00:36:32.445613 containerd[1504]: time="2026-03-14T00:36:32.445584628Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 14 00:36:32.449154 containerd[1504]: time="2026-03-14T00:36:32.448658248Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 14 00:36:32.455759 containerd[1504]: time="2026-03-14T00:36:32.455715393Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 734.724357ms" Mar 14 00:36:32.458722 containerd[1504]: time="2026-03-14T00:36:32.458684517Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 738.633232ms" Mar 14 00:36:32.460867 containerd[1504]: time="2026-03-14T00:36:32.460835080Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 764.76051ms" Mar 14 00:36:32.645612 kubelet[2289]: E0314 00:36:32.645488 2289 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.50.222:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-zkxct.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.50.222:6443: connect: connection refused" interval="1.6s" Mar 14 00:36:32.704406 containerd[1504]: time="2026-03-14T00:36:32.702508693Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 14 00:36:32.704406 containerd[1504]: time="2026-03-14T00:36:32.702641151Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 14 00:36:32.704406 containerd[1504]: time="2026-03-14T00:36:32.702665892Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:36:32.704406 containerd[1504]: time="2026-03-14T00:36:32.702786402Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:36:32.708805 containerd[1504]: time="2026-03-14T00:36:32.708507175Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 14 00:36:32.708805 containerd[1504]: time="2026-03-14T00:36:32.708605946Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 14 00:36:32.708805 containerd[1504]: time="2026-03-14T00:36:32.708624654Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:36:32.708805 containerd[1504]: time="2026-03-14T00:36:32.708730356Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:36:32.714514 containerd[1504]: time="2026-03-14T00:36:32.714363721Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 14 00:36:32.714514 containerd[1504]: time="2026-03-14T00:36:32.714427779Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 14 00:36:32.714514 containerd[1504]: time="2026-03-14T00:36:32.714444627Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:36:32.715314 containerd[1504]: time="2026-03-14T00:36:32.714959872Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:36:32.757713 systemd[1]: Started cri-containerd-21d245877f3a58074875bcaeb1a3329cb50775e63f3b27d106a4dbd771d41a61.scope - libcontainer container 21d245877f3a58074875bcaeb1a3329cb50775e63f3b27d106a4dbd771d41a61. Mar 14 00:36:32.762113 systemd[1]: Started cri-containerd-68f18d6fd1bba5d564aeeaf3ed6d0434c5e6f107e1bc5783fc84892b8088cb1f.scope - libcontainer container 68f18d6fd1bba5d564aeeaf3ed6d0434c5e6f107e1bc5783fc84892b8088cb1f. Mar 14 00:36:32.773042 systemd[1]: Started cri-containerd-653cccf99941013a5ac986ae1815ff677effc1ca03fd031c74470237b0bcf053.scope - libcontainer container 653cccf99941013a5ac986ae1815ff677effc1ca03fd031c74470237b0bcf053. Mar 14 00:36:32.863130 kubelet[2289]: I0314 00:36:32.861772 2289 kubelet_node_status.go:74] "Attempting to register node" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:32.863130 kubelet[2289]: E0314 00:36:32.862242 2289 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://10.230.50.222:6443/api/v1/nodes\": dial tcp 10.230.50.222:6443: connect: connection refused" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:32.889490 containerd[1504]: time="2026-03-14T00:36:32.888353126Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-zkxct.gb1.brightbox.com,Uid:c3940fbe4432c8d709a6c465f6b5a5b9,Namespace:kube-system,Attempt:0,} returns sandbox id \"21d245877f3a58074875bcaeb1a3329cb50775e63f3b27d106a4dbd771d41a61\"" Mar 14 00:36:32.900301 containerd[1504]: time="2026-03-14T00:36:32.900261484Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-zkxct.gb1.brightbox.com,Uid:5cfdb1c58853a426e774b9b953b8dafb,Namespace:kube-system,Attempt:0,} returns sandbox id \"68f18d6fd1bba5d564aeeaf3ed6d0434c5e6f107e1bc5783fc84892b8088cb1f\"" Mar 14 00:36:32.916628 containerd[1504]: time="2026-03-14T00:36:32.916580035Z" level=info msg="CreateContainer within sandbox \"21d245877f3a58074875bcaeb1a3329cb50775e63f3b27d106a4dbd771d41a61\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 14 00:36:32.917771 containerd[1504]: time="2026-03-14T00:36:32.917733910Z" level=info msg="CreateContainer within sandbox \"68f18d6fd1bba5d564aeeaf3ed6d0434c5e6f107e1bc5783fc84892b8088cb1f\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 14 00:36:32.921667 containerd[1504]: time="2026-03-14T00:36:32.921633046Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-zkxct.gb1.brightbox.com,Uid:d2759cf7ac9ef1235d06408beff94329,Namespace:kube-system,Attempt:0,} returns sandbox id \"653cccf99941013a5ac986ae1815ff677effc1ca03fd031c74470237b0bcf053\"" Mar 14 00:36:32.928548 containerd[1504]: time="2026-03-14T00:36:32.928507935Z" level=info msg="CreateContainer within sandbox \"653cccf99941013a5ac986ae1815ff677effc1ca03fd031c74470237b0bcf053\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 14 00:36:32.965705 containerd[1504]: time="2026-03-14T00:36:32.965459419Z" level=info msg="CreateContainer within sandbox \"68f18d6fd1bba5d564aeeaf3ed6d0434c5e6f107e1bc5783fc84892b8088cb1f\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"4e07bb964cadaebc6b998e897d4ebd20385320959c5bc785f91647ca77f53736\"" Mar 14 00:36:32.967199 containerd[1504]: time="2026-03-14T00:36:32.967087138Z" level=info msg="CreateContainer within sandbox \"21d245877f3a58074875bcaeb1a3329cb50775e63f3b27d106a4dbd771d41a61\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"8c686a2e12694250c2c73c91f1d861e7d384421a1c7b3fcb9ed558d4fd966cb0\"" Mar 14 00:36:32.967448 containerd[1504]: time="2026-03-14T00:36:32.967417622Z" level=info msg="StartContainer for \"4e07bb964cadaebc6b998e897d4ebd20385320959c5bc785f91647ca77f53736\"" Mar 14 00:36:32.968833 containerd[1504]: time="2026-03-14T00:36:32.968788665Z" level=info msg="CreateContainer within sandbox \"653cccf99941013a5ac986ae1815ff677effc1ca03fd031c74470237b0bcf053\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"70c7bde2b9b9c1f778b57f7a896730bbc323b7d944b6f251dcf912b4e8703e82\"" Mar 14 00:36:32.969223 containerd[1504]: time="2026-03-14T00:36:32.968990353Z" level=info msg="StartContainer for \"8c686a2e12694250c2c73c91f1d861e7d384421a1c7b3fcb9ed558d4fd966cb0\"" Mar 14 00:36:32.972774 containerd[1504]: time="2026-03-14T00:36:32.972716446Z" level=info msg="StartContainer for \"70c7bde2b9b9c1f778b57f7a896730bbc323b7d944b6f251dcf912b4e8703e82\"" Mar 14 00:36:33.018709 systemd[1]: Started cri-containerd-4e07bb964cadaebc6b998e897d4ebd20385320959c5bc785f91647ca77f53736.scope - libcontainer container 4e07bb964cadaebc6b998e897d4ebd20385320959c5bc785f91647ca77f53736. Mar 14 00:36:33.029675 systemd[1]: Started cri-containerd-8c686a2e12694250c2c73c91f1d861e7d384421a1c7b3fcb9ed558d4fd966cb0.scope - libcontainer container 8c686a2e12694250c2c73c91f1d861e7d384421a1c7b3fcb9ed558d4fd966cb0. Mar 14 00:36:33.053671 systemd[1]: Started cri-containerd-70c7bde2b9b9c1f778b57f7a896730bbc323b7d944b6f251dcf912b4e8703e82.scope - libcontainer container 70c7bde2b9b9c1f778b57f7a896730bbc323b7d944b6f251dcf912b4e8703e82. Mar 14 00:36:33.126447 containerd[1504]: time="2026-03-14T00:36:33.126032761Z" level=info msg="StartContainer for \"8c686a2e12694250c2c73c91f1d861e7d384421a1c7b3fcb9ed558d4fd966cb0\" returns successfully" Mar 14 00:36:33.134663 containerd[1504]: time="2026-03-14T00:36:33.134422348Z" level=info msg="StartContainer for \"4e07bb964cadaebc6b998e897d4ebd20385320959c5bc785f91647ca77f53736\" returns successfully" Mar 14 00:36:33.186591 containerd[1504]: time="2026-03-14T00:36:33.185924303Z" level=info msg="StartContainer for \"70c7bde2b9b9c1f778b57f7a896730bbc323b7d944b6f251dcf912b4e8703e82\" returns successfully" Mar 14 00:36:33.301621 kubelet[2289]: E0314 00:36:33.301437 2289 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zkxct.gb1.brightbox.com\" not found" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:33.302003 kubelet[2289]: E0314 00:36:33.301963 2289 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zkxct.gb1.brightbox.com\" not found" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:33.308370 kubelet[2289]: E0314 00:36:33.307229 2289 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zkxct.gb1.brightbox.com\" not found" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:34.311954 kubelet[2289]: E0314 00:36:34.311908 2289 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zkxct.gb1.brightbox.com\" not found" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:34.313943 kubelet[2289]: E0314 00:36:34.312753 2289 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zkxct.gb1.brightbox.com\" not found" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:34.465908 kubelet[2289]: I0314 00:36:34.465827 2289 kubelet_node_status.go:74] "Attempting to register node" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:35.062602 kubelet[2289]: E0314 00:36:35.062532 2289 nodelease.go:50] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-zkxct.gb1.brightbox.com\" not found" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:35.163299 kubelet[2289]: I0314 00:36:35.162934 2289 kubelet_node_status.go:77] "Successfully registered node" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:35.207987 kubelet[2289]: I0314 00:36:35.207914 2289 apiserver.go:52] "Watching apiserver" Mar 14 00:36:35.235447 kubelet[2289]: I0314 00:36:35.235382 2289 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:35.236067 kubelet[2289]: I0314 00:36:35.236032 2289 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 14 00:36:35.253977 kubelet[2289]: E0314 00:36:35.253923 2289 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-zkxct.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:35.253977 kubelet[2289]: I0314 00:36:35.253980 2289 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:35.266881 kubelet[2289]: E0314 00:36:35.266638 2289 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-zkxct.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:35.266881 kubelet[2289]: I0314 00:36:35.266679 2289 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:35.270023 kubelet[2289]: E0314 00:36:35.269905 2289 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-zkxct.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:35.311892 kubelet[2289]: I0314 00:36:35.311847 2289 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:35.316641 kubelet[2289]: E0314 00:36:35.316514 2289 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-zkxct.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:36.333542 kubelet[2289]: I0314 00:36:36.333348 2289 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:36.344532 kubelet[2289]: I0314 00:36:36.343507 2289 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 14 00:36:37.202230 update_engine[1493]: I20260314 00:36:37.201968 1493 update_attempter.cc:509] Updating boot flags... Mar 14 00:36:37.264780 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 34 scanned by (udev-worker) (2583) Mar 14 00:36:37.400564 systemd[1]: Reloading requested from client PID 2590 ('systemctl') (unit session-11.scope)... Mar 14 00:36:37.400607 systemd[1]: Reloading... Mar 14 00:36:37.423549 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 34 scanned by (udev-worker) (2581) Mar 14 00:36:37.535810 zram_generator::config[2627]: No configuration found. Mar 14 00:36:37.549415 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 34 scanned by (udev-worker) (2581) Mar 14 00:36:37.799930 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 14 00:36:37.931178 systemd[1]: Reloading finished in 529 ms. Mar 14 00:36:38.041357 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 14 00:36:38.072954 systemd[1]: kubelet.service: Deactivated successfully. Mar 14 00:36:38.073482 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 14 00:36:38.073615 systemd[1]: kubelet.service: Consumed 1.052s CPU time, 122.0M memory peak, 0B memory swap peak. Mar 14 00:36:38.088030 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 14 00:36:38.350728 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 14 00:36:38.359075 (kubelet)[2695]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 14 00:36:38.460297 kubelet[2695]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 14 00:36:38.472288 kubelet[2695]: I0314 00:36:38.472237 2695 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Mar 14 00:36:38.473152 kubelet[2695]: I0314 00:36:38.472482 2695 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 14 00:36:38.473152 kubelet[2695]: I0314 00:36:38.472600 2695 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 14 00:36:38.473152 kubelet[2695]: I0314 00:36:38.472628 2695 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 14 00:36:38.473596 kubelet[2695]: I0314 00:36:38.473574 2695 server.go:951] "Client rotation is on, will bootstrap in background" Mar 14 00:36:38.476636 kubelet[2695]: I0314 00:36:38.476532 2695 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 14 00:36:38.482496 kubelet[2695]: I0314 00:36:38.482039 2695 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 14 00:36:38.487727 kubelet[2695]: E0314 00:36:38.487695 2695 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 14 00:36:38.489445 kubelet[2695]: I0314 00:36:38.487993 2695 server.go:1395] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Mar 14 00:36:38.496166 kubelet[2695]: I0314 00:36:38.496129 2695 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 14 00:36:38.499113 kubelet[2695]: I0314 00:36:38.499042 2695 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 14 00:36:38.499310 kubelet[2695]: I0314 00:36:38.499094 2695 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-zkxct.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 14 00:36:38.499310 kubelet[2695]: I0314 00:36:38.499307 2695 topology_manager.go:143] "Creating topology manager with none policy" Mar 14 00:36:38.499647 kubelet[2695]: I0314 00:36:38.499320 2695 container_manager_linux.go:308] "Creating device plugin manager" Mar 14 00:36:38.499647 kubelet[2695]: I0314 00:36:38.499350 2695 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Mar 14 00:36:38.499752 kubelet[2695]: I0314 00:36:38.499683 2695 state_mem.go:41] "Initialized" logger="CPUManager state memory" Mar 14 00:36:38.500844 kubelet[2695]: I0314 00:36:38.499992 2695 kubelet.go:482] "Attempting to sync node with API server" Mar 14 00:36:38.500844 kubelet[2695]: I0314 00:36:38.500015 2695 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 14 00:36:38.501685 kubelet[2695]: I0314 00:36:38.501550 2695 kubelet.go:394] "Adding apiserver pod source" Mar 14 00:36:38.501685 kubelet[2695]: I0314 00:36:38.501575 2695 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 14 00:36:38.517062 kubelet[2695]: I0314 00:36:38.516621 2695 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 14 00:36:38.520386 kubelet[2695]: I0314 00:36:38.520339 2695 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 14 00:36:38.520506 kubelet[2695]: I0314 00:36:38.520395 2695 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 14 00:36:38.541638 kubelet[2695]: I0314 00:36:38.541152 2695 server.go:1257] "Started kubelet" Mar 14 00:36:38.548115 kubelet[2695]: I0314 00:36:38.547334 2695 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Mar 14 00:36:38.550885 kubelet[2695]: I0314 00:36:38.550860 2695 server.go:317] "Adding debug handlers to kubelet server" Mar 14 00:36:38.553485 kubelet[2695]: I0314 00:36:38.552817 2695 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Mar 14 00:36:38.559800 kubelet[2695]: I0314 00:36:38.559209 2695 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 14 00:36:38.559800 kubelet[2695]: I0314 00:36:38.559595 2695 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 14 00:36:38.560934 kubelet[2695]: I0314 00:36:38.560275 2695 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 14 00:36:38.569369 kubelet[2695]: I0314 00:36:38.569301 2695 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 14 00:36:38.569889 kubelet[2695]: I0314 00:36:38.569844 2695 volume_manager.go:311] "Starting Kubelet Volume Manager" Mar 14 00:36:38.572806 kubelet[2695]: I0314 00:36:38.572722 2695 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 14 00:36:38.573421 kubelet[2695]: I0314 00:36:38.573053 2695 reconciler.go:29] "Reconciler: start to sync state" Mar 14 00:36:38.579868 kubelet[2695]: I0314 00:36:38.579787 2695 factory.go:223] Registration of the systemd container factory successfully Mar 14 00:36:38.580506 kubelet[2695]: I0314 00:36:38.580386 2695 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 14 00:36:38.607926 kubelet[2695]: I0314 00:36:38.607687 2695 factory.go:223] Registration of the containerd container factory successfully Mar 14 00:36:38.637853 kubelet[2695]: I0314 00:36:38.637689 2695 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 14 00:36:38.641139 kubelet[2695]: I0314 00:36:38.641109 2695 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 14 00:36:38.641524 kubelet[2695]: I0314 00:36:38.641492 2695 status_manager.go:249] "Starting to sync pod status with apiserver" Mar 14 00:36:38.642818 kubelet[2695]: I0314 00:36:38.642798 2695 kubelet.go:2501] "Starting kubelet main sync loop" Mar 14 00:36:38.643033 kubelet[2695]: E0314 00:36:38.643006 2695 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 14 00:36:38.715145 kubelet[2695]: I0314 00:36:38.714351 2695 cpu_manager.go:225] "Starting" policy="none" Mar 14 00:36:38.715145 kubelet[2695]: I0314 00:36:38.714404 2695 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 14 00:36:38.715145 kubelet[2695]: I0314 00:36:38.714436 2695 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Mar 14 00:36:38.715145 kubelet[2695]: I0314 00:36:38.714682 2695 state_mem.go:94] "Updated default CPUSet" logger="CPUManager state checkpoint.CPUManager state memory" cpuSet="" Mar 14 00:36:38.715145 kubelet[2695]: I0314 00:36:38.714713 2695 state_mem.go:102] "Updated CPUSet assignments" logger="CPUManager state checkpoint.CPUManager state memory" assignments={} Mar 14 00:36:38.715145 kubelet[2695]: I0314 00:36:38.714741 2695 policy_none.go:50] "Start" Mar 14 00:36:38.715145 kubelet[2695]: I0314 00:36:38.714753 2695 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 14 00:36:38.715145 kubelet[2695]: I0314 00:36:38.714781 2695 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 14 00:36:38.717413 kubelet[2695]: I0314 00:36:38.717039 2695 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 14 00:36:38.717413 kubelet[2695]: I0314 00:36:38.717107 2695 policy_none.go:44] "Start" Mar 14 00:36:38.732296 kubelet[2695]: E0314 00:36:38.731196 2695 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 14 00:36:38.735394 kubelet[2695]: I0314 00:36:38.735011 2695 eviction_manager.go:194] "Eviction manager: starting control loop" Mar 14 00:36:38.735394 kubelet[2695]: I0314 00:36:38.735036 2695 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 14 00:36:38.737452 kubelet[2695]: I0314 00:36:38.735635 2695 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Mar 14 00:36:38.748331 kubelet[2695]: E0314 00:36:38.746725 2695 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 14 00:36:38.752808 kubelet[2695]: I0314 00:36:38.752746 2695 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:38.753856 kubelet[2695]: I0314 00:36:38.753342 2695 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:38.757561 kubelet[2695]: I0314 00:36:38.756976 2695 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:38.770715 kubelet[2695]: I0314 00:36:38.770567 2695 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 14 00:36:38.773114 kubelet[2695]: I0314 00:36:38.771968 2695 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 14 00:36:38.773114 kubelet[2695]: E0314 00:36:38.772031 2695 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-zkxct.gb1.brightbox.com\" already exists" pod="kube-system/kube-scheduler-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:38.773114 kubelet[2695]: I0314 00:36:38.772715 2695 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 14 00:36:38.775495 kubelet[2695]: I0314 00:36:38.774963 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c3940fbe4432c8d709a6c465f6b5a5b9-usr-share-ca-certificates\") pod \"kube-apiserver-srv-zkxct.gb1.brightbox.com\" (UID: \"c3940fbe4432c8d709a6c465f6b5a5b9\") " pod="kube-system/kube-apiserver-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:38.775495 kubelet[2695]: I0314 00:36:38.775008 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5cfdb1c58853a426e774b9b953b8dafb-k8s-certs\") pod \"kube-controller-manager-srv-zkxct.gb1.brightbox.com\" (UID: \"5cfdb1c58853a426e774b9b953b8dafb\") " pod="kube-system/kube-controller-manager-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:38.775495 kubelet[2695]: I0314 00:36:38.775041 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d2759cf7ac9ef1235d06408beff94329-kubeconfig\") pod \"kube-scheduler-srv-zkxct.gb1.brightbox.com\" (UID: \"d2759cf7ac9ef1235d06408beff94329\") " pod="kube-system/kube-scheduler-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:38.775495 kubelet[2695]: I0314 00:36:38.775066 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5cfdb1c58853a426e774b9b953b8dafb-ca-certs\") pod \"kube-controller-manager-srv-zkxct.gb1.brightbox.com\" (UID: \"5cfdb1c58853a426e774b9b953b8dafb\") " pod="kube-system/kube-controller-manager-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:38.775495 kubelet[2695]: I0314 00:36:38.775092 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5cfdb1c58853a426e774b9b953b8dafb-flexvolume-dir\") pod \"kube-controller-manager-srv-zkxct.gb1.brightbox.com\" (UID: \"5cfdb1c58853a426e774b9b953b8dafb\") " pod="kube-system/kube-controller-manager-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:38.775782 kubelet[2695]: I0314 00:36:38.775120 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5cfdb1c58853a426e774b9b953b8dafb-kubeconfig\") pod \"kube-controller-manager-srv-zkxct.gb1.brightbox.com\" (UID: \"5cfdb1c58853a426e774b9b953b8dafb\") " pod="kube-system/kube-controller-manager-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:38.775782 kubelet[2695]: I0314 00:36:38.775152 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5cfdb1c58853a426e774b9b953b8dafb-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-zkxct.gb1.brightbox.com\" (UID: \"5cfdb1c58853a426e774b9b953b8dafb\") " pod="kube-system/kube-controller-manager-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:38.775782 kubelet[2695]: I0314 00:36:38.775176 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c3940fbe4432c8d709a6c465f6b5a5b9-ca-certs\") pod \"kube-apiserver-srv-zkxct.gb1.brightbox.com\" (UID: \"c3940fbe4432c8d709a6c465f6b5a5b9\") " pod="kube-system/kube-apiserver-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:38.775782 kubelet[2695]: I0314 00:36:38.775205 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c3940fbe4432c8d709a6c465f6b5a5b9-k8s-certs\") pod \"kube-apiserver-srv-zkxct.gb1.brightbox.com\" (UID: \"c3940fbe4432c8d709a6c465f6b5a5b9\") " pod="kube-system/kube-apiserver-srv-zkxct.gb1.brightbox.com" Mar 14 00:36:38.862623 kubelet[2695]: I0314 00:36:38.862345 2695 kubelet_node_status.go:74] "Attempting to register node" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:38.879844 kubelet[2695]: I0314 00:36:38.878522 2695 kubelet_node_status.go:123] "Node was previously registered" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:38.879844 kubelet[2695]: I0314 00:36:38.878646 2695 kubelet_node_status.go:77] "Successfully registered node" node="srv-zkxct.gb1.brightbox.com" Mar 14 00:36:39.514852 kubelet[2695]: I0314 00:36:39.514764 2695 apiserver.go:52] "Watching apiserver" Mar 14 00:36:39.574582 kubelet[2695]: I0314 00:36:39.574544 2695 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 14 00:36:39.634030 kubelet[2695]: I0314 00:36:39.633926 2695 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-zkxct.gb1.brightbox.com" podStartSLOduration=1.633887805 podStartE2EDuration="1.633887805s" podCreationTimestamp="2026-03-14 00:36:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 00:36:39.617662479 +0000 UTC m=+1.231234420" watchObservedRunningTime="2026-03-14 00:36:39.633887805 +0000 UTC m=+1.247459745" Mar 14 00:36:39.653831 kubelet[2695]: I0314 00:36:39.653765 2695 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-zkxct.gb1.brightbox.com" podStartSLOduration=3.6537501519999998 podStartE2EDuration="3.653750152s" podCreationTimestamp="2026-03-14 00:36:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 00:36:39.634511711 +0000 UTC m=+1.248083687" watchObservedRunningTime="2026-03-14 00:36:39.653750152 +0000 UTC m=+1.267322098" Mar 14 00:36:39.671415 kubelet[2695]: I0314 00:36:39.671232 2695 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-zkxct.gb1.brightbox.com" podStartSLOduration=1.67121827 podStartE2EDuration="1.67121827s" podCreationTimestamp="2026-03-14 00:36:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 00:36:39.654586294 +0000 UTC m=+1.268158244" watchObservedRunningTime="2026-03-14 00:36:39.67121827 +0000 UTC m=+1.284790210" Mar 14 00:36:43.951342 kubelet[2695]: I0314 00:36:43.951100 2695 kuberuntime_manager.go:2062] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 14 00:36:43.952191 containerd[1504]: time="2026-03-14T00:36:43.952096197Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 14 00:36:43.953527 kubelet[2695]: I0314 00:36:43.953435 2695 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 14 00:36:44.969888 systemd[1]: Created slice kubepods-besteffort-podc3b50d4a_261c_4df2_b515_8ee8eef7cc42.slice - libcontainer container kubepods-besteffort-podc3b50d4a_261c_4df2_b515_8ee8eef7cc42.slice. Mar 14 00:36:45.027582 kubelet[2695]: I0314 00:36:45.027537 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/c3b50d4a-261c-4df2-b515-8ee8eef7cc42-kube-proxy\") pod \"kube-proxy-lcbd5\" (UID: \"c3b50d4a-261c-4df2-b515-8ee8eef7cc42\") " pod="kube-system/kube-proxy-lcbd5" Mar 14 00:36:45.028138 kubelet[2695]: I0314 00:36:45.027650 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/c3b50d4a-261c-4df2-b515-8ee8eef7cc42-xtables-lock\") pod \"kube-proxy-lcbd5\" (UID: \"c3b50d4a-261c-4df2-b515-8ee8eef7cc42\") " pod="kube-system/kube-proxy-lcbd5" Mar 14 00:36:45.028138 kubelet[2695]: I0314 00:36:45.027682 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3b50d4a-261c-4df2-b515-8ee8eef7cc42-lib-modules\") pod \"kube-proxy-lcbd5\" (UID: \"c3b50d4a-261c-4df2-b515-8ee8eef7cc42\") " pod="kube-system/kube-proxy-lcbd5" Mar 14 00:36:45.028138 kubelet[2695]: I0314 00:36:45.027730 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p26gr\" (UniqueName: \"kubernetes.io/projected/c3b50d4a-261c-4df2-b515-8ee8eef7cc42-kube-api-access-p26gr\") pod \"kube-proxy-lcbd5\" (UID: \"c3b50d4a-261c-4df2-b515-8ee8eef7cc42\") " pod="kube-system/kube-proxy-lcbd5" Mar 14 00:36:45.212581 systemd[1]: Created slice kubepods-besteffort-pod17d18789_fcc2_4672_baea_944eed80eed2.slice - libcontainer container kubepods-besteffort-pod17d18789_fcc2_4672_baea_944eed80eed2.slice. Mar 14 00:36:45.228848 kubelet[2695]: I0314 00:36:45.228660 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnbjz\" (UniqueName: \"kubernetes.io/projected/17d18789-fcc2-4672-baea-944eed80eed2-kube-api-access-wnbjz\") pod \"tigera-operator-6cf4cccc57-kn5nw\" (UID: \"17d18789-fcc2-4672-baea-944eed80eed2\") " pod="tigera-operator/tigera-operator-6cf4cccc57-kn5nw" Mar 14 00:36:45.228848 kubelet[2695]: I0314 00:36:45.228708 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/17d18789-fcc2-4672-baea-944eed80eed2-var-lib-calico\") pod \"tigera-operator-6cf4cccc57-kn5nw\" (UID: \"17d18789-fcc2-4672-baea-944eed80eed2\") " pod="tigera-operator/tigera-operator-6cf4cccc57-kn5nw" Mar 14 00:36:45.284645 containerd[1504]: time="2026-03-14T00:36:45.284575167Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lcbd5,Uid:c3b50d4a-261c-4df2-b515-8ee8eef7cc42,Namespace:kube-system,Attempt:0,}" Mar 14 00:36:45.327582 containerd[1504]: time="2026-03-14T00:36:45.325831786Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 14 00:36:45.327582 containerd[1504]: time="2026-03-14T00:36:45.326432674Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 14 00:36:45.327582 containerd[1504]: time="2026-03-14T00:36:45.327240901Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:36:45.329689 containerd[1504]: time="2026-03-14T00:36:45.329516802Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:36:45.378696 systemd[1]: Started cri-containerd-9a969555af668d351cce947cfe4c5ee59a0fac98601b4c60a796cd695f48c89b.scope - libcontainer container 9a969555af668d351cce947cfe4c5ee59a0fac98601b4c60a796cd695f48c89b. Mar 14 00:36:45.424485 containerd[1504]: time="2026-03-14T00:36:45.423901414Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lcbd5,Uid:c3b50d4a-261c-4df2-b515-8ee8eef7cc42,Namespace:kube-system,Attempt:0,} returns sandbox id \"9a969555af668d351cce947cfe4c5ee59a0fac98601b4c60a796cd695f48c89b\"" Mar 14 00:36:45.433158 containerd[1504]: time="2026-03-14T00:36:45.433108528Z" level=info msg="CreateContainer within sandbox \"9a969555af668d351cce947cfe4c5ee59a0fac98601b4c60a796cd695f48c89b\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 14 00:36:45.453700 containerd[1504]: time="2026-03-14T00:36:45.453631058Z" level=info msg="CreateContainer within sandbox \"9a969555af668d351cce947cfe4c5ee59a0fac98601b4c60a796cd695f48c89b\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"df0beba56ac464e0423657fe8e618625fc7251f9fc63f85a32706344ebff1570\"" Mar 14 00:36:45.456075 containerd[1504]: time="2026-03-14T00:36:45.456028724Z" level=info msg="StartContainer for \"df0beba56ac464e0423657fe8e618625fc7251f9fc63f85a32706344ebff1570\"" Mar 14 00:36:45.494663 systemd[1]: Started cri-containerd-df0beba56ac464e0423657fe8e618625fc7251f9fc63f85a32706344ebff1570.scope - libcontainer container df0beba56ac464e0423657fe8e618625fc7251f9fc63f85a32706344ebff1570. Mar 14 00:36:45.521439 containerd[1504]: time="2026-03-14T00:36:45.521259781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-kn5nw,Uid:17d18789-fcc2-4672-baea-944eed80eed2,Namespace:tigera-operator,Attempt:0,}" Mar 14 00:36:45.544455 containerd[1504]: time="2026-03-14T00:36:45.544305083Z" level=info msg="StartContainer for \"df0beba56ac464e0423657fe8e618625fc7251f9fc63f85a32706344ebff1570\" returns successfully" Mar 14 00:36:45.575276 containerd[1504]: time="2026-03-14T00:36:45.574600560Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 14 00:36:45.575725 containerd[1504]: time="2026-03-14T00:36:45.575667016Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 14 00:36:45.575955 containerd[1504]: time="2026-03-14T00:36:45.575907518Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:36:45.576189 containerd[1504]: time="2026-03-14T00:36:45.576149023Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:36:45.618668 systemd[1]: Started cri-containerd-832e4623a9ba595b38d39661513318075ab321259fd2ec201db08dcf5746db2a.scope - libcontainer container 832e4623a9ba595b38d39661513318075ab321259fd2ec201db08dcf5746db2a. Mar 14 00:36:45.703924 containerd[1504]: time="2026-03-14T00:36:45.703856702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-kn5nw,Uid:17d18789-fcc2-4672-baea-944eed80eed2,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"832e4623a9ba595b38d39661513318075ab321259fd2ec201db08dcf5746db2a\"" Mar 14 00:36:45.708708 containerd[1504]: time="2026-03-14T00:36:45.707710082Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 14 00:36:45.734068 kubelet[2695]: I0314 00:36:45.734005 2695 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-proxy-lcbd5" podStartSLOduration=1.733988603 podStartE2EDuration="1.733988603s" podCreationTimestamp="2026-03-14 00:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 00:36:45.733822034 +0000 UTC m=+7.347393985" watchObservedRunningTime="2026-03-14 00:36:45.733988603 +0000 UTC m=+7.347560552" Mar 14 00:36:47.398594 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount309801603.mount: Deactivated successfully. Mar 14 00:36:49.083745 containerd[1504]: time="2026-03-14T00:36:49.082385651Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:49.083745 containerd[1504]: time="2026-03-14T00:36:49.083674878Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Mar 14 00:36:49.084437 containerd[1504]: time="2026-03-14T00:36:49.083921857Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:49.088396 containerd[1504]: time="2026-03-14T00:36:49.088363664Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:36:49.089792 containerd[1504]: time="2026-03-14T00:36:49.089752648Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 3.381990905s" Mar 14 00:36:49.089865 containerd[1504]: time="2026-03-14T00:36:49.089821921Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Mar 14 00:36:49.096332 containerd[1504]: time="2026-03-14T00:36:49.096289956Z" level=info msg="CreateContainer within sandbox \"832e4623a9ba595b38d39661513318075ab321259fd2ec201db08dcf5746db2a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 14 00:36:49.118994 containerd[1504]: time="2026-03-14T00:36:49.118937486Z" level=info msg="CreateContainer within sandbox \"832e4623a9ba595b38d39661513318075ab321259fd2ec201db08dcf5746db2a\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"023b36768c1886d4ac2cb3ae1d9497d27a4e09a966d58bcb43a07a2f718184dd\"" Mar 14 00:36:49.120006 containerd[1504]: time="2026-03-14T00:36:49.119975511Z" level=info msg="StartContainer for \"023b36768c1886d4ac2cb3ae1d9497d27a4e09a966d58bcb43a07a2f718184dd\"" Mar 14 00:36:49.164691 systemd[1]: Started cri-containerd-023b36768c1886d4ac2cb3ae1d9497d27a4e09a966d58bcb43a07a2f718184dd.scope - libcontainer container 023b36768c1886d4ac2cb3ae1d9497d27a4e09a966d58bcb43a07a2f718184dd. Mar 14 00:36:49.201680 containerd[1504]: time="2026-03-14T00:36:49.201568756Z" level=info msg="StartContainer for \"023b36768c1886d4ac2cb3ae1d9497d27a4e09a966d58bcb43a07a2f718184dd\" returns successfully" Mar 14 00:36:51.420825 systemd[1]: Started sshd@10-10.230.50.222:22-152.32.215.203:44382.service - OpenSSH per-connection server daemon (152.32.215.203:44382). Mar 14 00:36:52.755768 sshd[3042]: Invalid user won from 152.32.215.203 port 44382 Mar 14 00:36:53.011858 sshd[3042]: Received disconnect from 152.32.215.203 port 44382:11: Bye Bye [preauth] Mar 14 00:36:53.011858 sshd[3042]: Disconnected from invalid user won 152.32.215.203 port 44382 [preauth] Mar 14 00:36:53.016913 systemd[1]: sshd@10-10.230.50.222:22-152.32.215.203:44382.service: Deactivated successfully. Mar 14 00:36:53.297569 kubelet[2695]: I0314 00:36:53.297238 2695 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6cf4cccc57-kn5nw" podStartSLOduration=4.912812078 podStartE2EDuration="8.297216752s" podCreationTimestamp="2026-03-14 00:36:45 +0000 UTC" firstStartedPulling="2026-03-14 00:36:45.706848459 +0000 UTC m=+7.320420409" lastFinishedPulling="2026-03-14 00:36:49.091253148 +0000 UTC m=+10.704825083" observedRunningTime="2026-03-14 00:36:49.72854644 +0000 UTC m=+11.342118392" watchObservedRunningTime="2026-03-14 00:36:53.297216752 +0000 UTC m=+14.910788699" Mar 14 00:36:56.512127 sudo[1763]: pam_unix(sudo:session): session closed for user root Mar 14 00:36:56.605716 sshd[1760]: pam_unix(sshd:session): session closed for user core Mar 14 00:36:56.611830 systemd[1]: sshd@8-10.230.50.222:22-20.161.92.111:36700.service: Deactivated successfully. Mar 14 00:36:56.617071 systemd[1]: session-11.scope: Deactivated successfully. Mar 14 00:36:56.617620 systemd[1]: session-11.scope: Consumed 4.786s CPU time, 154.6M memory peak, 0B memory swap peak. Mar 14 00:36:56.625576 systemd-logind[1491]: Session 11 logged out. Waiting for processes to exit. Mar 14 00:36:56.628057 systemd-logind[1491]: Removed session 11. Mar 14 00:36:58.113545 kubelet[2695]: I0314 00:36:58.113012 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96ecbb94-06bb-4fa4-b986-26255c847a35-tigera-ca-bundle\") pod \"calico-typha-55d8dfb58-vnk8j\" (UID: \"96ecbb94-06bb-4fa4-b986-26255c847a35\") " pod="calico-system/calico-typha-55d8dfb58-vnk8j" Mar 14 00:36:58.114196 kubelet[2695]: I0314 00:36:58.113551 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/96ecbb94-06bb-4fa4-b986-26255c847a35-typha-certs\") pod \"calico-typha-55d8dfb58-vnk8j\" (UID: \"96ecbb94-06bb-4fa4-b986-26255c847a35\") " pod="calico-system/calico-typha-55d8dfb58-vnk8j" Mar 14 00:36:58.114196 kubelet[2695]: I0314 00:36:58.113679 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qllps\" (UniqueName: \"kubernetes.io/projected/96ecbb94-06bb-4fa4-b986-26255c847a35-kube-api-access-qllps\") pod \"calico-typha-55d8dfb58-vnk8j\" (UID: \"96ecbb94-06bb-4fa4-b986-26255c847a35\") " pod="calico-system/calico-typha-55d8dfb58-vnk8j" Mar 14 00:36:58.121615 systemd[1]: Created slice kubepods-besteffort-pod96ecbb94_06bb_4fa4_b986_26255c847a35.slice - libcontainer container kubepods-besteffort-pod96ecbb94_06bb_4fa4_b986_26255c847a35.slice. Mar 14 00:36:58.271313 systemd[1]: Created slice kubepods-besteffort-poddd7d6a1c_5df4_4679_888c_6084a5a4037c.slice - libcontainer container kubepods-besteffort-poddd7d6a1c_5df4_4679_888c_6084a5a4037c.slice. Mar 14 00:36:58.315534 kubelet[2695]: I0314 00:36:58.315223 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/dd7d6a1c-5df4-4679-888c-6084a5a4037c-cni-log-dir\") pod \"calico-node-fzvrp\" (UID: \"dd7d6a1c-5df4-4679-888c-6084a5a4037c\") " pod="calico-system/calico-node-fzvrp" Mar 14 00:36:58.315534 kubelet[2695]: I0314 00:36:58.315290 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/dd7d6a1c-5df4-4679-888c-6084a5a4037c-xtables-lock\") pod \"calico-node-fzvrp\" (UID: \"dd7d6a1c-5df4-4679-888c-6084a5a4037c\") " pod="calico-system/calico-node-fzvrp" Mar 14 00:36:58.315534 kubelet[2695]: I0314 00:36:58.315319 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dd7d6a1c-5df4-4679-888c-6084a5a4037c-sys-fs\") pod \"calico-node-fzvrp\" (UID: \"dd7d6a1c-5df4-4679-888c-6084a5a4037c\") " pod="calico-system/calico-node-fzvrp" Mar 14 00:36:58.315534 kubelet[2695]: I0314 00:36:58.315348 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/dd7d6a1c-5df4-4679-888c-6084a5a4037c-cni-bin-dir\") pod \"calico-node-fzvrp\" (UID: \"dd7d6a1c-5df4-4679-888c-6084a5a4037c\") " pod="calico-system/calico-node-fzvrp" Mar 14 00:36:58.315534 kubelet[2695]: I0314 00:36:58.315392 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd7d6a1c-5df4-4679-888c-6084a5a4037c-tigera-ca-bundle\") pod \"calico-node-fzvrp\" (UID: \"dd7d6a1c-5df4-4679-888c-6084a5a4037c\") " pod="calico-system/calico-node-fzvrp" Mar 14 00:36:58.316011 kubelet[2695]: I0314 00:36:58.315415 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbdjp\" (UniqueName: \"kubernetes.io/projected/dd7d6a1c-5df4-4679-888c-6084a5a4037c-kube-api-access-lbdjp\") pod \"calico-node-fzvrp\" (UID: \"dd7d6a1c-5df4-4679-888c-6084a5a4037c\") " pod="calico-system/calico-node-fzvrp" Mar 14 00:36:58.316011 kubelet[2695]: I0314 00:36:58.315440 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/dd7d6a1c-5df4-4679-888c-6084a5a4037c-flexvol-driver-host\") pod \"calico-node-fzvrp\" (UID: \"dd7d6a1c-5df4-4679-888c-6084a5a4037c\") " pod="calico-system/calico-node-fzvrp" Mar 14 00:36:58.316536 kubelet[2695]: I0314 00:36:58.316253 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/dd7d6a1c-5df4-4679-888c-6084a5a4037c-node-certs\") pod \"calico-node-fzvrp\" (UID: \"dd7d6a1c-5df4-4679-888c-6084a5a4037c\") " pod="calico-system/calico-node-fzvrp" Mar 14 00:36:58.316536 kubelet[2695]: I0314 00:36:58.316340 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/dd7d6a1c-5df4-4679-888c-6084a5a4037c-nodeproc\") pod \"calico-node-fzvrp\" (UID: \"dd7d6a1c-5df4-4679-888c-6084a5a4037c\") " pod="calico-system/calico-node-fzvrp" Mar 14 00:36:58.316536 kubelet[2695]: I0314 00:36:58.316384 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/dd7d6a1c-5df4-4679-888c-6084a5a4037c-cni-net-dir\") pod \"calico-node-fzvrp\" (UID: \"dd7d6a1c-5df4-4679-888c-6084a5a4037c\") " pod="calico-system/calico-node-fzvrp" Mar 14 00:36:58.316968 kubelet[2695]: I0314 00:36:58.316538 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/dd7d6a1c-5df4-4679-888c-6084a5a4037c-policysync\") pod \"calico-node-fzvrp\" (UID: \"dd7d6a1c-5df4-4679-888c-6084a5a4037c\") " pod="calico-system/calico-node-fzvrp" Mar 14 00:36:58.316968 kubelet[2695]: I0314 00:36:58.316608 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/dd7d6a1c-5df4-4679-888c-6084a5a4037c-var-lib-calico\") pod \"calico-node-fzvrp\" (UID: \"dd7d6a1c-5df4-4679-888c-6084a5a4037c\") " pod="calico-system/calico-node-fzvrp" Mar 14 00:36:58.316968 kubelet[2695]: I0314 00:36:58.316644 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/dd7d6a1c-5df4-4679-888c-6084a5a4037c-var-run-calico\") pod \"calico-node-fzvrp\" (UID: \"dd7d6a1c-5df4-4679-888c-6084a5a4037c\") " pod="calico-system/calico-node-fzvrp" Mar 14 00:36:58.316968 kubelet[2695]: I0314 00:36:58.316706 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/dd7d6a1c-5df4-4679-888c-6084a5a4037c-bpffs\") pod \"calico-node-fzvrp\" (UID: \"dd7d6a1c-5df4-4679-888c-6084a5a4037c\") " pod="calico-system/calico-node-fzvrp" Mar 14 00:36:58.316968 kubelet[2695]: I0314 00:36:58.316776 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd7d6a1c-5df4-4679-888c-6084a5a4037c-lib-modules\") pod \"calico-node-fzvrp\" (UID: \"dd7d6a1c-5df4-4679-888c-6084a5a4037c\") " pod="calico-system/calico-node-fzvrp" Mar 14 00:36:58.363492 kubelet[2695]: E0314 00:36:58.363382 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psbtg" podUID="85de6037-ff24-4a8c-95b3-a77667104b2e" Mar 14 00:36:58.420493 kubelet[2695]: I0314 00:36:58.418719 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/85de6037-ff24-4a8c-95b3-a77667104b2e-registration-dir\") pod \"csi-node-driver-psbtg\" (UID: \"85de6037-ff24-4a8c-95b3-a77667104b2e\") " pod="calico-system/csi-node-driver-psbtg" Mar 14 00:36:58.420493 kubelet[2695]: I0314 00:36:58.418848 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/85de6037-ff24-4a8c-95b3-a77667104b2e-varrun\") pod \"csi-node-driver-psbtg\" (UID: \"85de6037-ff24-4a8c-95b3-a77667104b2e\") " pod="calico-system/csi-node-driver-psbtg" Mar 14 00:36:58.420493 kubelet[2695]: I0314 00:36:58.418876 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crd8f\" (UniqueName: \"kubernetes.io/projected/85de6037-ff24-4a8c-95b3-a77667104b2e-kube-api-access-crd8f\") pod \"csi-node-driver-psbtg\" (UID: \"85de6037-ff24-4a8c-95b3-a77667104b2e\") " pod="calico-system/csi-node-driver-psbtg" Mar 14 00:36:58.420493 kubelet[2695]: I0314 00:36:58.418926 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85de6037-ff24-4a8c-95b3-a77667104b2e-kubelet-dir\") pod \"csi-node-driver-psbtg\" (UID: \"85de6037-ff24-4a8c-95b3-a77667104b2e\") " pod="calico-system/csi-node-driver-psbtg" Mar 14 00:36:58.420493 kubelet[2695]: I0314 00:36:58.419003 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/85de6037-ff24-4a8c-95b3-a77667104b2e-socket-dir\") pod \"csi-node-driver-psbtg\" (UID: \"85de6037-ff24-4a8c-95b3-a77667104b2e\") " pod="calico-system/csi-node-driver-psbtg" Mar 14 00:36:58.423686 kubelet[2695]: E0314 00:36:58.423650 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.423844 kubelet[2695]: W0314 00:36:58.423810 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.424022 kubelet[2695]: E0314 00:36:58.423998 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.424439 kubelet[2695]: E0314 00:36:58.424409 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.424550 kubelet[2695]: W0314 00:36:58.424531 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.424660 kubelet[2695]: E0314 00:36:58.424640 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.425193 kubelet[2695]: E0314 00:36:58.425173 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.425298 kubelet[2695]: W0314 00:36:58.425279 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.425420 kubelet[2695]: E0314 00:36:58.425400 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.425851 kubelet[2695]: E0314 00:36:58.425827 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.425951 kubelet[2695]: W0314 00:36:58.425932 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.426050 kubelet[2695]: E0314 00:36:58.426033 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.435548 containerd[1504]: time="2026-03-14T00:36:58.434188523Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-55d8dfb58-vnk8j,Uid:96ecbb94-06bb-4fa4-b986-26255c847a35,Namespace:calico-system,Attempt:0,}" Mar 14 00:36:58.436156 kubelet[2695]: E0314 00:36:58.435999 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.436156 kubelet[2695]: W0314 00:36:58.436036 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.436156 kubelet[2695]: E0314 00:36:58.436063 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.479983 kubelet[2695]: E0314 00:36:58.478357 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.479983 kubelet[2695]: W0314 00:36:58.478439 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.479983 kubelet[2695]: E0314 00:36:58.478608 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.503407 containerd[1504]: time="2026-03-14T00:36:58.500590262Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 14 00:36:58.503407 containerd[1504]: time="2026-03-14T00:36:58.501730898Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 14 00:36:58.503407 containerd[1504]: time="2026-03-14T00:36:58.501763016Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:36:58.503407 containerd[1504]: time="2026-03-14T00:36:58.502555049Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:36:58.523617 kubelet[2695]: E0314 00:36:58.523561 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.523617 kubelet[2695]: W0314 00:36:58.523603 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.523861 kubelet[2695]: E0314 00:36:58.523627 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.524973 kubelet[2695]: E0314 00:36:58.524661 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.524973 kubelet[2695]: W0314 00:36:58.524683 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.524973 kubelet[2695]: E0314 00:36:58.524700 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.526225 kubelet[2695]: E0314 00:36:58.526197 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.526225 kubelet[2695]: W0314 00:36:58.526218 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.526353 kubelet[2695]: E0314 00:36:58.526235 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.529130 kubelet[2695]: E0314 00:36:58.529078 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.529398 kubelet[2695]: W0314 00:36:58.529367 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.529558 kubelet[2695]: E0314 00:36:58.529522 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.530072 kubelet[2695]: E0314 00:36:58.530040 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.530222 kubelet[2695]: W0314 00:36:58.530201 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.530319 kubelet[2695]: E0314 00:36:58.530300 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.531343 kubelet[2695]: E0314 00:36:58.531170 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.531343 kubelet[2695]: W0314 00:36:58.531189 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.531343 kubelet[2695]: E0314 00:36:58.531205 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.531938 kubelet[2695]: E0314 00:36:58.531757 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.531938 kubelet[2695]: W0314 00:36:58.531778 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.531938 kubelet[2695]: E0314 00:36:58.531802 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.532402 kubelet[2695]: E0314 00:36:58.532236 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.532402 kubelet[2695]: W0314 00:36:58.532255 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.532402 kubelet[2695]: E0314 00:36:58.532271 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.532679 kubelet[2695]: E0314 00:36:58.532661 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.532782 kubelet[2695]: W0314 00:36:58.532764 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.532897 kubelet[2695]: E0314 00:36:58.532877 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.533367 kubelet[2695]: E0314 00:36:58.533348 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.533731 kubelet[2695]: W0314 00:36:58.533458 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.533731 kubelet[2695]: E0314 00:36:58.533505 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.534209 kubelet[2695]: E0314 00:36:58.533951 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.534209 kubelet[2695]: W0314 00:36:58.533968 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.534209 kubelet[2695]: E0314 00:36:58.533985 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.534454 kubelet[2695]: E0314 00:36:58.534435 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.534630 kubelet[2695]: W0314 00:36:58.534610 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.534744 kubelet[2695]: E0314 00:36:58.534725 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.535189 kubelet[2695]: E0314 00:36:58.535169 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.535293 kubelet[2695]: W0314 00:36:58.535274 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.535414 kubelet[2695]: E0314 00:36:58.535394 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.535975 kubelet[2695]: E0314 00:36:58.535784 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.535975 kubelet[2695]: W0314 00:36:58.535801 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.535975 kubelet[2695]: E0314 00:36:58.535816 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.536207 kubelet[2695]: E0314 00:36:58.536190 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.536299 kubelet[2695]: W0314 00:36:58.536280 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.536389 kubelet[2695]: E0314 00:36:58.536371 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.536742 systemd[1]: Started cri-containerd-56a0d5886ff6f922c89785828d342f79ab4d526afef6ee00c2df8585c6c2a1b3.scope - libcontainer container 56a0d5886ff6f922c89785828d342f79ab4d526afef6ee00c2df8585c6c2a1b3. Mar 14 00:36:58.539437 kubelet[2695]: E0314 00:36:58.538980 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.539437 kubelet[2695]: W0314 00:36:58.538999 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.539437 kubelet[2695]: E0314 00:36:58.539017 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.539437 kubelet[2695]: E0314 00:36:58.539329 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.539437 kubelet[2695]: W0314 00:36:58.539343 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.539437 kubelet[2695]: E0314 00:36:58.539358 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.540483 kubelet[2695]: E0314 00:36:58.540286 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.540483 kubelet[2695]: W0314 00:36:58.540304 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.540483 kubelet[2695]: E0314 00:36:58.540321 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.540955 kubelet[2695]: E0314 00:36:58.540803 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.540955 kubelet[2695]: W0314 00:36:58.540821 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.540955 kubelet[2695]: E0314 00:36:58.540837 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.541201 kubelet[2695]: E0314 00:36:58.541183 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.541448 kubelet[2695]: W0314 00:36:58.541274 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.541448 kubelet[2695]: E0314 00:36:58.541297 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.541657 kubelet[2695]: E0314 00:36:58.541639 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.541797 kubelet[2695]: W0314 00:36:58.541770 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.541903 kubelet[2695]: E0314 00:36:58.541883 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.545759 kubelet[2695]: E0314 00:36:58.545252 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.545759 kubelet[2695]: W0314 00:36:58.545273 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.545759 kubelet[2695]: E0314 00:36:58.545290 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.545759 kubelet[2695]: E0314 00:36:58.545648 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.545759 kubelet[2695]: W0314 00:36:58.545661 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.545759 kubelet[2695]: E0314 00:36:58.545674 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.546507 kubelet[2695]: E0314 00:36:58.546276 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.546507 kubelet[2695]: W0314 00:36:58.546295 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.546507 kubelet[2695]: E0314 00:36:58.546310 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.547275 kubelet[2695]: E0314 00:36:58.547256 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.547976 kubelet[2695]: W0314 00:36:58.547371 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.547976 kubelet[2695]: E0314 00:36:58.547408 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.565371 kubelet[2695]: E0314 00:36:58.565325 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:36:58.565371 kubelet[2695]: W0314 00:36:58.565365 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:36:58.565699 kubelet[2695]: E0314 00:36:58.565406 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:36:58.580919 containerd[1504]: time="2026-03-14T00:36:58.580373063Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-fzvrp,Uid:dd7d6a1c-5df4-4679-888c-6084a5a4037c,Namespace:calico-system,Attempt:0,}" Mar 14 00:36:58.636327 containerd[1504]: time="2026-03-14T00:36:58.636124303Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 14 00:36:58.638624 containerd[1504]: time="2026-03-14T00:36:58.638180347Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 14 00:36:58.638624 containerd[1504]: time="2026-03-14T00:36:58.638303550Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:36:58.638624 containerd[1504]: time="2026-03-14T00:36:58.638490551Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:36:58.652955 containerd[1504]: time="2026-03-14T00:36:58.652866506Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-55d8dfb58-vnk8j,Uid:96ecbb94-06bb-4fa4-b986-26255c847a35,Namespace:calico-system,Attempt:0,} returns sandbox id \"56a0d5886ff6f922c89785828d342f79ab4d526afef6ee00c2df8585c6c2a1b3\"" Mar 14 00:36:58.656673 containerd[1504]: time="2026-03-14T00:36:58.656602195Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 14 00:36:58.680673 systemd[1]: Started cri-containerd-59f55397b6e4aad4f1c3e44c248b957b429ad5fe7e9f0f2d2f30dc24519359cc.scope - libcontainer container 59f55397b6e4aad4f1c3e44c248b957b429ad5fe7e9f0f2d2f30dc24519359cc. Mar 14 00:36:58.728054 containerd[1504]: time="2026-03-14T00:36:58.727931168Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-fzvrp,Uid:dd7d6a1c-5df4-4679-888c-6084a5a4037c,Namespace:calico-system,Attempt:0,} returns sandbox id \"59f55397b6e4aad4f1c3e44c248b957b429ad5fe7e9f0f2d2f30dc24519359cc\"" Mar 14 00:36:59.644188 kubelet[2695]: E0314 00:36:59.644067 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psbtg" podUID="85de6037-ff24-4a8c-95b3-a77667104b2e" Mar 14 00:37:00.142894 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2251767439.mount: Deactivated successfully. Mar 14 00:37:01.644334 kubelet[2695]: E0314 00:37:01.644252 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psbtg" podUID="85de6037-ff24-4a8c-95b3-a77667104b2e" Mar 14 00:37:01.850082 containerd[1504]: time="2026-03-14T00:37:01.849989414Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:01.851840 containerd[1504]: time="2026-03-14T00:37:01.851474164Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Mar 14 00:37:01.852902 containerd[1504]: time="2026-03-14T00:37:01.852525208Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:01.857381 containerd[1504]: time="2026-03-14T00:37:01.857289685Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:01.860900 containerd[1504]: time="2026-03-14T00:37:01.860672147Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 3.203993373s" Mar 14 00:37:01.860900 containerd[1504]: time="2026-03-14T00:37:01.860724376Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Mar 14 00:37:01.865157 containerd[1504]: time="2026-03-14T00:37:01.862268582Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 14 00:37:01.888958 containerd[1504]: time="2026-03-14T00:37:01.888815087Z" level=info msg="CreateContainer within sandbox \"56a0d5886ff6f922c89785828d342f79ab4d526afef6ee00c2df8585c6c2a1b3\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 14 00:37:01.916194 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount931241525.mount: Deactivated successfully. Mar 14 00:37:01.918594 containerd[1504]: time="2026-03-14T00:37:01.918535712Z" level=info msg="CreateContainer within sandbox \"56a0d5886ff6f922c89785828d342f79ab4d526afef6ee00c2df8585c6c2a1b3\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"00e739d1bad90ad81e6496ab64ec6480c6d70e6e1e5b7bf498e120b15a126a77\"" Mar 14 00:37:01.920784 containerd[1504]: time="2026-03-14T00:37:01.920469999Z" level=info msg="StartContainer for \"00e739d1bad90ad81e6496ab64ec6480c6d70e6e1e5b7bf498e120b15a126a77\"" Mar 14 00:37:02.026819 systemd[1]: Started cri-containerd-00e739d1bad90ad81e6496ab64ec6480c6d70e6e1e5b7bf498e120b15a126a77.scope - libcontainer container 00e739d1bad90ad81e6496ab64ec6480c6d70e6e1e5b7bf498e120b15a126a77. Mar 14 00:37:02.103678 containerd[1504]: time="2026-03-14T00:37:02.103561184Z" level=info msg="StartContainer for \"00e739d1bad90ad81e6496ab64ec6480c6d70e6e1e5b7bf498e120b15a126a77\" returns successfully" Mar 14 00:37:02.776856 kubelet[2695]: I0314 00:37:02.776744 2695 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-typha-55d8dfb58-vnk8j" podStartSLOduration=1.5706131939999999 podStartE2EDuration="4.77670382s" podCreationTimestamp="2026-03-14 00:36:58 +0000 UTC" firstStartedPulling="2026-03-14 00:36:58.655758603 +0000 UTC m=+20.269330544" lastFinishedPulling="2026-03-14 00:37:01.861849236 +0000 UTC m=+23.475421170" observedRunningTime="2026-03-14 00:37:02.775195582 +0000 UTC m=+24.388767551" watchObservedRunningTime="2026-03-14 00:37:02.77670382 +0000 UTC m=+24.390275766" Mar 14 00:37:02.843908 kubelet[2695]: E0314 00:37:02.843848 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.843908 kubelet[2695]: W0314 00:37:02.843886 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.844201 kubelet[2695]: E0314 00:37:02.843927 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.844360 kubelet[2695]: E0314 00:37:02.844295 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.844437 kubelet[2695]: W0314 00:37:02.844368 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.844543 kubelet[2695]: E0314 00:37:02.844443 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.844864 kubelet[2695]: E0314 00:37:02.844826 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.844864 kubelet[2695]: W0314 00:37:02.844856 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.845003 kubelet[2695]: E0314 00:37:02.844871 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.845321 kubelet[2695]: E0314 00:37:02.845300 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.845321 kubelet[2695]: W0314 00:37:02.845319 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.845503 kubelet[2695]: E0314 00:37:02.845355 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.845741 kubelet[2695]: E0314 00:37:02.845721 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.845816 kubelet[2695]: W0314 00:37:02.845753 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.845816 kubelet[2695]: E0314 00:37:02.845776 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.846097 kubelet[2695]: E0314 00:37:02.846079 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.846097 kubelet[2695]: W0314 00:37:02.846096 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.846229 kubelet[2695]: E0314 00:37:02.846123 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.846583 kubelet[2695]: E0314 00:37:02.846561 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.846583 kubelet[2695]: W0314 00:37:02.846580 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.846700 kubelet[2695]: E0314 00:37:02.846605 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.846973 kubelet[2695]: E0314 00:37:02.846945 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.847067 kubelet[2695]: W0314 00:37:02.846980 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.847067 kubelet[2695]: E0314 00:37:02.846997 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.847370 kubelet[2695]: E0314 00:37:02.847332 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.847370 kubelet[2695]: W0314 00:37:02.847368 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.847542 kubelet[2695]: E0314 00:37:02.847384 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.847718 kubelet[2695]: E0314 00:37:02.847699 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.847718 kubelet[2695]: W0314 00:37:02.847717 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.847850 kubelet[2695]: E0314 00:37:02.847732 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.848071 kubelet[2695]: E0314 00:37:02.848051 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.848071 kubelet[2695]: W0314 00:37:02.848069 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.848193 kubelet[2695]: E0314 00:37:02.848083 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.848371 kubelet[2695]: E0314 00:37:02.848348 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.848477 kubelet[2695]: W0314 00:37:02.848387 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.848477 kubelet[2695]: E0314 00:37:02.848403 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.848708 kubelet[2695]: E0314 00:37:02.848689 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.848708 kubelet[2695]: W0314 00:37:02.848707 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.848796 kubelet[2695]: E0314 00:37:02.848722 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.849014 kubelet[2695]: E0314 00:37:02.848995 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.849014 kubelet[2695]: W0314 00:37:02.849013 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.849124 kubelet[2695]: E0314 00:37:02.849028 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.849312 kubelet[2695]: E0314 00:37:02.849293 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.849312 kubelet[2695]: W0314 00:37:02.849311 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.849437 kubelet[2695]: E0314 00:37:02.849326 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.879323 kubelet[2695]: E0314 00:37:02.879272 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.879323 kubelet[2695]: W0314 00:37:02.879302 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.879874 kubelet[2695]: E0314 00:37:02.879368 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.879874 kubelet[2695]: E0314 00:37:02.879798 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.879874 kubelet[2695]: W0314 00:37:02.879832 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.879874 kubelet[2695]: E0314 00:37:02.879850 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.880403 kubelet[2695]: E0314 00:37:02.880362 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.880495 kubelet[2695]: W0314 00:37:02.880405 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.880495 kubelet[2695]: E0314 00:37:02.880423 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.880917 kubelet[2695]: E0314 00:37:02.880895 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.880917 kubelet[2695]: W0314 00:37:02.880914 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.881088 kubelet[2695]: E0314 00:37:02.880929 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.881906 kubelet[2695]: E0314 00:37:02.881335 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.881906 kubelet[2695]: W0314 00:37:02.881382 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.881906 kubelet[2695]: E0314 00:37:02.881400 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.881906 kubelet[2695]: E0314 00:37:02.881808 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.881906 kubelet[2695]: W0314 00:37:02.881821 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.881906 kubelet[2695]: E0314 00:37:02.881836 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.882411 kubelet[2695]: E0314 00:37:02.882391 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.882600 kubelet[2695]: W0314 00:37:02.882409 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.882600 kubelet[2695]: E0314 00:37:02.882437 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.883565 kubelet[2695]: E0314 00:37:02.883520 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.883565 kubelet[2695]: W0314 00:37:02.883541 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.883565 kubelet[2695]: E0314 00:37:02.883557 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.883995 kubelet[2695]: E0314 00:37:02.883970 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.883995 kubelet[2695]: W0314 00:37:02.883994 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.885003 kubelet[2695]: E0314 00:37:02.884576 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.885003 kubelet[2695]: E0314 00:37:02.885002 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.885171 kubelet[2695]: W0314 00:37:02.885015 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.885171 kubelet[2695]: E0314 00:37:02.885029 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.885561 kubelet[2695]: E0314 00:37:02.885540 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.885561 kubelet[2695]: W0314 00:37:02.885560 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.885703 kubelet[2695]: E0314 00:37:02.885577 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.885919 kubelet[2695]: E0314 00:37:02.885899 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.885919 kubelet[2695]: W0314 00:37:02.885918 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.886025 kubelet[2695]: E0314 00:37:02.885941 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.886398 kubelet[2695]: E0314 00:37:02.886378 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.886398 kubelet[2695]: W0314 00:37:02.886396 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.886550 kubelet[2695]: E0314 00:37:02.886427 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.887178 kubelet[2695]: E0314 00:37:02.887158 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.887258 kubelet[2695]: W0314 00:37:02.887179 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.887258 kubelet[2695]: E0314 00:37:02.887196 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.888225 kubelet[2695]: E0314 00:37:02.888119 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.888225 kubelet[2695]: W0314 00:37:02.888216 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.888377 kubelet[2695]: E0314 00:37:02.888233 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.888683 kubelet[2695]: E0314 00:37:02.888663 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.888752 kubelet[2695]: W0314 00:37:02.888695 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.888752 kubelet[2695]: E0314 00:37:02.888713 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.889035 kubelet[2695]: E0314 00:37:02.889016 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.889035 kubelet[2695]: W0314 00:37:02.889034 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.889135 kubelet[2695]: E0314 00:37:02.889049 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:02.889981 kubelet[2695]: E0314 00:37:02.889954 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:02.889981 kubelet[2695]: W0314 00:37:02.889975 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:02.890105 kubelet[2695]: E0314 00:37:02.890001 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:03.644970 kubelet[2695]: E0314 00:37:03.643563 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psbtg" podUID="85de6037-ff24-4a8c-95b3-a77667104b2e" Mar 14 00:37:03.699566 containerd[1504]: time="2026-03-14T00:37:03.699435821Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:03.700734 containerd[1504]: time="2026-03-14T00:37:03.700688205Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Mar 14 00:37:03.701977 containerd[1504]: time="2026-03-14T00:37:03.701415049Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:03.705925 containerd[1504]: time="2026-03-14T00:37:03.705181541Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:03.708308 containerd[1504]: time="2026-03-14T00:37:03.707529682Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 1.845210299s" Mar 14 00:37:03.708308 containerd[1504]: time="2026-03-14T00:37:03.707576269Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Mar 14 00:37:03.716707 containerd[1504]: time="2026-03-14T00:37:03.716595267Z" level=info msg="CreateContainer within sandbox \"59f55397b6e4aad4f1c3e44c248b957b429ad5fe7e9f0f2d2f30dc24519359cc\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 14 00:37:03.734189 containerd[1504]: time="2026-03-14T00:37:03.734141473Z" level=info msg="CreateContainer within sandbox \"59f55397b6e4aad4f1c3e44c248b957b429ad5fe7e9f0f2d2f30dc24519359cc\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"df2a871cb18202aca456d796feae9c9fac82107f76c1e2ac3c5179bfbec39b05\"" Mar 14 00:37:03.736413 containerd[1504]: time="2026-03-14T00:37:03.736118665Z" level=info msg="StartContainer for \"df2a871cb18202aca456d796feae9c9fac82107f76c1e2ac3c5179bfbec39b05\"" Mar 14 00:37:03.767487 kubelet[2695]: I0314 00:37:03.767347 2695 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 14 00:37:03.787670 systemd[1]: Started cri-containerd-df2a871cb18202aca456d796feae9c9fac82107f76c1e2ac3c5179bfbec39b05.scope - libcontainer container df2a871cb18202aca456d796feae9c9fac82107f76c1e2ac3c5179bfbec39b05. Mar 14 00:37:03.836001 containerd[1504]: time="2026-03-14T00:37:03.835940635Z" level=info msg="StartContainer for \"df2a871cb18202aca456d796feae9c9fac82107f76c1e2ac3c5179bfbec39b05\" returns successfully" Mar 14 00:37:03.857747 kubelet[2695]: E0314 00:37:03.857693 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 14 00:37:03.857747 kubelet[2695]: W0314 00:37:03.857732 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 14 00:37:03.859928 kubelet[2695]: E0314 00:37:03.857770 2695 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 14 00:37:03.860277 systemd[1]: cri-containerd-df2a871cb18202aca456d796feae9c9fac82107f76c1e2ac3c5179bfbec39b05.scope: Deactivated successfully. Mar 14 00:37:04.016067 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-df2a871cb18202aca456d796feae9c9fac82107f76c1e2ac3c5179bfbec39b05-rootfs.mount: Deactivated successfully. Mar 14 00:37:04.058199 containerd[1504]: time="2026-03-14T00:37:04.015769715Z" level=info msg="shim disconnected" id=df2a871cb18202aca456d796feae9c9fac82107f76c1e2ac3c5179bfbec39b05 namespace=k8s.io Mar 14 00:37:04.058529 containerd[1504]: time="2026-03-14T00:37:04.058213761Z" level=warning msg="cleaning up after shim disconnected" id=df2a871cb18202aca456d796feae9c9fac82107f76c1e2ac3c5179bfbec39b05 namespace=k8s.io Mar 14 00:37:04.058529 containerd[1504]: time="2026-03-14T00:37:04.058246550Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 14 00:37:04.776492 containerd[1504]: time="2026-03-14T00:37:04.775534988Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 14 00:37:05.643790 kubelet[2695]: E0314 00:37:05.643422 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psbtg" podUID="85de6037-ff24-4a8c-95b3-a77667104b2e" Mar 14 00:37:07.643798 kubelet[2695]: E0314 00:37:07.643326 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psbtg" podUID="85de6037-ff24-4a8c-95b3-a77667104b2e" Mar 14 00:37:09.644292 kubelet[2695]: E0314 00:37:09.644209 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psbtg" podUID="85de6037-ff24-4a8c-95b3-a77667104b2e" Mar 14 00:37:11.644579 kubelet[2695]: E0314 00:37:11.644508 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psbtg" podUID="85de6037-ff24-4a8c-95b3-a77667104b2e" Mar 14 00:37:13.644355 kubelet[2695]: E0314 00:37:13.644287 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psbtg" podUID="85de6037-ff24-4a8c-95b3-a77667104b2e" Mar 14 00:37:15.592692 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3089416952.mount: Deactivated successfully. Mar 14 00:37:15.643577 kubelet[2695]: E0314 00:37:15.643449 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psbtg" podUID="85de6037-ff24-4a8c-95b3-a77667104b2e" Mar 14 00:37:15.655950 containerd[1504]: time="2026-03-14T00:37:15.652019277Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Mar 14 00:37:15.656511 containerd[1504]: time="2026-03-14T00:37:15.648660678Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:15.658832 containerd[1504]: time="2026-03-14T00:37:15.658798124Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:15.669163 containerd[1504]: time="2026-03-14T00:37:15.669092453Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:15.671070 containerd[1504]: time="2026-03-14T00:37:15.670122707Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 10.892896345s" Mar 14 00:37:15.671070 containerd[1504]: time="2026-03-14T00:37:15.670185192Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Mar 14 00:37:15.677865 containerd[1504]: time="2026-03-14T00:37:15.677831026Z" level=info msg="CreateContainer within sandbox \"59f55397b6e4aad4f1c3e44c248b957b429ad5fe7e9f0f2d2f30dc24519359cc\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 14 00:37:15.700326 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2098211714.mount: Deactivated successfully. Mar 14 00:37:15.715860 containerd[1504]: time="2026-03-14T00:37:15.715810934Z" level=info msg="CreateContainer within sandbox \"59f55397b6e4aad4f1c3e44c248b957b429ad5fe7e9f0f2d2f30dc24519359cc\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"5a17247358e296f48acd54310a271dfd7cc3699a1bb3c0c7922b917017133753\"" Mar 14 00:37:15.719436 containerd[1504]: time="2026-03-14T00:37:15.717780533Z" level=info msg="StartContainer for \"5a17247358e296f48acd54310a271dfd7cc3699a1bb3c0c7922b917017133753\"" Mar 14 00:37:15.801731 systemd[1]: Started cri-containerd-5a17247358e296f48acd54310a271dfd7cc3699a1bb3c0c7922b917017133753.scope - libcontainer container 5a17247358e296f48acd54310a271dfd7cc3699a1bb3c0c7922b917017133753. Mar 14 00:37:15.861305 containerd[1504]: time="2026-03-14T00:37:15.860898616Z" level=info msg="StartContainer for \"5a17247358e296f48acd54310a271dfd7cc3699a1bb3c0c7922b917017133753\" returns successfully" Mar 14 00:37:15.949989 systemd[1]: cri-containerd-5a17247358e296f48acd54310a271dfd7cc3699a1bb3c0c7922b917017133753.scope: Deactivated successfully. Mar 14 00:37:15.988491 containerd[1504]: time="2026-03-14T00:37:15.988300595Z" level=info msg="shim disconnected" id=5a17247358e296f48acd54310a271dfd7cc3699a1bb3c0c7922b917017133753 namespace=k8s.io Mar 14 00:37:15.988491 containerd[1504]: time="2026-03-14T00:37:15.988416855Z" level=warning msg="cleaning up after shim disconnected" id=5a17247358e296f48acd54310a271dfd7cc3699a1bb3c0c7922b917017133753 namespace=k8s.io Mar 14 00:37:15.988491 containerd[1504]: time="2026-03-14T00:37:15.988432898Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 14 00:37:16.590879 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5a17247358e296f48acd54310a271dfd7cc3699a1bb3c0c7922b917017133753-rootfs.mount: Deactivated successfully. Mar 14 00:37:16.842042 containerd[1504]: time="2026-03-14T00:37:16.841888790Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 14 00:37:17.644215 kubelet[2695]: E0314 00:37:17.644083 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psbtg" podUID="85de6037-ff24-4a8c-95b3-a77667104b2e" Mar 14 00:37:19.644436 kubelet[2695]: E0314 00:37:19.643718 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psbtg" podUID="85de6037-ff24-4a8c-95b3-a77667104b2e" Mar 14 00:37:21.006189 kubelet[2695]: I0314 00:37:21.005448 2695 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 14 00:37:21.644040 kubelet[2695]: E0314 00:37:21.643397 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psbtg" podUID="85de6037-ff24-4a8c-95b3-a77667104b2e" Mar 14 00:37:21.700102 containerd[1504]: time="2026-03-14T00:37:21.700045227Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:21.701529 containerd[1504]: time="2026-03-14T00:37:21.701488439Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Mar 14 00:37:21.703309 containerd[1504]: time="2026-03-14T00:37:21.703233173Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:21.706567 containerd[1504]: time="2026-03-14T00:37:21.706440815Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:21.708432 containerd[1504]: time="2026-03-14T00:37:21.708286358Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 4.866335164s" Mar 14 00:37:21.708432 containerd[1504]: time="2026-03-14T00:37:21.708327498Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Mar 14 00:37:21.715082 containerd[1504]: time="2026-03-14T00:37:21.714893316Z" level=info msg="CreateContainer within sandbox \"59f55397b6e4aad4f1c3e44c248b957b429ad5fe7e9f0f2d2f30dc24519359cc\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 14 00:37:21.732816 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1409598100.mount: Deactivated successfully. Mar 14 00:37:21.734852 containerd[1504]: time="2026-03-14T00:37:21.734809254Z" level=info msg="CreateContainer within sandbox \"59f55397b6e4aad4f1c3e44c248b957b429ad5fe7e9f0f2d2f30dc24519359cc\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"b20c4f3a3ceab14d9fd4a381101aa5f43b6a5b35457de9c351dc735445a57fa6\"" Mar 14 00:37:21.735966 containerd[1504]: time="2026-03-14T00:37:21.735918231Z" level=info msg="StartContainer for \"b20c4f3a3ceab14d9fd4a381101aa5f43b6a5b35457de9c351dc735445a57fa6\"" Mar 14 00:37:21.783417 systemd[1]: run-containerd-runc-k8s.io-b20c4f3a3ceab14d9fd4a381101aa5f43b6a5b35457de9c351dc735445a57fa6-runc.xrZMgz.mount: Deactivated successfully. Mar 14 00:37:21.793701 systemd[1]: Started cri-containerd-b20c4f3a3ceab14d9fd4a381101aa5f43b6a5b35457de9c351dc735445a57fa6.scope - libcontainer container b20c4f3a3ceab14d9fd4a381101aa5f43b6a5b35457de9c351dc735445a57fa6. Mar 14 00:37:21.840012 containerd[1504]: time="2026-03-14T00:37:21.839953468Z" level=info msg="StartContainer for \"b20c4f3a3ceab14d9fd4a381101aa5f43b6a5b35457de9c351dc735445a57fa6\" returns successfully" Mar 14 00:37:22.964201 systemd[1]: cri-containerd-b20c4f3a3ceab14d9fd4a381101aa5f43b6a5b35457de9c351dc735445a57fa6.scope: Deactivated successfully. Mar 14 00:37:22.997703 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b20c4f3a3ceab14d9fd4a381101aa5f43b6a5b35457de9c351dc735445a57fa6-rootfs.mount: Deactivated successfully. Mar 14 00:37:23.034364 containerd[1504]: time="2026-03-14T00:37:23.034211778Z" level=info msg="shim disconnected" id=b20c4f3a3ceab14d9fd4a381101aa5f43b6a5b35457de9c351dc735445a57fa6 namespace=k8s.io Mar 14 00:37:23.034364 containerd[1504]: time="2026-03-14T00:37:23.034327946Z" level=warning msg="cleaning up after shim disconnected" id=b20c4f3a3ceab14d9fd4a381101aa5f43b6a5b35457de9c351dc735445a57fa6 namespace=k8s.io Mar 14 00:37:23.034364 containerd[1504]: time="2026-03-14T00:37:23.034345517Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 14 00:37:23.037715 kubelet[2695]: I0314 00:37:23.035638 2695 kubelet_node_status.go:427] "Fast updating node status as it just became ready" Mar 14 00:37:23.180869 systemd[1]: Created slice kubepods-besteffort-pod2820677f_9f42_42f1_ade4_b7694612f299.slice - libcontainer container kubepods-besteffort-pod2820677f_9f42_42f1_ade4_b7694612f299.slice. Mar 14 00:37:23.184948 systemd[1]: Created slice kubepods-besteffort-pod2ed4c211_c53b_438a_87ab_ddc36cd894a9.slice - libcontainer container kubepods-besteffort-pod2ed4c211_c53b_438a_87ab_ddc36cd894a9.slice. Mar 14 00:37:23.186147 systemd[1]: Created slice kubepods-burstable-pod08b4124a_e53b_471c_876a_69a445df63cd.slice - libcontainer container kubepods-burstable-pod08b4124a_e53b_471c_876a_69a445df63cd.slice. Mar 14 00:37:23.188790 systemd[1]: Created slice kubepods-burstable-pod0c8a2ac7_fa2e_44fe_9348_bf4cf1020019.slice - libcontainer container kubepods-burstable-pod0c8a2ac7_fa2e_44fe_9348_bf4cf1020019.slice. Mar 14 00:37:23.206332 systemd[1]: Created slice kubepods-besteffort-podf4805231_80ff_447c_a875_0751f3c2712b.slice - libcontainer container kubepods-besteffort-podf4805231_80ff_447c_a875_0751f3c2712b.slice. Mar 14 00:37:23.221986 systemd[1]: Created slice kubepods-besteffort-pod6ea7102e_c24f_4833_a40a_4027abd4e9fc.slice - libcontainer container kubepods-besteffort-pod6ea7102e_c24f_4833_a40a_4027abd4e9fc.slice. Mar 14 00:37:23.228307 systemd[1]: Created slice kubepods-besteffort-podc2f12561_b5d4_4d53_afc2_18ef12c655af.slice - libcontainer container kubepods-besteffort-podc2f12561_b5d4_4d53_afc2_18ef12c655af.slice. Mar 14 00:37:23.251505 kubelet[2695]: I0314 00:37:23.251108 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blm9c\" (UniqueName: \"kubernetes.io/projected/0c8a2ac7-fa2e-44fe-9348-bf4cf1020019-kube-api-access-blm9c\") pod \"coredns-7d764666f9-zs4lg\" (UID: \"0c8a2ac7-fa2e-44fe-9348-bf4cf1020019\") " pod="kube-system/coredns-7d764666f9-zs4lg" Mar 14 00:37:23.252417 kubelet[2695]: I0314 00:37:23.252239 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ea7102e-c24f-4833-a40a-4027abd4e9fc-tigera-ca-bundle\") pod \"calico-kube-controllers-5f9c794d4f-2qjv6\" (UID: \"6ea7102e-c24f-4833-a40a-4027abd4e9fc\") " pod="calico-system/calico-kube-controllers-5f9c794d4f-2qjv6" Mar 14 00:37:23.252417 kubelet[2695]: I0314 00:37:23.252322 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2f12561-b5d4-4d53-afc2-18ef12c655af-goldmane-ca-bundle\") pod \"goldmane-9f7667bb8-n69fb\" (UID: \"c2f12561-b5d4-4d53-afc2-18ef12c655af\") " pod="calico-system/goldmane-9f7667bb8-n69fb" Mar 14 00:37:23.252417 kubelet[2695]: I0314 00:37:23.252406 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2f12561-b5d4-4d53-afc2-18ef12c655af-config\") pod \"goldmane-9f7667bb8-n69fb\" (UID: \"c2f12561-b5d4-4d53-afc2-18ef12c655af\") " pod="calico-system/goldmane-9f7667bb8-n69fb" Mar 14 00:37:23.252627 kubelet[2695]: I0314 00:37:23.252439 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/2ed4c211-c53b-438a-87ab-ddc36cd894a9-nginx-config\") pod \"whisker-6f59cc5c64-pb8rc\" (UID: \"2ed4c211-c53b-438a-87ab-ddc36cd894a9\") " pod="calico-system/whisker-6f59cc5c64-pb8rc" Mar 14 00:37:23.252627 kubelet[2695]: I0314 00:37:23.252509 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxt9r\" (UniqueName: \"kubernetes.io/projected/08b4124a-e53b-471c-876a-69a445df63cd-kube-api-access-vxt9r\") pod \"coredns-7d764666f9-5sb67\" (UID: \"08b4124a-e53b-471c-876a-69a445df63cd\") " pod="kube-system/coredns-7d764666f9-5sb67" Mar 14 00:37:23.252627 kubelet[2695]: I0314 00:37:23.252539 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm55d\" (UniqueName: \"kubernetes.io/projected/f4805231-80ff-447c-a875-0751f3c2712b-kube-api-access-rm55d\") pod \"calico-apiserver-689dfd58b9-glmpr\" (UID: \"f4805231-80ff-447c-a875-0751f3c2712b\") " pod="calico-system/calico-apiserver-689dfd58b9-glmpr" Mar 14 00:37:23.252627 kubelet[2695]: I0314 00:37:23.252587 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc9b4\" (UniqueName: \"kubernetes.io/projected/2820677f-9f42-42f1-ade4-b7694612f299-kube-api-access-zc9b4\") pod \"calico-apiserver-689dfd58b9-98c4z\" (UID: \"2820677f-9f42-42f1-ade4-b7694612f299\") " pod="calico-system/calico-apiserver-689dfd58b9-98c4z" Mar 14 00:37:23.252857 kubelet[2695]: I0314 00:37:23.252624 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2ed4c211-c53b-438a-87ab-ddc36cd894a9-whisker-backend-key-pair\") pod \"whisker-6f59cc5c64-pb8rc\" (UID: \"2ed4c211-c53b-438a-87ab-ddc36cd894a9\") " pod="calico-system/whisker-6f59cc5c64-pb8rc" Mar 14 00:37:23.252857 kubelet[2695]: I0314 00:37:23.252672 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr4nk\" (UniqueName: \"kubernetes.io/projected/6ea7102e-c24f-4833-a40a-4027abd4e9fc-kube-api-access-dr4nk\") pod \"calico-kube-controllers-5f9c794d4f-2qjv6\" (UID: \"6ea7102e-c24f-4833-a40a-4027abd4e9fc\") " pod="calico-system/calico-kube-controllers-5f9c794d4f-2qjv6" Mar 14 00:37:23.252857 kubelet[2695]: I0314 00:37:23.252700 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/c2f12561-b5d4-4d53-afc2-18ef12c655af-goldmane-key-pair\") pod \"goldmane-9f7667bb8-n69fb\" (UID: \"c2f12561-b5d4-4d53-afc2-18ef12c655af\") " pod="calico-system/goldmane-9f7667bb8-n69fb" Mar 14 00:37:23.252857 kubelet[2695]: I0314 00:37:23.252764 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24t4l\" (UniqueName: \"kubernetes.io/projected/c2f12561-b5d4-4d53-afc2-18ef12c655af-kube-api-access-24t4l\") pod \"goldmane-9f7667bb8-n69fb\" (UID: \"c2f12561-b5d4-4d53-afc2-18ef12c655af\") " pod="calico-system/goldmane-9f7667bb8-n69fb" Mar 14 00:37:23.252857 kubelet[2695]: I0314 00:37:23.252846 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ed4c211-c53b-438a-87ab-ddc36cd894a9-whisker-ca-bundle\") pod \"whisker-6f59cc5c64-pb8rc\" (UID: \"2ed4c211-c53b-438a-87ab-ddc36cd894a9\") " pod="calico-system/whisker-6f59cc5c64-pb8rc" Mar 14 00:37:23.253072 kubelet[2695]: I0314 00:37:23.252881 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/2820677f-9f42-42f1-ade4-b7694612f299-calico-apiserver-certs\") pod \"calico-apiserver-689dfd58b9-98c4z\" (UID: \"2820677f-9f42-42f1-ade4-b7694612f299\") " pod="calico-system/calico-apiserver-689dfd58b9-98c4z" Mar 14 00:37:23.253072 kubelet[2695]: I0314 00:37:23.252934 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kfqv\" (UniqueName: \"kubernetes.io/projected/2ed4c211-c53b-438a-87ab-ddc36cd894a9-kube-api-access-5kfqv\") pod \"whisker-6f59cc5c64-pb8rc\" (UID: \"2ed4c211-c53b-438a-87ab-ddc36cd894a9\") " pod="calico-system/whisker-6f59cc5c64-pb8rc" Mar 14 00:37:23.253072 kubelet[2695]: I0314 00:37:23.252968 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c8a2ac7-fa2e-44fe-9348-bf4cf1020019-config-volume\") pod \"coredns-7d764666f9-zs4lg\" (UID: \"0c8a2ac7-fa2e-44fe-9348-bf4cf1020019\") " pod="kube-system/coredns-7d764666f9-zs4lg" Mar 14 00:37:23.253072 kubelet[2695]: I0314 00:37:23.253051 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f4805231-80ff-447c-a875-0751f3c2712b-calico-apiserver-certs\") pod \"calico-apiserver-689dfd58b9-glmpr\" (UID: \"f4805231-80ff-447c-a875-0751f3c2712b\") " pod="calico-system/calico-apiserver-689dfd58b9-glmpr" Mar 14 00:37:23.253295 kubelet[2695]: I0314 00:37:23.253176 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/08b4124a-e53b-471c-876a-69a445df63cd-config-volume\") pod \"coredns-7d764666f9-5sb67\" (UID: \"08b4124a-e53b-471c-876a-69a445df63cd\") " pod="kube-system/coredns-7d764666f9-5sb67" Mar 14 00:37:23.506595 containerd[1504]: time="2026-03-14T00:37:23.506216216Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-689dfd58b9-98c4z,Uid:2820677f-9f42-42f1-ade4-b7694612f299,Namespace:calico-system,Attempt:0,}" Mar 14 00:37:23.509133 containerd[1504]: time="2026-03-14T00:37:23.508989396Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f59cc5c64-pb8rc,Uid:2ed4c211-c53b-438a-87ab-ddc36cd894a9,Namespace:calico-system,Attempt:0,}" Mar 14 00:37:23.511680 containerd[1504]: time="2026-03-14T00:37:23.511558551Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-5sb67,Uid:08b4124a-e53b-471c-876a-69a445df63cd,Namespace:kube-system,Attempt:0,}" Mar 14 00:37:23.513659 containerd[1504]: time="2026-03-14T00:37:23.513438741Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-zs4lg,Uid:0c8a2ac7-fa2e-44fe-9348-bf4cf1020019,Namespace:kube-system,Attempt:0,}" Mar 14 00:37:23.520103 containerd[1504]: time="2026-03-14T00:37:23.519860338Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-689dfd58b9-glmpr,Uid:f4805231-80ff-447c-a875-0751f3c2712b,Namespace:calico-system,Attempt:0,}" Mar 14 00:37:23.528372 containerd[1504]: time="2026-03-14T00:37:23.528224611Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5f9c794d4f-2qjv6,Uid:6ea7102e-c24f-4833-a40a-4027abd4e9fc,Namespace:calico-system,Attempt:0,}" Mar 14 00:37:23.535770 containerd[1504]: time="2026-03-14T00:37:23.535388916Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-n69fb,Uid:c2f12561-b5d4-4d53-afc2-18ef12c655af,Namespace:calico-system,Attempt:0,}" Mar 14 00:37:23.663561 systemd[1]: Created slice kubepods-besteffort-pod85de6037_ff24_4a8c_95b3_a77667104b2e.slice - libcontainer container kubepods-besteffort-pod85de6037_ff24_4a8c_95b3_a77667104b2e.slice. Mar 14 00:37:23.710446 containerd[1504]: time="2026-03-14T00:37:23.706304121Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-psbtg,Uid:85de6037-ff24-4a8c-95b3-a77667104b2e,Namespace:calico-system,Attempt:0,}" Mar 14 00:37:23.978910 containerd[1504]: time="2026-03-14T00:37:23.978855970Z" level=info msg="CreateContainer within sandbox \"59f55397b6e4aad4f1c3e44c248b957b429ad5fe7e9f0f2d2f30dc24519359cc\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 14 00:37:24.072835 containerd[1504]: time="2026-03-14T00:37:24.072740225Z" level=info msg="CreateContainer within sandbox \"59f55397b6e4aad4f1c3e44c248b957b429ad5fe7e9f0f2d2f30dc24519359cc\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"a74e524a33317ff9cf7c964e89148d5a4eda0317390e3369c9a3d90653bf5e59\"" Mar 14 00:37:24.076059 containerd[1504]: time="2026-03-14T00:37:24.074626184Z" level=info msg="StartContainer for \"a74e524a33317ff9cf7c964e89148d5a4eda0317390e3369c9a3d90653bf5e59\"" Mar 14 00:37:24.155194 containerd[1504]: time="2026-03-14T00:37:24.155064578Z" level=error msg="Failed to destroy network for sandbox \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.161583 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4-shm.mount: Deactivated successfully. Mar 14 00:37:24.166184 containerd[1504]: time="2026-03-14T00:37:24.166031646Z" level=error msg="encountered an error cleaning up failed sandbox \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.166372 containerd[1504]: time="2026-03-14T00:37:24.166194695Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f59cc5c64-pb8rc,Uid:2ed4c211-c53b-438a-87ab-ddc36cd894a9,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.191268 kubelet[2695]: E0314 00:37:24.189679 2695 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.192038 kubelet[2695]: E0314 00:37:24.191358 2695 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6f59cc5c64-pb8rc" Mar 14 00:37:24.192038 kubelet[2695]: E0314 00:37:24.191397 2695 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6f59cc5c64-pb8rc" Mar 14 00:37:24.192038 kubelet[2695]: E0314 00:37:24.191553 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6f59cc5c64-pb8rc_calico-system(2ed4c211-c53b-438a-87ab-ddc36cd894a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6f59cc5c64-pb8rc_calico-system(2ed4c211-c53b-438a-87ab-ddc36cd894a9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6f59cc5c64-pb8rc" podUID="2ed4c211-c53b-438a-87ab-ddc36cd894a9" Mar 14 00:37:24.275978 containerd[1504]: time="2026-03-14T00:37:24.266115997Z" level=error msg="Failed to destroy network for sandbox \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.275978 containerd[1504]: time="2026-03-14T00:37:24.266676139Z" level=error msg="encountered an error cleaning up failed sandbox \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.275978 containerd[1504]: time="2026-03-14T00:37:24.266760040Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-5sb67,Uid:08b4124a-e53b-471c-876a-69a445df63cd,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.275978 containerd[1504]: time="2026-03-14T00:37:24.270754579Z" level=error msg="Failed to destroy network for sandbox \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.280549 kubelet[2695]: E0314 00:37:24.269281 2695 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.280549 kubelet[2695]: E0314 00:37:24.269384 2695 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-5sb67" Mar 14 00:37:24.280549 kubelet[2695]: E0314 00:37:24.269439 2695 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-5sb67" Mar 14 00:37:24.274911 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be-shm.mount: Deactivated successfully. Mar 14 00:37:24.288793 containerd[1504]: time="2026-03-14T00:37:24.280566343Z" level=error msg="encountered an error cleaning up failed sandbox \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.288793 containerd[1504]: time="2026-03-14T00:37:24.280656946Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-psbtg,Uid:85de6037-ff24-4a8c-95b3-a77667104b2e,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.291571 kubelet[2695]: E0314 00:37:24.271615 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-5sb67_kube-system(08b4124a-e53b-471c-876a-69a445df63cd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-5sb67_kube-system(08b4124a-e53b-471c-876a-69a445df63cd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-5sb67" podUID="08b4124a-e53b-471c-876a-69a445df63cd" Mar 14 00:37:24.291571 kubelet[2695]: E0314 00:37:24.281180 2695 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.291571 kubelet[2695]: E0314 00:37:24.281289 2695 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-psbtg" Mar 14 00:37:24.292557 kubelet[2695]: E0314 00:37:24.281316 2695 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-psbtg" Mar 14 00:37:24.292557 kubelet[2695]: E0314 00:37:24.281388 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-psbtg_calico-system(85de6037-ff24-4a8c-95b3-a77667104b2e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-psbtg_calico-system(85de6037-ff24-4a8c-95b3-a77667104b2e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-psbtg" podUID="85de6037-ff24-4a8c-95b3-a77667104b2e" Mar 14 00:37:24.303948 containerd[1504]: time="2026-03-14T00:37:24.303885457Z" level=error msg="Failed to destroy network for sandbox \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.304591 containerd[1504]: time="2026-03-14T00:37:24.304516539Z" level=error msg="encountered an error cleaning up failed sandbox \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.304693 containerd[1504]: time="2026-03-14T00:37:24.304655643Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-689dfd58b9-98c4z,Uid:2820677f-9f42-42f1-ade4-b7694612f299,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.304965 containerd[1504]: time="2026-03-14T00:37:24.304915608Z" level=error msg="Failed to destroy network for sandbox \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.305229 kubelet[2695]: E0314 00:37:24.305172 2695 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.305302 kubelet[2695]: E0314 00:37:24.305257 2695 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-689dfd58b9-98c4z" Mar 14 00:37:24.305302 kubelet[2695]: E0314 00:37:24.305286 2695 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-689dfd58b9-98c4z" Mar 14 00:37:24.305994 kubelet[2695]: E0314 00:37:24.305364 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-689dfd58b9-98c4z_calico-system(2820677f-9f42-42f1-ade4-b7694612f299)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-689dfd58b9-98c4z_calico-system(2820677f-9f42-42f1-ade4-b7694612f299)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-689dfd58b9-98c4z" podUID="2820677f-9f42-42f1-ade4-b7694612f299" Mar 14 00:37:24.306300 containerd[1504]: time="2026-03-14T00:37:24.305729047Z" level=error msg="Failed to destroy network for sandbox \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.308010 containerd[1504]: time="2026-03-14T00:37:24.307971963Z" level=error msg="encountered an error cleaning up failed sandbox \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.308560 containerd[1504]: time="2026-03-14T00:37:24.308437936Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-689dfd58b9-glmpr,Uid:f4805231-80ff-447c-a875-0751f3c2712b,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.308920 containerd[1504]: time="2026-03-14T00:37:24.308729988Z" level=error msg="encountered an error cleaning up failed sandbox \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.309164 containerd[1504]: time="2026-03-14T00:37:24.309126780Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-n69fb,Uid:c2f12561-b5d4-4d53-afc2-18ef12c655af,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.310063 kubelet[2695]: E0314 00:37:24.309641 2695 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.310063 kubelet[2695]: E0314 00:37:24.309738 2695 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-n69fb" Mar 14 00:37:24.310063 kubelet[2695]: E0314 00:37:24.309781 2695 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-n69fb" Mar 14 00:37:24.310241 kubelet[2695]: E0314 00:37:24.309867 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-9f7667bb8-n69fb_calico-system(c2f12561-b5d4-4d53-afc2-18ef12c655af)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-9f7667bb8-n69fb_calico-system(c2f12561-b5d4-4d53-afc2-18ef12c655af)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9f7667bb8-n69fb" podUID="c2f12561-b5d4-4d53-afc2-18ef12c655af" Mar 14 00:37:24.310241 kubelet[2695]: E0314 00:37:24.309663 2695 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.310241 kubelet[2695]: E0314 00:37:24.309942 2695 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-689dfd58b9-glmpr" Mar 14 00:37:24.310410 kubelet[2695]: E0314 00:37:24.309964 2695 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-689dfd58b9-glmpr" Mar 14 00:37:24.310410 kubelet[2695]: E0314 00:37:24.310005 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-689dfd58b9-glmpr_calico-system(f4805231-80ff-447c-a875-0751f3c2712b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-689dfd58b9-glmpr_calico-system(f4805231-80ff-447c-a875-0751f3c2712b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-689dfd58b9-glmpr" podUID="f4805231-80ff-447c-a875-0751f3c2712b" Mar 14 00:37:24.333056 containerd[1504]: time="2026-03-14T00:37:24.332753219Z" level=error msg="Failed to destroy network for sandbox \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.334898 containerd[1504]: time="2026-03-14T00:37:24.334846971Z" level=error msg="Failed to destroy network for sandbox \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.335875 containerd[1504]: time="2026-03-14T00:37:24.335822364Z" level=error msg="encountered an error cleaning up failed sandbox \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.335966 containerd[1504]: time="2026-03-14T00:37:24.335895322Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-zs4lg,Uid:0c8a2ac7-fa2e-44fe-9348-bf4cf1020019,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.336185 kubelet[2695]: E0314 00:37:24.336118 2695 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.336276 kubelet[2695]: E0314 00:37:24.336200 2695 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-zs4lg" Mar 14 00:37:24.336276 kubelet[2695]: E0314 00:37:24.336239 2695 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-zs4lg" Mar 14 00:37:24.336956 kubelet[2695]: E0314 00:37:24.336312 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-zs4lg_kube-system(0c8a2ac7-fa2e-44fe-9348-bf4cf1020019)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-zs4lg_kube-system(0c8a2ac7-fa2e-44fe-9348-bf4cf1020019)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-zs4lg" podUID="0c8a2ac7-fa2e-44fe-9348-bf4cf1020019" Mar 14 00:37:24.337284 containerd[1504]: time="2026-03-14T00:37:24.337141239Z" level=error msg="encountered an error cleaning up failed sandbox \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.337284 containerd[1504]: time="2026-03-14T00:37:24.337219857Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5f9c794d4f-2qjv6,Uid:6ea7102e-c24f-4833-a40a-4027abd4e9fc,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.337684 systemd[1]: Started cri-containerd-a74e524a33317ff9cf7c964e89148d5a4eda0317390e3369c9a3d90653bf5e59.scope - libcontainer container a74e524a33317ff9cf7c964e89148d5a4eda0317390e3369c9a3d90653bf5e59. Mar 14 00:37:24.338163 kubelet[2695]: E0314 00:37:24.337789 2695 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 14 00:37:24.338163 kubelet[2695]: E0314 00:37:24.337839 2695 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5f9c794d4f-2qjv6" Mar 14 00:37:24.338163 kubelet[2695]: E0314 00:37:24.337869 2695 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5f9c794d4f-2qjv6" Mar 14 00:37:24.338339 kubelet[2695]: E0314 00:37:24.337922 2695 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5f9c794d4f-2qjv6_calico-system(6ea7102e-c24f-4833-a40a-4027abd4e9fc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5f9c794d4f-2qjv6_calico-system(6ea7102e-c24f-4833-a40a-4027abd4e9fc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5f9c794d4f-2qjv6" podUID="6ea7102e-c24f-4833-a40a-4027abd4e9fc" Mar 14 00:37:24.404231 containerd[1504]: time="2026-03-14T00:37:24.404129405Z" level=info msg="StartContainer for \"a74e524a33317ff9cf7c964e89148d5a4eda0317390e3369c9a3d90653bf5e59\" returns successfully" Mar 14 00:37:24.939109 kubelet[2695]: I0314 00:37:24.925240 2695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Mar 14 00:37:24.942405 kubelet[2695]: I0314 00:37:24.942374 2695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Mar 14 00:37:24.945563 kubelet[2695]: I0314 00:37:24.945531 2695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Mar 14 00:37:24.996275 kubelet[2695]: I0314 00:37:24.994186 2695 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-node-fzvrp" podStartSLOduration=1.78508471 podStartE2EDuration="26.994165494s" podCreationTimestamp="2026-03-14 00:36:58 +0000 UTC" firstStartedPulling="2026-03-14 00:36:58.730712672 +0000 UTC m=+20.344284612" lastFinishedPulling="2026-03-14 00:37:23.939793455 +0000 UTC m=+45.553365396" observedRunningTime="2026-03-14 00:37:24.99205592 +0000 UTC m=+46.605627901" watchObservedRunningTime="2026-03-14 00:37:24.994165494 +0000 UTC m=+46.607737440" Mar 14 00:37:25.001373 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a-shm.mount: Deactivated successfully. Mar 14 00:37:25.001598 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0-shm.mount: Deactivated successfully. Mar 14 00:37:25.001734 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23-shm.mount: Deactivated successfully. Mar 14 00:37:25.002750 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0-shm.mount: Deactivated successfully. Mar 14 00:37:25.003155 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a-shm.mount: Deactivated successfully. Mar 14 00:37:25.003296 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5-shm.mount: Deactivated successfully. Mar 14 00:37:25.034797 containerd[1504]: time="2026-03-14T00:37:25.034500285Z" level=info msg="StopPodSandbox for \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\"" Mar 14 00:37:25.037584 containerd[1504]: time="2026-03-14T00:37:25.036951106Z" level=info msg="Ensure that sandbox 5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5 in task-service has been cleanup successfully" Mar 14 00:37:25.039410 containerd[1504]: time="2026-03-14T00:37:25.038671913Z" level=info msg="StopPodSandbox for \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\"" Mar 14 00:37:25.039410 containerd[1504]: time="2026-03-14T00:37:25.038976630Z" level=info msg="Ensure that sandbox 90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a in task-service has been cleanup successfully" Mar 14 00:37:25.050703 containerd[1504]: time="2026-03-14T00:37:25.050599134Z" level=info msg="StopPodSandbox for \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\"" Mar 14 00:37:25.067037 kubelet[2695]: I0314 00:37:25.066382 2695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Mar 14 00:37:25.075869 containerd[1504]: time="2026-03-14T00:37:25.073447482Z" level=info msg="Ensure that sandbox ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a in task-service has been cleanup successfully" Mar 14 00:37:25.075869 containerd[1504]: time="2026-03-14T00:37:25.075625340Z" level=info msg="StopPodSandbox for \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\"" Mar 14 00:37:25.076751 containerd[1504]: time="2026-03-14T00:37:25.076721266Z" level=info msg="Ensure that sandbox 4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23 in task-service has been cleanup successfully" Mar 14 00:37:25.106178 kubelet[2695]: I0314 00:37:25.106126 2695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Mar 14 00:37:25.116943 containerd[1504]: time="2026-03-14T00:37:25.116895599Z" level=info msg="StopPodSandbox for \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\"" Mar 14 00:37:25.117839 containerd[1504]: time="2026-03-14T00:37:25.117784717Z" level=info msg="Ensure that sandbox 3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0 in task-service has been cleanup successfully" Mar 14 00:37:25.119444 kubelet[2695]: I0314 00:37:25.119405 2695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Mar 14 00:37:25.122067 containerd[1504]: time="2026-03-14T00:37:25.122017561Z" level=info msg="StopPodSandbox for \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\"" Mar 14 00:37:25.122326 containerd[1504]: time="2026-03-14T00:37:25.122287748Z" level=info msg="Ensure that sandbox 6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0 in task-service has been cleanup successfully" Mar 14 00:37:25.135038 kubelet[2695]: I0314 00:37:25.134193 2695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Mar 14 00:37:25.135448 containerd[1504]: time="2026-03-14T00:37:25.135389822Z" level=info msg="StopPodSandbox for \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\"" Mar 14 00:37:25.136324 containerd[1504]: time="2026-03-14T00:37:25.136293423Z" level=info msg="Ensure that sandbox 1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be in task-service has been cleanup successfully" Mar 14 00:37:25.146615 kubelet[2695]: I0314 00:37:25.146571 2695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Mar 14 00:37:25.148543 containerd[1504]: time="2026-03-14T00:37:25.147441654Z" level=info msg="StopPodSandbox for \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\"" Mar 14 00:37:25.149064 containerd[1504]: time="2026-03-14T00:37:25.148991091Z" level=info msg="Ensure that sandbox 602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4 in task-service has been cleanup successfully" Mar 14 00:37:25.922397 containerd[1504]: 2026-03-14 00:37:25.607 [INFO][3830] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Mar 14 00:37:25.922397 containerd[1504]: 2026-03-14 00:37:25.608 [INFO][3830] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" iface="eth0" netns="/var/run/netns/cni-27c7d8bc-5901-3e3e-30d7-573cecf84e7d" Mar 14 00:37:25.922397 containerd[1504]: 2026-03-14 00:37:25.608 [INFO][3830] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" iface="eth0" netns="/var/run/netns/cni-27c7d8bc-5901-3e3e-30d7-573cecf84e7d" Mar 14 00:37:25.922397 containerd[1504]: 2026-03-14 00:37:25.609 [INFO][3830] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" iface="eth0" netns="/var/run/netns/cni-27c7d8bc-5901-3e3e-30d7-573cecf84e7d" Mar 14 00:37:25.922397 containerd[1504]: 2026-03-14 00:37:25.609 [INFO][3830] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Mar 14 00:37:25.922397 containerd[1504]: 2026-03-14 00:37:25.609 [INFO][3830] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Mar 14 00:37:25.922397 containerd[1504]: 2026-03-14 00:37:25.829 [INFO][3939] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" HandleID="k8s-pod-network.4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Workload="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" Mar 14 00:37:25.922397 containerd[1504]: 2026-03-14 00:37:25.831 [INFO][3939] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:25.922397 containerd[1504]: 2026-03-14 00:37:25.832 [INFO][3939] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:25.922397 containerd[1504]: 2026-03-14 00:37:25.886 [WARNING][3939] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" HandleID="k8s-pod-network.4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Workload="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" Mar 14 00:37:25.922397 containerd[1504]: 2026-03-14 00:37:25.886 [INFO][3939] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" HandleID="k8s-pod-network.4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Workload="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" Mar 14 00:37:25.922397 containerd[1504]: 2026-03-14 00:37:25.893 [INFO][3939] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:25.922397 containerd[1504]: 2026-03-14 00:37:25.905 [INFO][3830] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Mar 14 00:37:25.928969 containerd[1504]: time="2026-03-14T00:37:25.924760067Z" level=info msg="TearDown network for sandbox \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\" successfully" Mar 14 00:37:25.928969 containerd[1504]: time="2026-03-14T00:37:25.925090980Z" level=info msg="StopPodSandbox for \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\" returns successfully" Mar 14 00:37:25.932006 systemd[1]: run-netns-cni\x2d27c7d8bc\x2d5901\x2d3e3e\x2d30d7\x2d573cecf84e7d.mount: Deactivated successfully. Mar 14 00:37:25.944571 containerd[1504]: time="2026-03-14T00:37:25.943668798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-n69fb,Uid:c2f12561-b5d4-4d53-afc2-18ef12c655af,Namespace:calico-system,Attempt:1,}" Mar 14 00:37:25.966477 containerd[1504]: 2026-03-14 00:37:25.607 [INFO][3816] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Mar 14 00:37:25.966477 containerd[1504]: 2026-03-14 00:37:25.608 [INFO][3816] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" iface="eth0" netns="/var/run/netns/cni-90766107-ecf0-58d3-c3f3-712e9bb37c9f" Mar 14 00:37:25.966477 containerd[1504]: 2026-03-14 00:37:25.609 [INFO][3816] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" iface="eth0" netns="/var/run/netns/cni-90766107-ecf0-58d3-c3f3-712e9bb37c9f" Mar 14 00:37:25.966477 containerd[1504]: 2026-03-14 00:37:25.610 [INFO][3816] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" iface="eth0" netns="/var/run/netns/cni-90766107-ecf0-58d3-c3f3-712e9bb37c9f" Mar 14 00:37:25.966477 containerd[1504]: 2026-03-14 00:37:25.610 [INFO][3816] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Mar 14 00:37:25.966477 containerd[1504]: 2026-03-14 00:37:25.610 [INFO][3816] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Mar 14 00:37:25.966477 containerd[1504]: 2026-03-14 00:37:25.829 [INFO][3941] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" HandleID="k8s-pod-network.90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" Mar 14 00:37:25.966477 containerd[1504]: 2026-03-14 00:37:25.831 [INFO][3941] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:25.966477 containerd[1504]: 2026-03-14 00:37:25.894 [INFO][3941] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:25.966477 containerd[1504]: 2026-03-14 00:37:25.931 [WARNING][3941] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" HandleID="k8s-pod-network.90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" Mar 14 00:37:25.966477 containerd[1504]: 2026-03-14 00:37:25.931 [INFO][3941] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" HandleID="k8s-pod-network.90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" Mar 14 00:37:25.966477 containerd[1504]: 2026-03-14 00:37:25.942 [INFO][3941] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:25.966477 containerd[1504]: 2026-03-14 00:37:25.957 [INFO][3816] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Mar 14 00:37:25.968157 containerd[1504]: time="2026-03-14T00:37:25.967604822Z" level=info msg="TearDown network for sandbox \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\" successfully" Mar 14 00:37:25.968157 containerd[1504]: time="2026-03-14T00:37:25.967645759Z" level=info msg="StopPodSandbox for \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\" returns successfully" Mar 14 00:37:25.982253 systemd[1]: run-netns-cni\x2d90766107\x2decf0\x2d58d3\x2dc3f3\x2d712e9bb37c9f.mount: Deactivated successfully. Mar 14 00:37:26.001211 containerd[1504]: time="2026-03-14T00:37:26.000207899Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-zs4lg,Uid:0c8a2ac7-fa2e-44fe-9348-bf4cf1020019,Namespace:kube-system,Attempt:1,}" Mar 14 00:37:26.051581 containerd[1504]: 2026-03-14 00:37:25.667 [INFO][3863] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Mar 14 00:37:26.051581 containerd[1504]: 2026-03-14 00:37:25.669 [INFO][3863] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" iface="eth0" netns="/var/run/netns/cni-08ac806d-fd8d-7aad-43ea-32ad17a53ae0" Mar 14 00:37:26.051581 containerd[1504]: 2026-03-14 00:37:25.671 [INFO][3863] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" iface="eth0" netns="/var/run/netns/cni-08ac806d-fd8d-7aad-43ea-32ad17a53ae0" Mar 14 00:37:26.051581 containerd[1504]: 2026-03-14 00:37:25.673 [INFO][3863] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" iface="eth0" netns="/var/run/netns/cni-08ac806d-fd8d-7aad-43ea-32ad17a53ae0" Mar 14 00:37:26.051581 containerd[1504]: 2026-03-14 00:37:25.673 [INFO][3863] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Mar 14 00:37:26.051581 containerd[1504]: 2026-03-14 00:37:25.673 [INFO][3863] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Mar 14 00:37:26.051581 containerd[1504]: 2026-03-14 00:37:25.831 [INFO][3952] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" HandleID="k8s-pod-network.ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Workload="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" Mar 14 00:37:26.051581 containerd[1504]: 2026-03-14 00:37:25.835 [INFO][3952] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:26.051581 containerd[1504]: 2026-03-14 00:37:25.944 [INFO][3952] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:26.051581 containerd[1504]: 2026-03-14 00:37:26.010 [WARNING][3952] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" HandleID="k8s-pod-network.ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Workload="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" Mar 14 00:37:26.051581 containerd[1504]: 2026-03-14 00:37:26.010 [INFO][3952] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" HandleID="k8s-pod-network.ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Workload="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" Mar 14 00:37:26.051581 containerd[1504]: 2026-03-14 00:37:26.015 [INFO][3952] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:26.051581 containerd[1504]: 2026-03-14 00:37:26.037 [INFO][3863] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Mar 14 00:37:26.061607 containerd[1504]: time="2026-03-14T00:37:26.051574585Z" level=info msg="TearDown network for sandbox \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\" successfully" Mar 14 00:37:26.061607 containerd[1504]: time="2026-03-14T00:37:26.051632268Z" level=info msg="StopPodSandbox for \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\" returns successfully" Mar 14 00:37:26.061200 systemd[1]: run-netns-cni\x2d08ac806d\x2dfd8d\x2d7aad\x2d43ea\x2d32ad17a53ae0.mount: Deactivated successfully. Mar 14 00:37:26.068444 containerd[1504]: time="2026-03-14T00:37:26.068389631Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-psbtg,Uid:85de6037-ff24-4a8c-95b3-a77667104b2e,Namespace:calico-system,Attempt:1,}" Mar 14 00:37:26.150336 containerd[1504]: 2026-03-14 00:37:25.601 [INFO][3880] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Mar 14 00:37:26.150336 containerd[1504]: 2026-03-14 00:37:25.601 [INFO][3880] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" iface="eth0" netns="/var/run/netns/cni-c0ef4b9f-b0ad-ea4d-2569-5daafea0a79a" Mar 14 00:37:26.150336 containerd[1504]: 2026-03-14 00:37:25.602 [INFO][3880] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" iface="eth0" netns="/var/run/netns/cni-c0ef4b9f-b0ad-ea4d-2569-5daafea0a79a" Mar 14 00:37:26.150336 containerd[1504]: 2026-03-14 00:37:25.603 [INFO][3880] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" iface="eth0" netns="/var/run/netns/cni-c0ef4b9f-b0ad-ea4d-2569-5daafea0a79a" Mar 14 00:37:26.150336 containerd[1504]: 2026-03-14 00:37:25.603 [INFO][3880] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Mar 14 00:37:26.150336 containerd[1504]: 2026-03-14 00:37:25.603 [INFO][3880] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Mar 14 00:37:26.150336 containerd[1504]: 2026-03-14 00:37:25.834 [INFO][3937] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" HandleID="k8s-pod-network.6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" Mar 14 00:37:26.150336 containerd[1504]: 2026-03-14 00:37:25.836 [INFO][3937] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:26.150336 containerd[1504]: 2026-03-14 00:37:26.015 [INFO][3937] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:26.150336 containerd[1504]: 2026-03-14 00:37:26.072 [WARNING][3937] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" HandleID="k8s-pod-network.6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" Mar 14 00:37:26.150336 containerd[1504]: 2026-03-14 00:37:26.072 [INFO][3937] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" HandleID="k8s-pod-network.6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" Mar 14 00:37:26.150336 containerd[1504]: 2026-03-14 00:37:26.093 [INFO][3937] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:26.150336 containerd[1504]: 2026-03-14 00:37:26.126 [INFO][3880] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Mar 14 00:37:26.160789 systemd[1]: run-netns-cni\x2dc0ef4b9f\x2db0ad\x2dea4d\x2d2569\x2d5daafea0a79a.mount: Deactivated successfully. Mar 14 00:37:26.167776 containerd[1504]: time="2026-03-14T00:37:26.167718292Z" level=info msg="TearDown network for sandbox \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\" successfully" Mar 14 00:37:26.167776 containerd[1504]: time="2026-03-14T00:37:26.167761493Z" level=info msg="StopPodSandbox for \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\" returns successfully" Mar 14 00:37:26.172938 containerd[1504]: time="2026-03-14T00:37:26.172836600Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-689dfd58b9-glmpr,Uid:f4805231-80ff-447c-a875-0751f3c2712b,Namespace:calico-system,Attempt:1,}" Mar 14 00:37:26.248645 containerd[1504]: 2026-03-14 00:37:25.825 [INFO][3864] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Mar 14 00:37:26.248645 containerd[1504]: 2026-03-14 00:37:25.827 [INFO][3864] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" iface="eth0" netns="/var/run/netns/cni-0d32ccb4-373d-5210-d3b5-e8e069598eca" Mar 14 00:37:26.248645 containerd[1504]: 2026-03-14 00:37:25.828 [INFO][3864] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" iface="eth0" netns="/var/run/netns/cni-0d32ccb4-373d-5210-d3b5-e8e069598eca" Mar 14 00:37:26.248645 containerd[1504]: 2026-03-14 00:37:25.832 [INFO][3864] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" iface="eth0" netns="/var/run/netns/cni-0d32ccb4-373d-5210-d3b5-e8e069598eca" Mar 14 00:37:26.248645 containerd[1504]: 2026-03-14 00:37:25.832 [INFO][3864] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Mar 14 00:37:26.248645 containerd[1504]: 2026-03-14 00:37:25.832 [INFO][3864] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Mar 14 00:37:26.248645 containerd[1504]: 2026-03-14 00:37:26.082 [INFO][3973] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" HandleID="k8s-pod-network.3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" Mar 14 00:37:26.248645 containerd[1504]: 2026-03-14 00:37:26.082 [INFO][3973] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:26.248645 containerd[1504]: 2026-03-14 00:37:26.097 [INFO][3973] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:26.248645 containerd[1504]: 2026-03-14 00:37:26.146 [WARNING][3973] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" HandleID="k8s-pod-network.3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" Mar 14 00:37:26.248645 containerd[1504]: 2026-03-14 00:37:26.146 [INFO][3973] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" HandleID="k8s-pod-network.3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" Mar 14 00:37:26.248645 containerd[1504]: 2026-03-14 00:37:26.169 [INFO][3973] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:26.248645 containerd[1504]: 2026-03-14 00:37:26.213 [INFO][3864] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Mar 14 00:37:26.249593 containerd[1504]: time="2026-03-14T00:37:26.248871679Z" level=info msg="TearDown network for sandbox \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\" successfully" Mar 14 00:37:26.249593 containerd[1504]: time="2026-03-14T00:37:26.248909975Z" level=info msg="StopPodSandbox for \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\" returns successfully" Mar 14 00:37:26.253513 systemd[1]: run-netns-cni\x2d0d32ccb4\x2d373d\x2d5210\x2dd3b5\x2de8e069598eca.mount: Deactivated successfully. Mar 14 00:37:26.258725 containerd[1504]: time="2026-03-14T00:37:26.256869121Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5f9c794d4f-2qjv6,Uid:6ea7102e-c24f-4833-a40a-4027abd4e9fc,Namespace:calico-system,Attempt:1,}" Mar 14 00:37:26.319559 containerd[1504]: 2026-03-14 00:37:25.796 [INFO][3815] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Mar 14 00:37:26.319559 containerd[1504]: 2026-03-14 00:37:25.797 [INFO][3815] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" iface="eth0" netns="/var/run/netns/cni-7484aa7c-3844-34e8-cbf0-ec29de5dead4" Mar 14 00:37:26.319559 containerd[1504]: 2026-03-14 00:37:25.799 [INFO][3815] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" iface="eth0" netns="/var/run/netns/cni-7484aa7c-3844-34e8-cbf0-ec29de5dead4" Mar 14 00:37:26.319559 containerd[1504]: 2026-03-14 00:37:25.806 [INFO][3815] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" iface="eth0" netns="/var/run/netns/cni-7484aa7c-3844-34e8-cbf0-ec29de5dead4" Mar 14 00:37:26.319559 containerd[1504]: 2026-03-14 00:37:25.806 [INFO][3815] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Mar 14 00:37:26.319559 containerd[1504]: 2026-03-14 00:37:25.806 [INFO][3815] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Mar 14 00:37:26.319559 containerd[1504]: 2026-03-14 00:37:26.238 [INFO][3966] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" HandleID="k8s-pod-network.5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" Mar 14 00:37:26.319559 containerd[1504]: 2026-03-14 00:37:26.238 [INFO][3966] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:26.319559 containerd[1504]: 2026-03-14 00:37:26.238 [INFO][3966] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:26.319559 containerd[1504]: 2026-03-14 00:37:26.294 [WARNING][3966] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" HandleID="k8s-pod-network.5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" Mar 14 00:37:26.319559 containerd[1504]: 2026-03-14 00:37:26.294 [INFO][3966] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" HandleID="k8s-pod-network.5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" Mar 14 00:37:26.319559 containerd[1504]: 2026-03-14 00:37:26.296 [INFO][3966] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:26.319559 containerd[1504]: 2026-03-14 00:37:26.314 [INFO][3815] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Mar 14 00:37:26.319559 containerd[1504]: time="2026-03-14T00:37:26.318490514Z" level=info msg="TearDown network for sandbox \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\" successfully" Mar 14 00:37:26.319559 containerd[1504]: time="2026-03-14T00:37:26.318758001Z" level=info msg="StopPodSandbox for \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\" returns successfully" Mar 14 00:37:26.334001 containerd[1504]: time="2026-03-14T00:37:26.333063324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-689dfd58b9-98c4z,Uid:2820677f-9f42-42f1-ade4-b7694612f299,Namespace:calico-system,Attempt:1,}" Mar 14 00:37:26.346966 containerd[1504]: 2026-03-14 00:37:25.851 [INFO][3901] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Mar 14 00:37:26.346966 containerd[1504]: 2026-03-14 00:37:25.855 [INFO][3901] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" iface="eth0" netns="/var/run/netns/cni-464480f3-bb9a-7ad4-d2a1-d18fae268ea6" Mar 14 00:37:26.346966 containerd[1504]: 2026-03-14 00:37:25.855 [INFO][3901] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" iface="eth0" netns="/var/run/netns/cni-464480f3-bb9a-7ad4-d2a1-d18fae268ea6" Mar 14 00:37:26.346966 containerd[1504]: 2026-03-14 00:37:25.857 [INFO][3901] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" iface="eth0" netns="/var/run/netns/cni-464480f3-bb9a-7ad4-d2a1-d18fae268ea6" Mar 14 00:37:26.346966 containerd[1504]: 2026-03-14 00:37:25.859 [INFO][3901] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Mar 14 00:37:26.346966 containerd[1504]: 2026-03-14 00:37:25.860 [INFO][3901] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Mar 14 00:37:26.346966 containerd[1504]: 2026-03-14 00:37:26.306 [INFO][3980] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" HandleID="k8s-pod-network.1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" Mar 14 00:37:26.346966 containerd[1504]: 2026-03-14 00:37:26.307 [INFO][3980] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:26.346966 containerd[1504]: 2026-03-14 00:37:26.307 [INFO][3980] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:26.346966 containerd[1504]: 2026-03-14 00:37:26.327 [WARNING][3980] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" HandleID="k8s-pod-network.1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" Mar 14 00:37:26.346966 containerd[1504]: 2026-03-14 00:37:26.328 [INFO][3980] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" HandleID="k8s-pod-network.1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" Mar 14 00:37:26.346966 containerd[1504]: 2026-03-14 00:37:26.334 [INFO][3980] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:26.346966 containerd[1504]: 2026-03-14 00:37:26.342 [INFO][3901] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Mar 14 00:37:26.350018 containerd[1504]: time="2026-03-14T00:37:26.346963022Z" level=info msg="TearDown network for sandbox \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\" successfully" Mar 14 00:37:26.350018 containerd[1504]: time="2026-03-14T00:37:26.347033342Z" level=info msg="StopPodSandbox for \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\" returns successfully" Mar 14 00:37:26.350128 containerd[1504]: time="2026-03-14T00:37:26.350036017Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-5sb67,Uid:08b4124a-e53b-471c-876a-69a445df63cd,Namespace:kube-system,Attempt:1,}" Mar 14 00:37:26.436996 containerd[1504]: 2026-03-14 00:37:25.845 [INFO][3900] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Mar 14 00:37:26.436996 containerd[1504]: 2026-03-14 00:37:25.851 [INFO][3900] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" iface="eth0" netns="/var/run/netns/cni-e538fddc-1dd9-2dc0-c88a-e6d4e0552f6b" Mar 14 00:37:26.436996 containerd[1504]: 2026-03-14 00:37:25.853 [INFO][3900] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" iface="eth0" netns="/var/run/netns/cni-e538fddc-1dd9-2dc0-c88a-e6d4e0552f6b" Mar 14 00:37:26.436996 containerd[1504]: 2026-03-14 00:37:25.857 [INFO][3900] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" iface="eth0" netns="/var/run/netns/cni-e538fddc-1dd9-2dc0-c88a-e6d4e0552f6b" Mar 14 00:37:26.436996 containerd[1504]: 2026-03-14 00:37:25.858 [INFO][3900] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Mar 14 00:37:26.436996 containerd[1504]: 2026-03-14 00:37:25.858 [INFO][3900] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Mar 14 00:37:26.436996 containerd[1504]: 2026-03-14 00:37:26.389 [INFO][3979] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" HandleID="k8s-pod-network.602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Workload="srv--zkxct.gb1.brightbox.com-k8s-whisker--6f59cc5c64--pb8rc-eth0" Mar 14 00:37:26.436996 containerd[1504]: 2026-03-14 00:37:26.389 [INFO][3979] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:26.436996 containerd[1504]: 2026-03-14 00:37:26.389 [INFO][3979] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:26.436996 containerd[1504]: 2026-03-14 00:37:26.414 [WARNING][3979] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" HandleID="k8s-pod-network.602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Workload="srv--zkxct.gb1.brightbox.com-k8s-whisker--6f59cc5c64--pb8rc-eth0" Mar 14 00:37:26.436996 containerd[1504]: 2026-03-14 00:37:26.414 [INFO][3979] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" HandleID="k8s-pod-network.602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Workload="srv--zkxct.gb1.brightbox.com-k8s-whisker--6f59cc5c64--pb8rc-eth0" Mar 14 00:37:26.436996 containerd[1504]: 2026-03-14 00:37:26.422 [INFO][3979] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:26.436996 containerd[1504]: 2026-03-14 00:37:26.429 [INFO][3900] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Mar 14 00:37:26.439909 containerd[1504]: time="2026-03-14T00:37:26.439828258Z" level=info msg="TearDown network for sandbox \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\" successfully" Mar 14 00:37:26.439909 containerd[1504]: time="2026-03-14T00:37:26.439882932Z" level=info msg="StopPodSandbox for \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\" returns successfully" Mar 14 00:37:26.602984 kubelet[2695]: I0314 00:37:26.602558 2695 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/projected/2ed4c211-c53b-438a-87ab-ddc36cd894a9-kube-api-access-5kfqv\" (UniqueName: \"kubernetes.io/projected/2ed4c211-c53b-438a-87ab-ddc36cd894a9-kube-api-access-5kfqv\") pod \"2ed4c211-c53b-438a-87ab-ddc36cd894a9\" (UID: \"2ed4c211-c53b-438a-87ab-ddc36cd894a9\") " Mar 14 00:37:26.608747 kubelet[2695]: I0314 00:37:26.608713 2695 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/2ed4c211-c53b-438a-87ab-ddc36cd894a9-nginx-config\" (UniqueName: \"kubernetes.io/configmap/2ed4c211-c53b-438a-87ab-ddc36cd894a9-nginx-config\") pod \"2ed4c211-c53b-438a-87ab-ddc36cd894a9\" (UID: \"2ed4c211-c53b-438a-87ab-ddc36cd894a9\") " Mar 14 00:37:26.608947 kubelet[2695]: I0314 00:37:26.608920 2695 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/secret/2ed4c211-c53b-438a-87ab-ddc36cd894a9-whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2ed4c211-c53b-438a-87ab-ddc36cd894a9-whisker-backend-key-pair\") pod \"2ed4c211-c53b-438a-87ab-ddc36cd894a9\" (UID: \"2ed4c211-c53b-438a-87ab-ddc36cd894a9\") " Mar 14 00:37:26.609081 kubelet[2695]: I0314 00:37:26.609058 2695 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/2ed4c211-c53b-438a-87ab-ddc36cd894a9-whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ed4c211-c53b-438a-87ab-ddc36cd894a9-whisker-ca-bundle\") pod \"2ed4c211-c53b-438a-87ab-ddc36cd894a9\" (UID: \"2ed4c211-c53b-438a-87ab-ddc36cd894a9\") " Mar 14 00:37:26.632944 kubelet[2695]: I0314 00:37:26.627866 2695 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ed4c211-c53b-438a-87ab-ddc36cd894a9-nginx-config" pod "2ed4c211-c53b-438a-87ab-ddc36cd894a9" (UID: "2ed4c211-c53b-438a-87ab-ddc36cd894a9"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 14 00:37:26.632944 kubelet[2695]: I0314 00:37:26.631683 2695 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed4c211-c53b-438a-87ab-ddc36cd894a9-kube-api-access-5kfqv" pod "2ed4c211-c53b-438a-87ab-ddc36cd894a9" (UID: "2ed4c211-c53b-438a-87ab-ddc36cd894a9"). InnerVolumeSpecName "kube-api-access-5kfqv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 14 00:37:26.632944 kubelet[2695]: I0314 00:37:26.631715 2695 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ed4c211-c53b-438a-87ab-ddc36cd894a9-whisker-ca-bundle" pod "2ed4c211-c53b-438a-87ab-ddc36cd894a9" (UID: "2ed4c211-c53b-438a-87ab-ddc36cd894a9"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 14 00:37:26.636580 kubelet[2695]: I0314 00:37:26.636228 2695 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ed4c211-c53b-438a-87ab-ddc36cd894a9-whisker-backend-key-pair" pod "2ed4c211-c53b-438a-87ab-ddc36cd894a9" (UID: "2ed4c211-c53b-438a-87ab-ddc36cd894a9"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 14 00:37:26.715581 kubelet[2695]: I0314 00:37:26.715341 2695 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5kfqv\" (UniqueName: \"kubernetes.io/projected/2ed4c211-c53b-438a-87ab-ddc36cd894a9-kube-api-access-5kfqv\") on node \"srv-zkxct.gb1.brightbox.com\" DevicePath \"\"" Mar 14 00:37:26.715581 kubelet[2695]: I0314 00:37:26.715396 2695 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/2ed4c211-c53b-438a-87ab-ddc36cd894a9-nginx-config\") on node \"srv-zkxct.gb1.brightbox.com\" DevicePath \"\"" Mar 14 00:37:26.715581 kubelet[2695]: I0314 00:37:26.715415 2695 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2ed4c211-c53b-438a-87ab-ddc36cd894a9-whisker-backend-key-pair\") on node \"srv-zkxct.gb1.brightbox.com\" DevicePath \"\"" Mar 14 00:37:26.715581 kubelet[2695]: I0314 00:37:26.715430 2695 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ed4c211-c53b-438a-87ab-ddc36cd894a9-whisker-ca-bundle\") on node \"srv-zkxct.gb1.brightbox.com\" DevicePath \"\"" Mar 14 00:37:26.757384 systemd-networkd[1436]: calieb565f66cf3: Link UP Mar 14 00:37:26.758887 systemd-networkd[1436]: calieb565f66cf3: Gained carrier Mar 14 00:37:26.787174 systemd[1]: Removed slice kubepods-besteffort-pod2ed4c211_c53b_438a_87ab_ddc36cd894a9.slice - libcontainer container kubepods-besteffort-pod2ed4c211_c53b_438a_87ab_ddc36cd894a9.slice. Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.179 [ERROR][4007] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.284 [INFO][4007] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0 csi-node-driver- calico-system 85de6037-ff24-4a8c-95b3-a77667104b2e 938 0 2026-03-14 00:36:58 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:589b8b8d94 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-zkxct.gb1.brightbox.com csi-node-driver-psbtg eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calieb565f66cf3 [] [] }} ContainerID="dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" Namespace="calico-system" Pod="csi-node-driver-psbtg" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-" Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.284 [INFO][4007] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" Namespace="calico-system" Pod="csi-node-driver-psbtg" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.553 [INFO][4033] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" HandleID="k8s-pod-network.dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" Workload="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.587 [INFO][4033] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" HandleID="k8s-pod-network.dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" Workload="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000365c10), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-zkxct.gb1.brightbox.com", "pod":"csi-node-driver-psbtg", "timestamp":"2026-03-14 00:37:26.553834441 +0000 UTC"}, Hostname:"srv-zkxct.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003a22c0)} Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.587 [INFO][4033] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.587 [INFO][4033] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.587 [INFO][4033] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-zkxct.gb1.brightbox.com' Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.611 [INFO][4033] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.638 [INFO][4033] ipam/ipam.go 409: Looking up existing affinities for host host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.654 [INFO][4033] ipam/ipam.go 526: Trying affinity for 192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.660 [INFO][4033] ipam/ipam.go 160: Attempting to load block cidr=192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.667 [INFO][4033] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.667 [INFO][4033] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.26.0/26 handle="k8s-pod-network.dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.672 [INFO][4033] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.683 [INFO][4033] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.26.0/26 handle="k8s-pod-network.dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.700 [INFO][4033] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.26.1/26] block=192.168.26.0/26 handle="k8s-pod-network.dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.700 [INFO][4033] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.26.1/26] handle="k8s-pod-network.dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.700 [INFO][4033] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:26.837998 containerd[1504]: 2026-03-14 00:37:26.700 [INFO][4033] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.26.1/26] IPv6=[] ContainerID="dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" HandleID="k8s-pod-network.dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" Workload="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" Mar 14 00:37:26.841960 containerd[1504]: 2026-03-14 00:37:26.711 [INFO][4007] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" Namespace="calico-system" Pod="csi-node-driver-psbtg" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"85de6037-ff24-4a8c-95b3-a77667104b2e", ResourceVersion:"938", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-psbtg", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.26.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calieb565f66cf3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:26.841960 containerd[1504]: 2026-03-14 00:37:26.712 [INFO][4007] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.26.1/32] ContainerID="dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" Namespace="calico-system" Pod="csi-node-driver-psbtg" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" Mar 14 00:37:26.841960 containerd[1504]: 2026-03-14 00:37:26.712 [INFO][4007] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calieb565f66cf3 ContainerID="dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" Namespace="calico-system" Pod="csi-node-driver-psbtg" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" Mar 14 00:37:26.841960 containerd[1504]: 2026-03-14 00:37:26.741 [INFO][4007] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" Namespace="calico-system" Pod="csi-node-driver-psbtg" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" Mar 14 00:37:26.841960 containerd[1504]: 2026-03-14 00:37:26.742 [INFO][4007] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" Namespace="calico-system" Pod="csi-node-driver-psbtg" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"85de6037-ff24-4a8c-95b3-a77667104b2e", ResourceVersion:"938", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b", Pod:"csi-node-driver-psbtg", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.26.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calieb565f66cf3", MAC:"6a:45:9b:7c:37:f6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:26.841960 containerd[1504]: 2026-03-14 00:37:26.829 [INFO][4007] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b" Namespace="calico-system" Pod="csi-node-driver-psbtg" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" Mar 14 00:37:26.995317 systemd-networkd[1436]: cali7a2a8d1e826: Link UP Mar 14 00:37:26.999342 systemd-networkd[1436]: cali7a2a8d1e826: Gained carrier Mar 14 00:37:27.075994 systemd[1]: run-netns-cni\x2d464480f3\x2dbb9a\x2d7ad4\x2dd2a1\x2dd18fae268ea6.mount: Deactivated successfully. Mar 14 00:37:27.076196 systemd[1]: run-netns-cni\x2d7484aa7c\x2d3844\x2d34e8\x2dcbf0\x2dec29de5dead4.mount: Deactivated successfully. Mar 14 00:37:27.076307 systemd[1]: run-netns-cni\x2de538fddc\x2d1dd9\x2d2dc0\x2dc88a\x2de6d4e0552f6b.mount: Deactivated successfully. Mar 14 00:37:27.076416 systemd[1]: var-lib-kubelet-pods-2ed4c211\x2dc53b\x2d438a\x2d87ab\x2dddc36cd894a9-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d5kfqv.mount: Deactivated successfully. Mar 14 00:37:27.077662 systemd[1]: var-lib-kubelet-pods-2ed4c211\x2dc53b\x2d438a\x2d87ab\x2dddc36cd894a9-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.405 [ERROR][3991] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.441 [INFO][3991] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0 goldmane-9f7667bb8- calico-system c2f12561-b5d4-4d53-afc2-18ef12c655af 935 0 2026-03-14 00:36:57 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:9f7667bb8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s srv-zkxct.gb1.brightbox.com goldmane-9f7667bb8-n69fb eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali7a2a8d1e826 [] [] }} ContainerID="ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" Namespace="calico-system" Pod="goldmane-9f7667bb8-n69fb" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-" Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.441 [INFO][3991] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" Namespace="calico-system" Pod="goldmane-9f7667bb8-n69fb" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.822 [INFO][4078] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" HandleID="k8s-pod-network.ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" Workload="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.872 [INFO][4078] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" HandleID="k8s-pod-network.ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" Workload="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003d9830), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-zkxct.gb1.brightbox.com", "pod":"goldmane-9f7667bb8-n69fb", "timestamp":"2026-03-14 00:37:26.822030166 +0000 UTC"}, Hostname:"srv-zkxct.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000343a20)} Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.874 [INFO][4078] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.874 [INFO][4078] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.874 [INFO][4078] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-zkxct.gb1.brightbox.com' Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.882 [INFO][4078] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.897 [INFO][4078] ipam/ipam.go 409: Looking up existing affinities for host host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.909 [INFO][4078] ipam/ipam.go 526: Trying affinity for 192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.913 [INFO][4078] ipam/ipam.go 160: Attempting to load block cidr=192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.922 [INFO][4078] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.923 [INFO][4078] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.26.0/26 handle="k8s-pod-network.ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.934 [INFO][4078] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.951 [INFO][4078] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.26.0/26 handle="k8s-pod-network.ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.967 [INFO][4078] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.26.2/26] block=192.168.26.0/26 handle="k8s-pod-network.ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.968 [INFO][4078] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.26.2/26] handle="k8s-pod-network.ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.968 [INFO][4078] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:27.094712 containerd[1504]: 2026-03-14 00:37:26.968 [INFO][4078] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.26.2/26] IPv6=[] ContainerID="ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" HandleID="k8s-pod-network.ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" Workload="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" Mar 14 00:37:27.095804 containerd[1504]: 2026-03-14 00:37:26.975 [INFO][3991] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" Namespace="calico-system" Pod="goldmane-9f7667bb8-n69fb" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"c2f12561-b5d4-4d53-afc2-18ef12c655af", ResourceVersion:"935", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"", Pod:"goldmane-9f7667bb8-n69fb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.26.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7a2a8d1e826", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:27.095804 containerd[1504]: 2026-03-14 00:37:26.975 [INFO][3991] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.26.2/32] ContainerID="ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" Namespace="calico-system" Pod="goldmane-9f7667bb8-n69fb" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" Mar 14 00:37:27.095804 containerd[1504]: 2026-03-14 00:37:26.976 [INFO][3991] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7a2a8d1e826 ContainerID="ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" Namespace="calico-system" Pod="goldmane-9f7667bb8-n69fb" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" Mar 14 00:37:27.095804 containerd[1504]: 2026-03-14 00:37:27.000 [INFO][3991] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" Namespace="calico-system" Pod="goldmane-9f7667bb8-n69fb" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" Mar 14 00:37:27.095804 containerd[1504]: 2026-03-14 00:37:27.033 [INFO][3991] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" Namespace="calico-system" Pod="goldmane-9f7667bb8-n69fb" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"c2f12561-b5d4-4d53-afc2-18ef12c655af", ResourceVersion:"935", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f", Pod:"goldmane-9f7667bb8-n69fb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.26.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7a2a8d1e826", MAC:"6a:0a:3f:79:eb:0c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:27.095804 containerd[1504]: 2026-03-14 00:37:27.073 [INFO][3991] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f" Namespace="calico-system" Pod="goldmane-9f7667bb8-n69fb" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" Mar 14 00:37:27.158876 systemd-networkd[1436]: cali5e05db2c5c5: Link UP Mar 14 00:37:27.160896 systemd-networkd[1436]: cali5e05db2c5c5: Gained carrier Mar 14 00:37:27.184404 containerd[1504]: time="2026-03-14T00:37:27.183879501Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 14 00:37:27.184404 containerd[1504]: time="2026-03-14T00:37:27.184002549Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 14 00:37:27.184404 containerd[1504]: time="2026-03-14T00:37:27.184025538Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:37:27.184404 containerd[1504]: time="2026-03-14T00:37:27.184233282Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:37:27.213406 containerd[1504]: time="2026-03-14T00:37:27.212046833Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 14 00:37:27.213406 containerd[1504]: time="2026-03-14T00:37:27.212155461Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 14 00:37:27.213406 containerd[1504]: time="2026-03-14T00:37:27.212191545Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:37:27.213733 containerd[1504]: time="2026-03-14T00:37:27.212352056Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:26.493 [ERROR][4025] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:26.546 [INFO][4025] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0 coredns-7d764666f9- kube-system 0c8a2ac7-fa2e-44fe-9348-bf4cf1020019 936 0 2026-03-14 00:36:45 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-zkxct.gb1.brightbox.com coredns-7d764666f9-zs4lg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali5e05db2c5c5 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" Namespace="kube-system" Pod="coredns-7d764666f9-zs4lg" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-" Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:26.548 [INFO][4025] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" Namespace="kube-system" Pod="coredns-7d764666f9-zs4lg" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.010 [INFO][4098] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" HandleID="k8s-pod-network.31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.050 [INFO][4098] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" HandleID="k8s-pod-network.31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fea0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-zkxct.gb1.brightbox.com", "pod":"coredns-7d764666f9-zs4lg", "timestamp":"2026-03-14 00:37:27.010295974 +0000 UTC"}, Hostname:"srv-zkxct.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0000f0dc0)} Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.050 [INFO][4098] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.051 [INFO][4098] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.051 [INFO][4098] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-zkxct.gb1.brightbox.com' Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.067 [INFO][4098] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.088 [INFO][4098] ipam/ipam.go 409: Looking up existing affinities for host host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.097 [INFO][4098] ipam/ipam.go 526: Trying affinity for 192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.101 [INFO][4098] ipam/ipam.go 160: Attempting to load block cidr=192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.105 [INFO][4098] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.105 [INFO][4098] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.26.0/26 handle="k8s-pod-network.31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.112 [INFO][4098] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.125 [INFO][4098] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.26.0/26 handle="k8s-pod-network.31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.139 [INFO][4098] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.26.3/26] block=192.168.26.0/26 handle="k8s-pod-network.31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.139 [INFO][4098] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.26.3/26] handle="k8s-pod-network.31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.139 [INFO][4098] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:27.248718 containerd[1504]: 2026-03-14 00:37:27.139 [INFO][4098] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.26.3/26] IPv6=[] ContainerID="31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" HandleID="k8s-pod-network.31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" Mar 14 00:37:27.257097 containerd[1504]: 2026-03-14 00:37:27.149 [INFO][4025] cni-plugin/k8s.go 418: Populated endpoint ContainerID="31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" Namespace="kube-system" Pod="coredns-7d764666f9-zs4lg" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"0c8a2ac7-fa2e-44fe-9348-bf4cf1020019", ResourceVersion:"936", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7d764666f9-zs4lg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.26.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5e05db2c5c5", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:27.257097 containerd[1504]: 2026-03-14 00:37:27.150 [INFO][4025] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.26.3/32] ContainerID="31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" Namespace="kube-system" Pod="coredns-7d764666f9-zs4lg" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" Mar 14 00:37:27.257097 containerd[1504]: 2026-03-14 00:37:27.150 [INFO][4025] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5e05db2c5c5 ContainerID="31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" Namespace="kube-system" Pod="coredns-7d764666f9-zs4lg" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" Mar 14 00:37:27.257097 containerd[1504]: 2026-03-14 00:37:27.162 [INFO][4025] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" Namespace="kube-system" Pod="coredns-7d764666f9-zs4lg" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" Mar 14 00:37:27.257097 containerd[1504]: 2026-03-14 00:37:27.165 [INFO][4025] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" Namespace="kube-system" Pod="coredns-7d764666f9-zs4lg" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"0c8a2ac7-fa2e-44fe-9348-bf4cf1020019", ResourceVersion:"936", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e", Pod:"coredns-7d764666f9-zs4lg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.26.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5e05db2c5c5", MAC:"76:e1:69:ed:44:96", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:27.259345 containerd[1504]: 2026-03-14 00:37:27.218 [INFO][4025] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e" Namespace="kube-system" Pod="coredns-7d764666f9-zs4lg" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" Mar 14 00:37:27.258806 systemd[1]: Started cri-containerd-dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b.scope - libcontainer container dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b. Mar 14 00:37:27.360608 systemd[1]: Started cri-containerd-ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f.scope - libcontainer container ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f. Mar 14 00:37:27.371409 systemd-networkd[1436]: cali3d7db044870: Link UP Mar 14 00:37:27.373607 systemd-networkd[1436]: cali3d7db044870: Gained carrier Mar 14 00:37:27.423506 containerd[1504]: time="2026-03-14T00:37:27.421268641Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 14 00:37:27.423506 containerd[1504]: time="2026-03-14T00:37:27.421378270Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 14 00:37:27.423506 containerd[1504]: time="2026-03-14T00:37:27.421693886Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:37:27.423506 containerd[1504]: time="2026-03-14T00:37:27.421871509Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:26.602 [ERROR][4054] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:26.684 [INFO][4054] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0 calico-apiserver-689dfd58b9- calico-system f4805231-80ff-447c-a875-0751f3c2712b 937 0 2026-03-14 00:36:57 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:689dfd58b9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-zkxct.gb1.brightbox.com calico-apiserver-689dfd58b9-glmpr eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali3d7db044870 [] [] }} ContainerID="94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" Namespace="calico-system" Pod="calico-apiserver-689dfd58b9-glmpr" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-" Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:26.686 [INFO][4054] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" Namespace="calico-system" Pod="calico-apiserver-689dfd58b9-glmpr" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:26.990 [INFO][4128] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" HandleID="k8s-pod-network.94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:27.044 [INFO][4128] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" HandleID="k8s-pod-network.94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277d80), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-zkxct.gb1.brightbox.com", "pod":"calico-apiserver-689dfd58b9-glmpr", "timestamp":"2026-03-14 00:37:26.990493253 +0000 UTC"}, Hostname:"srv-zkxct.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0004b2dc0)} Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:27.057 [INFO][4128] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:27.139 [INFO][4128] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:27.140 [INFO][4128] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-zkxct.gb1.brightbox.com' Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:27.160 [INFO][4128] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:27.190 [INFO][4128] ipam/ipam.go 409: Looking up existing affinities for host host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:27.245 [INFO][4128] ipam/ipam.go 526: Trying affinity for 192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:27.261 [INFO][4128] ipam/ipam.go 160: Attempting to load block cidr=192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:27.273 [INFO][4128] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:27.273 [INFO][4128] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.26.0/26 handle="k8s-pod-network.94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:27.287 [INFO][4128] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728 Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:27.302 [INFO][4128] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.26.0/26 handle="k8s-pod-network.94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:27.324 [INFO][4128] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.26.4/26] block=192.168.26.0/26 handle="k8s-pod-network.94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:27.325 [INFO][4128] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.26.4/26] handle="k8s-pod-network.94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:27.326 [INFO][4128] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:27.424480 containerd[1504]: 2026-03-14 00:37:27.326 [INFO][4128] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.26.4/26] IPv6=[] ContainerID="94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" HandleID="k8s-pod-network.94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" Mar 14 00:37:27.426273 containerd[1504]: 2026-03-14 00:37:27.346 [INFO][4054] cni-plugin/k8s.go 418: Populated endpoint ContainerID="94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" Namespace="calico-system" Pod="calico-apiserver-689dfd58b9-glmpr" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0", GenerateName:"calico-apiserver-689dfd58b9-", Namespace:"calico-system", SelfLink:"", UID:"f4805231-80ff-447c-a875-0751f3c2712b", ResourceVersion:"937", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"689dfd58b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-689dfd58b9-glmpr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.26.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali3d7db044870", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:27.426273 containerd[1504]: 2026-03-14 00:37:27.346 [INFO][4054] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.26.4/32] ContainerID="94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" Namespace="calico-system" Pod="calico-apiserver-689dfd58b9-glmpr" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" Mar 14 00:37:27.426273 containerd[1504]: 2026-03-14 00:37:27.346 [INFO][4054] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3d7db044870 ContainerID="94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" Namespace="calico-system" Pod="calico-apiserver-689dfd58b9-glmpr" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" Mar 14 00:37:27.426273 containerd[1504]: 2026-03-14 00:37:27.375 [INFO][4054] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" Namespace="calico-system" Pod="calico-apiserver-689dfd58b9-glmpr" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" Mar 14 00:37:27.426273 containerd[1504]: 2026-03-14 00:37:27.378 [INFO][4054] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" Namespace="calico-system" Pod="calico-apiserver-689dfd58b9-glmpr" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0", GenerateName:"calico-apiserver-689dfd58b9-", Namespace:"calico-system", SelfLink:"", UID:"f4805231-80ff-447c-a875-0751f3c2712b", ResourceVersion:"937", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"689dfd58b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728", Pod:"calico-apiserver-689dfd58b9-glmpr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.26.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali3d7db044870", MAC:"92:29:39:95:07:a3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:27.426273 containerd[1504]: 2026-03-14 00:37:27.413 [INFO][4054] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728" Namespace="calico-system" Pod="calico-apiserver-689dfd58b9-glmpr" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" Mar 14 00:37:27.487986 systemd[1]: Started cri-containerd-31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e.scope - libcontainer container 31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e. Mar 14 00:37:27.502224 systemd[1]: Created slice kubepods-besteffort-pod2e0baf4e_1c7d_44d3_bda2_e8128927a0e7.slice - libcontainer container kubepods-besteffort-pod2e0baf4e_1c7d_44d3_bda2_e8128927a0e7.slice. Mar 14 00:37:27.516645 containerd[1504]: time="2026-03-14T00:37:27.515776799Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 14 00:37:27.516645 containerd[1504]: time="2026-03-14T00:37:27.515966198Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 14 00:37:27.516645 containerd[1504]: time="2026-03-14T00:37:27.515995140Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:37:27.517055 containerd[1504]: time="2026-03-14T00:37:27.516943605Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:37:27.542816 systemd-networkd[1436]: cali1bee082f6aa: Link UP Mar 14 00:37:27.547301 systemd-networkd[1436]: cali1bee082f6aa: Gained carrier Mar 14 00:37:27.592673 systemd[1]: Started cri-containerd-94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728.scope - libcontainer container 94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728. Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:26.503 [ERROR][4037] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:26.582 [INFO][4037] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0 calico-kube-controllers-5f9c794d4f- calico-system 6ea7102e-c24f-4833-a40a-4027abd4e9fc 940 0 2026-03-14 00:36:58 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5f9c794d4f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-zkxct.gb1.brightbox.com calico-kube-controllers-5f9c794d4f-2qjv6 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali1bee082f6aa [] [] }} ContainerID="fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" Namespace="calico-system" Pod="calico-kube-controllers-5f9c794d4f-2qjv6" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-" Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:26.583 [INFO][4037] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" Namespace="calico-system" Pod="calico-kube-controllers-5f9c794d4f-2qjv6" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.012 [INFO][4105] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" HandleID="k8s-pod-network.fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.075 [INFO][4105] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" HandleID="k8s-pod-network.fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000123eb0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-zkxct.gb1.brightbox.com", "pod":"calico-kube-controllers-5f9c794d4f-2qjv6", "timestamp":"2026-03-14 00:37:27.01200193 +0000 UTC"}, Hostname:"srv-zkxct.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00048f080)} Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.075 [INFO][4105] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.326 [INFO][4105] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.327 [INFO][4105] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-zkxct.gb1.brightbox.com' Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.336 [INFO][4105] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.355 [INFO][4105] ipam/ipam.go 409: Looking up existing affinities for host host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.392 [INFO][4105] ipam/ipam.go 526: Trying affinity for 192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.399 [INFO][4105] ipam/ipam.go 160: Attempting to load block cidr=192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.417 [INFO][4105] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.418 [INFO][4105] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.26.0/26 handle="k8s-pod-network.fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.434 [INFO][4105] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06 Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.464 [INFO][4105] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.26.0/26 handle="k8s-pod-network.fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.526 [INFO][4105] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.26.5/26] block=192.168.26.0/26 handle="k8s-pod-network.fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.527 [INFO][4105] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.26.5/26] handle="k8s-pod-network.fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.527 [INFO][4105] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:27.600347 containerd[1504]: 2026-03-14 00:37:27.527 [INFO][4105] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.26.5/26] IPv6=[] ContainerID="fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" HandleID="k8s-pod-network.fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" Mar 14 00:37:27.602968 containerd[1504]: 2026-03-14 00:37:27.535 [INFO][4037] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" Namespace="calico-system" Pod="calico-kube-controllers-5f9c794d4f-2qjv6" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0", GenerateName:"calico-kube-controllers-5f9c794d4f-", Namespace:"calico-system", SelfLink:"", UID:"6ea7102e-c24f-4833-a40a-4027abd4e9fc", ResourceVersion:"940", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5f9c794d4f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-5f9c794d4f-2qjv6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.26.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1bee082f6aa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:27.602968 containerd[1504]: 2026-03-14 00:37:27.536 [INFO][4037] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.26.5/32] ContainerID="fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" Namespace="calico-system" Pod="calico-kube-controllers-5f9c794d4f-2qjv6" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" Mar 14 00:37:27.602968 containerd[1504]: 2026-03-14 00:37:27.536 [INFO][4037] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1bee082f6aa ContainerID="fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" Namespace="calico-system" Pod="calico-kube-controllers-5f9c794d4f-2qjv6" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" Mar 14 00:37:27.602968 containerd[1504]: 2026-03-14 00:37:27.548 [INFO][4037] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" Namespace="calico-system" Pod="calico-kube-controllers-5f9c794d4f-2qjv6" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" Mar 14 00:37:27.602968 containerd[1504]: 2026-03-14 00:37:27.552 [INFO][4037] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" Namespace="calico-system" Pod="calico-kube-controllers-5f9c794d4f-2qjv6" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0", GenerateName:"calico-kube-controllers-5f9c794d4f-", Namespace:"calico-system", SelfLink:"", UID:"6ea7102e-c24f-4833-a40a-4027abd4e9fc", ResourceVersion:"940", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5f9c794d4f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06", Pod:"calico-kube-controllers-5f9c794d4f-2qjv6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.26.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1bee082f6aa", MAC:"a6:fd:05:20:dd:11", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:27.602968 containerd[1504]: 2026-03-14 00:37:27.593 [INFO][4037] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06" Namespace="calico-system" Pod="calico-kube-controllers-5f9c794d4f-2qjv6" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" Mar 14 00:37:27.625189 kubelet[2695]: I0314 00:37:27.625139 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmh52\" (UniqueName: \"kubernetes.io/projected/2e0baf4e-1c7d-44d3-bda2-e8128927a0e7-kube-api-access-dmh52\") pod \"whisker-b56c87cb9-82slf\" (UID: \"2e0baf4e-1c7d-44d3-bda2-e8128927a0e7\") " pod="calico-system/whisker-b56c87cb9-82slf" Mar 14 00:37:27.625189 kubelet[2695]: I0314 00:37:27.625203 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/2e0baf4e-1c7d-44d3-bda2-e8128927a0e7-nginx-config\") pod \"whisker-b56c87cb9-82slf\" (UID: \"2e0baf4e-1c7d-44d3-bda2-e8128927a0e7\") " pod="calico-system/whisker-b56c87cb9-82slf" Mar 14 00:37:27.627723 kubelet[2695]: I0314 00:37:27.626504 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e0baf4e-1c7d-44d3-bda2-e8128927a0e7-whisker-ca-bundle\") pod \"whisker-b56c87cb9-82slf\" (UID: \"2e0baf4e-1c7d-44d3-bda2-e8128927a0e7\") " pod="calico-system/whisker-b56c87cb9-82slf" Mar 14 00:37:27.627723 kubelet[2695]: I0314 00:37:27.627429 2695 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2e0baf4e-1c7d-44d3-bda2-e8128927a0e7-whisker-backend-key-pair\") pod \"whisker-b56c87cb9-82slf\" (UID: \"2e0baf4e-1c7d-44d3-bda2-e8128927a0e7\") " pod="calico-system/whisker-b56c87cb9-82slf" Mar 14 00:37:27.668732 containerd[1504]: time="2026-03-14T00:37:27.667117937Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 14 00:37:27.670212 containerd[1504]: time="2026-03-14T00:37:27.669737657Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 14 00:37:27.671387 containerd[1504]: time="2026-03-14T00:37:27.669993387Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:37:27.673624 systemd-networkd[1436]: califcef44ce592: Link UP Mar 14 00:37:27.674989 systemd-networkd[1436]: califcef44ce592: Gained carrier Mar 14 00:37:27.681038 containerd[1504]: time="2026-03-14T00:37:27.673342650Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:26.847 [ERROR][4081] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:26.916 [INFO][4081] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0 coredns-7d764666f9- kube-system 08b4124a-e53b-471c-876a-69a445df63cd 941 0 2026-03-14 00:36:45 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-zkxct.gb1.brightbox.com coredns-7d764666f9-5sb67 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] califcef44ce592 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" Namespace="kube-system" Pod="coredns-7d764666f9-5sb67" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-" Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:26.917 [INFO][4081] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" Namespace="kube-system" Pod="coredns-7d764666f9-5sb67" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.200 [INFO][4156] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" HandleID="k8s-pod-network.345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.259 [INFO][4156] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" HandleID="k8s-pod-network.345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000100b60), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-zkxct.gb1.brightbox.com", "pod":"coredns-7d764666f9-5sb67", "timestamp":"2026-03-14 00:37:27.20074 +0000 UTC"}, Hostname:"srv-zkxct.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00037a000)} Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.259 [INFO][4156] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.530 [INFO][4156] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.530 [INFO][4156] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-zkxct.gb1.brightbox.com' Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.555 [INFO][4156] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.595 [INFO][4156] ipam/ipam.go 409: Looking up existing affinities for host host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.610 [INFO][4156] ipam/ipam.go 526: Trying affinity for 192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.618 [INFO][4156] ipam/ipam.go 160: Attempting to load block cidr=192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.625 [INFO][4156] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.625 [INFO][4156] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.26.0/26 handle="k8s-pod-network.345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.630 [INFO][4156] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026 Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.642 [INFO][4156] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.26.0/26 handle="k8s-pod-network.345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.654 [INFO][4156] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.26.6/26] block=192.168.26.0/26 handle="k8s-pod-network.345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.654 [INFO][4156] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.26.6/26] handle="k8s-pod-network.345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.654 [INFO][4156] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:27.751335 containerd[1504]: 2026-03-14 00:37:27.654 [INFO][4156] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.26.6/26] IPv6=[] ContainerID="345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" HandleID="k8s-pod-network.345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" Mar 14 00:37:27.754692 containerd[1504]: 2026-03-14 00:37:27.663 [INFO][4081] cni-plugin/k8s.go 418: Populated endpoint ContainerID="345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" Namespace="kube-system" Pod="coredns-7d764666f9-5sb67" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"08b4124a-e53b-471c-876a-69a445df63cd", ResourceVersion:"941", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7d764666f9-5sb67", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.26.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califcef44ce592", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:27.754692 containerd[1504]: 2026-03-14 00:37:27.664 [INFO][4081] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.26.6/32] ContainerID="345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" Namespace="kube-system" Pod="coredns-7d764666f9-5sb67" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" Mar 14 00:37:27.754692 containerd[1504]: 2026-03-14 00:37:27.664 [INFO][4081] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califcef44ce592 ContainerID="345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" Namespace="kube-system" Pod="coredns-7d764666f9-5sb67" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" Mar 14 00:37:27.754692 containerd[1504]: 2026-03-14 00:37:27.681 [INFO][4081] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" Namespace="kube-system" Pod="coredns-7d764666f9-5sb67" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" Mar 14 00:37:27.754692 containerd[1504]: 2026-03-14 00:37:27.687 [INFO][4081] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" Namespace="kube-system" Pod="coredns-7d764666f9-5sb67" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"08b4124a-e53b-471c-876a-69a445df63cd", ResourceVersion:"941", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026", Pod:"coredns-7d764666f9-5sb67", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.26.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califcef44ce592", MAC:"f6:61:7d:8c:21:ae", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:27.755110 containerd[1504]: 2026-03-14 00:37:27.732 [INFO][4081] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026" Namespace="kube-system" Pod="coredns-7d764666f9-5sb67" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" Mar 14 00:37:27.772133 systemd[1]: Started cri-containerd-fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06.scope - libcontainer container fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06. Mar 14 00:37:27.820764 containerd[1504]: time="2026-03-14T00:37:27.820292754Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-b56c87cb9-82slf,Uid:2e0baf4e-1c7d-44d3-bda2-e8128927a0e7,Namespace:calico-system,Attempt:0,}" Mar 14 00:37:27.820751 systemd-networkd[1436]: caliddc623e6622: Link UP Mar 14 00:37:27.821132 systemd-networkd[1436]: caliddc623e6622: Gained carrier Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:26.735 [ERROR][4067] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:26.892 [INFO][4067] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0 calico-apiserver-689dfd58b9- calico-system 2820677f-9f42-42f1-ade4-b7694612f299 939 0 2026-03-14 00:36:57 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:689dfd58b9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-zkxct.gb1.brightbox.com calico-apiserver-689dfd58b9-98c4z eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] caliddc623e6622 [] [] }} ContainerID="503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" Namespace="calico-system" Pod="calico-apiserver-689dfd58b9-98c4z" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-" Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:26.892 [INFO][4067] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" Namespace="calico-system" Pod="calico-apiserver-689dfd58b9-98c4z" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.237 [INFO][4152] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" HandleID="k8s-pod-network.503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.318 [INFO][4152] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" HandleID="k8s-pod-network.503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004141e0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-zkxct.gb1.brightbox.com", "pod":"calico-apiserver-689dfd58b9-98c4z", "timestamp":"2026-03-14 00:37:27.237863695 +0000 UTC"}, Hostname:"srv-zkxct.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0000e8420)} Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.319 [INFO][4152] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.655 [INFO][4152] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.658 [INFO][4152] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-zkxct.gb1.brightbox.com' Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.663 [INFO][4152] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.706 [INFO][4152] ipam/ipam.go 409: Looking up existing affinities for host host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.720 [INFO][4152] ipam/ipam.go 526: Trying affinity for 192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.725 [INFO][4152] ipam/ipam.go 160: Attempting to load block cidr=192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.736 [INFO][4152] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.739 [INFO][4152] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.26.0/26 handle="k8s-pod-network.503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.744 [INFO][4152] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.756 [INFO][4152] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.26.0/26 handle="k8s-pod-network.503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.798 [INFO][4152] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.26.7/26] block=192.168.26.0/26 handle="k8s-pod-network.503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.798 [INFO][4152] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.26.7/26] handle="k8s-pod-network.503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.798 [INFO][4152] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:27.875441 containerd[1504]: 2026-03-14 00:37:27.798 [INFO][4152] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.26.7/26] IPv6=[] ContainerID="503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" HandleID="k8s-pod-network.503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" Mar 14 00:37:27.880117 containerd[1504]: 2026-03-14 00:37:27.807 [INFO][4067] cni-plugin/k8s.go 418: Populated endpoint ContainerID="503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" Namespace="calico-system" Pod="calico-apiserver-689dfd58b9-98c4z" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0", GenerateName:"calico-apiserver-689dfd58b9-", Namespace:"calico-system", SelfLink:"", UID:"2820677f-9f42-42f1-ade4-b7694612f299", ResourceVersion:"939", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"689dfd58b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-689dfd58b9-98c4z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.26.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"caliddc623e6622", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:27.880117 containerd[1504]: 2026-03-14 00:37:27.808 [INFO][4067] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.26.7/32] ContainerID="503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" Namespace="calico-system" Pod="calico-apiserver-689dfd58b9-98c4z" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" Mar 14 00:37:27.880117 containerd[1504]: 2026-03-14 00:37:27.808 [INFO][4067] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliddc623e6622 ContainerID="503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" Namespace="calico-system" Pod="calico-apiserver-689dfd58b9-98c4z" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" Mar 14 00:37:27.880117 containerd[1504]: 2026-03-14 00:37:27.823 [INFO][4067] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" Namespace="calico-system" Pod="calico-apiserver-689dfd58b9-98c4z" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" Mar 14 00:37:27.880117 containerd[1504]: 2026-03-14 00:37:27.825 [INFO][4067] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" Namespace="calico-system" Pod="calico-apiserver-689dfd58b9-98c4z" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0", GenerateName:"calico-apiserver-689dfd58b9-", Namespace:"calico-system", SelfLink:"", UID:"2820677f-9f42-42f1-ade4-b7694612f299", ResourceVersion:"939", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"689dfd58b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad", Pod:"calico-apiserver-689dfd58b9-98c4z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.26.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"caliddc623e6622", MAC:"f2:32:2b:3c:08:72", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:27.880117 containerd[1504]: 2026-03-14 00:37:27.842 [INFO][4067] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad" Namespace="calico-system" Pod="calico-apiserver-689dfd58b9-98c4z" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" Mar 14 00:37:27.962690 containerd[1504]: time="2026-03-14T00:37:27.959966635Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-zs4lg,Uid:0c8a2ac7-fa2e-44fe-9348-bf4cf1020019,Namespace:kube-system,Attempt:1,} returns sandbox id \"31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e\"" Mar 14 00:37:28.037153 containerd[1504]: time="2026-03-14T00:37:28.037090994Z" level=info msg="CreateContainer within sandbox \"31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 14 00:37:28.045579 containerd[1504]: time="2026-03-14T00:37:28.043929258Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 14 00:37:28.045579 containerd[1504]: time="2026-03-14T00:37:28.044020144Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 14 00:37:28.045579 containerd[1504]: time="2026-03-14T00:37:28.044049839Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:37:28.045579 containerd[1504]: time="2026-03-14T00:37:28.044283462Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:37:28.081941 containerd[1504]: time="2026-03-14T00:37:28.081872296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5f9c794d4f-2qjv6,Uid:6ea7102e-c24f-4833-a40a-4027abd4e9fc,Namespace:calico-system,Attempt:1,} returns sandbox id \"fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06\"" Mar 14 00:37:28.082387 containerd[1504]: time="2026-03-14T00:37:28.082353006Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-psbtg,Uid:85de6037-ff24-4a8c-95b3-a77667104b2e,Namespace:calico-system,Attempt:1,} returns sandbox id \"dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b\"" Mar 14 00:37:28.132962 containerd[1504]: time="2026-03-14T00:37:28.132907955Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 14 00:37:28.158717 systemd[1]: Started cri-containerd-345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026.scope - libcontainer container 345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026. Mar 14 00:37:28.211339 containerd[1504]: time="2026-03-14T00:37:28.209701889Z" level=info msg="CreateContainer within sandbox \"31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9a644f838e2f14520b2565ae070e2f02205f5c5ae4ac9741f641a0912ae40d47\"" Mar 14 00:37:28.214152 containerd[1504]: time="2026-03-14T00:37:28.213602618Z" level=info msg="StartContainer for \"9a644f838e2f14520b2565ae070e2f02205f5c5ae4ac9741f641a0912ae40d47\"" Mar 14 00:37:28.233413 containerd[1504]: time="2026-03-14T00:37:28.229462010Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 14 00:37:28.233413 containerd[1504]: time="2026-03-14T00:37:28.231174396Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 14 00:37:28.233413 containerd[1504]: time="2026-03-14T00:37:28.231196441Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:37:28.233413 containerd[1504]: time="2026-03-14T00:37:28.231394738Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:37:28.260930 systemd-networkd[1436]: calieb565f66cf3: Gained IPv6LL Mar 14 00:37:28.358691 systemd[1]: Started cri-containerd-9a644f838e2f14520b2565ae070e2f02205f5c5ae4ac9741f641a0912ae40d47.scope - libcontainer container 9a644f838e2f14520b2565ae070e2f02205f5c5ae4ac9741f641a0912ae40d47. Mar 14 00:37:28.433939 containerd[1504]: time="2026-03-14T00:37:28.433258055Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-689dfd58b9-glmpr,Uid:f4805231-80ff-447c-a875-0751f3c2712b,Namespace:calico-system,Attempt:1,} returns sandbox id \"94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728\"" Mar 14 00:37:28.434770 systemd[1]: Started cri-containerd-503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad.scope - libcontainer container 503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad. Mar 14 00:37:28.509537 containerd[1504]: time="2026-03-14T00:37:28.508208067Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-n69fb,Uid:c2f12561-b5d4-4d53-afc2-18ef12c655af,Namespace:calico-system,Attempt:1,} returns sandbox id \"ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f\"" Mar 14 00:37:28.525480 containerd[1504]: time="2026-03-14T00:37:28.523336737Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-5sb67,Uid:08b4124a-e53b-471c-876a-69a445df63cd,Namespace:kube-system,Attempt:1,} returns sandbox id \"345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026\"" Mar 14 00:37:28.597278 containerd[1504]: time="2026-03-14T00:37:28.596975707Z" level=info msg="CreateContainer within sandbox \"345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 14 00:37:28.642175 containerd[1504]: time="2026-03-14T00:37:28.641662694Z" level=info msg="StartContainer for \"9a644f838e2f14520b2565ae070e2f02205f5c5ae4ac9741f641a0912ae40d47\" returns successfully" Mar 14 00:37:28.644735 systemd-networkd[1436]: cali5e05db2c5c5: Gained IPv6LL Mar 14 00:37:28.666985 kubelet[2695]: I0314 00:37:28.666631 2695 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="2ed4c211-c53b-438a-87ab-ddc36cd894a9" path="/var/lib/kubelet/pods/2ed4c211-c53b-438a-87ab-ddc36cd894a9/volumes" Mar 14 00:37:28.668984 containerd[1504]: time="2026-03-14T00:37:28.668760225Z" level=info msg="CreateContainer within sandbox \"345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"1e3a87b41fb99b7e74136646353d779076f006576139ed514771b932e085192e\"" Mar 14 00:37:28.688154 containerd[1504]: time="2026-03-14T00:37:28.687713199Z" level=info msg="StartContainer for \"1e3a87b41fb99b7e74136646353d779076f006576139ed514771b932e085192e\"" Mar 14 00:37:28.733045 containerd[1504]: time="2026-03-14T00:37:28.732942635Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-689dfd58b9-98c4z,Uid:2820677f-9f42-42f1-ade4-b7694612f299,Namespace:calico-system,Attempt:1,} returns sandbox id \"503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad\"" Mar 14 00:37:28.807844 systemd-networkd[1436]: cali904cdce9a8f: Link UP Mar 14 00:37:28.810249 systemd-networkd[1436]: cali904cdce9a8f: Gained carrier Mar 14 00:37:28.837124 systemd-networkd[1436]: cali7a2a8d1e826: Gained IPv6LL Mar 14 00:37:28.842682 systemd[1]: Started cri-containerd-1e3a87b41fb99b7e74136646353d779076f006576139ed514771b932e085192e.scope - libcontainer container 1e3a87b41fb99b7e74136646353d779076f006576139ed514771b932e085192e. Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.178 [ERROR][4446] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.243 [INFO][4446] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--zkxct.gb1.brightbox.com-k8s-whisker--b56c87cb9--82slf-eth0 whisker-b56c87cb9- calico-system 2e0baf4e-1c7d-44d3-bda2-e8128927a0e7 978 0 2026-03-14 00:37:27 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:b56c87cb9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s srv-zkxct.gb1.brightbox.com whisker-b56c87cb9-82slf eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali904cdce9a8f [] [] }} ContainerID="4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" Namespace="calico-system" Pod="whisker-b56c87cb9-82slf" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-whisker--b56c87cb9--82slf-" Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.243 [INFO][4446] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" Namespace="calico-system" Pod="whisker-b56c87cb9-82slf" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-whisker--b56c87cb9--82slf-eth0" Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.627 [INFO][4544] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" HandleID="k8s-pod-network.4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" Workload="srv--zkxct.gb1.brightbox.com-k8s-whisker--b56c87cb9--82slf-eth0" Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.683 [INFO][4544] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" HandleID="k8s-pod-network.4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" Workload="srv--zkxct.gb1.brightbox.com-k8s-whisker--b56c87cb9--82slf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00036e6e0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-zkxct.gb1.brightbox.com", "pod":"whisker-b56c87cb9-82slf", "timestamp":"2026-03-14 00:37:28.627174354 +0000 UTC"}, Hostname:"srv-zkxct.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0002c0160)} Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.683 [INFO][4544] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.687 [INFO][4544] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.688 [INFO][4544] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-zkxct.gb1.brightbox.com' Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.707 [INFO][4544] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.720 [INFO][4544] ipam/ipam.go 409: Looking up existing affinities for host host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.738 [INFO][4544] ipam/ipam.go 526: Trying affinity for 192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.744 [INFO][4544] ipam/ipam.go 160: Attempting to load block cidr=192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.749 [INFO][4544] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.26.0/26 host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.749 [INFO][4544] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.26.0/26 handle="k8s-pod-network.4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.754 [INFO][4544] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7 Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.768 [INFO][4544] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.26.0/26 handle="k8s-pod-network.4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.781 [INFO][4544] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.26.8/26] block=192.168.26.0/26 handle="k8s-pod-network.4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.781 [INFO][4544] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.26.8/26] handle="k8s-pod-network.4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" host="srv-zkxct.gb1.brightbox.com" Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.781 [INFO][4544] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:28.881094 containerd[1504]: 2026-03-14 00:37:28.781 [INFO][4544] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.26.8/26] IPv6=[] ContainerID="4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" HandleID="k8s-pod-network.4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" Workload="srv--zkxct.gb1.brightbox.com-k8s-whisker--b56c87cb9--82slf-eth0" Mar 14 00:37:28.885425 containerd[1504]: 2026-03-14 00:37:28.792 [INFO][4446] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" Namespace="calico-system" Pod="whisker-b56c87cb9-82slf" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-whisker--b56c87cb9--82slf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-whisker--b56c87cb9--82slf-eth0", GenerateName:"whisker-b56c87cb9-", Namespace:"calico-system", SelfLink:"", UID:"2e0baf4e-1c7d-44d3-bda2-e8128927a0e7", ResourceVersion:"978", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 37, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"b56c87cb9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"", Pod:"whisker-b56c87cb9-82slf", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.26.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali904cdce9a8f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:28.885425 containerd[1504]: 2026-03-14 00:37:28.792 [INFO][4446] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.26.8/32] ContainerID="4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" Namespace="calico-system" Pod="whisker-b56c87cb9-82slf" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-whisker--b56c87cb9--82slf-eth0" Mar 14 00:37:28.885425 containerd[1504]: 2026-03-14 00:37:28.792 [INFO][4446] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali904cdce9a8f ContainerID="4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" Namespace="calico-system" Pod="whisker-b56c87cb9-82slf" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-whisker--b56c87cb9--82slf-eth0" Mar 14 00:37:28.885425 containerd[1504]: 2026-03-14 00:37:28.811 [INFO][4446] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" Namespace="calico-system" Pod="whisker-b56c87cb9-82slf" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-whisker--b56c87cb9--82slf-eth0" Mar 14 00:37:28.885425 containerd[1504]: 2026-03-14 00:37:28.818 [INFO][4446] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" Namespace="calico-system" Pod="whisker-b56c87cb9-82slf" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-whisker--b56c87cb9--82slf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-whisker--b56c87cb9--82slf-eth0", GenerateName:"whisker-b56c87cb9-", Namespace:"calico-system", SelfLink:"", UID:"2e0baf4e-1c7d-44d3-bda2-e8128927a0e7", ResourceVersion:"978", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 37, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"b56c87cb9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7", Pod:"whisker-b56c87cb9-82slf", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.26.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali904cdce9a8f", MAC:"7a:d0:e2:64:6a:30", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:28.885425 containerd[1504]: 2026-03-14 00:37:28.873 [INFO][4446] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7" Namespace="calico-system" Pod="whisker-b56c87cb9-82slf" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-whisker--b56c87cb9--82slf-eth0" Mar 14 00:37:28.942937 containerd[1504]: time="2026-03-14T00:37:28.940859171Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 14 00:37:28.942937 containerd[1504]: time="2026-03-14T00:37:28.940951439Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 14 00:37:28.942937 containerd[1504]: time="2026-03-14T00:37:28.940974843Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:37:28.942937 containerd[1504]: time="2026-03-14T00:37:28.941923432Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 14 00:37:29.002677 systemd[1]: Started cri-containerd-4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7.scope - libcontainer container 4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7. Mar 14 00:37:29.052875 containerd[1504]: time="2026-03-14T00:37:29.052777282Z" level=info msg="StartContainer for \"1e3a87b41fb99b7e74136646353d779076f006576139ed514771b932e085192e\" returns successfully" Mar 14 00:37:29.064438 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2790729487.mount: Deactivated successfully. Mar 14 00:37:29.156640 systemd-networkd[1436]: caliddc623e6622: Gained IPv6LL Mar 14 00:37:29.157154 systemd-networkd[1436]: cali3d7db044870: Gained IPv6LL Mar 14 00:37:29.350003 systemd-networkd[1436]: cali1bee082f6aa: Gained IPv6LL Mar 14 00:37:29.397370 kubelet[2695]: I0314 00:37:29.396885 2695 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-zs4lg" podStartSLOduration=44.39686651 podStartE2EDuration="44.39686651s" podCreationTimestamp="2026-03-14 00:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 00:37:29.395280207 +0000 UTC m=+51.008852155" watchObservedRunningTime="2026-03-14 00:37:29.39686651 +0000 UTC m=+51.010438467" Mar 14 00:37:29.420860 kubelet[2695]: I0314 00:37:29.420357 2695 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-5sb67" podStartSLOduration=44.420338897 podStartE2EDuration="44.420338897s" podCreationTimestamp="2026-03-14 00:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-14 00:37:29.42000203 +0000 UTC m=+51.033573978" watchObservedRunningTime="2026-03-14 00:37:29.420338897 +0000 UTC m=+51.033910844" Mar 14 00:37:29.516949 containerd[1504]: time="2026-03-14T00:37:29.516384714Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-b56c87cb9-82slf,Uid:2e0baf4e-1c7d-44d3-bda2-e8128927a0e7,Namespace:calico-system,Attempt:0,} returns sandbox id \"4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7\"" Mar 14 00:37:29.732773 systemd-networkd[1436]: califcef44ce592: Gained IPv6LL Mar 14 00:37:30.022812 kernel: calico-node[4462]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 14 00:37:30.630777 systemd-networkd[1436]: cali904cdce9a8f: Gained IPv6LL Mar 14 00:37:31.151623 systemd-networkd[1436]: vxlan.calico: Link UP Mar 14 00:37:31.151638 systemd-networkd[1436]: vxlan.calico: Gained carrier Mar 14 00:37:32.422415 systemd-networkd[1436]: vxlan.calico: Gained IPv6LL Mar 14 00:37:33.127989 containerd[1504]: time="2026-03-14T00:37:33.127728716Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Mar 14 00:37:33.131272 containerd[1504]: time="2026-03-14T00:37:33.131231212Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:33.141837 containerd[1504]: time="2026-03-14T00:37:33.141780761Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:33.143680 containerd[1504]: time="2026-03-14T00:37:33.143644039Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 5.010489519s" Mar 14 00:37:33.143821 containerd[1504]: time="2026-03-14T00:37:33.143783018Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Mar 14 00:37:33.144383 containerd[1504]: time="2026-03-14T00:37:33.144325805Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:33.156819 containerd[1504]: time="2026-03-14T00:37:33.156772879Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 14 00:37:33.198982 containerd[1504]: time="2026-03-14T00:37:33.198821231Z" level=info msg="CreateContainer within sandbox \"fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 14 00:37:33.255753 containerd[1504]: time="2026-03-14T00:37:33.255548557Z" level=info msg="CreateContainer within sandbox \"fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"e67e418b58f291c7d2b9c60f6b696594e733a0dd54cc58995c2e97ba0781a97a\"" Mar 14 00:37:33.256804 containerd[1504]: time="2026-03-14T00:37:33.256765058Z" level=info msg="StartContainer for \"e67e418b58f291c7d2b9c60f6b696594e733a0dd54cc58995c2e97ba0781a97a\"" Mar 14 00:37:33.412951 systemd[1]: Started cri-containerd-e67e418b58f291c7d2b9c60f6b696594e733a0dd54cc58995c2e97ba0781a97a.scope - libcontainer container e67e418b58f291c7d2b9c60f6b696594e733a0dd54cc58995c2e97ba0781a97a. Mar 14 00:37:33.500600 containerd[1504]: time="2026-03-14T00:37:33.500542005Z" level=info msg="StartContainer for \"e67e418b58f291c7d2b9c60f6b696594e733a0dd54cc58995c2e97ba0781a97a\" returns successfully" Mar 14 00:37:34.482532 kubelet[2695]: I0314 00:37:34.482344 2695 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5f9c794d4f-2qjv6" podStartSLOduration=31.418046807 podStartE2EDuration="36.482316329s" podCreationTimestamp="2026-03-14 00:36:58 +0000 UTC" firstStartedPulling="2026-03-14 00:37:28.089619439 +0000 UTC m=+49.703191380" lastFinishedPulling="2026-03-14 00:37:33.153888969 +0000 UTC m=+54.767460902" observedRunningTime="2026-03-14 00:37:34.481012716 +0000 UTC m=+56.094584679" watchObservedRunningTime="2026-03-14 00:37:34.482316329 +0000 UTC m=+56.095888283" Mar 14 00:37:34.940564 containerd[1504]: time="2026-03-14T00:37:34.940483770Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:34.942017 containerd[1504]: time="2026-03-14T00:37:34.941834914Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Mar 14 00:37:34.943498 containerd[1504]: time="2026-03-14T00:37:34.942768200Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:34.946372 containerd[1504]: time="2026-03-14T00:37:34.946299731Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:34.947649 containerd[1504]: time="2026-03-14T00:37:34.947544891Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 1.79071562s" Mar 14 00:37:34.947649 containerd[1504]: time="2026-03-14T00:37:34.947607803Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Mar 14 00:37:34.949837 containerd[1504]: time="2026-03-14T00:37:34.949773141Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 14 00:37:34.956854 containerd[1504]: time="2026-03-14T00:37:34.956798526Z" level=info msg="CreateContainer within sandbox \"dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 14 00:37:34.980262 containerd[1504]: time="2026-03-14T00:37:34.979601816Z" level=info msg="CreateContainer within sandbox \"dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"770c16361b92b513c9335ec056912060c75f6db332906ccde6f4839415e632cd\"" Mar 14 00:37:34.980712 containerd[1504]: time="2026-03-14T00:37:34.980681146Z" level=info msg="StartContainer for \"770c16361b92b513c9335ec056912060c75f6db332906ccde6f4839415e632cd\"" Mar 14 00:37:35.049749 systemd[1]: Started cri-containerd-770c16361b92b513c9335ec056912060c75f6db332906ccde6f4839415e632cd.scope - libcontainer container 770c16361b92b513c9335ec056912060c75f6db332906ccde6f4839415e632cd. Mar 14 00:37:35.101029 containerd[1504]: time="2026-03-14T00:37:35.100977288Z" level=info msg="StartContainer for \"770c16361b92b513c9335ec056912060c75f6db332906ccde6f4839415e632cd\" returns successfully" Mar 14 00:37:38.919887 containerd[1504]: time="2026-03-14T00:37:38.919588007Z" level=info msg="StopPodSandbox for \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\"" Mar 14 00:37:39.259215 containerd[1504]: time="2026-03-14T00:37:39.259059483Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:39.265958 containerd[1504]: time="2026-03-14T00:37:39.265856620Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Mar 14 00:37:39.272421 containerd[1504]: time="2026-03-14T00:37:39.271922679Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:39.286648 containerd[1504]: time="2026-03-14T00:37:39.286573138Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:39.288424 containerd[1504]: time="2026-03-14T00:37:39.288342946Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 4.338482992s" Mar 14 00:37:39.288424 containerd[1504]: time="2026-03-14T00:37:39.288397602Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 14 00:37:39.331892 containerd[1504]: time="2026-03-14T00:37:39.331369175Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 14 00:37:39.343657 containerd[1504]: time="2026-03-14T00:37:39.343554268Z" level=info msg="CreateContainer within sandbox \"94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 14 00:37:39.375418 containerd[1504]: time="2026-03-14T00:37:39.375293923Z" level=info msg="CreateContainer within sandbox \"94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"5c26a4c9121d10ca4fed7490d935f0df93b567a541553bfc7a5b7bd767f42e62\"" Mar 14 00:37:39.384219 containerd[1504]: time="2026-03-14T00:37:39.380852707Z" level=info msg="StartContainer for \"5c26a4c9121d10ca4fed7490d935f0df93b567a541553bfc7a5b7bd767f42e62\"" Mar 14 00:37:39.383886 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2560337428.mount: Deactivated successfully. Mar 14 00:37:39.489385 systemd[1]: Started cri-containerd-5c26a4c9121d10ca4fed7490d935f0df93b567a541553bfc7a5b7bd767f42e62.scope - libcontainer container 5c26a4c9121d10ca4fed7490d935f0df93b567a541553bfc7a5b7bd767f42e62. Mar 14 00:37:39.620517 containerd[1504]: time="2026-03-14T00:37:39.620305611Z" level=info msg="StartContainer for \"5c26a4c9121d10ca4fed7490d935f0df93b567a541553bfc7a5b7bd767f42e62\" returns successfully" Mar 14 00:37:39.820531 containerd[1504]: 2026-03-14 00:37:39.365 [WARNING][5018] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"c2f12561-b5d4-4d53-afc2-18ef12c655af", ResourceVersion:"953", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f", Pod:"goldmane-9f7667bb8-n69fb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.26.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7a2a8d1e826", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:39.820531 containerd[1504]: 2026-03-14 00:37:39.369 [INFO][5018] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Mar 14 00:37:39.820531 containerd[1504]: 2026-03-14 00:37:39.370 [INFO][5018] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" iface="eth0" netns="" Mar 14 00:37:39.820531 containerd[1504]: 2026-03-14 00:37:39.370 [INFO][5018] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Mar 14 00:37:39.820531 containerd[1504]: 2026-03-14 00:37:39.370 [INFO][5018] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Mar 14 00:37:39.820531 containerd[1504]: 2026-03-14 00:37:39.765 [INFO][5029] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" HandleID="k8s-pod-network.4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Workload="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" Mar 14 00:37:39.820531 containerd[1504]: 2026-03-14 00:37:39.771 [INFO][5029] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:39.820531 containerd[1504]: 2026-03-14 00:37:39.772 [INFO][5029] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:39.820531 containerd[1504]: 2026-03-14 00:37:39.809 [WARNING][5029] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" HandleID="k8s-pod-network.4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Workload="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" Mar 14 00:37:39.820531 containerd[1504]: 2026-03-14 00:37:39.809 [INFO][5029] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" HandleID="k8s-pod-network.4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Workload="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" Mar 14 00:37:39.820531 containerd[1504]: 2026-03-14 00:37:39.811 [INFO][5029] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:39.820531 containerd[1504]: 2026-03-14 00:37:39.814 [INFO][5018] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Mar 14 00:37:39.820531 containerd[1504]: time="2026-03-14T00:37:39.819115854Z" level=info msg="TearDown network for sandbox \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\" successfully" Mar 14 00:37:39.820531 containerd[1504]: time="2026-03-14T00:37:39.819150624Z" level=info msg="StopPodSandbox for \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\" returns successfully" Mar 14 00:37:39.980862 containerd[1504]: time="2026-03-14T00:37:39.980773855Z" level=info msg="RemovePodSandbox for \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\"" Mar 14 00:37:39.988529 containerd[1504]: time="2026-03-14T00:37:39.986842920Z" level=info msg="Forcibly stopping sandbox \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\"" Mar 14 00:37:40.127430 containerd[1504]: 2026-03-14 00:37:40.056 [WARNING][5084] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"c2f12561-b5d4-4d53-afc2-18ef12c655af", ResourceVersion:"953", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f", Pod:"goldmane-9f7667bb8-n69fb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.26.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7a2a8d1e826", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:40.127430 containerd[1504]: 2026-03-14 00:37:40.057 [INFO][5084] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Mar 14 00:37:40.127430 containerd[1504]: 2026-03-14 00:37:40.057 [INFO][5084] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" iface="eth0" netns="" Mar 14 00:37:40.127430 containerd[1504]: 2026-03-14 00:37:40.057 [INFO][5084] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Mar 14 00:37:40.127430 containerd[1504]: 2026-03-14 00:37:40.057 [INFO][5084] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Mar 14 00:37:40.127430 containerd[1504]: 2026-03-14 00:37:40.096 [INFO][5092] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" HandleID="k8s-pod-network.4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Workload="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" Mar 14 00:37:40.127430 containerd[1504]: 2026-03-14 00:37:40.096 [INFO][5092] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:40.127430 containerd[1504]: 2026-03-14 00:37:40.097 [INFO][5092] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:40.127430 containerd[1504]: 2026-03-14 00:37:40.108 [WARNING][5092] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" HandleID="k8s-pod-network.4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Workload="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" Mar 14 00:37:40.127430 containerd[1504]: 2026-03-14 00:37:40.108 [INFO][5092] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" HandleID="k8s-pod-network.4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Workload="srv--zkxct.gb1.brightbox.com-k8s-goldmane--9f7667bb8--n69fb-eth0" Mar 14 00:37:40.127430 containerd[1504]: 2026-03-14 00:37:40.111 [INFO][5092] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:40.127430 containerd[1504]: 2026-03-14 00:37:40.119 [INFO][5084] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23" Mar 14 00:37:40.128398 containerd[1504]: time="2026-03-14T00:37:40.128297275Z" level=info msg="TearDown network for sandbox \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\" successfully" Mar 14 00:37:40.169888 containerd[1504]: time="2026-03-14T00:37:40.169839426Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 14 00:37:40.193433 containerd[1504]: time="2026-03-14T00:37:40.193399741Z" level=info msg="RemovePodSandbox \"4d31733769b69d10c752e51c7573ba4c5c49d08f707fba5c0d0feedc87f7bc23\" returns successfully" Mar 14 00:37:40.194643 containerd[1504]: time="2026-03-14T00:37:40.194220755Z" level=info msg="StopPodSandbox for \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\"" Mar 14 00:37:40.359008 containerd[1504]: 2026-03-14 00:37:40.281 [WARNING][5106] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"08b4124a-e53b-471c-876a-69a445df63cd", ResourceVersion:"1007", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026", Pod:"coredns-7d764666f9-5sb67", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.26.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califcef44ce592", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:40.359008 containerd[1504]: 2026-03-14 00:37:40.282 [INFO][5106] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Mar 14 00:37:40.359008 containerd[1504]: 2026-03-14 00:37:40.282 [INFO][5106] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" iface="eth0" netns="" Mar 14 00:37:40.359008 containerd[1504]: 2026-03-14 00:37:40.283 [INFO][5106] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Mar 14 00:37:40.359008 containerd[1504]: 2026-03-14 00:37:40.283 [INFO][5106] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Mar 14 00:37:40.359008 containerd[1504]: 2026-03-14 00:37:40.334 [INFO][5113] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" HandleID="k8s-pod-network.1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" Mar 14 00:37:40.359008 containerd[1504]: 2026-03-14 00:37:40.334 [INFO][5113] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:40.359008 containerd[1504]: 2026-03-14 00:37:40.335 [INFO][5113] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:40.359008 containerd[1504]: 2026-03-14 00:37:40.350 [WARNING][5113] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" HandleID="k8s-pod-network.1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" Mar 14 00:37:40.359008 containerd[1504]: 2026-03-14 00:37:40.351 [INFO][5113] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" HandleID="k8s-pod-network.1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" Mar 14 00:37:40.359008 containerd[1504]: 2026-03-14 00:37:40.353 [INFO][5113] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:40.359008 containerd[1504]: 2026-03-14 00:37:40.355 [INFO][5106] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Mar 14 00:37:40.359008 containerd[1504]: time="2026-03-14T00:37:40.357977128Z" level=info msg="TearDown network for sandbox \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\" successfully" Mar 14 00:37:40.359008 containerd[1504]: time="2026-03-14T00:37:40.358016087Z" level=info msg="StopPodSandbox for \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\" returns successfully" Mar 14 00:37:40.360416 containerd[1504]: time="2026-03-14T00:37:40.359050942Z" level=info msg="RemovePodSandbox for \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\"" Mar 14 00:37:40.360416 containerd[1504]: time="2026-03-14T00:37:40.359279022Z" level=info msg="Forcibly stopping sandbox \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\"" Mar 14 00:37:40.522577 containerd[1504]: 2026-03-14 00:37:40.437 [WARNING][5127] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"08b4124a-e53b-471c-876a-69a445df63cd", ResourceVersion:"1007", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"345bebd5041c0ae76c20e66225ddd669275df022de8f70cc341d0b5f0be67026", Pod:"coredns-7d764666f9-5sb67", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.26.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califcef44ce592", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:40.522577 containerd[1504]: 2026-03-14 00:37:40.438 [INFO][5127] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Mar 14 00:37:40.522577 containerd[1504]: 2026-03-14 00:37:40.438 [INFO][5127] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" iface="eth0" netns="" Mar 14 00:37:40.522577 containerd[1504]: 2026-03-14 00:37:40.438 [INFO][5127] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Mar 14 00:37:40.522577 containerd[1504]: 2026-03-14 00:37:40.438 [INFO][5127] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Mar 14 00:37:40.522577 containerd[1504]: 2026-03-14 00:37:40.495 [INFO][5135] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" HandleID="k8s-pod-network.1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" Mar 14 00:37:40.522577 containerd[1504]: 2026-03-14 00:37:40.495 [INFO][5135] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:40.522577 containerd[1504]: 2026-03-14 00:37:40.495 [INFO][5135] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:40.522577 containerd[1504]: 2026-03-14 00:37:40.511 [WARNING][5135] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" HandleID="k8s-pod-network.1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" Mar 14 00:37:40.522577 containerd[1504]: 2026-03-14 00:37:40.511 [INFO][5135] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" HandleID="k8s-pod-network.1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--5sb67-eth0" Mar 14 00:37:40.522577 containerd[1504]: 2026-03-14 00:37:40.515 [INFO][5135] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:40.522577 containerd[1504]: 2026-03-14 00:37:40.520 [INFO][5127] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be" Mar 14 00:37:40.525274 containerd[1504]: time="2026-03-14T00:37:40.522638707Z" level=info msg="TearDown network for sandbox \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\" successfully" Mar 14 00:37:40.534942 containerd[1504]: time="2026-03-14T00:37:40.534875310Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 14 00:37:40.535120 containerd[1504]: time="2026-03-14T00:37:40.534967429Z" level=info msg="RemovePodSandbox \"1e5b4a14a2bb44962576109fc75775d9271299657de694bdb623fb4f801fa4be\" returns successfully" Mar 14 00:37:40.535941 containerd[1504]: time="2026-03-14T00:37:40.535685246Z" level=info msg="StopPodSandbox for \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\"" Mar 14 00:37:40.694913 kubelet[2695]: I0314 00:37:40.687914 2695 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-689dfd58b9-glmpr" podStartSLOduration=32.882102852 podStartE2EDuration="43.683782968s" podCreationTimestamp="2026-03-14 00:36:57 +0000 UTC" firstStartedPulling="2026-03-14 00:37:28.526480592 +0000 UTC m=+50.140052538" lastFinishedPulling="2026-03-14 00:37:39.328160718 +0000 UTC m=+60.941732654" observedRunningTime="2026-03-14 00:37:40.681569648 +0000 UTC m=+62.295141615" watchObservedRunningTime="2026-03-14 00:37:40.683782968 +0000 UTC m=+62.297354938" Mar 14 00:37:40.846150 containerd[1504]: 2026-03-14 00:37:40.649 [WARNING][5150] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"85de6037-ff24-4a8c-95b3-a77667104b2e", ResourceVersion:"949", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b", Pod:"csi-node-driver-psbtg", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.26.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calieb565f66cf3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:40.846150 containerd[1504]: 2026-03-14 00:37:40.650 [INFO][5150] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Mar 14 00:37:40.846150 containerd[1504]: 2026-03-14 00:37:40.650 [INFO][5150] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" iface="eth0" netns="" Mar 14 00:37:40.846150 containerd[1504]: 2026-03-14 00:37:40.650 [INFO][5150] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Mar 14 00:37:40.846150 containerd[1504]: 2026-03-14 00:37:40.650 [INFO][5150] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Mar 14 00:37:40.846150 containerd[1504]: 2026-03-14 00:37:40.805 [INFO][5157] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" HandleID="k8s-pod-network.ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Workload="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" Mar 14 00:37:40.846150 containerd[1504]: 2026-03-14 00:37:40.805 [INFO][5157] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:40.846150 containerd[1504]: 2026-03-14 00:37:40.805 [INFO][5157] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:40.846150 containerd[1504]: 2026-03-14 00:37:40.827 [WARNING][5157] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" HandleID="k8s-pod-network.ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Workload="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" Mar 14 00:37:40.846150 containerd[1504]: 2026-03-14 00:37:40.827 [INFO][5157] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" HandleID="k8s-pod-network.ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Workload="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" Mar 14 00:37:40.846150 containerd[1504]: 2026-03-14 00:37:40.832 [INFO][5157] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:40.846150 containerd[1504]: 2026-03-14 00:37:40.840 [INFO][5150] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Mar 14 00:37:40.846150 containerd[1504]: time="2026-03-14T00:37:40.845915082Z" level=info msg="TearDown network for sandbox \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\" successfully" Mar 14 00:37:40.846150 containerd[1504]: time="2026-03-14T00:37:40.845956392Z" level=info msg="StopPodSandbox for \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\" returns successfully" Mar 14 00:37:40.851611 containerd[1504]: time="2026-03-14T00:37:40.849592993Z" level=info msg="RemovePodSandbox for \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\"" Mar 14 00:37:40.851611 containerd[1504]: time="2026-03-14T00:37:40.849765012Z" level=info msg="Forcibly stopping sandbox \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\"" Mar 14 00:37:41.067671 containerd[1504]: 2026-03-14 00:37:40.982 [WARNING][5173] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"85de6037-ff24-4a8c-95b3-a77667104b2e", ResourceVersion:"949", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b", Pod:"csi-node-driver-psbtg", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.26.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calieb565f66cf3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:41.067671 containerd[1504]: 2026-03-14 00:37:40.983 [INFO][5173] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Mar 14 00:37:41.067671 containerd[1504]: 2026-03-14 00:37:40.983 [INFO][5173] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" iface="eth0" netns="" Mar 14 00:37:41.067671 containerd[1504]: 2026-03-14 00:37:40.983 [INFO][5173] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Mar 14 00:37:41.067671 containerd[1504]: 2026-03-14 00:37:40.983 [INFO][5173] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Mar 14 00:37:41.067671 containerd[1504]: 2026-03-14 00:37:41.041 [INFO][5180] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" HandleID="k8s-pod-network.ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Workload="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" Mar 14 00:37:41.067671 containerd[1504]: 2026-03-14 00:37:41.041 [INFO][5180] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:41.067671 containerd[1504]: 2026-03-14 00:37:41.041 [INFO][5180] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:41.067671 containerd[1504]: 2026-03-14 00:37:41.056 [WARNING][5180] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" HandleID="k8s-pod-network.ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Workload="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" Mar 14 00:37:41.067671 containerd[1504]: 2026-03-14 00:37:41.056 [INFO][5180] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" HandleID="k8s-pod-network.ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Workload="srv--zkxct.gb1.brightbox.com-k8s-csi--node--driver--psbtg-eth0" Mar 14 00:37:41.067671 containerd[1504]: 2026-03-14 00:37:41.060 [INFO][5180] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:41.067671 containerd[1504]: 2026-03-14 00:37:41.062 [INFO][5173] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a" Mar 14 00:37:41.067671 containerd[1504]: time="2026-03-14T00:37:41.067564275Z" level=info msg="TearDown network for sandbox \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\" successfully" Mar 14 00:37:41.074373 containerd[1504]: time="2026-03-14T00:37:41.074018570Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 14 00:37:41.074373 containerd[1504]: time="2026-03-14T00:37:41.074107857Z" level=info msg="RemovePodSandbox \"ff8d76d2c8a33f2c644d6a81d88f71d2fac6c765c44cda026b1d6d2da35e833a\" returns successfully" Mar 14 00:37:41.076160 containerd[1504]: time="2026-03-14T00:37:41.075869448Z" level=info msg="StopPodSandbox for \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\"" Mar 14 00:37:41.355698 containerd[1504]: 2026-03-14 00:37:41.181 [WARNING][5195] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0", GenerateName:"calico-kube-controllers-5f9c794d4f-", Namespace:"calico-system", SelfLink:"", UID:"6ea7102e-c24f-4833-a40a-4027abd4e9fc", ResourceVersion:"1034", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5f9c794d4f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06", Pod:"calico-kube-controllers-5f9c794d4f-2qjv6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.26.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1bee082f6aa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:41.355698 containerd[1504]: 2026-03-14 00:37:41.181 [INFO][5195] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Mar 14 00:37:41.355698 containerd[1504]: 2026-03-14 00:37:41.181 [INFO][5195] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" iface="eth0" netns="" Mar 14 00:37:41.355698 containerd[1504]: 2026-03-14 00:37:41.181 [INFO][5195] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Mar 14 00:37:41.355698 containerd[1504]: 2026-03-14 00:37:41.182 [INFO][5195] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Mar 14 00:37:41.355698 containerd[1504]: 2026-03-14 00:37:41.320 [INFO][5203] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" HandleID="k8s-pod-network.3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" Mar 14 00:37:41.355698 containerd[1504]: 2026-03-14 00:37:41.321 [INFO][5203] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:41.355698 containerd[1504]: 2026-03-14 00:37:41.321 [INFO][5203] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:41.355698 containerd[1504]: 2026-03-14 00:37:41.339 [WARNING][5203] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" HandleID="k8s-pod-network.3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" Mar 14 00:37:41.355698 containerd[1504]: 2026-03-14 00:37:41.339 [INFO][5203] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" HandleID="k8s-pod-network.3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" Mar 14 00:37:41.355698 containerd[1504]: 2026-03-14 00:37:41.342 [INFO][5203] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:41.355698 containerd[1504]: 2026-03-14 00:37:41.350 [INFO][5195] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Mar 14 00:37:41.355698 containerd[1504]: time="2026-03-14T00:37:41.355222994Z" level=info msg="TearDown network for sandbox \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\" successfully" Mar 14 00:37:41.355698 containerd[1504]: time="2026-03-14T00:37:41.355256219Z" level=info msg="StopPodSandbox for \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\" returns successfully" Mar 14 00:37:41.358277 containerd[1504]: time="2026-03-14T00:37:41.357565514Z" level=info msg="RemovePodSandbox for \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\"" Mar 14 00:37:41.358277 containerd[1504]: time="2026-03-14T00:37:41.357610421Z" level=info msg="Forcibly stopping sandbox \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\"" Mar 14 00:37:41.514557 containerd[1504]: 2026-03-14 00:37:41.427 [WARNING][5218] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0", GenerateName:"calico-kube-controllers-5f9c794d4f-", Namespace:"calico-system", SelfLink:"", UID:"6ea7102e-c24f-4833-a40a-4027abd4e9fc", ResourceVersion:"1034", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5f9c794d4f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"fdca01bdcb9ff99d53c684f7bc4f87e2fa9c582374c706d2401fd7e0b82ede06", Pod:"calico-kube-controllers-5f9c794d4f-2qjv6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.26.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1bee082f6aa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:41.514557 containerd[1504]: 2026-03-14 00:37:41.427 [INFO][5218] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Mar 14 00:37:41.514557 containerd[1504]: 2026-03-14 00:37:41.428 [INFO][5218] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" iface="eth0" netns="" Mar 14 00:37:41.514557 containerd[1504]: 2026-03-14 00:37:41.428 [INFO][5218] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Mar 14 00:37:41.514557 containerd[1504]: 2026-03-14 00:37:41.428 [INFO][5218] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Mar 14 00:37:41.514557 containerd[1504]: 2026-03-14 00:37:41.496 [INFO][5226] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" HandleID="k8s-pod-network.3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" Mar 14 00:37:41.514557 containerd[1504]: 2026-03-14 00:37:41.497 [INFO][5226] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:41.514557 containerd[1504]: 2026-03-14 00:37:41.497 [INFO][5226] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:41.514557 containerd[1504]: 2026-03-14 00:37:41.507 [WARNING][5226] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" HandleID="k8s-pod-network.3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" Mar 14 00:37:41.514557 containerd[1504]: 2026-03-14 00:37:41.507 [INFO][5226] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" HandleID="k8s-pod-network.3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--kube--controllers--5f9c794d4f--2qjv6-eth0" Mar 14 00:37:41.514557 containerd[1504]: 2026-03-14 00:37:41.509 [INFO][5226] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:41.514557 containerd[1504]: 2026-03-14 00:37:41.511 [INFO][5218] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0" Mar 14 00:37:41.517900 containerd[1504]: time="2026-03-14T00:37:41.515114777Z" level=info msg="TearDown network for sandbox \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\" successfully" Mar 14 00:37:41.520118 containerd[1504]: time="2026-03-14T00:37:41.519913096Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 14 00:37:41.520118 containerd[1504]: time="2026-03-14T00:37:41.519998401Z" level=info msg="RemovePodSandbox \"3cc86b5a99f4e6872da2a68c3751b2e2b681e8801c3c2e07d9dccf76a70219e0\" returns successfully" Mar 14 00:37:41.521146 containerd[1504]: time="2026-03-14T00:37:41.521045204Z" level=info msg="StopPodSandbox for \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\"" Mar 14 00:37:41.667762 kubelet[2695]: I0314 00:37:41.667077 2695 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 14 00:37:41.735194 containerd[1504]: 2026-03-14 00:37:41.646 [WARNING][5240] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0", GenerateName:"calico-apiserver-689dfd58b9-", Namespace:"calico-system", SelfLink:"", UID:"2820677f-9f42-42f1-ade4-b7694612f299", ResourceVersion:"986", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"689dfd58b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad", Pod:"calico-apiserver-689dfd58b9-98c4z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.26.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"caliddc623e6622", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:41.735194 containerd[1504]: 2026-03-14 00:37:41.648 [INFO][5240] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Mar 14 00:37:41.735194 containerd[1504]: 2026-03-14 00:37:41.648 [INFO][5240] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" iface="eth0" netns="" Mar 14 00:37:41.735194 containerd[1504]: 2026-03-14 00:37:41.648 [INFO][5240] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Mar 14 00:37:41.735194 containerd[1504]: 2026-03-14 00:37:41.650 [INFO][5240] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Mar 14 00:37:41.735194 containerd[1504]: 2026-03-14 00:37:41.708 [INFO][5248] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" HandleID="k8s-pod-network.5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" Mar 14 00:37:41.735194 containerd[1504]: 2026-03-14 00:37:41.708 [INFO][5248] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:41.735194 containerd[1504]: 2026-03-14 00:37:41.708 [INFO][5248] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:41.735194 containerd[1504]: 2026-03-14 00:37:41.725 [WARNING][5248] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" HandleID="k8s-pod-network.5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" Mar 14 00:37:41.735194 containerd[1504]: 2026-03-14 00:37:41.726 [INFO][5248] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" HandleID="k8s-pod-network.5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" Mar 14 00:37:41.735194 containerd[1504]: 2026-03-14 00:37:41.729 [INFO][5248] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:41.735194 containerd[1504]: 2026-03-14 00:37:41.732 [INFO][5240] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Mar 14 00:37:41.736672 containerd[1504]: time="2026-03-14T00:37:41.735265066Z" level=info msg="TearDown network for sandbox \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\" successfully" Mar 14 00:37:41.736672 containerd[1504]: time="2026-03-14T00:37:41.735301396Z" level=info msg="StopPodSandbox for \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\" returns successfully" Mar 14 00:37:41.737403 containerd[1504]: time="2026-03-14T00:37:41.736968067Z" level=info msg="RemovePodSandbox for \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\"" Mar 14 00:37:41.737403 containerd[1504]: time="2026-03-14T00:37:41.737016595Z" level=info msg="Forcibly stopping sandbox \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\"" Mar 14 00:37:41.873499 containerd[1504]: 2026-03-14 00:37:41.805 [WARNING][5262] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0", GenerateName:"calico-apiserver-689dfd58b9-", Namespace:"calico-system", SelfLink:"", UID:"2820677f-9f42-42f1-ade4-b7694612f299", ResourceVersion:"986", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"689dfd58b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad", Pod:"calico-apiserver-689dfd58b9-98c4z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.26.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"caliddc623e6622", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:41.873499 containerd[1504]: 2026-03-14 00:37:41.806 [INFO][5262] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Mar 14 00:37:41.873499 containerd[1504]: 2026-03-14 00:37:41.806 [INFO][5262] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" iface="eth0" netns="" Mar 14 00:37:41.873499 containerd[1504]: 2026-03-14 00:37:41.806 [INFO][5262] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Mar 14 00:37:41.873499 containerd[1504]: 2026-03-14 00:37:41.806 [INFO][5262] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Mar 14 00:37:41.873499 containerd[1504]: 2026-03-14 00:37:41.851 [INFO][5269] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" HandleID="k8s-pod-network.5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" Mar 14 00:37:41.873499 containerd[1504]: 2026-03-14 00:37:41.851 [INFO][5269] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:41.873499 containerd[1504]: 2026-03-14 00:37:41.851 [INFO][5269] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:41.873499 containerd[1504]: 2026-03-14 00:37:41.865 [WARNING][5269] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" HandleID="k8s-pod-network.5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" Mar 14 00:37:41.873499 containerd[1504]: 2026-03-14 00:37:41.865 [INFO][5269] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" HandleID="k8s-pod-network.5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--98c4z-eth0" Mar 14 00:37:41.873499 containerd[1504]: 2026-03-14 00:37:41.867 [INFO][5269] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:41.873499 containerd[1504]: 2026-03-14 00:37:41.870 [INFO][5262] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5" Mar 14 00:37:41.875037 containerd[1504]: time="2026-03-14T00:37:41.873516146Z" level=info msg="TearDown network for sandbox \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\" successfully" Mar 14 00:37:41.881855 containerd[1504]: time="2026-03-14T00:37:41.881194366Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 14 00:37:41.881855 containerd[1504]: time="2026-03-14T00:37:41.881261095Z" level=info msg="RemovePodSandbox \"5e8d9bab1b65386960b2c177f4eab92d89337cae793de27ab04c0195262badc5\" returns successfully" Mar 14 00:37:41.882057 containerd[1504]: time="2026-03-14T00:37:41.882021006Z" level=info msg="StopPodSandbox for \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\"" Mar 14 00:37:42.026653 containerd[1504]: 2026-03-14 00:37:41.953 [WARNING][5283] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"0c8a2ac7-fa2e-44fe-9348-bf4cf1020019", ResourceVersion:"1011", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e", Pod:"coredns-7d764666f9-zs4lg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.26.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5e05db2c5c5", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:42.026653 containerd[1504]: 2026-03-14 00:37:41.953 [INFO][5283] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Mar 14 00:37:42.026653 containerd[1504]: 2026-03-14 00:37:41.954 [INFO][5283] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" iface="eth0" netns="" Mar 14 00:37:42.026653 containerd[1504]: 2026-03-14 00:37:41.954 [INFO][5283] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Mar 14 00:37:42.026653 containerd[1504]: 2026-03-14 00:37:41.954 [INFO][5283] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Mar 14 00:37:42.026653 containerd[1504]: 2026-03-14 00:37:42.006 [INFO][5290] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" HandleID="k8s-pod-network.90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" Mar 14 00:37:42.026653 containerd[1504]: 2026-03-14 00:37:42.006 [INFO][5290] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:42.026653 containerd[1504]: 2026-03-14 00:37:42.006 [INFO][5290] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:42.026653 containerd[1504]: 2026-03-14 00:37:42.019 [WARNING][5290] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" HandleID="k8s-pod-network.90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" Mar 14 00:37:42.026653 containerd[1504]: 2026-03-14 00:37:42.019 [INFO][5290] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" HandleID="k8s-pod-network.90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" Mar 14 00:37:42.026653 containerd[1504]: 2026-03-14 00:37:42.022 [INFO][5290] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:42.026653 containerd[1504]: 2026-03-14 00:37:42.024 [INFO][5283] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Mar 14 00:37:42.026653 containerd[1504]: time="2026-03-14T00:37:42.026547717Z" level=info msg="TearDown network for sandbox \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\" successfully" Mar 14 00:37:42.026653 containerd[1504]: time="2026-03-14T00:37:42.026648687Z" level=info msg="StopPodSandbox for \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\" returns successfully" Mar 14 00:37:42.030166 containerd[1504]: time="2026-03-14T00:37:42.028837503Z" level=info msg="RemovePodSandbox for \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\"" Mar 14 00:37:42.030166 containerd[1504]: time="2026-03-14T00:37:42.028877444Z" level=info msg="Forcibly stopping sandbox \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\"" Mar 14 00:37:42.166501 containerd[1504]: 2026-03-14 00:37:42.101 [WARNING][5304] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"0c8a2ac7-fa2e-44fe-9348-bf4cf1020019", ResourceVersion:"1011", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"31c737e63105dbfd065a2ede6ca91597c9655c4707c4c96a8114a5972b44f86e", Pod:"coredns-7d764666f9-zs4lg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.26.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5e05db2c5c5", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:42.166501 containerd[1504]: 2026-03-14 00:37:42.102 [INFO][5304] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Mar 14 00:37:42.166501 containerd[1504]: 2026-03-14 00:37:42.102 [INFO][5304] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" iface="eth0" netns="" Mar 14 00:37:42.166501 containerd[1504]: 2026-03-14 00:37:42.102 [INFO][5304] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Mar 14 00:37:42.166501 containerd[1504]: 2026-03-14 00:37:42.102 [INFO][5304] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Mar 14 00:37:42.166501 containerd[1504]: 2026-03-14 00:37:42.145 [INFO][5318] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" HandleID="k8s-pod-network.90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" Mar 14 00:37:42.166501 containerd[1504]: 2026-03-14 00:37:42.146 [INFO][5318] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:42.166501 containerd[1504]: 2026-03-14 00:37:42.146 [INFO][5318] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:42.166501 containerd[1504]: 2026-03-14 00:37:42.159 [WARNING][5318] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" HandleID="k8s-pod-network.90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" Mar 14 00:37:42.166501 containerd[1504]: 2026-03-14 00:37:42.159 [INFO][5318] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" HandleID="k8s-pod-network.90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Workload="srv--zkxct.gb1.brightbox.com-k8s-coredns--7d764666f9--zs4lg-eth0" Mar 14 00:37:42.166501 containerd[1504]: 2026-03-14 00:37:42.162 [INFO][5318] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:42.166501 containerd[1504]: 2026-03-14 00:37:42.164 [INFO][5304] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a" Mar 14 00:37:42.166501 containerd[1504]: time="2026-03-14T00:37:42.166299779Z" level=info msg="TearDown network for sandbox \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\" successfully" Mar 14 00:37:42.176125 containerd[1504]: time="2026-03-14T00:37:42.176082552Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 14 00:37:42.176321 containerd[1504]: time="2026-03-14T00:37:42.176293290Z" level=info msg="RemovePodSandbox \"90bc9b66c288cac2bd6830908d094f67e950197c2a3e541be8f36d0a172d935a\" returns successfully" Mar 14 00:37:42.189472 containerd[1504]: time="2026-03-14T00:37:42.189421576Z" level=info msg="StopPodSandbox for \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\"" Mar 14 00:37:42.317850 containerd[1504]: 2026-03-14 00:37:42.256 [WARNING][5338] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-whisker--6f59cc5c64--pb8rc-eth0" Mar 14 00:37:42.317850 containerd[1504]: 2026-03-14 00:37:42.256 [INFO][5338] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Mar 14 00:37:42.317850 containerd[1504]: 2026-03-14 00:37:42.256 [INFO][5338] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" iface="eth0" netns="" Mar 14 00:37:42.317850 containerd[1504]: 2026-03-14 00:37:42.256 [INFO][5338] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Mar 14 00:37:42.317850 containerd[1504]: 2026-03-14 00:37:42.256 [INFO][5338] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Mar 14 00:37:42.317850 containerd[1504]: 2026-03-14 00:37:42.298 [INFO][5345] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" HandleID="k8s-pod-network.602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Workload="srv--zkxct.gb1.brightbox.com-k8s-whisker--6f59cc5c64--pb8rc-eth0" Mar 14 00:37:42.317850 containerd[1504]: 2026-03-14 00:37:42.298 [INFO][5345] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:42.317850 containerd[1504]: 2026-03-14 00:37:42.298 [INFO][5345] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:42.317850 containerd[1504]: 2026-03-14 00:37:42.309 [WARNING][5345] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" HandleID="k8s-pod-network.602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Workload="srv--zkxct.gb1.brightbox.com-k8s-whisker--6f59cc5c64--pb8rc-eth0" Mar 14 00:37:42.317850 containerd[1504]: 2026-03-14 00:37:42.310 [INFO][5345] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" HandleID="k8s-pod-network.602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Workload="srv--zkxct.gb1.brightbox.com-k8s-whisker--6f59cc5c64--pb8rc-eth0" Mar 14 00:37:42.317850 containerd[1504]: 2026-03-14 00:37:42.312 [INFO][5345] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:42.317850 containerd[1504]: 2026-03-14 00:37:42.315 [INFO][5338] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Mar 14 00:37:42.321126 containerd[1504]: time="2026-03-14T00:37:42.319546338Z" level=info msg="TearDown network for sandbox \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\" successfully" Mar 14 00:37:42.321126 containerd[1504]: time="2026-03-14T00:37:42.319649064Z" level=info msg="StopPodSandbox for \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\" returns successfully" Mar 14 00:37:42.323262 containerd[1504]: time="2026-03-14T00:37:42.322908129Z" level=info msg="RemovePodSandbox for \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\"" Mar 14 00:37:42.323262 containerd[1504]: time="2026-03-14T00:37:42.322962727Z" level=info msg="Forcibly stopping sandbox \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\"" Mar 14 00:37:42.450042 containerd[1504]: 2026-03-14 00:37:42.387 [WARNING][5359] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" WorkloadEndpoint="srv--zkxct.gb1.brightbox.com-k8s-whisker--6f59cc5c64--pb8rc-eth0" Mar 14 00:37:42.450042 containerd[1504]: 2026-03-14 00:37:42.387 [INFO][5359] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Mar 14 00:37:42.450042 containerd[1504]: 2026-03-14 00:37:42.387 [INFO][5359] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" iface="eth0" netns="" Mar 14 00:37:42.450042 containerd[1504]: 2026-03-14 00:37:42.387 [INFO][5359] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Mar 14 00:37:42.450042 containerd[1504]: 2026-03-14 00:37:42.387 [INFO][5359] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Mar 14 00:37:42.450042 containerd[1504]: 2026-03-14 00:37:42.428 [INFO][5366] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" HandleID="k8s-pod-network.602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Workload="srv--zkxct.gb1.brightbox.com-k8s-whisker--6f59cc5c64--pb8rc-eth0" Mar 14 00:37:42.450042 containerd[1504]: 2026-03-14 00:37:42.428 [INFO][5366] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:42.450042 containerd[1504]: 2026-03-14 00:37:42.428 [INFO][5366] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:42.450042 containerd[1504]: 2026-03-14 00:37:42.441 [WARNING][5366] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" HandleID="k8s-pod-network.602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Workload="srv--zkxct.gb1.brightbox.com-k8s-whisker--6f59cc5c64--pb8rc-eth0" Mar 14 00:37:42.450042 containerd[1504]: 2026-03-14 00:37:42.441 [INFO][5366] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" HandleID="k8s-pod-network.602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Workload="srv--zkxct.gb1.brightbox.com-k8s-whisker--6f59cc5c64--pb8rc-eth0" Mar 14 00:37:42.450042 containerd[1504]: 2026-03-14 00:37:42.443 [INFO][5366] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:42.450042 containerd[1504]: 2026-03-14 00:37:42.446 [INFO][5359] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4" Mar 14 00:37:42.450834 containerd[1504]: time="2026-03-14T00:37:42.450099807Z" level=info msg="TearDown network for sandbox \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\" successfully" Mar 14 00:37:42.457927 containerd[1504]: time="2026-03-14T00:37:42.457676870Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 14 00:37:42.457927 containerd[1504]: time="2026-03-14T00:37:42.457750693Z" level=info msg="RemovePodSandbox \"602e7bea67ed2b3e3fc4e478bf72fee60beb66df364c1f34a302a0731b8256e4\" returns successfully" Mar 14 00:37:42.458692 containerd[1504]: time="2026-03-14T00:37:42.458663732Z" level=info msg="StopPodSandbox for \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\"" Mar 14 00:37:42.626668 containerd[1504]: 2026-03-14 00:37:42.523 [WARNING][5380] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0", GenerateName:"calico-apiserver-689dfd58b9-", Namespace:"calico-system", SelfLink:"", UID:"f4805231-80ff-447c-a875-0751f3c2712b", ResourceVersion:"1057", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"689dfd58b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728", Pod:"calico-apiserver-689dfd58b9-glmpr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.26.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali3d7db044870", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:42.626668 containerd[1504]: 2026-03-14 00:37:42.523 [INFO][5380] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Mar 14 00:37:42.626668 containerd[1504]: 2026-03-14 00:37:42.523 [INFO][5380] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" iface="eth0" netns="" Mar 14 00:37:42.626668 containerd[1504]: 2026-03-14 00:37:42.523 [INFO][5380] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Mar 14 00:37:42.626668 containerd[1504]: 2026-03-14 00:37:42.524 [INFO][5380] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Mar 14 00:37:42.626668 containerd[1504]: 2026-03-14 00:37:42.569 [INFO][5387] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" HandleID="k8s-pod-network.6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" Mar 14 00:37:42.626668 containerd[1504]: 2026-03-14 00:37:42.571 [INFO][5387] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:42.626668 containerd[1504]: 2026-03-14 00:37:42.571 [INFO][5387] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:42.626668 containerd[1504]: 2026-03-14 00:37:42.595 [WARNING][5387] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" HandleID="k8s-pod-network.6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" Mar 14 00:37:42.626668 containerd[1504]: 2026-03-14 00:37:42.595 [INFO][5387] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" HandleID="k8s-pod-network.6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" Mar 14 00:37:42.626668 containerd[1504]: 2026-03-14 00:37:42.598 [INFO][5387] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:42.626668 containerd[1504]: 2026-03-14 00:37:42.604 [INFO][5380] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Mar 14 00:37:42.626668 containerd[1504]: time="2026-03-14T00:37:42.625595027Z" level=info msg="TearDown network for sandbox \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\" successfully" Mar 14 00:37:42.626668 containerd[1504]: time="2026-03-14T00:37:42.625633824Z" level=info msg="StopPodSandbox for \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\" returns successfully" Mar 14 00:37:42.629563 containerd[1504]: time="2026-03-14T00:37:42.628725767Z" level=info msg="RemovePodSandbox for \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\"" Mar 14 00:37:42.629563 containerd[1504]: time="2026-03-14T00:37:42.628769376Z" level=info msg="Forcibly stopping sandbox \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\"" Mar 14 00:37:42.833626 containerd[1504]: 2026-03-14 00:37:42.741 [WARNING][5402] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0", GenerateName:"calico-apiserver-689dfd58b9-", Namespace:"calico-system", SelfLink:"", UID:"f4805231-80ff-447c-a875-0751f3c2712b", ResourceVersion:"1057", Generation:0, CreationTimestamp:time.Date(2026, time.March, 14, 0, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"689dfd58b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zkxct.gb1.brightbox.com", ContainerID:"94d83644d1f383913666838c8266f37aaad9fa49e7a0b07978819c92221ac728", Pod:"calico-apiserver-689dfd58b9-glmpr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.26.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali3d7db044870", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 14 00:37:42.833626 containerd[1504]: 2026-03-14 00:37:42.741 [INFO][5402] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Mar 14 00:37:42.833626 containerd[1504]: 2026-03-14 00:37:42.741 [INFO][5402] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" iface="eth0" netns="" Mar 14 00:37:42.833626 containerd[1504]: 2026-03-14 00:37:42.741 [INFO][5402] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Mar 14 00:37:42.833626 containerd[1504]: 2026-03-14 00:37:42.741 [INFO][5402] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Mar 14 00:37:42.833626 containerd[1504]: 2026-03-14 00:37:42.802 [INFO][5409] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" HandleID="k8s-pod-network.6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" Mar 14 00:37:42.833626 containerd[1504]: 2026-03-14 00:37:42.802 [INFO][5409] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 14 00:37:42.833626 containerd[1504]: 2026-03-14 00:37:42.802 [INFO][5409] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 14 00:37:42.833626 containerd[1504]: 2026-03-14 00:37:42.815 [WARNING][5409] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" HandleID="k8s-pod-network.6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" Mar 14 00:37:42.833626 containerd[1504]: 2026-03-14 00:37:42.815 [INFO][5409] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" HandleID="k8s-pod-network.6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Workload="srv--zkxct.gb1.brightbox.com-k8s-calico--apiserver--689dfd58b9--glmpr-eth0" Mar 14 00:37:42.833626 containerd[1504]: 2026-03-14 00:37:42.817 [INFO][5409] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 14 00:37:42.833626 containerd[1504]: 2026-03-14 00:37:42.820 [INFO][5402] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0" Mar 14 00:37:42.834817 containerd[1504]: time="2026-03-14T00:37:42.834586147Z" level=info msg="TearDown network for sandbox \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\" successfully" Mar 14 00:37:42.869823 containerd[1504]: time="2026-03-14T00:37:42.869756788Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 14 00:37:42.870240 containerd[1504]: time="2026-03-14T00:37:42.870055056Z" level=info msg="RemovePodSandbox \"6ae03941c9879b95dfba08abf0c6757f03205527bf79a442b25534bccbc6e2d0\" returns successfully" Mar 14 00:37:43.654112 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3966734005.mount: Deactivated successfully. Mar 14 00:37:44.443067 containerd[1504]: time="2026-03-14T00:37:44.442851305Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:44.463542 containerd[1504]: time="2026-03-14T00:37:44.454275729Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Mar 14 00:37:44.478073 containerd[1504]: time="2026-03-14T00:37:44.477989778Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:44.481503 containerd[1504]: time="2026-03-14T00:37:44.481320901Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:44.491499 containerd[1504]: time="2026-03-14T00:37:44.491403879Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 5.159974621s" Mar 14 00:37:44.491499 containerd[1504]: time="2026-03-14T00:37:44.491482411Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Mar 14 00:37:44.546932 containerd[1504]: time="2026-03-14T00:37:44.546863062Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 14 00:37:44.840178 containerd[1504]: time="2026-03-14T00:37:44.838500658Z" level=info msg="CreateContainer within sandbox \"ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 14 00:37:44.975788 containerd[1504]: time="2026-03-14T00:37:44.975696132Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:44.977687 containerd[1504]: time="2026-03-14T00:37:44.977605309Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 14 00:37:44.991457 containerd[1504]: time="2026-03-14T00:37:44.990471261Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 443.540642ms" Mar 14 00:37:44.991457 containerd[1504]: time="2026-03-14T00:37:44.990586219Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 14 00:37:44.995178 containerd[1504]: time="2026-03-14T00:37:44.994784679Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 14 00:37:45.001598 containerd[1504]: time="2026-03-14T00:37:45.001561163Z" level=info msg="CreateContainer within sandbox \"503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 14 00:37:45.010543 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1361546347.mount: Deactivated successfully. Mar 14 00:37:45.058497 containerd[1504]: time="2026-03-14T00:37:45.058366024Z" level=info msg="CreateContainer within sandbox \"503ff4f790081d7e83113bb4c2d8e9c0ef5d41300f78a19abe2855e239d3d7ad\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c8957ff84b2245ff78e152baf1a3b4b5d581e1e6f1c3e62dfa18301bda1c585c\"" Mar 14 00:37:45.065167 containerd[1504]: time="2026-03-14T00:37:45.064096504Z" level=info msg="StartContainer for \"c8957ff84b2245ff78e152baf1a3b4b5d581e1e6f1c3e62dfa18301bda1c585c\"" Mar 14 00:37:45.067711 containerd[1504]: time="2026-03-14T00:37:45.067677326Z" level=info msg="CreateContainer within sandbox \"ba9882937cd07a472879bb52247cb86eaf65bd87efe89a212d5d80165b19842f\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"ca9f4589eb406112603859869cf7d63331da66b87171620fe294a2bef1b93666\"" Mar 14 00:37:45.073076 containerd[1504]: time="2026-03-14T00:37:45.073046451Z" level=info msg="StartContainer for \"ca9f4589eb406112603859869cf7d63331da66b87171620fe294a2bef1b93666\"" Mar 14 00:37:45.410951 systemd[1]: Started cri-containerd-c8957ff84b2245ff78e152baf1a3b4b5d581e1e6f1c3e62dfa18301bda1c585c.scope - libcontainer container c8957ff84b2245ff78e152baf1a3b4b5d581e1e6f1c3e62dfa18301bda1c585c. Mar 14 00:37:45.427305 systemd[1]: Started cri-containerd-ca9f4589eb406112603859869cf7d63331da66b87171620fe294a2bef1b93666.scope - libcontainer container ca9f4589eb406112603859869cf7d63331da66b87171620fe294a2bef1b93666. Mar 14 00:37:45.561711 containerd[1504]: time="2026-03-14T00:37:45.561617547Z" level=info msg="StartContainer for \"c8957ff84b2245ff78e152baf1a3b4b5d581e1e6f1c3e62dfa18301bda1c585c\" returns successfully" Mar 14 00:37:45.595417 containerd[1504]: time="2026-03-14T00:37:45.595363338Z" level=info msg="StartContainer for \"ca9f4589eb406112603859869cf7d63331da66b87171620fe294a2bef1b93666\" returns successfully" Mar 14 00:37:45.901551 kubelet[2695]: I0314 00:37:45.887364 2695 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/goldmane-9f7667bb8-n69fb" podStartSLOduration=32.860006512 podStartE2EDuration="48.862033186s" podCreationTimestamp="2026-03-14 00:36:57 +0000 UTC" firstStartedPulling="2026-03-14 00:37:28.544652017 +0000 UTC m=+50.158223958" lastFinishedPulling="2026-03-14 00:37:44.54667868 +0000 UTC m=+66.160250632" observedRunningTime="2026-03-14 00:37:45.861787401 +0000 UTC m=+67.475359337" watchObservedRunningTime="2026-03-14 00:37:45.862033186 +0000 UTC m=+67.475605132" Mar 14 00:37:46.298418 kubelet[2695]: I0314 00:37:46.298319 2695 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-689dfd58b9-98c4z" podStartSLOduration=33.044699282 podStartE2EDuration="49.298295189s" podCreationTimestamp="2026-03-14 00:36:57 +0000 UTC" firstStartedPulling="2026-03-14 00:37:28.738561328 +0000 UTC m=+50.352133265" lastFinishedPulling="2026-03-14 00:37:44.992157239 +0000 UTC m=+66.605729172" observedRunningTime="2026-03-14 00:37:45.980886929 +0000 UTC m=+67.594458883" watchObservedRunningTime="2026-03-14 00:37:46.298295189 +0000 UTC m=+67.911867187" Mar 14 00:37:46.973165 containerd[1504]: time="2026-03-14T00:37:46.973098401Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:46.975034 containerd[1504]: time="2026-03-14T00:37:46.974956197Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Mar 14 00:37:46.976190 containerd[1504]: time="2026-03-14T00:37:46.976047325Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:46.981164 containerd[1504]: time="2026-03-14T00:37:46.981119948Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:46.985823 containerd[1504]: time="2026-03-14T00:37:46.985654254Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 1.990788062s" Mar 14 00:37:46.986713 containerd[1504]: time="2026-03-14T00:37:46.985977410Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Mar 14 00:37:46.990268 containerd[1504]: time="2026-03-14T00:37:46.989562874Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 14 00:37:46.994229 containerd[1504]: time="2026-03-14T00:37:46.994184660Z" level=info msg="CreateContainer within sandbox \"4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 14 00:37:47.022865 containerd[1504]: time="2026-03-14T00:37:47.022780544Z" level=info msg="CreateContainer within sandbox \"4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"85bb9a1c094ddadd675800b409278fd5a569f6d2bac3feea08b28d84e779d3a4\"" Mar 14 00:37:47.024715 containerd[1504]: time="2026-03-14T00:37:47.024364264Z" level=info msg="StartContainer for \"85bb9a1c094ddadd675800b409278fd5a569f6d2bac3feea08b28d84e779d3a4\"" Mar 14 00:37:47.177705 systemd[1]: Started cri-containerd-85bb9a1c094ddadd675800b409278fd5a569f6d2bac3feea08b28d84e779d3a4.scope - libcontainer container 85bb9a1c094ddadd675800b409278fd5a569f6d2bac3feea08b28d84e779d3a4. Mar 14 00:37:47.317086 containerd[1504]: time="2026-03-14T00:37:47.316836975Z" level=info msg="StartContainer for \"85bb9a1c094ddadd675800b409278fd5a569f6d2bac3feea08b28d84e779d3a4\" returns successfully" Mar 14 00:37:49.396501 containerd[1504]: time="2026-03-14T00:37:49.395525228Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:49.397156 containerd[1504]: time="2026-03-14T00:37:49.396621477Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Mar 14 00:37:49.398485 containerd[1504]: time="2026-03-14T00:37:49.397758068Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:49.402141 containerd[1504]: time="2026-03-14T00:37:49.400806617Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:49.402141 containerd[1504]: time="2026-03-14T00:37:49.402049520Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 2.410641075s" Mar 14 00:37:49.402141 containerd[1504]: time="2026-03-14T00:37:49.402088395Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Mar 14 00:37:49.418840 containerd[1504]: time="2026-03-14T00:37:49.418643683Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 14 00:37:49.447285 containerd[1504]: time="2026-03-14T00:37:49.447166014Z" level=info msg="CreateContainer within sandbox \"dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 14 00:37:49.472657 containerd[1504]: time="2026-03-14T00:37:49.472604988Z" level=info msg="CreateContainer within sandbox \"dffa42a365c4c23901da1a85ec03b04abd64a42d54268ab74627e04c65da040b\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"3349fdd776279db389be7c0c03cef527fd6aa151c97af93eb37f2052216d2d90\"" Mar 14 00:37:49.476606 containerd[1504]: time="2026-03-14T00:37:49.474355020Z" level=info msg="StartContainer for \"3349fdd776279db389be7c0c03cef527fd6aa151c97af93eb37f2052216d2d90\"" Mar 14 00:37:49.527584 systemd[1]: run-containerd-runc-k8s.io-3349fdd776279db389be7c0c03cef527fd6aa151c97af93eb37f2052216d2d90-runc.Z6KaHs.mount: Deactivated successfully. Mar 14 00:37:49.541699 systemd[1]: Started cri-containerd-3349fdd776279db389be7c0c03cef527fd6aa151c97af93eb37f2052216d2d90.scope - libcontainer container 3349fdd776279db389be7c0c03cef527fd6aa151c97af93eb37f2052216d2d90. Mar 14 00:37:49.574883 kubelet[2695]: I0314 00:37:49.574806 2695 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 14 00:37:49.643810 containerd[1504]: time="2026-03-14T00:37:49.643587351Z" level=info msg="StartContainer for \"3349fdd776279db389be7c0c03cef527fd6aa151c97af93eb37f2052216d2d90\" returns successfully" Mar 14 00:37:49.879769 kubelet[2695]: I0314 00:37:49.879678 2695 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/csi-node-driver-psbtg" podStartSLOduration=30.561979949 podStartE2EDuration="51.879661104s" podCreationTimestamp="2026-03-14 00:36:58 +0000 UTC" firstStartedPulling="2026-03-14 00:37:28.090203482 +0000 UTC m=+49.703775416" lastFinishedPulling="2026-03-14 00:37:49.40788462 +0000 UTC m=+71.021456571" observedRunningTime="2026-03-14 00:37:49.876983477 +0000 UTC m=+71.490555444" watchObservedRunningTime="2026-03-14 00:37:49.879661104 +0000 UTC m=+71.493233052" Mar 14 00:37:50.090957 kubelet[2695]: I0314 00:37:50.089582 2695 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 14 00:37:50.090957 kubelet[2695]: I0314 00:37:50.090846 2695 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 14 00:37:51.460202 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3071431681.mount: Deactivated successfully. Mar 14 00:37:51.479005 containerd[1504]: time="2026-03-14T00:37:51.478741593Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:51.480720 containerd[1504]: time="2026-03-14T00:37:51.480673524Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Mar 14 00:37:51.482989 containerd[1504]: time="2026-03-14T00:37:51.481530964Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:51.484607 containerd[1504]: time="2026-03-14T00:37:51.484495039Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 14 00:37:51.485947 containerd[1504]: time="2026-03-14T00:37:51.485705285Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 2.067000172s" Mar 14 00:37:51.485947 containerd[1504]: time="2026-03-14T00:37:51.485758261Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Mar 14 00:37:51.492260 containerd[1504]: time="2026-03-14T00:37:51.491970301Z" level=info msg="CreateContainer within sandbox \"4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 14 00:37:51.507486 containerd[1504]: time="2026-03-14T00:37:51.506812613Z" level=info msg="CreateContainer within sandbox \"4f7a068311d0fff4c90861c0a7b892e9cbfbfb9fc94d3a1decb8d08bbdb531b7\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"1daae29564dde12e3d531c55327b9683c487936bfd7826d970607fa738dba87d\"" Mar 14 00:37:51.510953 containerd[1504]: time="2026-03-14T00:37:51.510785630Z" level=info msg="StartContainer for \"1daae29564dde12e3d531c55327b9683c487936bfd7826d970607fa738dba87d\"" Mar 14 00:37:51.571708 systemd[1]: Started cri-containerd-1daae29564dde12e3d531c55327b9683c487936bfd7826d970607fa738dba87d.scope - libcontainer container 1daae29564dde12e3d531c55327b9683c487936bfd7826d970607fa738dba87d. Mar 14 00:37:51.637992 containerd[1504]: time="2026-03-14T00:37:51.637930102Z" level=info msg="StartContainer for \"1daae29564dde12e3d531c55327b9683c487936bfd7826d970607fa738dba87d\" returns successfully" Mar 14 00:37:51.891862 kubelet[2695]: I0314 00:37:51.890995 2695 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/whisker-b56c87cb9-82slf" podStartSLOduration=2.9269322 podStartE2EDuration="24.890963831s" podCreationTimestamp="2026-03-14 00:37:27 +0000 UTC" firstStartedPulling="2026-03-14 00:37:29.523387206 +0000 UTC m=+51.136959139" lastFinishedPulling="2026-03-14 00:37:51.487418829 +0000 UTC m=+73.100990770" observedRunningTime="2026-03-14 00:37:51.888509691 +0000 UTC m=+73.502081643" watchObservedRunningTime="2026-03-14 00:37:51.890963831 +0000 UTC m=+73.504535782" Mar 14 00:38:02.979060 systemd[1]: Started sshd@11-10.230.50.222:22-20.161.92.111:57400.service - OpenSSH per-connection server daemon (20.161.92.111:57400). Mar 14 00:38:03.653407 sshd[5726]: Accepted publickey for core from 20.161.92.111 port 57400 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:38:03.657951 sshd[5726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:38:03.678179 systemd-logind[1491]: New session 12 of user core. Mar 14 00:38:03.680800 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 14 00:38:04.797920 systemd[1]: run-containerd-runc-k8s.io-e67e418b58f291c7d2b9c60f6b696594e733a0dd54cc58995c2e97ba0781a97a-runc.62ejRA.mount: Deactivated successfully. Mar 14 00:38:05.047638 sshd[5726]: pam_unix(sshd:session): session closed for user core Mar 14 00:38:05.061617 systemd-logind[1491]: Session 12 logged out. Waiting for processes to exit. Mar 14 00:38:05.062435 systemd[1]: sshd@11-10.230.50.222:22-20.161.92.111:57400.service: Deactivated successfully. Mar 14 00:38:05.069804 systemd[1]: session-12.scope: Deactivated successfully. Mar 14 00:38:05.075698 systemd-logind[1491]: Removed session 12. Mar 14 00:38:10.157781 systemd[1]: Started sshd@12-10.230.50.222:22-20.161.92.111:35078.service - OpenSSH per-connection server daemon (20.161.92.111:35078). Mar 14 00:38:10.764745 sshd[5768]: Accepted publickey for core from 20.161.92.111 port 35078 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:38:10.767249 sshd[5768]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:38:10.776129 systemd-logind[1491]: New session 13 of user core. Mar 14 00:38:10.784741 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 14 00:38:11.284044 sshd[5768]: pam_unix(sshd:session): session closed for user core Mar 14 00:38:11.289758 systemd[1]: sshd@12-10.230.50.222:22-20.161.92.111:35078.service: Deactivated successfully. Mar 14 00:38:11.293728 systemd[1]: session-13.scope: Deactivated successfully. Mar 14 00:38:11.295598 systemd-logind[1491]: Session 13 logged out. Waiting for processes to exit. Mar 14 00:38:11.296925 systemd-logind[1491]: Removed session 13. Mar 14 00:38:16.399820 systemd[1]: Started sshd@13-10.230.50.222:22-20.161.92.111:35088.service - OpenSSH per-connection server daemon (20.161.92.111:35088). Mar 14 00:38:17.056882 sshd[5850]: Accepted publickey for core from 20.161.92.111 port 35088 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:38:17.060348 sshd[5850]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:38:17.068993 systemd-logind[1491]: New session 14 of user core. Mar 14 00:38:17.073721 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 14 00:38:17.633886 sshd[5850]: pam_unix(sshd:session): session closed for user core Mar 14 00:38:17.640631 systemd[1]: sshd@13-10.230.50.222:22-20.161.92.111:35088.service: Deactivated successfully. Mar 14 00:38:17.645393 systemd[1]: session-14.scope: Deactivated successfully. Mar 14 00:38:17.646988 systemd-logind[1491]: Session 14 logged out. Waiting for processes to exit. Mar 14 00:38:17.648407 systemd-logind[1491]: Removed session 14. Mar 14 00:38:22.737865 systemd[1]: Started sshd@14-10.230.50.222:22-20.161.92.111:52950.service - OpenSSH per-connection server daemon (20.161.92.111:52950). Mar 14 00:38:23.300373 sshd[5866]: Accepted publickey for core from 20.161.92.111 port 52950 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:38:23.303180 sshd[5866]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:38:23.310268 systemd-logind[1491]: New session 15 of user core. Mar 14 00:38:23.315748 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 14 00:38:23.807679 sshd[5866]: pam_unix(sshd:session): session closed for user core Mar 14 00:38:23.813567 systemd-logind[1491]: Session 15 logged out. Waiting for processes to exit. Mar 14 00:38:23.814951 systemd[1]: sshd@14-10.230.50.222:22-20.161.92.111:52950.service: Deactivated successfully. Mar 14 00:38:23.819346 systemd[1]: session-15.scope: Deactivated successfully. Mar 14 00:38:23.821644 systemd-logind[1491]: Removed session 15. Mar 14 00:38:28.919213 systemd[1]: Started sshd@15-10.230.50.222:22-20.161.92.111:52956.service - OpenSSH per-connection server daemon (20.161.92.111:52956). Mar 14 00:38:29.520385 sshd[5901]: Accepted publickey for core from 20.161.92.111 port 52956 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:38:29.523019 sshd[5901]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:38:29.530675 systemd-logind[1491]: New session 16 of user core. Mar 14 00:38:29.536750 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 14 00:38:30.059627 sshd[5901]: pam_unix(sshd:session): session closed for user core Mar 14 00:38:30.064150 systemd[1]: sshd@15-10.230.50.222:22-20.161.92.111:52956.service: Deactivated successfully. Mar 14 00:38:30.068602 systemd[1]: session-16.scope: Deactivated successfully. Mar 14 00:38:30.070773 systemd-logind[1491]: Session 16 logged out. Waiting for processes to exit. Mar 14 00:38:30.072384 systemd-logind[1491]: Removed session 16. Mar 14 00:38:34.486959 systemd[1]: run-containerd-runc-k8s.io-e67e418b58f291c7d2b9c60f6b696594e733a0dd54cc58995c2e97ba0781a97a-runc.pQsJpF.mount: Deactivated successfully. Mar 14 00:38:35.167862 systemd[1]: Started sshd@16-10.230.50.222:22-20.161.92.111:57704.service - OpenSSH per-connection server daemon (20.161.92.111:57704). Mar 14 00:38:35.744387 sshd[5952]: Accepted publickey for core from 20.161.92.111 port 57704 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:38:35.746962 sshd[5952]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:38:35.754248 systemd-logind[1491]: New session 17 of user core. Mar 14 00:38:35.759659 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 14 00:38:36.308720 sshd[5952]: pam_unix(sshd:session): session closed for user core Mar 14 00:38:36.314796 systemd[1]: sshd@16-10.230.50.222:22-20.161.92.111:57704.service: Deactivated successfully. Mar 14 00:38:36.317297 systemd[1]: session-17.scope: Deactivated successfully. Mar 14 00:38:36.319263 systemd-logind[1491]: Session 17 logged out. Waiting for processes to exit. Mar 14 00:38:36.320836 systemd-logind[1491]: Removed session 17. Mar 14 00:38:36.415822 systemd[1]: Started sshd@17-10.230.50.222:22-20.161.92.111:57708.service - OpenSSH per-connection server daemon (20.161.92.111:57708). Mar 14 00:38:36.978565 sshd[5967]: Accepted publickey for core from 20.161.92.111 port 57708 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:38:36.981923 sshd[5967]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:38:36.989538 systemd-logind[1491]: New session 18 of user core. Mar 14 00:38:36.995663 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 14 00:38:37.606728 sshd[5967]: pam_unix(sshd:session): session closed for user core Mar 14 00:38:37.616242 systemd[1]: sshd@17-10.230.50.222:22-20.161.92.111:57708.service: Deactivated successfully. Mar 14 00:38:37.618823 systemd[1]: session-18.scope: Deactivated successfully. Mar 14 00:38:37.620555 systemd-logind[1491]: Session 18 logged out. Waiting for processes to exit. Mar 14 00:38:37.622160 systemd-logind[1491]: Removed session 18. Mar 14 00:38:37.705998 systemd[1]: Started sshd@18-10.230.50.222:22-20.161.92.111:57722.service - OpenSSH per-connection server daemon (20.161.92.111:57722). Mar 14 00:38:38.384488 sshd[5978]: Accepted publickey for core from 20.161.92.111 port 57722 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:38:38.386481 sshd[5978]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:38:38.395216 systemd-logind[1491]: New session 19 of user core. Mar 14 00:38:38.403732 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 14 00:38:38.970717 sshd[5978]: pam_unix(sshd:session): session closed for user core Mar 14 00:38:38.979548 systemd-logind[1491]: Session 19 logged out. Waiting for processes to exit. Mar 14 00:38:38.981171 systemd[1]: sshd@18-10.230.50.222:22-20.161.92.111:57722.service: Deactivated successfully. Mar 14 00:38:38.988053 systemd[1]: session-19.scope: Deactivated successfully. Mar 14 00:38:38.991367 systemd-logind[1491]: Removed session 19. Mar 14 00:38:44.074839 systemd[1]: Started sshd@19-10.230.50.222:22-20.161.92.111:40488.service - OpenSSH per-connection server daemon (20.161.92.111:40488). Mar 14 00:38:44.655473 sshd[5992]: Accepted publickey for core from 20.161.92.111 port 40488 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:38:44.658942 sshd[5992]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:38:44.665895 systemd-logind[1491]: New session 20 of user core. Mar 14 00:38:44.674710 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 14 00:38:45.203375 sshd[5992]: pam_unix(sshd:session): session closed for user core Mar 14 00:38:45.210838 systemd-logind[1491]: Session 20 logged out. Waiting for processes to exit. Mar 14 00:38:45.211573 systemd[1]: sshd@19-10.230.50.222:22-20.161.92.111:40488.service: Deactivated successfully. Mar 14 00:38:45.215938 systemd[1]: session-20.scope: Deactivated successfully. Mar 14 00:38:45.219130 systemd-logind[1491]: Removed session 20. Mar 14 00:38:45.319959 systemd[1]: Started sshd@20-10.230.50.222:22-20.161.92.111:40496.service - OpenSSH per-connection server daemon (20.161.92.111:40496). Mar 14 00:38:45.880016 systemd[1]: run-containerd-runc-k8s.io-ca9f4589eb406112603859869cf7d63331da66b87171620fe294a2bef1b93666-runc.G3oW5k.mount: Deactivated successfully. Mar 14 00:38:45.930533 sshd[6005]: Accepted publickey for core from 20.161.92.111 port 40496 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:38:45.933726 sshd[6005]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:38:45.942883 systemd-logind[1491]: New session 21 of user core. Mar 14 00:38:45.950120 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 14 00:38:46.836080 sshd[6005]: pam_unix(sshd:session): session closed for user core Mar 14 00:38:46.844409 systemd[1]: sshd@20-10.230.50.222:22-20.161.92.111:40496.service: Deactivated successfully. Mar 14 00:38:46.848192 systemd[1]: session-21.scope: Deactivated successfully. Mar 14 00:38:46.851886 systemd-logind[1491]: Session 21 logged out. Waiting for processes to exit. Mar 14 00:38:46.853806 systemd-logind[1491]: Removed session 21. Mar 14 00:38:46.942196 systemd[1]: Started sshd@21-10.230.50.222:22-20.161.92.111:40500.service - OpenSSH per-connection server daemon (20.161.92.111:40500). Mar 14 00:38:47.556547 sshd[6040]: Accepted publickey for core from 20.161.92.111 port 40500 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:38:47.559986 sshd[6040]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:38:47.571005 systemd-logind[1491]: New session 22 of user core. Mar 14 00:38:47.578671 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 14 00:38:48.892698 sshd[6040]: pam_unix(sshd:session): session closed for user core Mar 14 00:38:48.900563 systemd-logind[1491]: Session 22 logged out. Waiting for processes to exit. Mar 14 00:38:48.900947 systemd[1]: sshd@21-10.230.50.222:22-20.161.92.111:40500.service: Deactivated successfully. Mar 14 00:38:48.904798 systemd[1]: session-22.scope: Deactivated successfully. Mar 14 00:38:48.907670 systemd-logind[1491]: Removed session 22. Mar 14 00:38:48.989874 systemd[1]: Started sshd@22-10.230.50.222:22-20.161.92.111:40508.service - OpenSSH per-connection server daemon (20.161.92.111:40508). Mar 14 00:38:49.574759 sshd[6067]: Accepted publickey for core from 20.161.92.111 port 40508 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:38:49.579635 sshd[6067]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:38:49.586539 systemd-logind[1491]: New session 23 of user core. Mar 14 00:38:49.591808 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 14 00:38:50.649900 sshd[6067]: pam_unix(sshd:session): session closed for user core Mar 14 00:38:50.659365 systemd[1]: sshd@22-10.230.50.222:22-20.161.92.111:40508.service: Deactivated successfully. Mar 14 00:38:50.662053 systemd[1]: session-23.scope: Deactivated successfully. Mar 14 00:38:50.663167 systemd-logind[1491]: Session 23 logged out. Waiting for processes to exit. Mar 14 00:38:50.665344 systemd-logind[1491]: Removed session 23. Mar 14 00:38:50.752873 systemd[1]: Started sshd@23-10.230.50.222:22-20.161.92.111:58226.service - OpenSSH per-connection server daemon (20.161.92.111:58226). Mar 14 00:38:51.347499 sshd[6080]: Accepted publickey for core from 20.161.92.111 port 58226 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:38:51.349714 sshd[6080]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:38:51.360256 systemd-logind[1491]: New session 24 of user core. Mar 14 00:38:51.367249 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 14 00:38:51.860666 sshd[6080]: pam_unix(sshd:session): session closed for user core Mar 14 00:38:51.869190 systemd[1]: sshd@23-10.230.50.222:22-20.161.92.111:58226.service: Deactivated successfully. Mar 14 00:38:51.872623 systemd[1]: session-24.scope: Deactivated successfully. Mar 14 00:38:51.873943 systemd-logind[1491]: Session 24 logged out. Waiting for processes to exit. Mar 14 00:38:51.876527 systemd-logind[1491]: Removed session 24. Mar 14 00:38:56.967942 systemd[1]: Started sshd@24-10.230.50.222:22-20.161.92.111:58228.service - OpenSSH per-connection server daemon (20.161.92.111:58228). Mar 14 00:38:57.601499 sshd[6126]: Accepted publickey for core from 20.161.92.111 port 58228 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:38:57.604672 sshd[6126]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:38:57.613141 systemd-logind[1491]: New session 25 of user core. Mar 14 00:38:57.618770 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 14 00:38:58.190915 sshd[6126]: pam_unix(sshd:session): session closed for user core Mar 14 00:38:58.201003 systemd[1]: sshd@24-10.230.50.222:22-20.161.92.111:58228.service: Deactivated successfully. Mar 14 00:38:58.206203 systemd[1]: session-25.scope: Deactivated successfully. Mar 14 00:38:58.207785 systemd-logind[1491]: Session 25 logged out. Waiting for processes to exit. Mar 14 00:38:58.210384 systemd-logind[1491]: Removed session 25. Mar 14 00:39:03.299852 systemd[1]: Started sshd@25-10.230.50.222:22-20.161.92.111:40858.service - OpenSSH per-connection server daemon (20.161.92.111:40858). Mar 14 00:39:03.868651 sshd[6140]: Accepted publickey for core from 20.161.92.111 port 40858 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:39:03.872690 sshd[6140]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:39:03.881822 systemd-logind[1491]: New session 26 of user core. Mar 14 00:39:03.892860 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 14 00:39:04.405892 sshd[6140]: pam_unix(sshd:session): session closed for user core Mar 14 00:39:04.410688 systemd[1]: sshd@25-10.230.50.222:22-20.161.92.111:40858.service: Deactivated successfully. Mar 14 00:39:04.416162 systemd[1]: session-26.scope: Deactivated successfully. Mar 14 00:39:04.419181 systemd-logind[1491]: Session 26 logged out. Waiting for processes to exit. Mar 14 00:39:04.421516 systemd-logind[1491]: Removed session 26. Mar 14 00:39:09.509969 systemd[1]: Started sshd@26-10.230.50.222:22-20.161.92.111:40864.service - OpenSSH per-connection server daemon (20.161.92.111:40864). Mar 14 00:39:10.126594 sshd[6207]: Accepted publickey for core from 20.161.92.111 port 40864 ssh2: RSA SHA256:G3DxPtudQCSC+zb3xt9jRLB1yvq/SeDG59+4Mc6l5RQ Mar 14 00:39:10.128353 sshd[6207]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 14 00:39:10.136575 systemd-logind[1491]: New session 27 of user core. Mar 14 00:39:10.141711 systemd[1]: Started session-27.scope - Session 27 of User core. Mar 14 00:39:10.640188 sshd[6207]: pam_unix(sshd:session): session closed for user core Mar 14 00:39:10.648442 systemd[1]: sshd@26-10.230.50.222:22-20.161.92.111:40864.service: Deactivated successfully. Mar 14 00:39:10.653917 systemd[1]: session-27.scope: Deactivated successfully. Mar 14 00:39:10.657125 systemd-logind[1491]: Session 27 logged out. Waiting for processes to exit. Mar 14 00:39:10.661016 systemd-logind[1491]: Removed session 27.