Mar 2 12:51:57.933582 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Mon Mar 2 10:28:24 -00 2026 Mar 2 12:51:57.933620 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=82731586f036a8515942386c762f58de23efa7b4e7ecf4198e267e112154cbc2 Mar 2 12:51:57.933634 kernel: BIOS-provided physical RAM map: Mar 2 12:51:57.933645 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 2 12:51:57.933659 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 2 12:51:57.933670 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 2 12:51:57.933681 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Mar 2 12:51:57.933703 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Mar 2 12:51:57.933714 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Mar 2 12:51:57.933725 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Mar 2 12:51:57.933735 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 2 12:51:57.933745 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 2 12:51:57.933756 kernel: NX (Execute Disable) protection: active Mar 2 12:51:57.933771 kernel: APIC: Static calls initialized Mar 2 12:51:57.933784 kernel: SMBIOS 2.8 present. Mar 2 12:51:57.933796 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.16.0-3.module_el8.7.0+3346+68867adb 04/01/2014 Mar 2 12:51:57.933816 kernel: DMI: Memory slots populated: 1/1 Mar 2 12:51:57.933828 kernel: Hypervisor detected: KVM Mar 2 12:51:57.933839 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Mar 2 12:51:57.933856 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 2 12:51:57.933867 kernel: kvm-clock: using sched offset of 7896582720 cycles Mar 2 12:51:57.933879 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 2 12:51:57.933890 kernel: tsc: Detected 2799.998 MHz processor Mar 2 12:51:57.933901 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 2 12:51:57.933913 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 2 12:51:57.933924 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Mar 2 12:51:57.933935 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 2 12:51:57.933947 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 2 12:51:57.933963 kernel: Using GB pages for direct mapping Mar 2 12:51:57.933974 kernel: ACPI: Early table checksum verification disabled Mar 2 12:51:57.933985 kernel: ACPI: RSDP 0x00000000000F59E0 000014 (v00 BOCHS ) Mar 2 12:51:57.933996 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 2 12:51:57.934007 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 2 12:51:57.934018 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 2 12:51:57.934029 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Mar 2 12:51:57.934040 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 2 12:51:57.934055 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 2 12:51:57.934077 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 2 12:51:57.934089 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 2 12:51:57.934100 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Mar 2 12:51:57.934118 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Mar 2 12:51:57.934129 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Mar 2 12:51:57.934141 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Mar 2 12:51:57.934157 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Mar 2 12:51:57.934169 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Mar 2 12:51:57.934180 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Mar 2 12:51:57.934192 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Mar 2 12:51:57.934203 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Mar 2 12:51:57.934215 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Mar 2 12:51:57.934227 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00001000-0x7ffdbfff] Mar 2 12:51:57.934238 kernel: NODE_DATA(0) allocated [mem 0x7ffd4dc0-0x7ffdbfff] Mar 2 12:51:57.934255 kernel: Zone ranges: Mar 2 12:51:57.934266 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 2 12:51:57.934278 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Mar 2 12:51:57.934290 kernel: Normal empty Mar 2 12:51:57.934301 kernel: Device empty Mar 2 12:51:57.934312 kernel: Movable zone start for each node Mar 2 12:51:57.934324 kernel: Early memory node ranges Mar 2 12:51:57.934335 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 2 12:51:57.934347 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Mar 2 12:51:57.934371 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Mar 2 12:51:57.934399 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 2 12:51:57.934421 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 2 12:51:57.934434 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Mar 2 12:51:57.934446 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 2 12:51:57.934461 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 2 12:51:57.934474 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 2 12:51:57.934486 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 2 12:51:57.934498 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 2 12:51:57.934509 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 2 12:51:57.934527 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 2 12:51:57.934539 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 2 12:51:57.934559 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 2 12:51:57.934572 kernel: TSC deadline timer available Mar 2 12:51:57.934584 kernel: CPU topo: Max. logical packages: 16 Mar 2 12:51:57.934595 kernel: CPU topo: Max. logical dies: 16 Mar 2 12:51:57.934607 kernel: CPU topo: Max. dies per package: 1 Mar 2 12:51:57.934618 kernel: CPU topo: Max. threads per core: 1 Mar 2 12:51:57.934629 kernel: CPU topo: Num. cores per package: 1 Mar 2 12:51:57.934646 kernel: CPU topo: Num. threads per package: 1 Mar 2 12:51:57.934658 kernel: CPU topo: Allowing 2 present CPUs plus 14 hotplug CPUs Mar 2 12:51:57.934669 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 2 12:51:57.934681 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Mar 2 12:51:57.934692 kernel: Booting paravirtualized kernel on KVM Mar 2 12:51:57.934704 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 2 12:51:57.934715 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Mar 2 12:51:57.934727 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Mar 2 12:51:57.934739 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Mar 2 12:51:57.934755 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Mar 2 12:51:57.934766 kernel: kvm-guest: PV spinlocks enabled Mar 2 12:51:57.934778 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 2 12:51:57.934791 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=82731586f036a8515942386c762f58de23efa7b4e7ecf4198e267e112154cbc2 Mar 2 12:51:57.934803 kernel: random: crng init done Mar 2 12:51:57.934815 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 2 12:51:57.934826 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 2 12:51:57.934838 kernel: Fallback order for Node 0: 0 Mar 2 12:51:57.934854 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524154 Mar 2 12:51:57.934866 kernel: Policy zone: DMA32 Mar 2 12:51:57.934877 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 2 12:51:57.934889 kernel: software IO TLB: area num 16. Mar 2 12:51:57.934900 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Mar 2 12:51:57.934912 kernel: Kernel/User page tables isolation: enabled Mar 2 12:51:57.934923 kernel: ftrace: allocating 40099 entries in 157 pages Mar 2 12:51:57.934935 kernel: ftrace: allocated 157 pages with 5 groups Mar 2 12:51:57.934946 kernel: Dynamic Preempt: voluntary Mar 2 12:51:57.934962 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 2 12:51:57.934975 kernel: rcu: RCU event tracing is enabled. Mar 2 12:51:57.934987 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Mar 2 12:51:57.934999 kernel: Trampoline variant of Tasks RCU enabled. Mar 2 12:51:57.935018 kernel: Rude variant of Tasks RCU enabled. Mar 2 12:51:57.935032 kernel: Tracing variant of Tasks RCU enabled. Mar 2 12:51:57.935043 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 2 12:51:57.935055 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Mar 2 12:51:57.935067 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 2 12:51:57.935084 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 2 12:51:57.935096 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 2 12:51:57.935108 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Mar 2 12:51:57.935120 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 2 12:51:57.935143 kernel: Console: colour VGA+ 80x25 Mar 2 12:51:57.935159 kernel: printk: legacy console [tty0] enabled Mar 2 12:51:57.935171 kernel: printk: legacy console [ttyS0] enabled Mar 2 12:51:57.935183 kernel: ACPI: Core revision 20240827 Mar 2 12:51:57.935200 kernel: APIC: Switch to symmetric I/O mode setup Mar 2 12:51:57.935213 kernel: x2apic enabled Mar 2 12:51:57.935226 kernel: APIC: Switched APIC routing to: physical x2apic Mar 2 12:51:57.935243 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Mar 2 12:51:57.935256 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Mar 2 12:51:57.935268 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 2 12:51:57.935281 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Mar 2 12:51:57.935293 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Mar 2 12:51:57.935305 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 2 12:51:57.935321 kernel: Spectre V2 : Mitigation: Retpolines Mar 2 12:51:57.935333 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Mar 2 12:51:57.935346 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Mar 2 12:51:57.935358 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 2 12:51:57.935370 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 2 12:51:57.935382 kernel: MDS: Mitigation: Clear CPU buffers Mar 2 12:51:57.935718 kernel: MMIO Stale Data: Unknown: No mitigations Mar 2 12:51:57.935732 kernel: SRBDS: Unknown: Dependent on hypervisor status Mar 2 12:51:57.935744 kernel: active return thunk: its_return_thunk Mar 2 12:51:57.935756 kernel: ITS: Mitigation: Aligned branch/return thunks Mar 2 12:51:57.935769 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 2 12:51:57.935788 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 2 12:51:57.935800 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 2 12:51:57.935812 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 2 12:51:57.935825 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Mar 2 12:51:57.935837 kernel: Freeing SMP alternatives memory: 32K Mar 2 12:51:57.935849 kernel: pid_max: default: 32768 minimum: 301 Mar 2 12:51:57.935861 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 2 12:51:57.935873 kernel: landlock: Up and running. Mar 2 12:51:57.935885 kernel: SELinux: Initializing. Mar 2 12:51:57.935898 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 2 12:51:57.935910 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 2 12:51:57.935927 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Mar 2 12:51:57.935940 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Mar 2 12:51:57.935952 kernel: signal: max sigframe size: 1776 Mar 2 12:51:57.935974 kernel: rcu: Hierarchical SRCU implementation. Mar 2 12:51:57.935989 kernel: rcu: Max phase no-delay instances is 400. Mar 2 12:51:57.936001 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Mar 2 12:51:57.936014 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Mar 2 12:51:57.936026 kernel: smp: Bringing up secondary CPUs ... Mar 2 12:51:57.936038 kernel: smpboot: x86: Booting SMP configuration: Mar 2 12:51:57.936056 kernel: .... node #0, CPUs: #1 Mar 2 12:51:57.936068 kernel: smp: Brought up 1 node, 2 CPUs Mar 2 12:51:57.936081 kernel: smpboot: Total of 2 processors activated (11199.99 BogoMIPS) Mar 2 12:51:57.936094 kernel: Memory: 1887476K/2096616K available (14336K kernel code, 2445K rwdata, 26064K rodata, 46192K init, 2568K bss, 203124K reserved, 0K cma-reserved) Mar 2 12:51:57.936107 kernel: devtmpfs: initialized Mar 2 12:51:57.936119 kernel: x86/mm: Memory block size: 128MB Mar 2 12:51:57.936131 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 2 12:51:57.936144 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Mar 2 12:51:57.936156 kernel: pinctrl core: initialized pinctrl subsystem Mar 2 12:51:57.936173 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 2 12:51:57.936186 kernel: audit: initializing netlink subsys (disabled) Mar 2 12:51:57.936198 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 2 12:51:57.936210 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 2 12:51:57.936222 kernel: cpuidle: using governor menu Mar 2 12:51:57.936235 kernel: audit: type=2000 audit(1772455914.330:1): state=initialized audit_enabled=0 res=1 Mar 2 12:51:57.936247 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 2 12:51:57.936259 kernel: dca service started, version 1.12.1 Mar 2 12:51:57.936271 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Mar 2 12:51:57.936294 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry Mar 2 12:51:57.936307 kernel: PCI: Using configuration type 1 for base access Mar 2 12:51:57.936319 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 2 12:51:57.936332 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 2 12:51:57.936344 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 2 12:51:57.936356 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 2 12:51:57.936368 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 2 12:51:57.936380 kernel: ACPI: Added _OSI(Module Device) Mar 2 12:51:57.936407 kernel: ACPI: Added _OSI(Processor Device) Mar 2 12:51:57.936427 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 2 12:51:57.936439 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 2 12:51:57.936451 kernel: ACPI: Interpreter enabled Mar 2 12:51:57.936463 kernel: ACPI: PM: (supports S0 S5) Mar 2 12:51:57.936475 kernel: ACPI: Using IOAPIC for interrupt routing Mar 2 12:51:57.936488 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 2 12:51:57.936500 kernel: PCI: Using E820 reservations for host bridge windows Mar 2 12:51:57.936512 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 2 12:51:57.936524 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 2 12:51:57.936814 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 2 12:51:57.936989 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 2 12:51:57.937156 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 2 12:51:57.937175 kernel: PCI host bridge to bus 0000:00 Mar 2 12:51:57.937355 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 2 12:51:57.938574 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 2 12:51:57.938744 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 2 12:51:57.938899 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Mar 2 12:51:57.939050 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 2 12:51:57.939202 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Mar 2 12:51:57.939352 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 2 12:51:57.940622 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Mar 2 12:51:57.940826 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 conventional PCI endpoint Mar 2 12:51:57.941004 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfa000000-0xfbffffff pref] Mar 2 12:51:57.941170 kernel: pci 0000:00:01.0: BAR 1 [mem 0xfea50000-0xfea50fff] Mar 2 12:51:57.941333 kernel: pci 0000:00:01.0: ROM [mem 0xfea40000-0xfea4ffff pref] Mar 2 12:51:57.942569 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 2 12:51:57.942781 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 2 12:51:57.942962 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea51000-0xfea51fff] Mar 2 12:51:57.943162 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 2 12:51:57.943336 kernel: pci 0000:00:02.0: bridge window [io 0xc000-0xcfff] Mar 2 12:51:57.943524 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Mar 2 12:51:57.943706 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 2 12:51:57.943882 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 2 12:51:57.944048 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea52000-0xfea52fff] Mar 2 12:51:57.944212 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 2 12:51:57.944375 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Mar 2 12:51:57.948634 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 2 12:51:57.948855 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 2 12:51:57.949031 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea53000-0xfea53fff] Mar 2 12:51:57.949200 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 2 12:51:57.949365 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Mar 2 12:51:57.950614 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 2 12:51:57.950816 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 2 12:51:57.950996 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea54000-0xfea54fff] Mar 2 12:51:57.951163 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 2 12:51:57.951327 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Mar 2 12:51:57.953548 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 2 12:51:57.953772 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 2 12:51:57.953955 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea55000-0xfea55fff] Mar 2 12:51:57.954124 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 2 12:51:57.954302 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Mar 2 12:51:57.954508 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 2 12:51:57.954717 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 2 12:51:57.954886 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea56000-0xfea56fff] Mar 2 12:51:57.955054 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 2 12:51:57.955220 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Mar 2 12:51:57.956416 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 2 12:51:57.956658 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 2 12:51:57.956843 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea57000-0xfea57fff] Mar 2 12:51:57.957010 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 2 12:51:57.957190 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Mar 2 12:51:57.957355 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 2 12:51:57.957576 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 2 12:51:57.957755 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea58000-0xfea58fff] Mar 2 12:51:57.957919 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 2 12:51:57.958083 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Mar 2 12:51:57.958248 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 2 12:51:57.961978 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Mar 2 12:51:57.962163 kernel: pci 0000:00:03.0: BAR 0 [io 0xd0c0-0xd0df] Mar 2 12:51:57.962337 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfea59000-0xfea59fff] Mar 2 12:51:57.962529 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Mar 2 12:51:57.962720 kernel: pci 0000:00:03.0: ROM [mem 0xfea00000-0xfea3ffff pref] Mar 2 12:51:57.962907 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Mar 2 12:51:57.963075 kernel: pci 0000:00:04.0: BAR 0 [io 0xd000-0xd07f] Mar 2 12:51:57.963240 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfea5a000-0xfea5afff] Mar 2 12:51:57.963847 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfd004000-0xfd007fff 64bit pref] Mar 2 12:51:57.964057 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Mar 2 12:51:57.964232 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 2 12:51:57.964467 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Mar 2 12:51:57.964655 kernel: pci 0000:00:1f.2: BAR 4 [io 0xd0e0-0xd0ff] Mar 2 12:51:57.964821 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea5b000-0xfea5bfff] Mar 2 12:51:57.965004 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Mar 2 12:51:57.965169 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Mar 2 12:51:57.965359 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Mar 2 12:51:57.967944 kernel: pci 0000:01:00.0: BAR 0 [mem 0xfda00000-0xfda000ff 64bit] Mar 2 12:51:57.968126 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 2 12:51:57.968303 kernel: pci 0000:01:00.0: bridge window [io 0xc000-0xcfff] Mar 2 12:51:57.969536 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 2 12:51:57.969731 kernel: pci 0000:01:00.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 2 12:51:57.969903 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 2 12:51:57.970107 kernel: pci_bus 0000:02: extended config space not accessible Mar 2 12:51:57.970332 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 conventional PCI endpoint Mar 2 12:51:57.971094 kernel: pci 0000:02:01.0: BAR 0 [mem 0xfd800000-0xfd80000f] Mar 2 12:51:57.971277 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 2 12:51:57.972516 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Mar 2 12:51:57.972715 kernel: pci 0000:03:00.0: BAR 0 [mem 0xfe800000-0xfe803fff 64bit] Mar 2 12:51:57.972887 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 2 12:51:57.973100 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Mar 2 12:51:57.973286 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Mar 2 12:51:57.974075 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 2 12:51:57.974254 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 2 12:51:57.976467 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 2 12:51:57.976672 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 2 12:51:57.976850 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 2 12:51:57.977033 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 2 12:51:57.977055 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 2 12:51:57.977082 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 2 12:51:57.977097 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 2 12:51:57.977110 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 2 12:51:57.977122 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 2 12:51:57.977135 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 2 12:51:57.977148 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 2 12:51:57.977160 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 2 12:51:57.977180 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 2 12:51:57.977193 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 2 12:51:57.977205 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 2 12:51:57.977217 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 2 12:51:57.977230 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 2 12:51:57.977243 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 2 12:51:57.977256 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 2 12:51:57.977268 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 2 12:51:57.977281 kernel: iommu: Default domain type: Translated Mar 2 12:51:57.977299 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 2 12:51:57.977311 kernel: PCI: Using ACPI for IRQ routing Mar 2 12:51:57.977324 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 2 12:51:57.977336 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 2 12:51:57.977349 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Mar 2 12:51:57.977559 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 2 12:51:57.977732 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 2 12:51:57.977911 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 2 12:51:57.977932 kernel: vgaarb: loaded Mar 2 12:51:57.977953 kernel: clocksource: Switched to clocksource kvm-clock Mar 2 12:51:57.977966 kernel: VFS: Disk quotas dquot_6.6.0 Mar 2 12:51:57.977978 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 2 12:51:57.977991 kernel: pnp: PnP ACPI init Mar 2 12:51:57.978191 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Mar 2 12:51:57.978213 kernel: pnp: PnP ACPI: found 5 devices Mar 2 12:51:57.978226 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 2 12:51:57.978239 kernel: NET: Registered PF_INET protocol family Mar 2 12:51:57.978251 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 2 12:51:57.978271 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Mar 2 12:51:57.978284 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 2 12:51:57.978297 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 2 12:51:57.978309 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Mar 2 12:51:57.978322 kernel: TCP: Hash tables configured (established 16384 bind 16384) Mar 2 12:51:57.978335 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 2 12:51:57.978348 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 2 12:51:57.978360 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 2 12:51:57.978378 kernel: NET: Registered PF_XDP protocol family Mar 2 12:51:57.980598 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Mar 2 12:51:57.980778 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Mar 2 12:51:57.980950 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Mar 2 12:51:57.981119 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 2 12:51:57.981304 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 2 12:51:57.981495 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 2 12:51:57.981675 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 2 12:51:57.981851 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff]: assigned Mar 2 12:51:57.982016 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff]: assigned Mar 2 12:51:57.982181 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff]: assigned Mar 2 12:51:57.982345 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff]: assigned Mar 2 12:51:57.984568 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff]: assigned Mar 2 12:51:57.984747 kernel: pci 0000:00:02.6: bridge window [io 0x6000-0x6fff]: assigned Mar 2 12:51:57.984919 kernel: pci 0000:00:02.7: bridge window [io 0x7000-0x7fff]: assigned Mar 2 12:51:57.985124 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 2 12:51:57.985321 kernel: pci 0000:01:00.0: bridge window [io 0xc000-0xcfff] Mar 2 12:51:57.985585 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 2 12:51:57.985760 kernel: pci 0000:01:00.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 2 12:51:57.985928 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 2 12:51:57.986093 kernel: pci 0000:00:02.0: bridge window [io 0xc000-0xcfff] Mar 2 12:51:57.986258 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Mar 2 12:51:57.986447 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 2 12:51:57.986629 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 2 12:51:57.986794 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff] Mar 2 12:51:57.986958 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Mar 2 12:51:57.987132 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 2 12:51:57.987310 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 2 12:51:57.989517 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff] Mar 2 12:51:57.989707 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Mar 2 12:51:57.989877 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 2 12:51:57.990047 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 2 12:51:57.990224 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff] Mar 2 12:51:57.990407 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Mar 2 12:51:57.990592 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 2 12:51:57.990759 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 2 12:51:57.990924 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff] Mar 2 12:51:57.991098 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Mar 2 12:51:57.991263 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 2 12:51:57.993463 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 2 12:51:57.993658 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff] Mar 2 12:51:57.993831 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Mar 2 12:51:57.994001 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 2 12:51:57.994171 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 2 12:51:57.994337 kernel: pci 0000:00:02.6: bridge window [io 0x6000-0x6fff] Mar 2 12:51:57.994526 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Mar 2 12:51:57.994715 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 2 12:51:57.994882 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 2 12:51:57.995047 kernel: pci 0000:00:02.7: bridge window [io 0x7000-0x7fff] Mar 2 12:51:57.995210 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Mar 2 12:51:57.995373 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 2 12:51:58.001529 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 2 12:51:58.001705 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 2 12:51:58.001857 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 2 12:51:58.002008 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Mar 2 12:51:58.002168 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Mar 2 12:51:58.002318 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Mar 2 12:51:58.002514 kernel: pci_bus 0000:01: resource 0 [io 0xc000-0xcfff] Mar 2 12:51:58.002720 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Mar 2 12:51:58.002881 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Mar 2 12:51:58.003050 kernel: pci_bus 0000:02: resource 0 [io 0xc000-0xcfff] Mar 2 12:51:58.003213 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Mar 2 12:51:58.003398 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Mar 2 12:51:58.003664 kernel: pci_bus 0000:03: resource 0 [io 0x1000-0x1fff] Mar 2 12:51:58.003824 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Mar 2 12:51:58.003986 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 2 12:51:58.004163 kernel: pci_bus 0000:04: resource 0 [io 0x2000-0x2fff] Mar 2 12:51:58.004322 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Mar 2 12:51:58.004498 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 2 12:51:58.004715 kernel: pci_bus 0000:05: resource 0 [io 0x3000-0x3fff] Mar 2 12:51:58.004874 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Mar 2 12:51:58.005031 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 2 12:51:58.005197 kernel: pci_bus 0000:06: resource 0 [io 0x4000-0x4fff] Mar 2 12:51:58.005353 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Mar 2 12:51:58.006577 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 2 12:51:58.006767 kernel: pci_bus 0000:07: resource 0 [io 0x5000-0x5fff] Mar 2 12:51:58.006935 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Mar 2 12:51:58.007092 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 2 12:51:58.007267 kernel: pci_bus 0000:08: resource 0 [io 0x6000-0x6fff] Mar 2 12:51:58.007449 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Mar 2 12:51:58.007623 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 2 12:51:58.007821 kernel: pci_bus 0000:09: resource 0 [io 0x7000-0x7fff] Mar 2 12:51:58.007989 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Mar 2 12:51:58.008146 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 2 12:51:58.008168 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 2 12:51:58.008181 kernel: PCI: CLS 0 bytes, default 64 Mar 2 12:51:58.008195 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 2 12:51:58.008208 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Mar 2 12:51:58.008222 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Mar 2 12:51:58.008242 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Mar 2 12:51:58.008255 kernel: Initialise system trusted keyrings Mar 2 12:51:58.008268 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Mar 2 12:51:58.008281 kernel: Key type asymmetric registered Mar 2 12:51:58.008295 kernel: Asymmetric key parser 'x509' registered Mar 2 12:51:58.008307 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 2 12:51:58.008320 kernel: io scheduler mq-deadline registered Mar 2 12:51:58.008333 kernel: io scheduler kyber registered Mar 2 12:51:58.008347 kernel: io scheduler bfq registered Mar 2 12:51:58.011894 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 2 12:51:58.012074 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 2 12:51:58.012243 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 2 12:51:58.012434 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 2 12:51:58.012616 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 2 12:51:58.012782 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 2 12:51:58.012957 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 2 12:51:58.013122 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 2 12:51:58.013285 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 2 12:51:58.013469 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 2 12:51:58.013648 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 2 12:51:58.013813 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 2 12:51:58.013986 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 2 12:51:58.014150 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 2 12:51:58.014313 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 2 12:51:58.015525 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 2 12:51:58.015715 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 2 12:51:58.015886 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 2 12:51:58.016062 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 2 12:51:58.016230 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 2 12:51:58.018437 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 2 12:51:58.018630 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 2 12:51:58.018801 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 2 12:51:58.018970 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 2 12:51:58.018999 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 2 12:51:58.019018 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 2 12:51:58.019032 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Mar 2 12:51:58.019045 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 2 12:51:58.019058 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 2 12:51:58.019076 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 2 12:51:58.019090 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 2 12:51:58.019103 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 2 12:51:58.019283 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 2 12:51:58.019312 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 2 12:51:58.019490 kernel: rtc_cmos 00:03: registered as rtc0 Mar 2 12:51:58.019663 kernel: rtc_cmos 00:03: setting system clock to 2026-03-02T12:51:57 UTC (1772455917) Mar 2 12:51:58.019820 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Mar 2 12:51:58.019839 kernel: intel_pstate: CPU model not supported Mar 2 12:51:58.019853 kernel: NET: Registered PF_INET6 protocol family Mar 2 12:51:58.019866 kernel: Segment Routing with IPv6 Mar 2 12:51:58.019879 kernel: In-situ OAM (IOAM) with IPv6 Mar 2 12:51:58.019900 kernel: NET: Registered PF_PACKET protocol family Mar 2 12:51:58.019913 kernel: Key type dns_resolver registered Mar 2 12:51:58.019926 kernel: IPI shorthand broadcast: enabled Mar 2 12:51:58.019939 kernel: sched_clock: Marking stable (3731004166, 213481225)->(4285963622, -341478231) Mar 2 12:51:58.019952 kernel: registered taskstats version 1 Mar 2 12:51:58.019965 kernel: Loading compiled-in X.509 certificates Mar 2 12:51:58.019978 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: ca052fea375a75b056ebd4154b64794dffb70b96' Mar 2 12:51:58.019991 kernel: Demotion targets for Node 0: null Mar 2 12:51:58.020004 kernel: Key type .fscrypt registered Mar 2 12:51:58.020021 kernel: Key type fscrypt-provisioning registered Mar 2 12:51:58.020034 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 2 12:51:58.020047 kernel: ima: Allocated hash algorithm: sha1 Mar 2 12:51:58.020060 kernel: ima: No architecture policies found Mar 2 12:51:58.020073 kernel: clk: Disabling unused clocks Mar 2 12:51:58.020086 kernel: Warning: unable to open an initial console. Mar 2 12:51:58.020100 kernel: Freeing unused kernel image (initmem) memory: 46192K Mar 2 12:51:58.020113 kernel: Write protecting the kernel read-only data: 40960k Mar 2 12:51:58.020126 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Mar 2 12:51:58.020143 kernel: Run /init as init process Mar 2 12:51:58.020156 kernel: with arguments: Mar 2 12:51:58.020169 kernel: /init Mar 2 12:51:58.020182 kernel: with environment: Mar 2 12:51:58.020195 kernel: HOME=/ Mar 2 12:51:58.020207 kernel: TERM=linux Mar 2 12:51:58.020222 systemd[1]: Successfully made /usr/ read-only. Mar 2 12:51:58.020240 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 2 12:51:58.020260 systemd[1]: Detected virtualization kvm. Mar 2 12:51:58.020273 systemd[1]: Detected architecture x86-64. Mar 2 12:51:58.020287 systemd[1]: Running in initrd. Mar 2 12:51:58.020300 systemd[1]: No hostname configured, using default hostname. Mar 2 12:51:58.020314 systemd[1]: Hostname set to . Mar 2 12:51:58.020328 systemd[1]: Initializing machine ID from VM UUID. Mar 2 12:51:58.020341 systemd[1]: Queued start job for default target initrd.target. Mar 2 12:51:58.020355 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 2 12:51:58.020374 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 2 12:51:58.022410 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 2 12:51:58.022433 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 2 12:51:58.022448 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 2 12:51:58.022463 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 2 12:51:58.022479 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 2 12:51:58.022493 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 2 12:51:58.022515 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 2 12:51:58.022529 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 2 12:51:58.022543 systemd[1]: Reached target paths.target - Path Units. Mar 2 12:51:58.022570 systemd[1]: Reached target slices.target - Slice Units. Mar 2 12:51:58.022584 systemd[1]: Reached target swap.target - Swaps. Mar 2 12:51:58.022598 systemd[1]: Reached target timers.target - Timer Units. Mar 2 12:51:58.022612 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 2 12:51:58.022626 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 2 12:51:58.022646 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 2 12:51:58.022660 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 2 12:51:58.022674 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 2 12:51:58.022688 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 2 12:51:58.022703 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 2 12:51:58.022717 systemd[1]: Reached target sockets.target - Socket Units. Mar 2 12:51:58.022731 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 2 12:51:58.022745 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 2 12:51:58.022759 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 2 12:51:58.022779 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 2 12:51:58.022793 systemd[1]: Starting systemd-fsck-usr.service... Mar 2 12:51:58.022807 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 2 12:51:58.022821 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 2 12:51:58.022835 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 2 12:51:58.022849 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 2 12:51:58.022915 systemd-journald[211]: Collecting audit messages is disabled. Mar 2 12:51:58.022950 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 2 12:51:58.022971 systemd[1]: Finished systemd-fsck-usr.service. Mar 2 12:51:58.022986 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 2 12:51:58.023000 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 2 12:51:58.023015 systemd-journald[211]: Journal started Mar 2 12:51:58.023040 systemd-journald[211]: Runtime Journal (/run/log/journal/83e28ea300e04bebb456ed59bf48c1f3) is 4.7M, max 37.8M, 33.1M free. Mar 2 12:51:57.971446 systemd-modules-load[212]: Inserted module 'overlay' Mar 2 12:51:58.051514 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 2 12:51:58.051565 kernel: Bridge firewalling registered Mar 2 12:51:58.051586 systemd[1]: Started systemd-journald.service - Journal Service. Mar 2 12:51:58.026215 systemd-modules-load[212]: Inserted module 'br_netfilter' Mar 2 12:51:58.052844 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 2 12:51:58.054112 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 2 12:51:58.057535 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 2 12:51:58.059546 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 2 12:51:58.064530 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 2 12:51:58.080914 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 2 12:51:58.096922 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 2 12:51:58.099721 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 2 12:51:58.106038 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 2 12:51:58.106194 systemd-tmpfiles[229]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 2 12:51:58.108031 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 2 12:51:58.120591 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 2 12:51:58.124543 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 2 12:51:58.148787 dracut-cmdline[247]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=82731586f036a8515942386c762f58de23efa7b4e7ecf4198e267e112154cbc2 Mar 2 12:51:58.178121 systemd-resolved[249]: Positive Trust Anchors: Mar 2 12:51:58.178144 systemd-resolved[249]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 2 12:51:58.178186 systemd-resolved[249]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 2 12:51:58.186409 systemd-resolved[249]: Defaulting to hostname 'linux'. Mar 2 12:51:58.188205 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 2 12:51:58.192217 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 2 12:51:58.291419 kernel: SCSI subsystem initialized Mar 2 12:51:58.301425 kernel: Loading iSCSI transport class v2.0-870. Mar 2 12:51:58.315444 kernel: iscsi: registered transport (tcp) Mar 2 12:51:58.340801 kernel: iscsi: registered transport (qla4xxx) Mar 2 12:51:58.340926 kernel: QLogic iSCSI HBA Driver Mar 2 12:51:58.367652 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 2 12:51:58.388413 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 2 12:51:58.389987 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 2 12:51:58.455173 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 2 12:51:58.466034 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 2 12:51:58.519441 kernel: raid6: sse2x4 gen() 14682 MB/s Mar 2 12:51:58.537433 kernel: raid6: sse2x2 gen() 10081 MB/s Mar 2 12:51:58.555894 kernel: raid6: sse2x1 gen() 10750 MB/s Mar 2 12:51:58.556037 kernel: raid6: using algorithm sse2x4 gen() 14682 MB/s Mar 2 12:51:58.574882 kernel: raid6: .... xor() 8242 MB/s, rmw enabled Mar 2 12:51:58.575006 kernel: raid6: using ssse3x2 recovery algorithm Mar 2 12:51:58.603435 kernel: xor: automatically using best checksumming function avx Mar 2 12:51:58.836650 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 2 12:51:58.844755 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 2 12:51:58.851872 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 2 12:51:58.886975 systemd-udevd[458]: Using default interface naming scheme 'v255'. Mar 2 12:51:58.895757 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 2 12:51:58.900194 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 2 12:51:58.931348 dracut-pre-trigger[469]: rd.md=0: removing MD RAID activation Mar 2 12:51:58.965977 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 2 12:51:58.973933 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 2 12:51:59.096270 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 2 12:51:59.100271 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 2 12:51:59.221417 kernel: ACPI: bus type USB registered Mar 2 12:51:59.221493 kernel: usbcore: registered new interface driver usbfs Mar 2 12:51:59.231444 kernel: usbcore: registered new interface driver hub Mar 2 12:51:59.237408 kernel: usbcore: registered new device driver usb Mar 2 12:51:59.254424 kernel: cryptd: max_cpu_qlen set to 1000 Mar 2 12:51:59.257414 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Mar 2 12:51:59.278615 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Mar 2 12:51:59.283084 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 2 12:51:59.283890 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 2 12:51:59.286046 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 2 12:51:59.290098 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 2 12:51:59.294422 kernel: AES CTR mode by8 optimization enabled Mar 2 12:51:59.295246 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 2 12:51:59.346027 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Mar 2 12:51:59.346066 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 2 12:51:59.346094 kernel: GPT:17805311 != 125829119 Mar 2 12:51:59.346112 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 2 12:51:59.346218 kernel: GPT:17805311 != 125829119 Mar 2 12:51:59.346242 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 2 12:51:59.346347 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 2 12:51:59.404424 kernel: libata version 3.00 loaded. Mar 2 12:51:59.413450 kernel: ahci 0000:00:1f.2: version 3.0 Mar 2 12:51:59.413829 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 2 12:51:59.417738 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Mar 2 12:51:59.418064 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Mar 2 12:51:59.418274 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 2 12:51:59.425285 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 2 12:51:59.519315 kernel: scsi host0: ahci Mar 2 12:51:59.521282 kernel: scsi host1: ahci Mar 2 12:51:59.521623 kernel: scsi host2: ahci Mar 2 12:51:59.521937 kernel: scsi host3: ahci Mar 2 12:51:59.522176 kernel: scsi host4: ahci Mar 2 12:51:59.522455 kernel: scsi host5: ahci Mar 2 12:51:59.522719 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 38 lpm-pol 1 Mar 2 12:51:59.522748 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 38 lpm-pol 1 Mar 2 12:51:59.522772 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 38 lpm-pol 1 Mar 2 12:51:59.522805 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 38 lpm-pol 1 Mar 2 12:51:59.522830 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 38 lpm-pol 1 Mar 2 12:51:59.522848 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 38 lpm-pol 1 Mar 2 12:51:59.522178 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 2 12:51:59.536653 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 2 12:51:59.557897 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 2 12:51:59.561254 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 2 12:51:59.576645 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 2 12:51:59.580883 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 2 12:51:59.611764 disk-uuid[616]: Primary Header is updated. Mar 2 12:51:59.611764 disk-uuid[616]: Secondary Entries is updated. Mar 2 12:51:59.611764 disk-uuid[616]: Secondary Header is updated. Mar 2 12:51:59.626741 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 2 12:51:59.747055 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 2 12:51:59.747449 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 2 12:51:59.747476 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 2 12:51:59.751415 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 2 12:51:59.757233 kernel: ata1: SATA link down (SStatus 0 SControl 300) Mar 2 12:51:59.757319 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 2 12:51:59.790484 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Mar 2 12:51:59.800822 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Mar 2 12:51:59.811575 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 2 12:51:59.816418 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Mar 2 12:51:59.822456 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Mar 2 12:51:59.822778 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Mar 2 12:51:59.825859 kernel: hub 1-0:1.0: USB hub found Mar 2 12:51:59.826200 kernel: hub 1-0:1.0: 4 ports detected Mar 2 12:51:59.829420 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 2 12:51:59.834536 kernel: hub 2-0:1.0: USB hub found Mar 2 12:51:59.835769 kernel: hub 2-0:1.0: 4 ports detected Mar 2 12:51:59.950172 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 2 12:51:59.956606 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 2 12:51:59.957527 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 2 12:51:59.959215 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 2 12:51:59.964397 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 2 12:52:00.015108 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 2 12:52:00.070818 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 2 12:52:00.218492 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 2 12:52:00.226457 kernel: usbcore: registered new interface driver usbhid Mar 2 12:52:00.226547 kernel: usbhid: USB HID core driver Mar 2 12:52:00.236787 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Mar 2 12:52:00.237086 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Mar 2 12:52:00.657511 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 2 12:52:00.658452 disk-uuid[617]: The operation has completed successfully. Mar 2 12:52:00.775602 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 2 12:52:00.776059 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 2 12:52:00.807219 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 2 12:52:00.833658 sh[642]: Success Mar 2 12:52:00.868732 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 2 12:52:00.869327 kernel: device-mapper: uevent: version 1.0.3 Mar 2 12:52:00.869373 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 2 12:52:00.892925 kernel: device-mapper: verity: sha256 using shash "sha256-avx" Mar 2 12:52:01.016545 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 2 12:52:01.025529 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 2 12:52:01.049541 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 2 12:52:01.073492 kernel: BTRFS: device fsid 760529e6-8e55-47fc-ad5a-c1c1d184e50a devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (654) Mar 2 12:52:01.081680 kernel: BTRFS info (device dm-0): first mount of filesystem 760529e6-8e55-47fc-ad5a-c1c1d184e50a Mar 2 12:52:01.082129 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 2 12:52:01.099239 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 2 12:52:01.099357 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 2 12:52:01.105306 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 2 12:52:01.107160 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 2 12:52:01.111341 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 2 12:52:01.117911 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 2 12:52:01.120563 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 2 12:52:01.178528 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (687) Mar 2 12:52:01.183050 kernel: BTRFS info (device vda6): first mount of filesystem 81b29f52-362f-4f57-bc73-813781f2dfeb Mar 2 12:52:01.185445 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 2 12:52:01.198480 kernel: BTRFS info (device vda6): turning on async discard Mar 2 12:52:01.198912 kernel: BTRFS info (device vda6): enabling free space tree Mar 2 12:52:01.210502 kernel: BTRFS info (device vda6): last unmount of filesystem 81b29f52-362f-4f57-bc73-813781f2dfeb Mar 2 12:52:01.219188 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 2 12:52:01.223665 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 2 12:52:01.351662 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 2 12:52:01.358083 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 2 12:52:01.449165 systemd-networkd[825]: lo: Link UP Mar 2 12:52:01.449208 systemd-networkd[825]: lo: Gained carrier Mar 2 12:52:01.452826 systemd-networkd[825]: Enumeration completed Mar 2 12:52:01.454545 systemd-networkd[825]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 2 12:52:01.454556 systemd-networkd[825]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 2 12:52:01.458418 systemd-networkd[825]: eth0: Link UP Mar 2 12:52:01.459214 systemd-networkd[825]: eth0: Gained carrier Mar 2 12:52:01.459238 systemd-networkd[825]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 2 12:52:01.481323 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 2 12:52:01.487036 systemd[1]: Reached target network.target - Network. Mar 2 12:52:01.562137 systemd-networkd[825]: eth0: DHCPv4 address 10.243.74.166/30, gateway 10.243.74.165 acquired from 10.243.74.165 Mar 2 12:52:01.716890 ignition[746]: Ignition 2.22.0 Mar 2 12:52:01.718443 ignition[746]: Stage: fetch-offline Mar 2 12:52:01.718644 ignition[746]: no configs at "/usr/lib/ignition/base.d" Mar 2 12:52:01.718667 ignition[746]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 2 12:52:01.723760 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 2 12:52:01.718915 ignition[746]: parsed url from cmdline: "" Mar 2 12:52:01.718922 ignition[746]: no config URL provided Mar 2 12:52:01.718937 ignition[746]: reading system config file "/usr/lib/ignition/user.ign" Mar 2 12:52:01.718958 ignition[746]: no config at "/usr/lib/ignition/user.ign" Mar 2 12:52:01.718976 ignition[746]: failed to fetch config: resource requires networking Mar 2 12:52:01.720713 ignition[746]: Ignition finished successfully Mar 2 12:52:01.739883 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 2 12:52:01.815756 ignition[835]: Ignition 2.22.0 Mar 2 12:52:01.819607 ignition[835]: Stage: fetch Mar 2 12:52:01.820296 ignition[835]: no configs at "/usr/lib/ignition/base.d" Mar 2 12:52:01.820323 ignition[835]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 2 12:52:01.823934 ignition[835]: parsed url from cmdline: "" Mar 2 12:52:01.823944 ignition[835]: no config URL provided Mar 2 12:52:01.823964 ignition[835]: reading system config file "/usr/lib/ignition/user.ign" Mar 2 12:52:01.823992 ignition[835]: no config at "/usr/lib/ignition/user.ign" Mar 2 12:52:01.824486 ignition[835]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 2 12:52:01.824597 ignition[835]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 2 12:52:01.827625 ignition[835]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 2 12:52:01.861249 ignition[835]: GET result: OK Mar 2 12:52:01.861667 ignition[835]: parsing config with SHA512: 9b919809aa19a12ed4876de780cd459815cb9d138761cefd40af40936ce97b3d1b1e7f2a8c174f25bef427b7bc71efeaa9fd569528827fdabf45b9d902f3301f Mar 2 12:52:01.876024 unknown[835]: fetched base config from "system" Mar 2 12:52:01.879007 unknown[835]: fetched base config from "system" Mar 2 12:52:01.879637 ignition[835]: fetch: fetch complete Mar 2 12:52:01.879028 unknown[835]: fetched user config from "openstack" Mar 2 12:52:01.879648 ignition[835]: fetch: fetch passed Mar 2 12:52:01.879771 ignition[835]: Ignition finished successfully Mar 2 12:52:01.884008 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 2 12:52:01.888504 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 2 12:52:02.007547 ignition[841]: Ignition 2.22.0 Mar 2 12:52:02.007592 ignition[841]: Stage: kargs Mar 2 12:52:02.008129 ignition[841]: no configs at "/usr/lib/ignition/base.d" Mar 2 12:52:02.016138 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 2 12:52:02.008156 ignition[841]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 2 12:52:02.009567 ignition[841]: kargs: kargs passed Mar 2 12:52:02.009705 ignition[841]: Ignition finished successfully Mar 2 12:52:02.020930 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 2 12:52:02.115417 ignition[847]: Ignition 2.22.0 Mar 2 12:52:02.115464 ignition[847]: Stage: disks Mar 2 12:52:02.115946 ignition[847]: no configs at "/usr/lib/ignition/base.d" Mar 2 12:52:02.115975 ignition[847]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 2 12:52:02.117689 ignition[847]: disks: disks passed Mar 2 12:52:02.117818 ignition[847]: Ignition finished successfully Mar 2 12:52:02.121604 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 2 12:52:02.126776 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 2 12:52:02.128959 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 2 12:52:02.129837 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 2 12:52:02.130499 systemd[1]: Reached target sysinit.target - System Initialization. Mar 2 12:52:02.131123 systemd[1]: Reached target basic.target - Basic System. Mar 2 12:52:02.139672 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 2 12:52:02.203522 systemd-fsck[855]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Mar 2 12:52:02.217355 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 2 12:52:02.222555 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 2 12:52:02.361418 kernel: EXT4-fs (vda9): mounted filesystem 9d55f1a4-66ad-43d6-b325-f6b8d2d08c3e r/w with ordered data mode. Quota mode: none. Mar 2 12:52:02.363126 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 2 12:52:02.364541 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 2 12:52:02.367610 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 2 12:52:02.370496 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 2 12:52:02.372665 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 2 12:52:02.375003 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Mar 2 12:52:02.375791 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 2 12:52:02.375832 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 2 12:52:02.389661 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 2 12:52:02.393632 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 2 12:52:02.406999 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (863) Mar 2 12:52:02.407095 kernel: BTRFS info (device vda6): first mount of filesystem 81b29f52-362f-4f57-bc73-813781f2dfeb Mar 2 12:52:02.407122 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 2 12:52:02.413742 kernel: BTRFS info (device vda6): turning on async discard Mar 2 12:52:02.413827 kernel: BTRFS info (device vda6): enabling free space tree Mar 2 12:52:02.419474 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 2 12:52:02.476464 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 2 12:52:02.483060 initrd-setup-root[891]: cut: /sysroot/etc/passwd: No such file or directory Mar 2 12:52:02.495416 initrd-setup-root[898]: cut: /sysroot/etc/group: No such file or directory Mar 2 12:52:02.505417 initrd-setup-root[905]: cut: /sysroot/etc/shadow: No such file or directory Mar 2 12:52:02.512825 initrd-setup-root[912]: cut: /sysroot/etc/gshadow: No such file or directory Mar 2 12:52:02.628608 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 2 12:52:02.632049 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 2 12:52:02.635412 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 2 12:52:02.651817 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 2 12:52:02.654620 kernel: BTRFS info (device vda6): last unmount of filesystem 81b29f52-362f-4f57-bc73-813781f2dfeb Mar 2 12:52:02.679459 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 2 12:52:02.712559 ignition[981]: INFO : Ignition 2.22.0 Mar 2 12:52:02.712559 ignition[981]: INFO : Stage: mount Mar 2 12:52:02.714580 ignition[981]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 2 12:52:02.714580 ignition[981]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 2 12:52:02.714580 ignition[981]: INFO : mount: mount passed Mar 2 12:52:02.714580 ignition[981]: INFO : Ignition finished successfully Mar 2 12:52:02.716497 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 2 12:52:02.841819 systemd-networkd[825]: eth0: Gained IPv6LL Mar 2 12:52:03.515435 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 2 12:52:04.351255 systemd-networkd[825]: eth0: Ignoring DHCPv6 address 2a02:1348:17c:d2a9:24:19ff:fef3:4aa6/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17c:d2a9:24:19ff:fef3:4aa6/64 assigned by NDisc. Mar 2 12:52:04.351267 systemd-networkd[825]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Mar 2 12:52:05.525413 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 2 12:52:09.534415 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 2 12:52:09.541107 coreos-metadata[865]: Mar 02 12:52:09.541 WARN failed to locate config-drive, using the metadata service API instead Mar 2 12:52:09.564948 coreos-metadata[865]: Mar 02 12:52:09.564 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 2 12:52:09.586321 coreos-metadata[865]: Mar 02 12:52:09.586 INFO Fetch successful Mar 2 12:52:09.587353 coreos-metadata[865]: Mar 02 12:52:09.587 INFO wrote hostname srv-zvfam.gb1.brightbox.com to /sysroot/etc/hostname Mar 2 12:52:09.589569 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 2 12:52:09.589739 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Mar 2 12:52:09.597981 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 2 12:52:09.623317 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 2 12:52:09.670724 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (996) Mar 2 12:52:09.678580 kernel: BTRFS info (device vda6): first mount of filesystem 81b29f52-362f-4f57-bc73-813781f2dfeb Mar 2 12:52:09.678681 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 2 12:52:09.689657 kernel: BTRFS info (device vda6): turning on async discard Mar 2 12:52:09.689769 kernel: BTRFS info (device vda6): enabling free space tree Mar 2 12:52:09.692642 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 2 12:52:09.741950 ignition[1014]: INFO : Ignition 2.22.0 Mar 2 12:52:09.741950 ignition[1014]: INFO : Stage: files Mar 2 12:52:09.743774 ignition[1014]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 2 12:52:09.743774 ignition[1014]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 2 12:52:09.751707 ignition[1014]: DEBUG : files: compiled without relabeling support, skipping Mar 2 12:52:09.751707 ignition[1014]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 2 12:52:09.751707 ignition[1014]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 2 12:52:09.754668 ignition[1014]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 2 12:52:09.754668 ignition[1014]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 2 12:52:09.756611 ignition[1014]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 2 12:52:09.755555 unknown[1014]: wrote ssh authorized keys file for user: core Mar 2 12:52:09.758662 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 2 12:52:09.758662 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Mar 2 12:52:10.019437 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 2 12:52:10.318434 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 2 12:52:10.323249 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 2 12:52:10.323249 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 2 12:52:10.323249 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 2 12:52:10.323249 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 2 12:52:10.323249 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 2 12:52:10.323249 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 2 12:52:10.323249 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 2 12:52:10.323249 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 2 12:52:10.333043 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 2 12:52:10.333043 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 2 12:52:10.333043 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 2 12:52:10.333043 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 2 12:52:10.333043 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 2 12:52:10.333043 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.8-x86-64.raw: attempt #1 Mar 2 12:52:10.799813 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 2 12:52:14.557063 ignition[1014]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 2 12:52:14.560918 ignition[1014]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 2 12:52:14.564996 ignition[1014]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 2 12:52:14.569800 ignition[1014]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 2 12:52:14.569800 ignition[1014]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 2 12:52:14.573451 ignition[1014]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 2 12:52:14.573451 ignition[1014]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 2 12:52:14.573451 ignition[1014]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 2 12:52:14.573451 ignition[1014]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 2 12:52:14.573451 ignition[1014]: INFO : files: files passed Mar 2 12:52:14.573451 ignition[1014]: INFO : Ignition finished successfully Mar 2 12:52:14.577371 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 2 12:52:14.583371 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 2 12:52:14.585551 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 2 12:52:14.614950 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 2 12:52:14.615139 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 2 12:52:14.627585 initrd-setup-root-after-ignition[1043]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 2 12:52:14.629212 initrd-setup-root-after-ignition[1043]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 2 12:52:14.630229 initrd-setup-root-after-ignition[1047]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 2 12:52:14.632434 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 2 12:52:14.634780 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 2 12:52:14.637485 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 2 12:52:14.711073 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 2 12:52:14.711264 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 2 12:52:14.713056 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 2 12:52:14.714432 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 2 12:52:14.716042 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 2 12:52:14.717574 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 2 12:52:14.750278 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 2 12:52:14.753117 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 2 12:52:14.778611 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 2 12:52:14.779581 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 2 12:52:14.780372 systemd[1]: Stopped target timers.target - Timer Units. Mar 2 12:52:14.781064 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 2 12:52:14.781267 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 2 12:52:14.783626 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 2 12:52:14.784610 systemd[1]: Stopped target basic.target - Basic System. Mar 2 12:52:14.785873 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 2 12:52:14.787243 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 2 12:52:14.788814 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 2 12:52:14.790107 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 2 12:52:14.791350 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 2 12:52:14.792907 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 2 12:52:14.794451 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 2 12:52:14.795919 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 2 12:52:14.797346 systemd[1]: Stopped target swap.target - Swaps. Mar 2 12:52:14.798621 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 2 12:52:14.798918 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 2 12:52:14.800683 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 2 12:52:14.801540 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 2 12:52:14.803077 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 2 12:52:14.803333 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 2 12:52:14.804463 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 2 12:52:14.804730 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 2 12:52:14.806346 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 2 12:52:14.806636 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 2 12:52:14.813773 systemd[1]: ignition-files.service: Deactivated successfully. Mar 2 12:52:14.814029 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 2 12:52:14.818111 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 2 12:52:14.819751 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 2 12:52:14.820020 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 2 12:52:14.825595 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 2 12:52:14.827737 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 2 12:52:14.828575 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 2 12:52:14.830824 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 2 12:52:14.831060 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 2 12:52:14.846984 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 2 12:52:14.847201 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 2 12:52:14.863270 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 2 12:52:14.868270 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 2 12:52:14.868433 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 2 12:52:14.883853 ignition[1067]: INFO : Ignition 2.22.0 Mar 2 12:52:14.883853 ignition[1067]: INFO : Stage: umount Mar 2 12:52:14.885762 ignition[1067]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 2 12:52:14.885762 ignition[1067]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 2 12:52:14.887461 ignition[1067]: INFO : umount: umount passed Mar 2 12:52:14.887461 ignition[1067]: INFO : Ignition finished successfully Mar 2 12:52:14.889223 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 2 12:52:14.889424 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 2 12:52:14.890814 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 2 12:52:14.891028 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 2 12:52:14.892522 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 2 12:52:14.892596 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 2 12:52:14.893819 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 2 12:52:14.893888 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 2 12:52:14.895130 systemd[1]: Stopped target network.target - Network. Mar 2 12:52:14.897056 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 2 12:52:14.897158 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 2 12:52:14.898549 systemd[1]: Stopped target paths.target - Path Units. Mar 2 12:52:14.899765 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 2 12:52:14.903562 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 2 12:52:14.904487 systemd[1]: Stopped target slices.target - Slice Units. Mar 2 12:52:14.906728 systemd[1]: Stopped target sockets.target - Socket Units. Mar 2 12:52:14.907483 systemd[1]: iscsid.socket: Deactivated successfully. Mar 2 12:52:14.907565 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 2 12:52:14.908767 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 2 12:52:14.908827 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 2 12:52:14.910073 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 2 12:52:14.910177 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 2 12:52:14.911482 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 2 12:52:14.911548 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 2 12:52:14.912754 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 2 12:52:14.912867 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 2 12:52:14.914405 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 2 12:52:14.916285 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 2 12:52:14.920616 systemd-networkd[825]: eth0: DHCPv6 lease lost Mar 2 12:52:14.925817 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 2 12:52:14.926049 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 2 12:52:14.932935 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 2 12:52:14.933436 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 2 12:52:14.933847 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 2 12:52:14.936461 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 2 12:52:14.937515 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 2 12:52:14.938248 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 2 12:52:14.938321 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 2 12:52:14.940073 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 2 12:52:14.941709 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 2 12:52:14.941781 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 2 12:52:14.943915 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 2 12:52:14.943981 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 2 12:52:14.947543 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 2 12:52:14.947632 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 2 12:52:14.950490 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 2 12:52:14.950559 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 2 12:52:14.952066 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 2 12:52:14.954486 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 2 12:52:14.954594 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 2 12:52:14.963213 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 2 12:52:14.964526 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 2 12:52:14.966961 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 2 12:52:14.967086 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 2 12:52:14.969486 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 2 12:52:14.969545 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 2 12:52:14.971526 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 2 12:52:14.971604 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 2 12:52:14.973441 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 2 12:52:14.973511 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 2 12:52:14.975635 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 2 12:52:14.975711 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 2 12:52:14.978803 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 2 12:52:14.980714 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 2 12:52:14.980793 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 2 12:52:14.983523 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 2 12:52:14.983591 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 2 12:52:14.990324 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 2 12:52:14.990457 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 2 12:52:14.993211 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Mar 2 12:52:14.993302 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 2 12:52:14.993372 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 2 12:52:14.993979 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 2 12:52:14.999538 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 2 12:52:15.009655 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 2 12:52:15.009836 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 2 12:52:15.011886 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 2 12:52:15.014110 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 2 12:52:15.040011 systemd[1]: Switching root. Mar 2 12:52:15.078001 systemd-journald[211]: Journal stopped Mar 2 12:52:16.713979 systemd-journald[211]: Received SIGTERM from PID 1 (systemd). Mar 2 12:52:16.714201 kernel: SELinux: policy capability network_peer_controls=1 Mar 2 12:52:16.714261 kernel: SELinux: policy capability open_perms=1 Mar 2 12:52:16.714292 kernel: SELinux: policy capability extended_socket_class=1 Mar 2 12:52:16.714318 kernel: SELinux: policy capability always_check_network=0 Mar 2 12:52:16.714361 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 2 12:52:16.714407 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 2 12:52:16.714452 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 2 12:52:16.714486 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 2 12:52:16.714506 kernel: SELinux: policy capability userspace_initial_context=0 Mar 2 12:52:16.714538 kernel: audit: type=1403 audit(1772455935.356:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 2 12:52:16.714571 systemd[1]: Successfully loaded SELinux policy in 76.163ms. Mar 2 12:52:16.714621 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 12.601ms. Mar 2 12:52:16.714655 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 2 12:52:16.714685 systemd[1]: Detected virtualization kvm. Mar 2 12:52:16.714715 systemd[1]: Detected architecture x86-64. Mar 2 12:52:16.714753 systemd[1]: Detected first boot. Mar 2 12:52:16.714774 systemd[1]: Hostname set to . Mar 2 12:52:16.714793 systemd[1]: Initializing machine ID from VM UUID. Mar 2 12:52:16.714812 zram_generator::config[1111]: No configuration found. Mar 2 12:52:16.714843 kernel: Guest personality initialized and is inactive Mar 2 12:52:16.714863 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Mar 2 12:52:16.714881 kernel: Initialized host personality Mar 2 12:52:16.714900 kernel: NET: Registered PF_VSOCK protocol family Mar 2 12:52:16.714918 systemd[1]: Populated /etc with preset unit settings. Mar 2 12:52:16.714952 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 2 12:52:16.714973 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 2 12:52:16.715000 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 2 12:52:16.715021 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 2 12:52:16.715040 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 2 12:52:16.715060 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 2 12:52:16.715079 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 2 12:52:16.715098 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 2 12:52:16.715200 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 2 12:52:16.715237 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 2 12:52:16.715258 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 2 12:52:16.715278 systemd[1]: Created slice user.slice - User and Session Slice. Mar 2 12:52:16.715297 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 2 12:52:16.715316 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 2 12:52:16.715350 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 2 12:52:16.715372 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 2 12:52:16.717444 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 2 12:52:16.717473 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 2 12:52:16.717505 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 2 12:52:16.717527 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 2 12:52:16.717562 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 2 12:52:16.717584 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 2 12:52:16.717603 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 2 12:52:16.717632 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 2 12:52:16.717661 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 2 12:52:16.717682 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 2 12:52:16.717709 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 2 12:52:16.717731 systemd[1]: Reached target slices.target - Slice Units. Mar 2 12:52:16.717750 systemd[1]: Reached target swap.target - Swaps. Mar 2 12:52:16.717789 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 2 12:52:16.717811 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 2 12:52:16.717840 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 2 12:52:16.717860 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 2 12:52:16.717880 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 2 12:52:16.717899 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 2 12:52:16.717918 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 2 12:52:16.717937 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 2 12:52:16.717971 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 2 12:52:16.718011 systemd[1]: Mounting media.mount - External Media Directory... Mar 2 12:52:16.718034 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 2 12:52:16.718054 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 2 12:52:16.718072 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 2 12:52:16.718091 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 2 12:52:16.718124 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 2 12:52:16.718146 systemd[1]: Reached target machines.target - Containers. Mar 2 12:52:16.718167 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 2 12:52:16.718202 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 2 12:52:16.718244 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 2 12:52:16.718264 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 2 12:52:16.718292 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 2 12:52:16.718314 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 2 12:52:16.718333 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 2 12:52:16.718352 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 2 12:52:16.718371 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 2 12:52:16.718406 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 2 12:52:16.718443 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 2 12:52:16.718464 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 2 12:52:16.718484 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 2 12:52:16.718511 systemd[1]: Stopped systemd-fsck-usr.service. Mar 2 12:52:16.718533 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 2 12:52:16.718552 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 2 12:52:16.718572 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 2 12:52:16.718591 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 2 12:52:16.718611 kernel: fuse: init (API version 7.41) Mar 2 12:52:16.718644 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 2 12:52:16.718685 kernel: loop: module loaded Mar 2 12:52:16.718719 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 2 12:52:16.718740 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 2 12:52:16.718779 systemd[1]: verity-setup.service: Deactivated successfully. Mar 2 12:52:16.718799 systemd[1]: Stopped verity-setup.service. Mar 2 12:52:16.718819 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 2 12:52:16.718838 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 2 12:52:16.718856 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 2 12:52:16.718891 systemd[1]: Mounted media.mount - External Media Directory. Mar 2 12:52:16.718912 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 2 12:52:16.718931 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 2 12:52:16.718950 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 2 12:52:16.718969 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 2 12:52:16.718988 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 2 12:52:16.719007 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 2 12:52:16.719026 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 2 12:52:16.719059 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 2 12:52:16.719080 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 2 12:52:16.719107 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 2 12:52:16.719129 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 2 12:52:16.719149 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 2 12:52:16.719167 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 2 12:52:16.719186 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 2 12:52:16.719205 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 2 12:52:16.719238 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 2 12:52:16.719273 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 2 12:52:16.719295 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 2 12:52:16.719323 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 2 12:52:16.719352 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 2 12:52:16.722422 kernel: ACPI: bus type drm_connector registered Mar 2 12:52:16.722460 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 2 12:52:16.722482 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 2 12:52:16.722515 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 2 12:52:16.722537 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 2 12:52:16.722618 systemd-journald[1198]: Collecting audit messages is disabled. Mar 2 12:52:16.722686 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 2 12:52:16.722710 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 2 12:52:16.722730 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 2 12:52:16.722749 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 2 12:52:16.722769 systemd-journald[1198]: Journal started Mar 2 12:52:16.722816 systemd-journald[1198]: Runtime Journal (/run/log/journal/83e28ea300e04bebb456ed59bf48c1f3) is 4.7M, max 37.8M, 33.1M free. Mar 2 12:52:16.202540 systemd[1]: Queued start job for default target multi-user.target. Mar 2 12:52:16.228203 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 2 12:52:16.229021 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 2 12:52:16.732458 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 2 12:52:16.746512 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 2 12:52:16.746605 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 2 12:52:16.755415 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 2 12:52:16.759960 systemd[1]: Started systemd-journald.service - Journal Service. Mar 2 12:52:16.765999 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 2 12:52:16.767122 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 2 12:52:16.767607 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 2 12:52:16.768830 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 2 12:52:16.770727 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 2 12:52:16.810640 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 2 12:52:16.814894 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 2 12:52:16.828453 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 2 12:52:16.830110 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 2 12:52:16.849612 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 2 12:52:16.854554 systemd-journald[1198]: Time spent on flushing to /var/log/journal/83e28ea300e04bebb456ed59bf48c1f3 is 29.048ms for 1170 entries. Mar 2 12:52:16.854554 systemd-journald[1198]: System Journal (/var/log/journal/83e28ea300e04bebb456ed59bf48c1f3) is 8M, max 584.8M, 576.8M free. Mar 2 12:52:16.895351 systemd-journald[1198]: Received client request to flush runtime journal. Mar 2 12:52:16.895430 kernel: loop0: detected capacity change from 0 to 228704 Mar 2 12:52:16.863540 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 2 12:52:16.900078 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 2 12:52:16.930569 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 2 12:52:16.933962 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 2 12:52:16.944431 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 2 12:52:16.943496 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 2 12:52:16.976638 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 2 12:52:16.987416 kernel: loop1: detected capacity change from 0 to 110984 Mar 2 12:52:17.025296 systemd-tmpfiles[1266]: ACLs are not supported, ignoring. Mar 2 12:52:17.025324 systemd-tmpfiles[1266]: ACLs are not supported, ignoring. Mar 2 12:52:17.036521 kernel: loop2: detected capacity change from 0 to 128560 Mar 2 12:52:17.037181 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 2 12:52:17.078488 kernel: loop3: detected capacity change from 0 to 8 Mar 2 12:52:17.127423 kernel: loop4: detected capacity change from 0 to 228704 Mar 2 12:52:17.190479 kernel: loop5: detected capacity change from 0 to 110984 Mar 2 12:52:17.230456 kernel: loop6: detected capacity change from 0 to 128560 Mar 2 12:52:17.231487 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 2 12:52:17.281278 kernel: loop7: detected capacity change from 0 to 8 Mar 2 12:52:17.289742 (sd-merge)[1275]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Mar 2 12:52:17.295868 (sd-merge)[1275]: Merged extensions into '/usr'. Mar 2 12:52:17.302574 systemd[1]: Reload requested from client PID 1230 ('systemd-sysext') (unit systemd-sysext.service)... Mar 2 12:52:17.302612 systemd[1]: Reloading... Mar 2 12:52:17.463429 zram_generator::config[1298]: No configuration found. Mar 2 12:52:17.860430 ldconfig[1223]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 2 12:52:17.930878 systemd[1]: Reloading finished in 627 ms. Mar 2 12:52:17.957519 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 2 12:52:17.965241 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 2 12:52:17.980654 systemd[1]: Starting ensure-sysext.service... Mar 2 12:52:17.984525 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 2 12:52:18.041852 systemd[1]: Reload requested from client PID 1357 ('systemctl') (unit ensure-sysext.service)... Mar 2 12:52:18.042083 systemd[1]: Reloading... Mar 2 12:52:18.042590 systemd-tmpfiles[1358]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 2 12:52:18.042669 systemd-tmpfiles[1358]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 2 12:52:18.043138 systemd-tmpfiles[1358]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 2 12:52:18.045632 systemd-tmpfiles[1358]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 2 12:52:18.047110 systemd-tmpfiles[1358]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 2 12:52:18.049528 systemd-tmpfiles[1358]: ACLs are not supported, ignoring. Mar 2 12:52:18.049644 systemd-tmpfiles[1358]: ACLs are not supported, ignoring. Mar 2 12:52:18.059857 systemd-tmpfiles[1358]: Detected autofs mount point /boot during canonicalization of boot. Mar 2 12:52:18.059876 systemd-tmpfiles[1358]: Skipping /boot Mar 2 12:52:18.086178 systemd-tmpfiles[1358]: Detected autofs mount point /boot during canonicalization of boot. Mar 2 12:52:18.086196 systemd-tmpfiles[1358]: Skipping /boot Mar 2 12:52:18.151429 zram_generator::config[1384]: No configuration found. Mar 2 12:52:18.421832 systemd[1]: Reloading finished in 379 ms. Mar 2 12:52:18.437739 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 2 12:52:18.453495 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 2 12:52:18.465728 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 2 12:52:18.470737 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 2 12:52:18.483414 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 2 12:52:18.488559 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 2 12:52:18.493948 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 2 12:52:18.500734 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 2 12:52:18.507238 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 2 12:52:18.507532 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 2 12:52:18.512101 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 2 12:52:18.518608 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 2 12:52:18.529857 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 2 12:52:18.530962 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 2 12:52:18.531153 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 2 12:52:18.531300 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 2 12:52:18.538002 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 2 12:52:18.538287 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 2 12:52:18.538549 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 2 12:52:18.538696 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 2 12:52:18.543810 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 2 12:52:18.544977 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 2 12:52:18.554884 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 2 12:52:18.562547 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 2 12:52:18.562894 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 2 12:52:18.572334 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 2 12:52:18.573611 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 2 12:52:18.573675 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 2 12:52:18.573793 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 2 12:52:18.576616 systemd[1]: Finished ensure-sysext.service. Mar 2 12:52:18.585866 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 2 12:52:18.607177 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 2 12:52:18.608113 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 2 12:52:18.613156 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 2 12:52:18.625308 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 2 12:52:18.635013 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 2 12:52:18.637144 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 2 12:52:18.638587 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 2 12:52:18.638899 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 2 12:52:18.640347 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 2 12:52:18.646347 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 2 12:52:18.647124 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 2 12:52:18.648865 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 2 12:52:18.653334 systemd-udevd[1447]: Using default interface naming scheme 'v255'. Mar 2 12:52:18.671383 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 2 12:52:18.672335 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 2 12:52:18.680081 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 2 12:52:18.681026 augenrules[1485]: No rules Mar 2 12:52:18.683256 systemd[1]: audit-rules.service: Deactivated successfully. Mar 2 12:52:18.683972 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 2 12:52:18.701281 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 2 12:52:18.704047 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 2 12:52:18.711374 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 2 12:52:18.968962 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 2 12:52:18.969932 systemd[1]: Reached target time-set.target - System Time Set. Mar 2 12:52:18.991540 systemd-networkd[1501]: lo: Link UP Mar 2 12:52:18.991554 systemd-networkd[1501]: lo: Gained carrier Mar 2 12:52:18.992902 systemd-networkd[1501]: Enumeration completed Mar 2 12:52:18.993487 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 2 12:52:18.997214 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 2 12:52:19.001656 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 2 12:52:19.017899 systemd-resolved[1446]: Positive Trust Anchors: Mar 2 12:52:19.017914 systemd-resolved[1446]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 2 12:52:19.017957 systemd-resolved[1446]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 2 12:52:19.042019 systemd-resolved[1446]: Using system hostname 'srv-zvfam.gb1.brightbox.com'. Mar 2 12:52:19.046217 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 2 12:52:19.047336 systemd[1]: Reached target network.target - Network. Mar 2 12:52:19.047954 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 2 12:52:19.048687 systemd[1]: Reached target sysinit.target - System Initialization. Mar 2 12:52:19.049459 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 2 12:52:19.050215 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 2 12:52:19.050955 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Mar 2 12:52:19.051887 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 2 12:52:19.052975 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 2 12:52:19.053922 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 2 12:52:19.054771 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 2 12:52:19.054821 systemd[1]: Reached target paths.target - Path Units. Mar 2 12:52:19.055507 systemd[1]: Reached target timers.target - Timer Units. Mar 2 12:52:19.057532 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 2 12:52:19.060561 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 2 12:52:19.070660 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 2 12:52:19.071725 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 2 12:52:19.073578 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 2 12:52:19.084246 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 2 12:52:19.085977 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 2 12:52:19.089260 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 2 12:52:19.090271 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 2 12:52:19.098941 systemd[1]: Reached target sockets.target - Socket Units. Mar 2 12:52:19.100334 systemd[1]: Reached target basic.target - Basic System. Mar 2 12:52:19.101035 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 2 12:52:19.101100 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 2 12:52:19.103777 systemd[1]: Starting containerd.service - containerd container runtime... Mar 2 12:52:19.108677 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 2 12:52:19.115905 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 2 12:52:19.119447 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 2 12:52:19.127827 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 2 12:52:19.134234 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 2 12:52:19.135501 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 2 12:52:19.140556 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Mar 2 12:52:19.157426 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 2 12:52:19.160378 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 2 12:52:19.164975 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 2 12:52:19.169639 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 2 12:52:19.175204 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 2 12:52:19.184654 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 2 12:52:19.188277 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 2 12:52:19.189207 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 2 12:52:19.198823 systemd[1]: Starting update-engine.service - Update Engine... Mar 2 12:52:19.211945 jq[1540]: false Mar 2 12:52:19.213576 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 2 12:52:19.227512 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 2 12:52:19.230122 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 2 12:52:19.230719 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 2 12:52:19.231135 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 2 12:52:19.254702 jq[1552]: true Mar 2 12:52:19.291281 google_oslogin_nss_cache[1543]: oslogin_cache_refresh[1543]: Refreshing passwd entry cache Mar 2 12:52:19.287910 oslogin_cache_refresh[1543]: Refreshing passwd entry cache Mar 2 12:52:19.299166 systemd[1]: motdgen.service: Deactivated successfully. Mar 2 12:52:19.300481 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 2 12:52:19.320082 google_oslogin_nss_cache[1543]: oslogin_cache_refresh[1543]: Failure getting users, quitting Mar 2 12:52:19.320082 google_oslogin_nss_cache[1543]: oslogin_cache_refresh[1543]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 2 12:52:19.320082 google_oslogin_nss_cache[1543]: oslogin_cache_refresh[1543]: Refreshing group entry cache Mar 2 12:52:19.319541 oslogin_cache_refresh[1543]: Failure getting users, quitting Mar 2 12:52:19.319582 oslogin_cache_refresh[1543]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 2 12:52:19.319654 oslogin_cache_refresh[1543]: Refreshing group entry cache Mar 2 12:52:19.323678 google_oslogin_nss_cache[1543]: oslogin_cache_refresh[1543]: Failure getting groups, quitting Mar 2 12:52:19.323678 google_oslogin_nss_cache[1543]: oslogin_cache_refresh[1543]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 2 12:52:19.323671 oslogin_cache_refresh[1543]: Failure getting groups, quitting Mar 2 12:52:19.323689 oslogin_cache_refresh[1543]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 2 12:52:19.324944 (ntainerd)[1566]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 2 12:52:19.352447 update_engine[1551]: I20260302 12:52:19.343806 1551 main.cc:92] Flatcar Update Engine starting Mar 2 12:52:19.389695 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Mar 2 12:52:19.392174 dbus-daemon[1538]: [system] SELinux support is enabled Mar 2 12:52:19.392545 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 2 12:52:19.396416 kernel: ACPI: button: Power Button [PWRF] Mar 2 12:52:19.401439 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 2 12:52:19.408634 update_engine[1551]: I20260302 12:52:19.403733 1551 update_check_scheduler.cc:74] Next update check in 6m17s Mar 2 12:52:19.401486 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 2 12:52:19.402307 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 2 12:52:19.402346 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 2 12:52:19.403177 systemd[1]: Started update-engine.service - Update Engine. Mar 2 12:52:19.413299 jq[1565]: true Mar 2 12:52:19.456738 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 2 12:52:19.492924 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Mar 2 12:52:19.493281 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Mar 2 12:52:19.507140 extend-filesystems[1541]: Found /dev/vda6 Mar 2 12:52:19.501942 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 2 12:52:19.503481 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 2 12:52:19.522708 extend-filesystems[1541]: Found /dev/vda9 Mar 2 12:52:19.533203 tar[1564]: linux-amd64/LICENSE Mar 2 12:52:19.533203 tar[1564]: linux-amd64/helm Mar 2 12:52:19.534788 extend-filesystems[1541]: Checking size of /dev/vda9 Mar 2 12:52:19.576843 systemd-networkd[1501]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 2 12:52:19.579191 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 2 12:52:19.587163 systemd-networkd[1501]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 2 12:52:19.587484 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 2 12:52:19.591781 systemd-networkd[1501]: eth0: Link UP Mar 2 12:52:19.592114 systemd-networkd[1501]: eth0: Gained carrier Mar 2 12:52:19.592150 systemd-networkd[1501]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 2 12:52:19.625844 systemd-logind[1549]: New seat seat0. Mar 2 12:52:19.628292 systemd[1]: Started systemd-logind.service - User Login Management. Mar 2 12:52:19.637417 kernel: mousedev: PS/2 mouse device common for all mice Mar 2 12:52:19.666421 extend-filesystems[1541]: Resized partition /dev/vda9 Mar 2 12:52:19.689403 extend-filesystems[1603]: resize2fs 1.47.3 (8-Jul-2025) Mar 2 12:52:19.687140 dbus-daemon[1538]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.2' (uid=244 pid=1501 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Mar 2 12:52:19.697151 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Mar 2 12:52:19.706465 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Mar 2 12:52:19.712608 systemd-networkd[1501]: eth0: DHCPv4 address 10.243.74.166/30, gateway 10.243.74.165 acquired from 10.243.74.165 Mar 2 12:52:19.714089 systemd-timesyncd[1466]: Network configuration changed, trying to establish connection. Mar 2 12:52:19.739106 bash[1600]: Updated "/home/core/.ssh/authorized_keys" Mar 2 12:52:19.743120 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 2 12:52:19.751843 systemd[1]: Starting sshkeys.service... Mar 2 12:52:19.763617 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 2 12:52:19.851219 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 2 12:52:19.855869 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 2 12:52:20.036415 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 2 12:52:20.104709 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 2 12:52:20.107416 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 2 12:52:20.116270 locksmithd[1573]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 2 12:52:20.153992 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 2 12:52:20.220309 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Mar 2 12:52:20.225640 dbus-daemon[1538]: [system] Successfully activated service 'org.freedesktop.hostname1' Mar 2 12:52:20.231540 dbus-daemon[1538]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.8' (uid=0 pid=1605 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Mar 2 12:52:20.243921 systemd[1]: Starting polkit.service - Authorization Manager... Mar 2 12:52:20.390779 containerd[1566]: time="2026-03-02T12:52:20Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 2 12:52:20.407341 containerd[1566]: time="2026-03-02T12:52:20.405568448Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 2 12:52:20.433418 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Mar 2 12:52:20.451467 extend-filesystems[1603]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 2 12:52:20.451467 extend-filesystems[1603]: old_desc_blocks = 1, new_desc_blocks = 8 Mar 2 12:52:20.451467 extend-filesystems[1603]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Mar 2 12:52:20.470480 extend-filesystems[1541]: Resized filesystem in /dev/vda9 Mar 2 12:52:20.473460 containerd[1566]: time="2026-03-02T12:52:20.455004443Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="21.1µs" Mar 2 12:52:20.473460 containerd[1566]: time="2026-03-02T12:52:20.455067673Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 2 12:52:20.473460 containerd[1566]: time="2026-03-02T12:52:20.455114513Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 2 12:52:20.473460 containerd[1566]: time="2026-03-02T12:52:20.463873650Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 2 12:52:20.473460 containerd[1566]: time="2026-03-02T12:52:20.464040834Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 2 12:52:20.473460 containerd[1566]: time="2026-03-02T12:52:20.464204768Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 2 12:52:20.473460 containerd[1566]: time="2026-03-02T12:52:20.464378566Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 2 12:52:20.473460 containerd[1566]: time="2026-03-02T12:52:20.464432513Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 2 12:52:20.454068 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 2 12:52:20.454507 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 2 12:52:20.480554 containerd[1566]: time="2026-03-02T12:52:20.479552394Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 2 12:52:20.480554 containerd[1566]: time="2026-03-02T12:52:20.479630877Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 2 12:52:20.480554 containerd[1566]: time="2026-03-02T12:52:20.479680306Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 2 12:52:20.480554 containerd[1566]: time="2026-03-02T12:52:20.479703891Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 2 12:52:20.480554 containerd[1566]: time="2026-03-02T12:52:20.479953524Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 2 12:52:20.482427 containerd[1566]: time="2026-03-02T12:52:20.481614813Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 2 12:52:20.482427 containerd[1566]: time="2026-03-02T12:52:20.481697780Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 2 12:52:20.482427 containerd[1566]: time="2026-03-02T12:52:20.481728936Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 2 12:52:20.482427 containerd[1566]: time="2026-03-02T12:52:20.481823876Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 2 12:52:20.483882 containerd[1566]: time="2026-03-02T12:52:20.483851242Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 2 12:52:20.484771 containerd[1566]: time="2026-03-02T12:52:20.484740798Z" level=info msg="metadata content store policy set" policy=shared Mar 2 12:52:20.497415 containerd[1566]: time="2026-03-02T12:52:20.495738187Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 2 12:52:20.497415 containerd[1566]: time="2026-03-02T12:52:20.495921139Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 2 12:52:20.497415 containerd[1566]: time="2026-03-02T12:52:20.495965166Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 2 12:52:20.497415 containerd[1566]: time="2026-03-02T12:52:20.495999581Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 2 12:52:20.497415 containerd[1566]: time="2026-03-02T12:52:20.496076319Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 2 12:52:20.497415 containerd[1566]: time="2026-03-02T12:52:20.496122457Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 2 12:52:20.497415 containerd[1566]: time="2026-03-02T12:52:20.496190869Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 2 12:52:20.497415 containerd[1566]: time="2026-03-02T12:52:20.496246998Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 2 12:52:20.497415 containerd[1566]: time="2026-03-02T12:52:20.496279549Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 2 12:52:20.497415 containerd[1566]: time="2026-03-02T12:52:20.496309847Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 2 12:52:20.497415 containerd[1566]: time="2026-03-02T12:52:20.496342470Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 2 12:52:20.497415 containerd[1566]: time="2026-03-02T12:52:20.496412030Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 2 12:52:20.497415 containerd[1566]: time="2026-03-02T12:52:20.496748245Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 2 12:52:20.497415 containerd[1566]: time="2026-03-02T12:52:20.496817398Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 2 12:52:20.497988 containerd[1566]: time="2026-03-02T12:52:20.496885859Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 2 12:52:20.497988 containerd[1566]: time="2026-03-02T12:52:20.496930202Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 2 12:52:20.497988 containerd[1566]: time="2026-03-02T12:52:20.496960706Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 2 12:52:20.497988 containerd[1566]: time="2026-03-02T12:52:20.496988323Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 2 12:52:20.497988 containerd[1566]: time="2026-03-02T12:52:20.497020745Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 2 12:52:20.497988 containerd[1566]: time="2026-03-02T12:52:20.497055630Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 2 12:52:20.497988 containerd[1566]: time="2026-03-02T12:52:20.497091170Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 2 12:52:20.497988 containerd[1566]: time="2026-03-02T12:52:20.497119904Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 2 12:52:20.497988 containerd[1566]: time="2026-03-02T12:52:20.497158459Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 2 12:52:20.497988 containerd[1566]: time="2026-03-02T12:52:20.497343913Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 2 12:52:20.504092 containerd[1566]: time="2026-03-02T12:52:20.500443956Z" level=info msg="Start snapshots syncer" Mar 2 12:52:20.504092 containerd[1566]: time="2026-03-02T12:52:20.500570345Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 2 12:52:20.504092 containerd[1566]: time="2026-03-02T12:52:20.501301460Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 2 12:52:20.504584 containerd[1566]: time="2026-03-02T12:52:20.501455244Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 2 12:52:20.512244 containerd[1566]: time="2026-03-02T12:52:20.512178118Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 2 12:52:20.512649 containerd[1566]: time="2026-03-02T12:52:20.512619555Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 2 12:52:20.512810 containerd[1566]: time="2026-03-02T12:52:20.512780922Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 2 12:52:20.512911 containerd[1566]: time="2026-03-02T12:52:20.512886345Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 2 12:52:20.513025 containerd[1566]: time="2026-03-02T12:52:20.512989610Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 2 12:52:20.513948 containerd[1566]: time="2026-03-02T12:52:20.513916962Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 2 12:52:20.514072 containerd[1566]: time="2026-03-02T12:52:20.514047201Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 2 12:52:20.514209 containerd[1566]: time="2026-03-02T12:52:20.514183211Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 2 12:52:20.514999 containerd[1566]: time="2026-03-02T12:52:20.514969010Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 2 12:52:20.515189 containerd[1566]: time="2026-03-02T12:52:20.515161715Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 2 12:52:20.515305 containerd[1566]: time="2026-03-02T12:52:20.515279557Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 2 12:52:20.515509 containerd[1566]: time="2026-03-02T12:52:20.515482875Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 2 12:52:20.515634 containerd[1566]: time="2026-03-02T12:52:20.515605803Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 2 12:52:20.515747 containerd[1566]: time="2026-03-02T12:52:20.515724163Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 2 12:52:20.515877 containerd[1566]: time="2026-03-02T12:52:20.515837169Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 2 12:52:20.515994 containerd[1566]: time="2026-03-02T12:52:20.515968529Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 2 12:52:20.516121 containerd[1566]: time="2026-03-02T12:52:20.516096588Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 2 12:52:20.516240 containerd[1566]: time="2026-03-02T12:52:20.516214921Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 2 12:52:20.518413 containerd[1566]: time="2026-03-02T12:52:20.517483758Z" level=info msg="runtime interface created" Mar 2 12:52:20.518413 containerd[1566]: time="2026-03-02T12:52:20.517509037Z" level=info msg="created NRI interface" Mar 2 12:52:20.518413 containerd[1566]: time="2026-03-02T12:52:20.517526637Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 2 12:52:20.518413 containerd[1566]: time="2026-03-02T12:52:20.517570135Z" level=info msg="Connect containerd service" Mar 2 12:52:20.518413 containerd[1566]: time="2026-03-02T12:52:20.517621962Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 2 12:52:20.520902 containerd[1566]: time="2026-03-02T12:52:20.520868712Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 2 12:52:20.835477 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 2 12:52:20.944224 systemd-logind[1549]: Watching system buttons on /dev/input/event3 (Power Button) Mar 2 12:52:20.996926 systemd-logind[1549]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 2 12:52:21.054768 containerd[1566]: time="2026-03-02T12:52:21.054698939Z" level=info msg="Start subscribing containerd event" Mar 2 12:52:21.054939 containerd[1566]: time="2026-03-02T12:52:21.054832422Z" level=info msg="Start recovering state" Mar 2 12:52:21.056410 containerd[1566]: time="2026-03-02T12:52:21.055103818Z" level=info msg="Start event monitor" Mar 2 12:52:21.056410 containerd[1566]: time="2026-03-02T12:52:21.055136229Z" level=info msg="Start cni network conf syncer for default" Mar 2 12:52:21.056410 containerd[1566]: time="2026-03-02T12:52:21.055159770Z" level=info msg="Start streaming server" Mar 2 12:52:21.056410 containerd[1566]: time="2026-03-02T12:52:21.055201816Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 2 12:52:21.056410 containerd[1566]: time="2026-03-02T12:52:21.055227212Z" level=info msg="runtime interface starting up..." Mar 2 12:52:21.056410 containerd[1566]: time="2026-03-02T12:52:21.055244066Z" level=info msg="starting plugins..." Mar 2 12:52:21.056410 containerd[1566]: time="2026-03-02T12:52:21.055283384Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 2 12:52:21.056410 containerd[1566]: time="2026-03-02T12:52:21.055338478Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 2 12:52:21.056410 containerd[1566]: time="2026-03-02T12:52:21.055451157Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 2 12:52:21.057645 systemd[1]: Started containerd.service - containerd container runtime. Mar 2 12:52:21.059358 containerd[1566]: time="2026-03-02T12:52:21.058555651Z" level=info msg="containerd successfully booted in 0.672839s" Mar 2 12:52:21.490332 polkitd[1624]: Started polkitd version 126 Mar 2 12:52:21.534802 polkitd[1624]: Loading rules from directory /etc/polkit-1/rules.d Mar 2 12:52:21.535315 polkitd[1624]: Loading rules from directory /run/polkit-1/rules.d Mar 2 12:52:21.538822 polkitd[1624]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Mar 2 12:52:21.539204 polkitd[1624]: Loading rules from directory /usr/local/share/polkit-1/rules.d Mar 2 12:52:21.539246 polkitd[1624]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Mar 2 12:52:21.539330 polkitd[1624]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 2 12:52:21.555444 polkitd[1624]: Finished loading, compiling and executing 2 rules Mar 2 12:52:21.560916 dbus-daemon[1538]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Mar 2 12:52:21.560956 systemd[1]: Started polkit.service - Authorization Manager. Mar 2 12:52:21.565576 polkitd[1624]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Mar 2 12:52:21.577174 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 2 12:52:21.597819 systemd-networkd[1501]: eth0: Gained IPv6LL Mar 2 12:52:21.607869 systemd-timesyncd[1466]: Network configuration changed, trying to establish connection. Mar 2 12:52:21.611694 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 2 12:52:21.614166 systemd[1]: Reached target network-online.target - Network is Online. Mar 2 12:52:21.621133 systemd-hostnamed[1605]: Hostname set to (static) Mar 2 12:52:21.622344 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:52:21.626918 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 2 12:52:21.648858 tar[1564]: linux-amd64/README.md Mar 2 12:52:21.700902 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 2 12:52:21.737332 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 2 12:52:22.081337 sshd_keygen[1562]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 2 12:52:22.146941 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 2 12:52:22.160980 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 2 12:52:22.168180 systemd[1]: Started sshd@0-10.243.74.166:22-68.220.241.50:54984.service - OpenSSH per-connection server daemon (68.220.241.50:54984). Mar 2 12:52:22.222512 systemd[1]: issuegen.service: Deactivated successfully. Mar 2 12:52:22.224522 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 2 12:52:22.235936 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 2 12:52:22.304812 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 2 12:52:22.315560 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 2 12:52:22.326044 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 2 12:52:22.329468 systemd[1]: Reached target getty.target - Login Prompts. Mar 2 12:52:22.478336 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 2 12:52:22.478941 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 2 12:52:22.837585 sshd[1695]: Accepted publickey for core from 68.220.241.50 port 54984 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:52:22.841139 sshd-session[1695]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:52:22.881226 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 2 12:52:22.893922 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 2 12:52:22.910530 systemd-logind[1549]: New session 1 of user core. Mar 2 12:52:22.991501 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 2 12:52:23.007479 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 2 12:52:23.037622 (systemd)[1709]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 2 12:52:23.046816 systemd-logind[1549]: New session c1 of user core. Mar 2 12:52:23.124555 systemd-networkd[1501]: eth0: Ignoring DHCPv6 address 2a02:1348:17c:d2a9:24:19ff:fef3:4aa6/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17c:d2a9:24:19ff:fef3:4aa6/64 assigned by NDisc. Mar 2 12:52:23.124584 systemd-networkd[1501]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Mar 2 12:52:23.129380 systemd-timesyncd[1466]: Network configuration changed, trying to establish connection. Mar 2 12:52:23.378478 systemd[1709]: Queued start job for default target default.target. Mar 2 12:52:23.393013 systemd[1709]: Created slice app.slice - User Application Slice. Mar 2 12:52:23.393093 systemd[1709]: Reached target paths.target - Paths. Mar 2 12:52:23.393225 systemd[1709]: Reached target timers.target - Timers. Mar 2 12:52:23.399562 systemd[1709]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 2 12:52:23.457774 systemd[1709]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 2 12:52:23.458208 systemd[1709]: Reached target sockets.target - Sockets. Mar 2 12:52:23.458352 systemd[1709]: Reached target basic.target - Basic System. Mar 2 12:52:23.459346 systemd[1709]: Reached target default.target - Main User Target. Mar 2 12:52:23.459480 systemd[1709]: Startup finished in 385ms. Mar 2 12:52:23.460837 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 2 12:52:23.498355 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 2 12:52:23.846910 systemd[1]: Started sshd@1-10.243.74.166:22-68.220.241.50:52188.service - OpenSSH per-connection server daemon (68.220.241.50:52188). Mar 2 12:52:24.101762 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:52:24.110709 (kubelet)[1728]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 2 12:52:24.489668 systemd-timesyncd[1466]: Network configuration changed, trying to establish connection. Mar 2 12:52:24.530116 sshd[1721]: Accepted publickey for core from 68.220.241.50 port 52188 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:52:24.532233 sshd-session[1721]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:52:24.544634 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 2 12:52:24.569105 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 2 12:52:24.702085 systemd-logind[1549]: New session 2 of user core. Mar 2 12:52:24.755863 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 2 12:52:25.069969 sshd[1736]: Connection closed by 68.220.241.50 port 52188 Mar 2 12:52:25.049642 sshd-session[1721]: pam_unix(sshd:session): session closed for user core Mar 2 12:52:25.085772 systemd[1]: sshd@1-10.243.74.166:22-68.220.241.50:52188.service: Deactivated successfully. Mar 2 12:52:25.096508 systemd[1]: session-2.scope: Deactivated successfully. Mar 2 12:52:25.103858 systemd-logind[1549]: Session 2 logged out. Waiting for processes to exit. Mar 2 12:52:25.108662 systemd-logind[1549]: Removed session 2. Mar 2 12:52:25.175247 systemd[1]: Started sshd@2-10.243.74.166:22-68.220.241.50:52198.service - OpenSSH per-connection server daemon (68.220.241.50:52198). Mar 2 12:52:25.789956 sshd[1742]: Accepted publickey for core from 68.220.241.50 port 52198 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:52:25.795352 sshd-session[1742]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:52:25.810296 systemd-logind[1549]: New session 3 of user core. Mar 2 12:52:25.824592 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 2 12:52:25.942538 kubelet[1728]: E0302 12:52:25.942154 1728 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 2 12:52:25.952981 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 2 12:52:25.953379 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 2 12:52:25.955648 systemd[1]: kubelet.service: Consumed 2.047s CPU time, 268.4M memory peak. Mar 2 12:52:26.118034 sshd[1746]: Connection closed by 68.220.241.50 port 52198 Mar 2 12:52:26.129576 sshd-session[1742]: pam_unix(sshd:session): session closed for user core Mar 2 12:52:26.161758 systemd[1]: sshd@2-10.243.74.166:22-68.220.241.50:52198.service: Deactivated successfully. Mar 2 12:52:26.172611 systemd[1]: session-3.scope: Deactivated successfully. Mar 2 12:52:26.179572 systemd-logind[1549]: Session 3 logged out. Waiting for processes to exit. Mar 2 12:52:26.187924 systemd-logind[1549]: Removed session 3. Mar 2 12:52:27.443647 login[1703]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 2 12:52:27.447436 login[1704]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 2 12:52:27.457499 systemd-logind[1549]: New session 4 of user core. Mar 2 12:52:27.469960 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 2 12:52:27.478462 systemd-logind[1549]: New session 5 of user core. Mar 2 12:52:27.483658 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 2 12:52:28.618444 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 2 12:52:28.629762 coreos-metadata[1537]: Mar 02 12:52:28.629 WARN failed to locate config-drive, using the metadata service API instead Mar 2 12:52:28.655232 coreos-metadata[1537]: Mar 02 12:52:28.655 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Mar 2 12:52:28.662691 coreos-metadata[1537]: Mar 02 12:52:28.662 INFO Fetch failed with 404: resource not found Mar 2 12:52:28.662691 coreos-metadata[1537]: Mar 02 12:52:28.662 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 2 12:52:28.663131 coreos-metadata[1537]: Mar 02 12:52:28.663 INFO Fetch successful Mar 2 12:52:28.663335 coreos-metadata[1537]: Mar 02 12:52:28.663 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Mar 2 12:52:28.694596 coreos-metadata[1537]: Mar 02 12:52:28.694 INFO Fetch successful Mar 2 12:52:28.694596 coreos-metadata[1537]: Mar 02 12:52:28.694 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Mar 2 12:52:28.715912 coreos-metadata[1537]: Mar 02 12:52:28.715 INFO Fetch successful Mar 2 12:52:28.716268 coreos-metadata[1537]: Mar 02 12:52:28.716 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Mar 2 12:52:28.735255 coreos-metadata[1537]: Mar 02 12:52:28.735 INFO Fetch successful Mar 2 12:52:28.735255 coreos-metadata[1537]: Mar 02 12:52:28.735 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Mar 2 12:52:28.756705 coreos-metadata[1537]: Mar 02 12:52:28.756 INFO Fetch successful Mar 2 12:52:28.766477 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Mar 2 12:52:28.780825 coreos-metadata[1614]: Mar 02 12:52:28.780 WARN failed to locate config-drive, using the metadata service API instead Mar 2 12:52:28.794704 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 2 12:52:28.796157 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 2 12:52:28.806349 coreos-metadata[1614]: Mar 02 12:52:28.806 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 2 12:52:28.834235 coreos-metadata[1614]: Mar 02 12:52:28.834 INFO Fetch successful Mar 2 12:52:28.834545 coreos-metadata[1614]: Mar 02 12:52:28.834 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 2 12:52:28.867131 coreos-metadata[1614]: Mar 02 12:52:28.867 INFO Fetch successful Mar 2 12:52:28.869613 unknown[1614]: wrote ssh authorized keys file for user: core Mar 2 12:52:28.893970 update-ssh-keys[1787]: Updated "/home/core/.ssh/authorized_keys" Mar 2 12:52:28.895020 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 2 12:52:28.898800 systemd[1]: Finished sshkeys.service. Mar 2 12:52:28.900316 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 2 12:52:28.902524 systemd[1]: Startup finished in 3.825s (kernel) + 17.671s (initrd) + 13.620s (userspace) = 35.117s. Mar 2 12:52:36.196383 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 2 12:52:36.198740 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:52:36.201686 systemd[1]: Started sshd@3-10.243.74.166:22-68.220.241.50:36060.service - OpenSSH per-connection server daemon (68.220.241.50:36060). Mar 2 12:52:36.561440 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:52:36.572181 (kubelet)[1802]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 2 12:52:36.670265 kubelet[1802]: E0302 12:52:36.670158 1802 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 2 12:52:36.675464 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 2 12:52:36.675738 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 2 12:52:36.676522 systemd[1]: kubelet.service: Consumed 425ms CPU time, 109.3M memory peak. Mar 2 12:52:36.714287 sshd[1792]: Accepted publickey for core from 68.220.241.50 port 36060 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:52:36.716697 sshd-session[1792]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:52:36.724538 systemd-logind[1549]: New session 6 of user core. Mar 2 12:52:36.741206 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 2 12:52:36.989564 sshd[1810]: Connection closed by 68.220.241.50 port 36060 Mar 2 12:52:36.990785 sshd-session[1792]: pam_unix(sshd:session): session closed for user core Mar 2 12:52:36.996110 systemd[1]: sshd@3-10.243.74.166:22-68.220.241.50:36060.service: Deactivated successfully. Mar 2 12:52:36.998558 systemd[1]: session-6.scope: Deactivated successfully. Mar 2 12:52:36.999782 systemd-logind[1549]: Session 6 logged out. Waiting for processes to exit. Mar 2 12:52:37.001799 systemd-logind[1549]: Removed session 6. Mar 2 12:52:37.101786 systemd[1]: Started sshd@4-10.243.74.166:22-68.220.241.50:36064.service - OpenSSH per-connection server daemon (68.220.241.50:36064). Mar 2 12:52:37.628890 sshd[1816]: Accepted publickey for core from 68.220.241.50 port 36064 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:52:37.630634 sshd-session[1816]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:52:37.638792 systemd-logind[1549]: New session 7 of user core. Mar 2 12:52:37.645658 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 2 12:52:37.911469 sshd[1819]: Connection closed by 68.220.241.50 port 36064 Mar 2 12:52:37.910467 sshd-session[1816]: pam_unix(sshd:session): session closed for user core Mar 2 12:52:37.917666 systemd[1]: sshd@4-10.243.74.166:22-68.220.241.50:36064.service: Deactivated successfully. Mar 2 12:52:37.920767 systemd[1]: session-7.scope: Deactivated successfully. Mar 2 12:52:37.922794 systemd-logind[1549]: Session 7 logged out. Waiting for processes to exit. Mar 2 12:52:37.924927 systemd-logind[1549]: Removed session 7. Mar 2 12:52:38.008129 systemd[1]: Started sshd@5-10.243.74.166:22-68.220.241.50:36068.service - OpenSSH per-connection server daemon (68.220.241.50:36068). Mar 2 12:52:38.515431 sshd[1825]: Accepted publickey for core from 68.220.241.50 port 36068 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:52:38.517747 sshd-session[1825]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:52:38.526380 systemd-logind[1549]: New session 8 of user core. Mar 2 12:52:38.533655 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 2 12:52:38.790230 sshd[1828]: Connection closed by 68.220.241.50 port 36068 Mar 2 12:52:38.791205 sshd-session[1825]: pam_unix(sshd:session): session closed for user core Mar 2 12:52:38.795917 systemd[1]: sshd@5-10.243.74.166:22-68.220.241.50:36068.service: Deactivated successfully. Mar 2 12:52:38.798771 systemd[1]: session-8.scope: Deactivated successfully. Mar 2 12:52:38.801565 systemd-logind[1549]: Session 8 logged out. Waiting for processes to exit. Mar 2 12:52:38.802946 systemd-logind[1549]: Removed session 8. Mar 2 12:52:38.896052 systemd[1]: Started sshd@6-10.243.74.166:22-68.220.241.50:36084.service - OpenSSH per-connection server daemon (68.220.241.50:36084). Mar 2 12:52:39.407139 sshd[1834]: Accepted publickey for core from 68.220.241.50 port 36084 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:52:39.409279 sshd-session[1834]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:52:39.416456 systemd-logind[1549]: New session 9 of user core. Mar 2 12:52:39.427703 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 2 12:52:39.612599 sudo[1838]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 2 12:52:39.613105 sudo[1838]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 2 12:52:39.631023 sudo[1838]: pam_unix(sudo:session): session closed for user root Mar 2 12:52:39.722239 sshd[1837]: Connection closed by 68.220.241.50 port 36084 Mar 2 12:52:39.721842 sshd-session[1834]: pam_unix(sshd:session): session closed for user core Mar 2 12:52:39.731434 systemd[1]: sshd@6-10.243.74.166:22-68.220.241.50:36084.service: Deactivated successfully. Mar 2 12:52:39.734811 systemd[1]: session-9.scope: Deactivated successfully. Mar 2 12:52:39.737931 systemd-logind[1549]: Session 9 logged out. Waiting for processes to exit. Mar 2 12:52:39.739808 systemd-logind[1549]: Removed session 9. Mar 2 12:52:39.828244 systemd[1]: Started sshd@7-10.243.74.166:22-68.220.241.50:36092.service - OpenSSH per-connection server daemon (68.220.241.50:36092). Mar 2 12:52:40.348098 sshd[1844]: Accepted publickey for core from 68.220.241.50 port 36092 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:52:40.349991 sshd-session[1844]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:52:40.358338 systemd-logind[1549]: New session 10 of user core. Mar 2 12:52:40.363682 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 2 12:52:40.535921 sudo[1849]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 2 12:52:40.536373 sudo[1849]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 2 12:52:40.546267 sudo[1849]: pam_unix(sudo:session): session closed for user root Mar 2 12:52:40.555015 sudo[1848]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 2 12:52:40.555483 sudo[1848]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 2 12:52:40.571790 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 2 12:52:40.640628 augenrules[1871]: No rules Mar 2 12:52:40.642755 systemd[1]: audit-rules.service: Deactivated successfully. Mar 2 12:52:40.643177 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 2 12:52:40.645210 sudo[1848]: pam_unix(sudo:session): session closed for user root Mar 2 12:52:40.734810 sshd[1847]: Connection closed by 68.220.241.50 port 36092 Mar 2 12:52:40.736888 sshd-session[1844]: pam_unix(sshd:session): session closed for user core Mar 2 12:52:40.742290 systemd[1]: sshd@7-10.243.74.166:22-68.220.241.50:36092.service: Deactivated successfully. Mar 2 12:52:40.744971 systemd[1]: session-10.scope: Deactivated successfully. Mar 2 12:52:40.747978 systemd-logind[1549]: Session 10 logged out. Waiting for processes to exit. Mar 2 12:52:40.750134 systemd-logind[1549]: Removed session 10. Mar 2 12:52:40.847000 systemd[1]: Started sshd@8-10.243.74.166:22-68.220.241.50:36108.service - OpenSSH per-connection server daemon (68.220.241.50:36108). Mar 2 12:52:41.366570 sshd[1880]: Accepted publickey for core from 68.220.241.50 port 36108 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:52:41.368354 sshd-session[1880]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:52:41.376217 systemd-logind[1549]: New session 11 of user core. Mar 2 12:52:41.387856 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 2 12:52:41.562783 sudo[1884]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 2 12:52:41.563211 sudo[1884]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 2 12:52:42.447286 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 2 12:52:42.469675 (dockerd)[1901]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 2 12:52:43.058581 dockerd[1901]: time="2026-03-02T12:52:43.058408516Z" level=info msg="Starting up" Mar 2 12:52:43.060445 dockerd[1901]: time="2026-03-02T12:52:43.060384591Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 2 12:52:43.087643 dockerd[1901]: time="2026-03-02T12:52:43.087541734Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 2 12:52:43.129375 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2018224677-merged.mount: Deactivated successfully. Mar 2 12:52:43.171082 dockerd[1901]: time="2026-03-02T12:52:43.170445678Z" level=info msg="Loading containers: start." Mar 2 12:52:43.187096 kernel: Initializing XFRM netlink socket Mar 2 12:52:43.526066 systemd-timesyncd[1466]: Network configuration changed, trying to establish connection. Mar 2 12:52:43.588153 systemd-networkd[1501]: docker0: Link UP Mar 2 12:52:43.601669 dockerd[1901]: time="2026-03-02T12:52:43.601501464Z" level=info msg="Loading containers: done." Mar 2 12:52:43.629411 dockerd[1901]: time="2026-03-02T12:52:43.628877352Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 2 12:52:43.629411 dockerd[1901]: time="2026-03-02T12:52:43.629050984Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 2 12:52:43.629411 dockerd[1901]: time="2026-03-02T12:52:43.629196827Z" level=info msg="Initializing buildkit" Mar 2 12:52:43.664377 dockerd[1901]: time="2026-03-02T12:52:43.664309210Z" level=info msg="Completed buildkit initialization" Mar 2 12:52:43.676426 dockerd[1901]: time="2026-03-02T12:52:43.676309107Z" level=info msg="Daemon has completed initialization" Mar 2 12:52:43.677416 dockerd[1901]: time="2026-03-02T12:52:43.676685619Z" level=info msg="API listen on /run/docker.sock" Mar 2 12:52:43.676893 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 2 12:52:44.123672 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1839766753-merged.mount: Deactivated successfully. Mar 2 12:52:45.313974 systemd-resolved[1446]: Clock change detected. Flushing caches. Mar 2 12:52:45.314894 systemd-timesyncd[1466]: Contacted time server [2a01:7e00::f03c:94ff:fee2:ae4f]:123 (2.flatcar.pool.ntp.org). Mar 2 12:52:45.315021 systemd-timesyncd[1466]: Initial clock synchronization to Mon 2026-03-02 12:52:45.313662 UTC. Mar 2 12:52:45.916956 containerd[1566]: time="2026-03-02T12:52:45.915779020Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\"" Mar 2 12:52:46.804852 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount229224629.mount: Deactivated successfully. Mar 2 12:52:47.653698 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 2 12:52:47.659547 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:52:47.955381 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:52:47.970228 (kubelet)[2179]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 2 12:52:48.082903 kubelet[2179]: E0302 12:52:48.082819 2179 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 2 12:52:48.085696 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 2 12:52:48.085968 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 2 12:52:48.086593 systemd[1]: kubelet.service: Consumed 322ms CPU time, 110.5M memory peak. Mar 2 12:52:51.277480 containerd[1566]: time="2026-03-02T12:52:51.277095546Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:52:51.280948 containerd[1566]: time="2026-03-02T12:52:51.280889127Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.9: active requests=0, bytes read=30116194" Mar 2 12:52:51.282033 containerd[1566]: time="2026-03-02T12:52:51.281957999Z" level=info msg="ImageCreate event name:\"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:52:51.287079 containerd[1566]: time="2026-03-02T12:52:51.287011399Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:52:51.288752 containerd[1566]: time="2026-03-02T12:52:51.288352088Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.9\" with image id \"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\", size \"30112785\" in 5.372429403s" Mar 2 12:52:51.288752 containerd[1566]: time="2026-03-02T12:52:51.288441188Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\" returns image reference \"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\"" Mar 2 12:52:51.290321 containerd[1566]: time="2026-03-02T12:52:51.290281592Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\"" Mar 2 12:52:53.645787 containerd[1566]: time="2026-03-02T12:52:53.645647289Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:52:53.647312 containerd[1566]: time="2026-03-02T12:52:53.647261098Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.9: active requests=0, bytes read=26021818" Mar 2 12:52:53.648499 containerd[1566]: time="2026-03-02T12:52:53.648001064Z" level=info msg="ImageCreate event name:\"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:52:53.652576 containerd[1566]: time="2026-03-02T12:52:53.651735605Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:52:53.653625 containerd[1566]: time="2026-03-02T12:52:53.653582158Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.9\" with image id \"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\", size \"27678758\" in 2.363231308s" Mar 2 12:52:53.653709 containerd[1566]: time="2026-03-02T12:52:53.653656724Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\" returns image reference \"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\"" Mar 2 12:52:53.655062 containerd[1566]: time="2026-03-02T12:52:53.654769087Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\"" Mar 2 12:52:54.004476 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Mar 2 12:52:55.615628 containerd[1566]: time="2026-03-02T12:52:55.615497073Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:52:55.617190 containerd[1566]: time="2026-03-02T12:52:55.617132762Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.9: active requests=0, bytes read=20162754" Mar 2 12:52:55.618490 containerd[1566]: time="2026-03-02T12:52:55.617939731Z" level=info msg="ImageCreate event name:\"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:52:55.621482 containerd[1566]: time="2026-03-02T12:52:55.621231513Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:52:55.623908 containerd[1566]: time="2026-03-02T12:52:55.622906703Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.9\" with image id \"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\", size \"21819712\" in 1.967304761s" Mar 2 12:52:55.623908 containerd[1566]: time="2026-03-02T12:52:55.622991703Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\" returns image reference \"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\"" Mar 2 12:52:55.624469 containerd[1566]: time="2026-03-02T12:52:55.624410704Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\"" Mar 2 12:52:57.415003 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount764130519.mount: Deactivated successfully. Mar 2 12:52:58.154398 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 2 12:52:58.162757 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:52:58.617088 containerd[1566]: time="2026-03-02T12:52:58.616936364Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:52:58.618900 containerd[1566]: time="2026-03-02T12:52:58.618686279Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.9: active requests=0, bytes read=31828655" Mar 2 12:52:58.621113 containerd[1566]: time="2026-03-02T12:52:58.619690760Z" level=info msg="ImageCreate event name:\"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:52:58.624264 containerd[1566]: time="2026-03-02T12:52:58.624210218Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:52:58.625130 containerd[1566]: time="2026-03-02T12:52:58.625086174Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.9\" with image id \"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\", repo tag \"registry.k8s.io/kube-proxy:v1.33.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\", size \"31827666\" in 3.000620584s" Mar 2 12:52:58.625242 containerd[1566]: time="2026-03-02T12:52:58.625170157Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\" returns image reference \"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\"" Mar 2 12:52:58.627306 containerd[1566]: time="2026-03-02T12:52:58.627249471Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Mar 2 12:52:58.649832 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:52:58.662384 (kubelet)[2216]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 2 12:52:58.747714 kubelet[2216]: E0302 12:52:58.747602 2216 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 2 12:52:58.750289 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 2 12:52:58.750573 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 2 12:52:58.751521 systemd[1]: kubelet.service: Consumed 396ms CPU time, 108.5M memory peak. Mar 2 12:52:59.239357 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2899579292.mount: Deactivated successfully. Mar 2 12:53:00.928794 containerd[1566]: time="2026-03-02T12:53:00.928698200Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:53:00.930184 containerd[1566]: time="2026-03-02T12:53:00.930098520Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942246" Mar 2 12:53:00.932476 containerd[1566]: time="2026-03-02T12:53:00.931118236Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:53:00.935261 containerd[1566]: time="2026-03-02T12:53:00.935211428Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:53:00.936972 containerd[1566]: time="2026-03-02T12:53:00.936932486Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 2.309629394s" Mar 2 12:53:00.937103 containerd[1566]: time="2026-03-02T12:53:00.937076290Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Mar 2 12:53:00.937949 containerd[1566]: time="2026-03-02T12:53:00.937915341Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 2 12:53:01.588354 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3455524475.mount: Deactivated successfully. Mar 2 12:53:01.601148 containerd[1566]: time="2026-03-02T12:53:01.595626973Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 2 12:53:01.601148 containerd[1566]: time="2026-03-02T12:53:01.597653519Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Mar 2 12:53:01.601148 containerd[1566]: time="2026-03-02T12:53:01.600007660Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 2 12:53:01.602566 containerd[1566]: time="2026-03-02T12:53:01.602512960Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 2 12:53:01.605555 containerd[1566]: time="2026-03-02T12:53:01.605488271Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 667.52804ms" Mar 2 12:53:01.605703 containerd[1566]: time="2026-03-02T12:53:01.605574169Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Mar 2 12:53:01.606848 containerd[1566]: time="2026-03-02T12:53:01.606785307Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\"" Mar 2 12:53:02.217403 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2777168224.mount: Deactivated successfully. Mar 2 12:53:04.692959 containerd[1566]: time="2026-03-02T12:53:04.692858328Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.24-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:53:04.694310 containerd[1566]: time="2026-03-02T12:53:04.693922107Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.24-0: active requests=0, bytes read=23718848" Mar 2 12:53:04.694974 containerd[1566]: time="2026-03-02T12:53:04.694939089Z" level=info msg="ImageCreate event name:\"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:53:04.699665 containerd[1566]: time="2026-03-02T12:53:04.699603537Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:53:04.702012 containerd[1566]: time="2026-03-02T12:53:04.701854354Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.24-0\" with image id \"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\", repo tag \"registry.k8s.io/etcd:3.5.24-0\", repo digest \"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\", size \"23716032\" in 3.095015316s" Mar 2 12:53:04.702012 containerd[1566]: time="2026-03-02T12:53:04.701907735Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\" returns image reference \"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\"" Mar 2 12:53:05.600479 update_engine[1551]: I20260302 12:53:05.599016 1551 update_attempter.cc:509] Updating boot flags... Mar 2 12:53:08.904176 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 2 12:53:08.908674 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:53:09.295659 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:53:09.306255 (kubelet)[2386]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 2 12:53:09.382414 kubelet[2386]: E0302 12:53:09.381892 2386 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 2 12:53:09.387803 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 2 12:53:09.388071 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 2 12:53:09.390069 systemd[1]: kubelet.service: Consumed 375ms CPU time, 107.1M memory peak. Mar 2 12:53:10.742400 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:53:10.742672 systemd[1]: kubelet.service: Consumed 375ms CPU time, 107.1M memory peak. Mar 2 12:53:10.747413 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:53:10.789285 systemd[1]: Reload requested from client PID 2400 ('systemctl') (unit session-11.scope)... Mar 2 12:53:10.789354 systemd[1]: Reloading... Mar 2 12:53:10.962546 zram_generator::config[2444]: No configuration found. Mar 2 12:53:11.319268 systemd[1]: Reloading finished in 529 ms. Mar 2 12:53:11.396146 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 2 12:53:11.396302 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 2 12:53:11.396693 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:53:11.396764 systemd[1]: kubelet.service: Consumed 143ms CPU time, 98.2M memory peak. Mar 2 12:53:11.399272 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:53:11.597705 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:53:11.614717 (kubelet)[2511]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 2 12:53:11.700422 kubelet[2511]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 2 12:53:11.700422 kubelet[2511]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 2 12:53:11.700422 kubelet[2511]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 2 12:53:11.701618 kubelet[2511]: I0302 12:53:11.701295 2511 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 2 12:53:12.601295 kubelet[2511]: I0302 12:53:12.601079 2511 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 2 12:53:12.601602 kubelet[2511]: I0302 12:53:12.601579 2511 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 2 12:53:12.602105 kubelet[2511]: I0302 12:53:12.602082 2511 server.go:956] "Client rotation is on, will bootstrap in background" Mar 2 12:53:12.635215 kubelet[2511]: I0302 12:53:12.635164 2511 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 2 12:53:12.635979 kubelet[2511]: E0302 12:53:12.635933 2511 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.243.74.166:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.243.74.166:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 2 12:53:12.647117 kubelet[2511]: I0302 12:53:12.647056 2511 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 2 12:53:12.654677 kubelet[2511]: I0302 12:53:12.654375 2511 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 2 12:53:12.658392 kubelet[2511]: I0302 12:53:12.658290 2511 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 2 12:53:12.660266 kubelet[2511]: I0302 12:53:12.658374 2511 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-zvfam.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 2 12:53:12.660673 kubelet[2511]: I0302 12:53:12.660284 2511 topology_manager.go:138] "Creating topology manager with none policy" Mar 2 12:53:12.660673 kubelet[2511]: I0302 12:53:12.660308 2511 container_manager_linux.go:303] "Creating device plugin manager" Mar 2 12:53:12.660673 kubelet[2511]: I0302 12:53:12.660645 2511 state_mem.go:36] "Initialized new in-memory state store" Mar 2 12:53:12.667142 kubelet[2511]: I0302 12:53:12.667073 2511 kubelet.go:480] "Attempting to sync node with API server" Mar 2 12:53:12.667338 kubelet[2511]: I0302 12:53:12.667160 2511 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 2 12:53:12.669535 kubelet[2511]: I0302 12:53:12.669060 2511 kubelet.go:386] "Adding apiserver pod source" Mar 2 12:53:12.672407 kubelet[2511]: I0302 12:53:12.672021 2511 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 2 12:53:12.680365 kubelet[2511]: E0302 12:53:12.680302 2511 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.243.74.166:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-zvfam.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.243.74.166:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 2 12:53:12.681962 kubelet[2511]: I0302 12:53:12.681928 2511 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 2 12:53:12.682995 kubelet[2511]: I0302 12:53:12.682967 2511 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 2 12:53:12.684254 kubelet[2511]: W0302 12:53:12.684228 2511 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 2 12:53:12.689499 kubelet[2511]: E0302 12:53:12.689024 2511 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.243.74.166:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.243.74.166:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 2 12:53:12.694384 kubelet[2511]: I0302 12:53:12.694353 2511 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 2 12:53:12.694668 kubelet[2511]: I0302 12:53:12.694646 2511 server.go:1289] "Started kubelet" Mar 2 12:53:12.698219 kubelet[2511]: I0302 12:53:12.698034 2511 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 2 12:53:12.700675 kubelet[2511]: I0302 12:53:12.698827 2511 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 2 12:53:12.700675 kubelet[2511]: I0302 12:53:12.699493 2511 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 2 12:53:12.700675 kubelet[2511]: I0302 12:53:12.699595 2511 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 2 12:53:12.701940 kubelet[2511]: I0302 12:53:12.701580 2511 server.go:317] "Adding debug handlers to kubelet server" Mar 2 12:53:12.712139 kubelet[2511]: I0302 12:53:12.712099 2511 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 2 12:53:12.713209 kubelet[2511]: I0302 12:53:12.713153 2511 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 2 12:53:12.714852 kubelet[2511]: E0302 12:53:12.710129 2511 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.243.74.166:6443/api/v1/namespaces/default/events\": dial tcp 10.243.74.166:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-zvfam.gb1.brightbox.com.18990755a09266c2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-zvfam.gb1.brightbox.com,UID:srv-zvfam.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-zvfam.gb1.brightbox.com,},FirstTimestamp:2026-03-02 12:53:12.694564546 +0000 UTC m=+1.070479034,LastTimestamp:2026-03-02 12:53:12.694564546 +0000 UTC m=+1.070479034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-zvfam.gb1.brightbox.com,}" Mar 2 12:53:12.717345 kubelet[2511]: E0302 12:53:12.716774 2511 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-zvfam.gb1.brightbox.com\" not found" Mar 2 12:53:12.717926 kubelet[2511]: E0302 12:53:12.717848 2511 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.243.74.166:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-zvfam.gb1.brightbox.com?timeout=10s\": dial tcp 10.243.74.166:6443: connect: connection refused" interval="200ms" Mar 2 12:53:12.718194 kubelet[2511]: I0302 12:53:12.718165 2511 reconciler.go:26] "Reconciler: start to sync state" Mar 2 12:53:12.718366 kubelet[2511]: I0302 12:53:12.718339 2511 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 2 12:53:12.720258 kubelet[2511]: E0302 12:53:12.718566 2511 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.243.74.166:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.243.74.166:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 2 12:53:12.725857 kubelet[2511]: I0302 12:53:12.725807 2511 factory.go:223] Registration of the containerd container factory successfully Mar 2 12:53:12.726108 kubelet[2511]: I0302 12:53:12.726088 2511 factory.go:223] Registration of the systemd container factory successfully Mar 2 12:53:12.726408 kubelet[2511]: I0302 12:53:12.726377 2511 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 2 12:53:12.732195 kubelet[2511]: E0302 12:53:12.732043 2511 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 2 12:53:12.763474 kubelet[2511]: I0302 12:53:12.761618 2511 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 2 12:53:12.767904 kubelet[2511]: I0302 12:53:12.767872 2511 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 2 12:53:12.768088 kubelet[2511]: I0302 12:53:12.768067 2511 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 2 12:53:12.768345 kubelet[2511]: I0302 12:53:12.767911 2511 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 2 12:53:12.768479 kubelet[2511]: I0302 12:53:12.768446 2511 state_mem.go:36] "Initialized new in-memory state store" Mar 2 12:53:12.769940 kubelet[2511]: I0302 12:53:12.769912 2511 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 2 12:53:12.772016 kubelet[2511]: I0302 12:53:12.769995 2511 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 2 12:53:12.772016 kubelet[2511]: I0302 12:53:12.770026 2511 kubelet.go:2436] "Starting kubelet main sync loop" Mar 2 12:53:12.772016 kubelet[2511]: E0302 12:53:12.770158 2511 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 2 12:53:12.773787 kubelet[2511]: I0302 12:53:12.773758 2511 policy_none.go:49] "None policy: Start" Mar 2 12:53:12.773868 kubelet[2511]: I0302 12:53:12.773802 2511 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 2 12:53:12.773868 kubelet[2511]: I0302 12:53:12.773838 2511 state_mem.go:35] "Initializing new in-memory state store" Mar 2 12:53:12.774431 kubelet[2511]: E0302 12:53:12.774400 2511 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.243.74.166:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.243.74.166:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 2 12:53:12.785554 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 2 12:53:12.800920 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 2 12:53:12.816417 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 2 12:53:12.817769 kubelet[2511]: E0302 12:53:12.817138 2511 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-zvfam.gb1.brightbox.com\" not found" Mar 2 12:53:12.818779 kubelet[2511]: E0302 12:53:12.818740 2511 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 2 12:53:12.819132 kubelet[2511]: I0302 12:53:12.819090 2511 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 2 12:53:12.819198 kubelet[2511]: I0302 12:53:12.819140 2511 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 2 12:53:12.821706 kubelet[2511]: I0302 12:53:12.820590 2511 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 2 12:53:12.822089 kubelet[2511]: E0302 12:53:12.822063 2511 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 2 12:53:12.822278 kubelet[2511]: E0302 12:53:12.822254 2511 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-zvfam.gb1.brightbox.com\" not found" Mar 2 12:53:12.894731 systemd[1]: Created slice kubepods-burstable-podf76c54020f466e13365037ce88cf8b2f.slice - libcontainer container kubepods-burstable-podf76c54020f466e13365037ce88cf8b2f.slice. Mar 2 12:53:12.908127 kubelet[2511]: E0302 12:53:12.908044 2511 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zvfam.gb1.brightbox.com\" not found" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:12.913295 systemd[1]: Created slice kubepods-burstable-podba57e322649050bf56d89b6c7932a8e1.slice - libcontainer container kubepods-burstable-podba57e322649050bf56d89b6c7932a8e1.slice. Mar 2 12:53:12.917847 kubelet[2511]: E0302 12:53:12.917445 2511 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zvfam.gb1.brightbox.com\" not found" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:12.918720 kubelet[2511]: E0302 12:53:12.918685 2511 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.243.74.166:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-zvfam.gb1.brightbox.com?timeout=10s\": dial tcp 10.243.74.166:6443: connect: connection refused" interval="400ms" Mar 2 12:53:12.922402 kubelet[2511]: I0302 12:53:12.922347 2511 kubelet_node_status.go:75] "Attempting to register node" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:12.923122 kubelet[2511]: E0302 12:53:12.923083 2511 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.243.74.166:6443/api/v1/nodes\": dial tcp 10.243.74.166:6443: connect: connection refused" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:12.929492 systemd[1]: Created slice kubepods-burstable-pode7c4012fbdda6341bd8d59d396acc654.slice - libcontainer container kubepods-burstable-pode7c4012fbdda6341bd8d59d396acc654.slice. Mar 2 12:53:12.932728 kubelet[2511]: E0302 12:53:12.932680 2511 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zvfam.gb1.brightbox.com\" not found" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:13.019287 kubelet[2511]: I0302 12:53:13.019221 2511 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f76c54020f466e13365037ce88cf8b2f-k8s-certs\") pod \"kube-apiserver-srv-zvfam.gb1.brightbox.com\" (UID: \"f76c54020f466e13365037ce88cf8b2f\") " pod="kube-system/kube-apiserver-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:13.019754 kubelet[2511]: I0302 12:53:13.019721 2511 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ba57e322649050bf56d89b6c7932a8e1-k8s-certs\") pod \"kube-controller-manager-srv-zvfam.gb1.brightbox.com\" (UID: \"ba57e322649050bf56d89b6c7932a8e1\") " pod="kube-system/kube-controller-manager-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:13.020098 kubelet[2511]: I0302 12:53:13.019930 2511 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e7c4012fbdda6341bd8d59d396acc654-kubeconfig\") pod \"kube-scheduler-srv-zvfam.gb1.brightbox.com\" (UID: \"e7c4012fbdda6341bd8d59d396acc654\") " pod="kube-system/kube-scheduler-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:13.020098 kubelet[2511]: I0302 12:53:13.019997 2511 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f76c54020f466e13365037ce88cf8b2f-ca-certs\") pod \"kube-apiserver-srv-zvfam.gb1.brightbox.com\" (UID: \"f76c54020f466e13365037ce88cf8b2f\") " pod="kube-system/kube-apiserver-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:13.020098 kubelet[2511]: I0302 12:53:13.020061 2511 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f76c54020f466e13365037ce88cf8b2f-usr-share-ca-certificates\") pod \"kube-apiserver-srv-zvfam.gb1.brightbox.com\" (UID: \"f76c54020f466e13365037ce88cf8b2f\") " pod="kube-system/kube-apiserver-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:13.020268 kubelet[2511]: I0302 12:53:13.020117 2511 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ba57e322649050bf56d89b6c7932a8e1-ca-certs\") pod \"kube-controller-manager-srv-zvfam.gb1.brightbox.com\" (UID: \"ba57e322649050bf56d89b6c7932a8e1\") " pod="kube-system/kube-controller-manager-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:13.020268 kubelet[2511]: I0302 12:53:13.020155 2511 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ba57e322649050bf56d89b6c7932a8e1-flexvolume-dir\") pod \"kube-controller-manager-srv-zvfam.gb1.brightbox.com\" (UID: \"ba57e322649050bf56d89b6c7932a8e1\") " pod="kube-system/kube-controller-manager-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:13.020268 kubelet[2511]: I0302 12:53:13.020184 2511 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ba57e322649050bf56d89b6c7932a8e1-kubeconfig\") pod \"kube-controller-manager-srv-zvfam.gb1.brightbox.com\" (UID: \"ba57e322649050bf56d89b6c7932a8e1\") " pod="kube-system/kube-controller-manager-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:13.020268 kubelet[2511]: I0302 12:53:13.020214 2511 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ba57e322649050bf56d89b6c7932a8e1-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-zvfam.gb1.brightbox.com\" (UID: \"ba57e322649050bf56d89b6c7932a8e1\") " pod="kube-system/kube-controller-manager-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:13.126299 kubelet[2511]: I0302 12:53:13.126176 2511 kubelet_node_status.go:75] "Attempting to register node" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:13.127089 kubelet[2511]: E0302 12:53:13.127041 2511 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.243.74.166:6443/api/v1/nodes\": dial tcp 10.243.74.166:6443: connect: connection refused" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:13.211421 containerd[1566]: time="2026-03-02T12:53:13.211318697Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-zvfam.gb1.brightbox.com,Uid:f76c54020f466e13365037ce88cf8b2f,Namespace:kube-system,Attempt:0,}" Mar 2 12:53:13.219393 containerd[1566]: time="2026-03-02T12:53:13.219006641Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-zvfam.gb1.brightbox.com,Uid:ba57e322649050bf56d89b6c7932a8e1,Namespace:kube-system,Attempt:0,}" Mar 2 12:53:13.236294 containerd[1566]: time="2026-03-02T12:53:13.235891984Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-zvfam.gb1.brightbox.com,Uid:e7c4012fbdda6341bd8d59d396acc654,Namespace:kube-system,Attempt:0,}" Mar 2 12:53:13.322912 kubelet[2511]: E0302 12:53:13.322846 2511 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.243.74.166:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-zvfam.gb1.brightbox.com?timeout=10s\": dial tcp 10.243.74.166:6443: connect: connection refused" interval="800ms" Mar 2 12:53:13.376246 containerd[1566]: time="2026-03-02T12:53:13.376126509Z" level=info msg="connecting to shim 1078c6fd9a95df274afb4ecdc78014c1633275956a405b1e14934dd4fa2120bd" address="unix:///run/containerd/s/67fc542270104767ceec9102f8cb400d1c68bc3fd94aa5c5d6190bf3bb6d5bbd" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:53:13.381020 containerd[1566]: time="2026-03-02T12:53:13.380817210Z" level=info msg="connecting to shim b03f672bac27da5e9bdd94f56e13111362524186b470ebdeceacde36ae8541a3" address="unix:///run/containerd/s/73212f9cc1189ba18c965bf3f5823b23f60c6c04c4d29fc1cbf489cfac258b15" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:53:13.391718 containerd[1566]: time="2026-03-02T12:53:13.390861181Z" level=info msg="connecting to shim 79b4772b7a3c991385a438a70bf871b0c515f18dbd2483c5b55e2b95447c1b71" address="unix:///run/containerd/s/afad487d455704ddbf46d4fc9a9b891ea37349c7b551ef390591306b2a52363e" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:53:13.529737 systemd[1]: Started cri-containerd-1078c6fd9a95df274afb4ecdc78014c1633275956a405b1e14934dd4fa2120bd.scope - libcontainer container 1078c6fd9a95df274afb4ecdc78014c1633275956a405b1e14934dd4fa2120bd. Mar 2 12:53:13.531897 kubelet[2511]: I0302 12:53:13.530685 2511 kubelet_node_status.go:75] "Attempting to register node" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:13.533282 kubelet[2511]: E0302 12:53:13.532504 2511 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.243.74.166:6443/api/v1/nodes\": dial tcp 10.243.74.166:6443: connect: connection refused" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:13.533784 systemd[1]: Started cri-containerd-b03f672bac27da5e9bdd94f56e13111362524186b470ebdeceacde36ae8541a3.scope - libcontainer container b03f672bac27da5e9bdd94f56e13111362524186b470ebdeceacde36ae8541a3. Mar 2 12:53:13.542130 systemd[1]: Started cri-containerd-79b4772b7a3c991385a438a70bf871b0c515f18dbd2483c5b55e2b95447c1b71.scope - libcontainer container 79b4772b7a3c991385a438a70bf871b0c515f18dbd2483c5b55e2b95447c1b71. Mar 2 12:53:13.686050 containerd[1566]: time="2026-03-02T12:53:13.685812576Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-zvfam.gb1.brightbox.com,Uid:ba57e322649050bf56d89b6c7932a8e1,Namespace:kube-system,Attempt:0,} returns sandbox id \"79b4772b7a3c991385a438a70bf871b0c515f18dbd2483c5b55e2b95447c1b71\"" Mar 2 12:53:13.705144 containerd[1566]: time="2026-03-02T12:53:13.705070238Z" level=info msg="CreateContainer within sandbox \"79b4772b7a3c991385a438a70bf871b0c515f18dbd2483c5b55e2b95447c1b71\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 2 12:53:13.707516 containerd[1566]: time="2026-03-02T12:53:13.707437471Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-zvfam.gb1.brightbox.com,Uid:e7c4012fbdda6341bd8d59d396acc654,Namespace:kube-system,Attempt:0,} returns sandbox id \"b03f672bac27da5e9bdd94f56e13111362524186b470ebdeceacde36ae8541a3\"" Mar 2 12:53:13.709148 containerd[1566]: time="2026-03-02T12:53:13.709030533Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-zvfam.gb1.brightbox.com,Uid:f76c54020f466e13365037ce88cf8b2f,Namespace:kube-system,Attempt:0,} returns sandbox id \"1078c6fd9a95df274afb4ecdc78014c1633275956a405b1e14934dd4fa2120bd\"" Mar 2 12:53:13.715085 containerd[1566]: time="2026-03-02T12:53:13.714968635Z" level=info msg="CreateContainer within sandbox \"b03f672bac27da5e9bdd94f56e13111362524186b470ebdeceacde36ae8541a3\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 2 12:53:13.717328 containerd[1566]: time="2026-03-02T12:53:13.717293334Z" level=info msg="CreateContainer within sandbox \"1078c6fd9a95df274afb4ecdc78014c1633275956a405b1e14934dd4fa2120bd\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 2 12:53:13.740637 kubelet[2511]: E0302 12:53:13.740571 2511 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.243.74.166:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-zvfam.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.243.74.166:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 2 12:53:13.741363 kubelet[2511]: E0302 12:53:13.741005 2511 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.243.74.166:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.243.74.166:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 2 12:53:13.745696 containerd[1566]: time="2026-03-02T12:53:13.745616676Z" level=info msg="Container 1b9065881776da36e1c91f775adaed8f37fb639fbbb0d8ee668327c184bab28f: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:53:13.755545 containerd[1566]: time="2026-03-02T12:53:13.755351220Z" level=info msg="Container 66ca89f7c16fde0d84d82a1715ef7094b2afc77e0d6abe261f06ed3994bffd15: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:53:13.759382 containerd[1566]: time="2026-03-02T12:53:13.759337010Z" level=info msg="Container 54a7d735f51ccfbabd4e66586446eb3ff3003b0e98202514c4e99a9edc5343bb: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:53:13.764712 containerd[1566]: time="2026-03-02T12:53:13.764621984Z" level=info msg="CreateContainer within sandbox \"79b4772b7a3c991385a438a70bf871b0c515f18dbd2483c5b55e2b95447c1b71\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"1b9065881776da36e1c91f775adaed8f37fb639fbbb0d8ee668327c184bab28f\"" Mar 2 12:53:13.772514 containerd[1566]: time="2026-03-02T12:53:13.772424338Z" level=info msg="CreateContainer within sandbox \"1078c6fd9a95df274afb4ecdc78014c1633275956a405b1e14934dd4fa2120bd\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"66ca89f7c16fde0d84d82a1715ef7094b2afc77e0d6abe261f06ed3994bffd15\"" Mar 2 12:53:13.774950 containerd[1566]: time="2026-03-02T12:53:13.774890303Z" level=info msg="StartContainer for \"1b9065881776da36e1c91f775adaed8f37fb639fbbb0d8ee668327c184bab28f\"" Mar 2 12:53:13.777004 containerd[1566]: time="2026-03-02T12:53:13.776901870Z" level=info msg="connecting to shim 1b9065881776da36e1c91f775adaed8f37fb639fbbb0d8ee668327c184bab28f" address="unix:///run/containerd/s/afad487d455704ddbf46d4fc9a9b891ea37349c7b551ef390591306b2a52363e" protocol=ttrpc version=3 Mar 2 12:53:13.779352 containerd[1566]: time="2026-03-02T12:53:13.778538954Z" level=info msg="CreateContainer within sandbox \"b03f672bac27da5e9bdd94f56e13111362524186b470ebdeceacde36ae8541a3\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"54a7d735f51ccfbabd4e66586446eb3ff3003b0e98202514c4e99a9edc5343bb\"" Mar 2 12:53:13.779480 containerd[1566]: time="2026-03-02T12:53:13.779433780Z" level=info msg="StartContainer for \"66ca89f7c16fde0d84d82a1715ef7094b2afc77e0d6abe261f06ed3994bffd15\"" Mar 2 12:53:13.781734 containerd[1566]: time="2026-03-02T12:53:13.781643315Z" level=info msg="StartContainer for \"54a7d735f51ccfbabd4e66586446eb3ff3003b0e98202514c4e99a9edc5343bb\"" Mar 2 12:53:13.783713 containerd[1566]: time="2026-03-02T12:53:13.783668366Z" level=info msg="connecting to shim 54a7d735f51ccfbabd4e66586446eb3ff3003b0e98202514c4e99a9edc5343bb" address="unix:///run/containerd/s/73212f9cc1189ba18c965bf3f5823b23f60c6c04c4d29fc1cbf489cfac258b15" protocol=ttrpc version=3 Mar 2 12:53:13.785661 containerd[1566]: time="2026-03-02T12:53:13.785622908Z" level=info msg="connecting to shim 66ca89f7c16fde0d84d82a1715ef7094b2afc77e0d6abe261f06ed3994bffd15" address="unix:///run/containerd/s/67fc542270104767ceec9102f8cb400d1c68bc3fd94aa5c5d6190bf3bb6d5bbd" protocol=ttrpc version=3 Mar 2 12:53:13.829675 systemd[1]: Started cri-containerd-66ca89f7c16fde0d84d82a1715ef7094b2afc77e0d6abe261f06ed3994bffd15.scope - libcontainer container 66ca89f7c16fde0d84d82a1715ef7094b2afc77e0d6abe261f06ed3994bffd15. Mar 2 12:53:13.859680 systemd[1]: Started cri-containerd-1b9065881776da36e1c91f775adaed8f37fb639fbbb0d8ee668327c184bab28f.scope - libcontainer container 1b9065881776da36e1c91f775adaed8f37fb639fbbb0d8ee668327c184bab28f. Mar 2 12:53:13.861210 systemd[1]: Started cri-containerd-54a7d735f51ccfbabd4e66586446eb3ff3003b0e98202514c4e99a9edc5343bb.scope - libcontainer container 54a7d735f51ccfbabd4e66586446eb3ff3003b0e98202514c4e99a9edc5343bb. Mar 2 12:53:13.974401 containerd[1566]: time="2026-03-02T12:53:13.974224772Z" level=info msg="StartContainer for \"66ca89f7c16fde0d84d82a1715ef7094b2afc77e0d6abe261f06ed3994bffd15\" returns successfully" Mar 2 12:53:14.002976 containerd[1566]: time="2026-03-02T12:53:14.002888284Z" level=info msg="StartContainer for \"54a7d735f51ccfbabd4e66586446eb3ff3003b0e98202514c4e99a9edc5343bb\" returns successfully" Mar 2 12:53:14.036589 containerd[1566]: time="2026-03-02T12:53:14.035331826Z" level=info msg="StartContainer for \"1b9065881776da36e1c91f775adaed8f37fb639fbbb0d8ee668327c184bab28f\" returns successfully" Mar 2 12:53:14.095062 kubelet[2511]: E0302 12:53:14.094993 2511 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.243.74.166:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.243.74.166:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 2 12:53:14.125347 kubelet[2511]: E0302 12:53:14.125159 2511 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.243.74.166:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-zvfam.gb1.brightbox.com?timeout=10s\": dial tcp 10.243.74.166:6443: connect: connection refused" interval="1.6s" Mar 2 12:53:14.136476 kubelet[2511]: E0302 12:53:14.135471 2511 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.243.74.166:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.243.74.166:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 2 12:53:14.338918 kubelet[2511]: I0302 12:53:14.338597 2511 kubelet_node_status.go:75] "Attempting to register node" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:14.823958 kubelet[2511]: E0302 12:53:14.823317 2511 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zvfam.gb1.brightbox.com\" not found" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:14.831190 kubelet[2511]: E0302 12:53:14.830610 2511 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zvfam.gb1.brightbox.com\" not found" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:14.833885 kubelet[2511]: E0302 12:53:14.833860 2511 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zvfam.gb1.brightbox.com\" not found" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:15.840500 kubelet[2511]: E0302 12:53:15.838977 2511 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zvfam.gb1.brightbox.com\" not found" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:15.841841 kubelet[2511]: E0302 12:53:15.841560 2511 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zvfam.gb1.brightbox.com\" not found" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:15.842330 kubelet[2511]: E0302 12:53:15.842307 2511 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-zvfam.gb1.brightbox.com\" not found" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:16.569891 kubelet[2511]: E0302 12:53:16.569664 2511 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-zvfam.gb1.brightbox.com\" not found" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:16.631687 kubelet[2511]: I0302 12:53:16.631603 2511 kubelet_node_status.go:78] "Successfully registered node" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:16.676001 kubelet[2511]: I0302 12:53:16.675944 2511 apiserver.go:52] "Watching apiserver" Mar 2 12:53:16.719003 kubelet[2511]: I0302 12:53:16.718549 2511 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:16.720636 kubelet[2511]: I0302 12:53:16.720433 2511 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 2 12:53:16.761359 kubelet[2511]: E0302 12:53:16.761045 2511 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-zvfam.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:16.761359 kubelet[2511]: I0302 12:53:16.761105 2511 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:16.770071 kubelet[2511]: E0302 12:53:16.769691 2511 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-zvfam.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:16.770071 kubelet[2511]: I0302 12:53:16.769757 2511 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:16.779741 kubelet[2511]: E0302 12:53:16.779685 2511 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-zvfam.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:16.839359 kubelet[2511]: I0302 12:53:16.838873 2511 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:16.839829 kubelet[2511]: I0302 12:53:16.839805 2511 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:16.846768 kubelet[2511]: E0302 12:53:16.846693 2511 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-zvfam.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:16.849715 kubelet[2511]: E0302 12:53:16.849659 2511 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-zvfam.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:17.504240 kubelet[2511]: I0302 12:53:17.503829 2511 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:17.512495 kubelet[2511]: I0302 12:53:17.512437 2511 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 2 12:53:18.668388 systemd[1]: Reload requested from client PID 2797 ('systemctl') (unit session-11.scope)... Mar 2 12:53:18.668418 systemd[1]: Reloading... Mar 2 12:53:18.832496 zram_generator::config[2839]: No configuration found. Mar 2 12:53:19.194812 systemd[1]: Reloading finished in 525 ms. Mar 2 12:53:19.230266 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:53:19.253413 systemd[1]: kubelet.service: Deactivated successfully. Mar 2 12:53:19.253907 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:53:19.254012 systemd[1]: kubelet.service: Consumed 1.608s CPU time, 126.2M memory peak. Mar 2 12:53:19.258232 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 2 12:53:19.549109 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 2 12:53:19.566231 (kubelet)[2906]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 2 12:53:19.651486 kubelet[2906]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 2 12:53:19.651486 kubelet[2906]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 2 12:53:19.651486 kubelet[2906]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 2 12:53:19.653093 kubelet[2906]: I0302 12:53:19.652953 2906 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 2 12:53:19.669611 kubelet[2906]: I0302 12:53:19.669562 2906 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 2 12:53:19.670489 kubelet[2906]: I0302 12:53:19.669841 2906 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 2 12:53:19.670489 kubelet[2906]: I0302 12:53:19.670294 2906 server.go:956] "Client rotation is on, will bootstrap in background" Mar 2 12:53:19.672960 kubelet[2906]: I0302 12:53:19.672849 2906 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 2 12:53:19.679483 kubelet[2906]: I0302 12:53:19.679357 2906 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 2 12:53:19.695707 kubelet[2906]: I0302 12:53:19.695668 2906 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 2 12:53:19.708920 kubelet[2906]: I0302 12:53:19.708868 2906 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 2 12:53:19.711025 kubelet[2906]: I0302 12:53:19.709434 2906 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 2 12:53:19.711025 kubelet[2906]: I0302 12:53:19.709503 2906 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-zvfam.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 2 12:53:19.711025 kubelet[2906]: I0302 12:53:19.709785 2906 topology_manager.go:138] "Creating topology manager with none policy" Mar 2 12:53:19.711025 kubelet[2906]: I0302 12:53:19.709813 2906 container_manager_linux.go:303] "Creating device plugin manager" Mar 2 12:53:19.711025 kubelet[2906]: I0302 12:53:19.709886 2906 state_mem.go:36] "Initialized new in-memory state store" Mar 2 12:53:19.711357 kubelet[2906]: I0302 12:53:19.710100 2906 kubelet.go:480] "Attempting to sync node with API server" Mar 2 12:53:19.711357 kubelet[2906]: I0302 12:53:19.710122 2906 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 2 12:53:19.711357 kubelet[2906]: I0302 12:53:19.710157 2906 kubelet.go:386] "Adding apiserver pod source" Mar 2 12:53:19.711357 kubelet[2906]: I0302 12:53:19.710180 2906 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 2 12:53:19.715037 kubelet[2906]: I0302 12:53:19.715001 2906 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 2 12:53:19.715697 kubelet[2906]: I0302 12:53:19.715666 2906 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 2 12:53:19.726124 kubelet[2906]: I0302 12:53:19.726085 2906 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 2 12:53:19.726291 kubelet[2906]: I0302 12:53:19.726158 2906 server.go:1289] "Started kubelet" Mar 2 12:53:19.732397 kubelet[2906]: I0302 12:53:19.731261 2906 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 2 12:53:19.741818 kubelet[2906]: I0302 12:53:19.740780 2906 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 2 12:53:19.746630 kubelet[2906]: I0302 12:53:19.745531 2906 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 2 12:53:19.747439 kubelet[2906]: I0302 12:53:19.747408 2906 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 2 12:53:19.753197 kubelet[2906]: I0302 12:53:19.753161 2906 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 2 12:53:19.755296 kubelet[2906]: E0302 12:53:19.754607 2906 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-zvfam.gb1.brightbox.com\" not found" Mar 2 12:53:19.778634 kubelet[2906]: I0302 12:53:19.757644 2906 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 2 12:53:19.779467 kubelet[2906]: I0302 12:53:19.779239 2906 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 2 12:53:19.785598 kubelet[2906]: I0302 12:53:19.784685 2906 reconciler.go:26] "Reconciler: start to sync state" Mar 2 12:53:19.789421 kubelet[2906]: I0302 12:53:19.789279 2906 factory.go:223] Registration of the systemd container factory successfully Mar 2 12:53:19.789550 kubelet[2906]: I0302 12:53:19.789482 2906 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 2 12:53:19.795506 kubelet[2906]: I0302 12:53:19.794770 2906 server.go:317] "Adding debug handlers to kubelet server" Mar 2 12:53:19.812705 kubelet[2906]: E0302 12:53:19.812093 2906 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 2 12:53:19.813726 kubelet[2906]: I0302 12:53:19.813694 2906 factory.go:223] Registration of the containerd container factory successfully Mar 2 12:53:19.884952 kubelet[2906]: I0302 12:53:19.883734 2906 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 2 12:53:19.894734 kubelet[2906]: I0302 12:53:19.894105 2906 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 2 12:53:19.894734 kubelet[2906]: I0302 12:53:19.894145 2906 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 2 12:53:19.894734 kubelet[2906]: I0302 12:53:19.894194 2906 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 2 12:53:19.894734 kubelet[2906]: I0302 12:53:19.894211 2906 kubelet.go:2436] "Starting kubelet main sync loop" Mar 2 12:53:19.896031 kubelet[2906]: E0302 12:53:19.895906 2906 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 2 12:53:20.000024 kubelet[2906]: E0302 12:53:19.999784 2906 kubelet.go:2460] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 2 12:53:20.003543 kubelet[2906]: I0302 12:53:20.003504 2906 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 2 12:53:20.003543 kubelet[2906]: I0302 12:53:20.003538 2906 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 2 12:53:20.003739 kubelet[2906]: I0302 12:53:20.003570 2906 state_mem.go:36] "Initialized new in-memory state store" Mar 2 12:53:20.003906 kubelet[2906]: I0302 12:53:20.003862 2906 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 2 12:53:20.003967 kubelet[2906]: I0302 12:53:20.003889 2906 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 2 12:53:20.003967 kubelet[2906]: I0302 12:53:20.003927 2906 policy_none.go:49] "None policy: Start" Mar 2 12:53:20.003967 kubelet[2906]: I0302 12:53:20.003944 2906 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 2 12:53:20.003967 kubelet[2906]: I0302 12:53:20.003966 2906 state_mem.go:35] "Initializing new in-memory state store" Mar 2 12:53:20.004147 kubelet[2906]: I0302 12:53:20.004135 2906 state_mem.go:75] "Updated machine memory state" Mar 2 12:53:20.015575 kubelet[2906]: E0302 12:53:20.015521 2906 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 2 12:53:20.015885 kubelet[2906]: I0302 12:53:20.015815 2906 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 2 12:53:20.015885 kubelet[2906]: I0302 12:53:20.015835 2906 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 2 12:53:20.016437 kubelet[2906]: I0302 12:53:20.016402 2906 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 2 12:53:20.030275 kubelet[2906]: E0302 12:53:20.029851 2906 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 2 12:53:20.152783 kubelet[2906]: I0302 12:53:20.151757 2906 kubelet_node_status.go:75] "Attempting to register node" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.165722 kubelet[2906]: I0302 12:53:20.164974 2906 kubelet_node_status.go:124] "Node was previously registered" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.165722 kubelet[2906]: I0302 12:53:20.165095 2906 kubelet_node_status.go:78] "Successfully registered node" node="srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.203519 kubelet[2906]: I0302 12:53:20.203160 2906 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.205167 kubelet[2906]: I0302 12:53:20.205001 2906 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.209466 kubelet[2906]: I0302 12:53:20.209400 2906 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.220375 kubelet[2906]: I0302 12:53:20.220311 2906 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 2 12:53:20.225336 kubelet[2906]: I0302 12:53:20.225283 2906 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 2 12:53:20.227074 kubelet[2906]: I0302 12:53:20.226938 2906 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 2 12:53:20.227074 kubelet[2906]: E0302 12:53:20.227011 2906 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-zvfam.gb1.brightbox.com\" already exists" pod="kube-system/kube-controller-manager-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.288990 kubelet[2906]: I0302 12:53:20.288921 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ba57e322649050bf56d89b6c7932a8e1-ca-certs\") pod \"kube-controller-manager-srv-zvfam.gb1.brightbox.com\" (UID: \"ba57e322649050bf56d89b6c7932a8e1\") " pod="kube-system/kube-controller-manager-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.288990 kubelet[2906]: I0302 12:53:20.288974 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ba57e322649050bf56d89b6c7932a8e1-k8s-certs\") pod \"kube-controller-manager-srv-zvfam.gb1.brightbox.com\" (UID: \"ba57e322649050bf56d89b6c7932a8e1\") " pod="kube-system/kube-controller-manager-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.289250 kubelet[2906]: I0302 12:53:20.289013 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ba57e322649050bf56d89b6c7932a8e1-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-zvfam.gb1.brightbox.com\" (UID: \"ba57e322649050bf56d89b6c7932a8e1\") " pod="kube-system/kube-controller-manager-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.289250 kubelet[2906]: I0302 12:53:20.289045 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f76c54020f466e13365037ce88cf8b2f-usr-share-ca-certificates\") pod \"kube-apiserver-srv-zvfam.gb1.brightbox.com\" (UID: \"f76c54020f466e13365037ce88cf8b2f\") " pod="kube-system/kube-apiserver-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.289250 kubelet[2906]: I0302 12:53:20.289076 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ba57e322649050bf56d89b6c7932a8e1-flexvolume-dir\") pod \"kube-controller-manager-srv-zvfam.gb1.brightbox.com\" (UID: \"ba57e322649050bf56d89b6c7932a8e1\") " pod="kube-system/kube-controller-manager-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.289250 kubelet[2906]: I0302 12:53:20.289103 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ba57e322649050bf56d89b6c7932a8e1-kubeconfig\") pod \"kube-controller-manager-srv-zvfam.gb1.brightbox.com\" (UID: \"ba57e322649050bf56d89b6c7932a8e1\") " pod="kube-system/kube-controller-manager-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.289250 kubelet[2906]: I0302 12:53:20.289131 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e7c4012fbdda6341bd8d59d396acc654-kubeconfig\") pod \"kube-scheduler-srv-zvfam.gb1.brightbox.com\" (UID: \"e7c4012fbdda6341bd8d59d396acc654\") " pod="kube-system/kube-scheduler-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.290061 kubelet[2906]: I0302 12:53:20.289156 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f76c54020f466e13365037ce88cf8b2f-ca-certs\") pod \"kube-apiserver-srv-zvfam.gb1.brightbox.com\" (UID: \"f76c54020f466e13365037ce88cf8b2f\") " pod="kube-system/kube-apiserver-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.290061 kubelet[2906]: I0302 12:53:20.289183 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f76c54020f466e13365037ce88cf8b2f-k8s-certs\") pod \"kube-apiserver-srv-zvfam.gb1.brightbox.com\" (UID: \"f76c54020f466e13365037ce88cf8b2f\") " pod="kube-system/kube-apiserver-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.713132 kubelet[2906]: I0302 12:53:20.712717 2906 apiserver.go:52] "Watching apiserver" Mar 2 12:53:20.779190 kubelet[2906]: I0302 12:53:20.779100 2906 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 2 12:53:20.942782 kubelet[2906]: I0302 12:53:20.940432 2906 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:20.987849 kubelet[2906]: I0302 12:53:20.987680 2906 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 2 12:53:20.987849 kubelet[2906]: E0302 12:53:20.987785 2906 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-zvfam.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-zvfam.gb1.brightbox.com" Mar 2 12:53:21.007480 kubelet[2906]: I0302 12:53:21.007161 2906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-zvfam.gb1.brightbox.com" podStartSLOduration=1.007126814 podStartE2EDuration="1.007126814s" podCreationTimestamp="2026-03-02 12:53:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-02 12:53:21.007102744 +0000 UTC m=+1.430909806" watchObservedRunningTime="2026-03-02 12:53:21.007126814 +0000 UTC m=+1.430933855" Mar 2 12:53:21.046302 kubelet[2906]: I0302 12:53:21.046107 2906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-zvfam.gb1.brightbox.com" podStartSLOduration=1.04607586 podStartE2EDuration="1.04607586s" podCreationTimestamp="2026-03-02 12:53:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-02 12:53:21.022978805 +0000 UTC m=+1.446785862" watchObservedRunningTime="2026-03-02 12:53:21.04607586 +0000 UTC m=+1.469882896" Mar 2 12:53:21.062475 kubelet[2906]: I0302 12:53:21.062250 2906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-zvfam.gb1.brightbox.com" podStartSLOduration=4.062221312 podStartE2EDuration="4.062221312s" podCreationTimestamp="2026-03-02 12:53:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-02 12:53:21.047328297 +0000 UTC m=+1.471135354" watchObservedRunningTime="2026-03-02 12:53:21.062221312 +0000 UTC m=+1.486028353" Mar 2 12:53:24.492347 kubelet[2906]: I0302 12:53:24.492221 2906 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 2 12:53:24.493807 containerd[1566]: time="2026-03-02T12:53:24.493547027Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 2 12:53:24.495680 kubelet[2906]: I0302 12:53:24.493784 2906 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 2 12:53:25.538638 systemd[1]: Created slice kubepods-besteffort-podcd806d95_6889_4935_9746_e8a8aa592263.slice - libcontainer container kubepods-besteffort-podcd806d95_6889_4935_9746_e8a8aa592263.slice. Mar 2 12:53:25.627270 kubelet[2906]: I0302 12:53:25.627205 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/cd806d95-6889-4935-9746-e8a8aa592263-kube-proxy\") pod \"kube-proxy-48sz9\" (UID: \"cd806d95-6889-4935-9746-e8a8aa592263\") " pod="kube-system/kube-proxy-48sz9" Mar 2 12:53:25.628184 kubelet[2906]: I0302 12:53:25.628094 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd806d95-6889-4935-9746-e8a8aa592263-lib-modules\") pod \"kube-proxy-48sz9\" (UID: \"cd806d95-6889-4935-9746-e8a8aa592263\") " pod="kube-system/kube-proxy-48sz9" Mar 2 12:53:25.628374 kubelet[2906]: I0302 12:53:25.628320 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjr78\" (UniqueName: \"kubernetes.io/projected/cd806d95-6889-4935-9746-e8a8aa592263-kube-api-access-bjr78\") pod \"kube-proxy-48sz9\" (UID: \"cd806d95-6889-4935-9746-e8a8aa592263\") " pod="kube-system/kube-proxy-48sz9" Mar 2 12:53:25.628652 kubelet[2906]: I0302 12:53:25.628536 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/cd806d95-6889-4935-9746-e8a8aa592263-xtables-lock\") pod \"kube-proxy-48sz9\" (UID: \"cd806d95-6889-4935-9746-e8a8aa592263\") " pod="kube-system/kube-proxy-48sz9" Mar 2 12:53:25.734514 systemd[1]: Created slice kubepods-besteffort-pod27925959_cc08_459e_aa8e_adb91639dccf.slice - libcontainer container kubepods-besteffort-pod27925959_cc08_459e_aa8e_adb91639dccf.slice. Mar 2 12:53:25.831008 kubelet[2906]: I0302 12:53:25.830387 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72kxf\" (UniqueName: \"kubernetes.io/projected/27925959-cc08-459e-aa8e-adb91639dccf-kube-api-access-72kxf\") pod \"tigera-operator-7d4578d8d-q88pq\" (UID: \"27925959-cc08-459e-aa8e-adb91639dccf\") " pod="tigera-operator/tigera-operator-7d4578d8d-q88pq" Mar 2 12:53:25.831008 kubelet[2906]: I0302 12:53:25.830519 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/27925959-cc08-459e-aa8e-adb91639dccf-var-lib-calico\") pod \"tigera-operator-7d4578d8d-q88pq\" (UID: \"27925959-cc08-459e-aa8e-adb91639dccf\") " pod="tigera-operator/tigera-operator-7d4578d8d-q88pq" Mar 2 12:53:25.859157 containerd[1566]: time="2026-03-02T12:53:25.859060034Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-48sz9,Uid:cd806d95-6889-4935-9746-e8a8aa592263,Namespace:kube-system,Attempt:0,}" Mar 2 12:53:25.887487 containerd[1566]: time="2026-03-02T12:53:25.887145943Z" level=info msg="connecting to shim 024c211a737207d22aa9c21fc1d87170e2639f550455c810db80e3568184b62d" address="unix:///run/containerd/s/57030ead9c815bc812880a3bd4f30a2ba1cad57a4e8c13c98f2595ea982f013d" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:53:25.933738 systemd[1]: Started cri-containerd-024c211a737207d22aa9c21fc1d87170e2639f550455c810db80e3568184b62d.scope - libcontainer container 024c211a737207d22aa9c21fc1d87170e2639f550455c810db80e3568184b62d. Mar 2 12:53:25.995510 containerd[1566]: time="2026-03-02T12:53:25.995438682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-48sz9,Uid:cd806d95-6889-4935-9746-e8a8aa592263,Namespace:kube-system,Attempt:0,} returns sandbox id \"024c211a737207d22aa9c21fc1d87170e2639f550455c810db80e3568184b62d\"" Mar 2 12:53:26.003409 containerd[1566]: time="2026-03-02T12:53:26.002432135Z" level=info msg="CreateContainer within sandbox \"024c211a737207d22aa9c21fc1d87170e2639f550455c810db80e3568184b62d\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 2 12:53:26.016720 containerd[1566]: time="2026-03-02T12:53:26.016667927Z" level=info msg="Container 3c2571c3ed7abb0ad2b173cd35d803bd4659cb7ecd76acbb85311a43dd2a904b: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:53:26.024915 containerd[1566]: time="2026-03-02T12:53:26.024853819Z" level=info msg="CreateContainer within sandbox \"024c211a737207d22aa9c21fc1d87170e2639f550455c810db80e3568184b62d\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"3c2571c3ed7abb0ad2b173cd35d803bd4659cb7ecd76acbb85311a43dd2a904b\"" Mar 2 12:53:26.026621 containerd[1566]: time="2026-03-02T12:53:26.026259341Z" level=info msg="StartContainer for \"3c2571c3ed7abb0ad2b173cd35d803bd4659cb7ecd76acbb85311a43dd2a904b\"" Mar 2 12:53:26.029297 containerd[1566]: time="2026-03-02T12:53:26.029241331Z" level=info msg="connecting to shim 3c2571c3ed7abb0ad2b173cd35d803bd4659cb7ecd76acbb85311a43dd2a904b" address="unix:///run/containerd/s/57030ead9c815bc812880a3bd4f30a2ba1cad57a4e8c13c98f2595ea982f013d" protocol=ttrpc version=3 Mar 2 12:53:26.039611 containerd[1566]: time="2026-03-02T12:53:26.039561464Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7d4578d8d-q88pq,Uid:27925959-cc08-459e-aa8e-adb91639dccf,Namespace:tigera-operator,Attempt:0,}" Mar 2 12:53:26.067781 systemd[1]: Started cri-containerd-3c2571c3ed7abb0ad2b173cd35d803bd4659cb7ecd76acbb85311a43dd2a904b.scope - libcontainer container 3c2571c3ed7abb0ad2b173cd35d803bd4659cb7ecd76acbb85311a43dd2a904b. Mar 2 12:53:26.081649 containerd[1566]: time="2026-03-02T12:53:26.081388902Z" level=info msg="connecting to shim 3040319a15c402de5f23d7528726bbaab1950e7d4b09c253a151d9809f743c4a" address="unix:///run/containerd/s/5eeb3fddc76f04c922a70d37e3df362d27b97fcae614d3d10677bec05f3c2d3d" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:53:26.137807 systemd[1]: Started cri-containerd-3040319a15c402de5f23d7528726bbaab1950e7d4b09c253a151d9809f743c4a.scope - libcontainer container 3040319a15c402de5f23d7528726bbaab1950e7d4b09c253a151d9809f743c4a. Mar 2 12:53:26.237308 containerd[1566]: time="2026-03-02T12:53:26.237111167Z" level=info msg="StartContainer for \"3c2571c3ed7abb0ad2b173cd35d803bd4659cb7ecd76acbb85311a43dd2a904b\" returns successfully" Mar 2 12:53:26.251089 containerd[1566]: time="2026-03-02T12:53:26.250876714Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7d4578d8d-q88pq,Uid:27925959-cc08-459e-aa8e-adb91639dccf,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3040319a15c402de5f23d7528726bbaab1950e7d4b09c253a151d9809f743c4a\"" Mar 2 12:53:26.254638 containerd[1566]: time="2026-03-02T12:53:26.254379907Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.3\"" Mar 2 12:53:28.428140 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount800160633.mount: Deactivated successfully. Mar 2 12:53:30.330207 containerd[1566]: time="2026-03-02T12:53:30.330119646Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:53:30.331854 containerd[1566]: time="2026-03-02T12:53:30.331822380Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.3: active requests=0, bytes read=40822719" Mar 2 12:53:30.339963 containerd[1566]: time="2026-03-02T12:53:30.339864522Z" level=info msg="ImageCreate event name:\"sha256:de15454df5913bb69360783a4d76287caf2c87324eed18162e79d4c06a4c8896\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:53:30.343364 containerd[1566]: time="2026-03-02T12:53:30.343303159Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3b1a6762e1f3fae8490773b8f06ddd1e6775850febbece4d6002416f39adc670\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:53:30.344368 containerd[1566]: time="2026-03-02T12:53:30.344315301Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.3\" with image id \"sha256:de15454df5913bb69360783a4d76287caf2c87324eed18162e79d4c06a4c8896\", repo tag \"quay.io/tigera/operator:v1.40.3\", repo digest \"quay.io/tigera/operator@sha256:3b1a6762e1f3fae8490773b8f06ddd1e6775850febbece4d6002416f39adc670\", size \"40818714\" in 4.08989205s" Mar 2 12:53:30.344368 containerd[1566]: time="2026-03-02T12:53:30.344365363Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.3\" returns image reference \"sha256:de15454df5913bb69360783a4d76287caf2c87324eed18162e79d4c06a4c8896\"" Mar 2 12:53:30.356002 containerd[1566]: time="2026-03-02T12:53:30.355939005Z" level=info msg="CreateContainer within sandbox \"3040319a15c402de5f23d7528726bbaab1950e7d4b09c253a151d9809f743c4a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 2 12:53:30.367475 containerd[1566]: time="2026-03-02T12:53:30.365991284Z" level=info msg="Container 0609e38af8c4d351e7fb7f035eb788bca7c6c5b31d7ca4d39f848c73767f2687: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:53:30.376629 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2182783461.mount: Deactivated successfully. Mar 2 12:53:30.384532 containerd[1566]: time="2026-03-02T12:53:30.383579893Z" level=info msg="CreateContainer within sandbox \"3040319a15c402de5f23d7528726bbaab1950e7d4b09c253a151d9809f743c4a\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"0609e38af8c4d351e7fb7f035eb788bca7c6c5b31d7ca4d39f848c73767f2687\"" Mar 2 12:53:30.389348 containerd[1566]: time="2026-03-02T12:53:30.385715340Z" level=info msg="StartContainer for \"0609e38af8c4d351e7fb7f035eb788bca7c6c5b31d7ca4d39f848c73767f2687\"" Mar 2 12:53:30.389348 containerd[1566]: time="2026-03-02T12:53:30.388879224Z" level=info msg="connecting to shim 0609e38af8c4d351e7fb7f035eb788bca7c6c5b31d7ca4d39f848c73767f2687" address="unix:///run/containerd/s/5eeb3fddc76f04c922a70d37e3df362d27b97fcae614d3d10677bec05f3c2d3d" protocol=ttrpc version=3 Mar 2 12:53:30.431713 systemd[1]: Started cri-containerd-0609e38af8c4d351e7fb7f035eb788bca7c6c5b31d7ca4d39f848c73767f2687.scope - libcontainer container 0609e38af8c4d351e7fb7f035eb788bca7c6c5b31d7ca4d39f848c73767f2687. Mar 2 12:53:30.486974 containerd[1566]: time="2026-03-02T12:53:30.486883696Z" level=info msg="StartContainer for \"0609e38af8c4d351e7fb7f035eb788bca7c6c5b31d7ca4d39f848c73767f2687\" returns successfully" Mar 2 12:53:31.018269 kubelet[2906]: I0302 12:53:31.018036 2906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7d4578d8d-q88pq" podStartSLOduration=1.900811299 podStartE2EDuration="5.993441918s" podCreationTimestamp="2026-03-02 12:53:25 +0000 UTC" firstStartedPulling="2026-03-02 12:53:26.252904724 +0000 UTC m=+6.676711757" lastFinishedPulling="2026-03-02 12:53:30.345535347 +0000 UTC m=+10.769342376" observedRunningTime="2026-03-02 12:53:30.992443481 +0000 UTC m=+11.416250527" watchObservedRunningTime="2026-03-02 12:53:30.993441918 +0000 UTC m=+11.417248960" Mar 2 12:53:31.024329 kubelet[2906]: I0302 12:53:31.023276 2906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-48sz9" podStartSLOduration=6.023127143 podStartE2EDuration="6.023127143s" podCreationTimestamp="2026-03-02 12:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-02 12:53:26.980651581 +0000 UTC m=+7.404458654" watchObservedRunningTime="2026-03-02 12:53:31.023127143 +0000 UTC m=+11.446934189" Mar 2 12:53:38.149744 sudo[1884]: pam_unix(sudo:session): session closed for user root Mar 2 12:53:38.249481 sshd[1883]: Connection closed by 68.220.241.50 port 36108 Mar 2 12:53:38.249708 sshd-session[1880]: pam_unix(sshd:session): session closed for user core Mar 2 12:53:38.269273 systemd[1]: sshd@8-10.243.74.166:22-68.220.241.50:36108.service: Deactivated successfully. Mar 2 12:53:38.269347 systemd-logind[1549]: Session 11 logged out. Waiting for processes to exit. Mar 2 12:53:38.283517 systemd[1]: session-11.scope: Deactivated successfully. Mar 2 12:53:38.284568 systemd[1]: session-11.scope: Consumed 8.431s CPU time, 159.4M memory peak. Mar 2 12:53:38.302621 systemd-logind[1549]: Removed session 11. Mar 2 12:53:42.535221 systemd[1]: Created slice kubepods-besteffort-podc239fd01_cca9_42ec_8ff4_ec474c02634d.slice - libcontainer container kubepods-besteffort-podc239fd01_cca9_42ec_8ff4_ec474c02634d.slice. Mar 2 12:53:42.576539 kubelet[2906]: I0302 12:53:42.576203 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ddm5\" (UniqueName: \"kubernetes.io/projected/c239fd01-cca9-42ec-8ff4-ec474c02634d-kube-api-access-7ddm5\") pod \"calico-typha-bb965cdbd-xhzsx\" (UID: \"c239fd01-cca9-42ec-8ff4-ec474c02634d\") " pod="calico-system/calico-typha-bb965cdbd-xhzsx" Mar 2 12:53:42.578530 kubelet[2906]: I0302 12:53:42.577763 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c239fd01-cca9-42ec-8ff4-ec474c02634d-tigera-ca-bundle\") pod \"calico-typha-bb965cdbd-xhzsx\" (UID: \"c239fd01-cca9-42ec-8ff4-ec474c02634d\") " pod="calico-system/calico-typha-bb965cdbd-xhzsx" Mar 2 12:53:42.578530 kubelet[2906]: I0302 12:53:42.577810 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/c239fd01-cca9-42ec-8ff4-ec474c02634d-typha-certs\") pod \"calico-typha-bb965cdbd-xhzsx\" (UID: \"c239fd01-cca9-42ec-8ff4-ec474c02634d\") " pod="calico-system/calico-typha-bb965cdbd-xhzsx" Mar 2 12:53:42.660324 systemd[1]: Created slice kubepods-besteffort-podfd6e2129_cad0_49c3_a6af_11c8cca20a49.slice - libcontainer container kubepods-besteffort-podfd6e2129_cad0_49c3_a6af_11c8cca20a49.slice. Mar 2 12:53:42.764866 kubelet[2906]: E0302 12:53:42.764780 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-56k4g" podUID="abbdfe3c-56e5-4932-b650-1489a1c6d2bc" Mar 2 12:53:42.778658 kubelet[2906]: I0302 12:53:42.778534 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fd6e2129-cad0-49c3-a6af-11c8cca20a49-var-lib-calico\") pod \"calico-node-jzm77\" (UID: \"fd6e2129-cad0-49c3-a6af-11c8cca20a49\") " pod="calico-system/calico-node-jzm77" Mar 2 12:53:42.779134 kubelet[2906]: I0302 12:53:42.778868 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g47z8\" (UniqueName: \"kubernetes.io/projected/fd6e2129-cad0-49c3-a6af-11c8cca20a49-kube-api-access-g47z8\") pod \"calico-node-jzm77\" (UID: \"fd6e2129-cad0-49c3-a6af-11c8cca20a49\") " pod="calico-system/calico-node-jzm77" Mar 2 12:53:42.779134 kubelet[2906]: I0302 12:53:42.778950 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/fd6e2129-cad0-49c3-a6af-11c8cca20a49-nodeproc\") pod \"calico-node-jzm77\" (UID: \"fd6e2129-cad0-49c3-a6af-11c8cca20a49\") " pod="calico-system/calico-node-jzm77" Mar 2 12:53:42.780472 kubelet[2906]: I0302 12:53:42.779315 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/fd6e2129-cad0-49c3-a6af-11c8cca20a49-cni-bin-dir\") pod \"calico-node-jzm77\" (UID: \"fd6e2129-cad0-49c3-a6af-11c8cca20a49\") " pod="calico-system/calico-node-jzm77" Mar 2 12:53:42.780472 kubelet[2906]: I0302 12:53:42.779726 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/fd6e2129-cad0-49c3-a6af-11c8cca20a49-policysync\") pod \"calico-node-jzm77\" (UID: \"fd6e2129-cad0-49c3-a6af-11c8cca20a49\") " pod="calico-system/calico-node-jzm77" Mar 2 12:53:42.780472 kubelet[2906]: I0302 12:53:42.779762 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd6e2129-cad0-49c3-a6af-11c8cca20a49-tigera-ca-bundle\") pod \"calico-node-jzm77\" (UID: \"fd6e2129-cad0-49c3-a6af-11c8cca20a49\") " pod="calico-system/calico-node-jzm77" Mar 2 12:53:42.780472 kubelet[2906]: I0302 12:53:42.779821 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/fd6e2129-cad0-49c3-a6af-11c8cca20a49-node-certs\") pod \"calico-node-jzm77\" (UID: \"fd6e2129-cad0-49c3-a6af-11c8cca20a49\") " pod="calico-system/calico-node-jzm77" Mar 2 12:53:42.780472 kubelet[2906]: I0302 12:53:42.779911 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/fd6e2129-cad0-49c3-a6af-11c8cca20a49-cni-net-dir\") pod \"calico-node-jzm77\" (UID: \"fd6e2129-cad0-49c3-a6af-11c8cca20a49\") " pod="calico-system/calico-node-jzm77" Mar 2 12:53:42.780722 kubelet[2906]: I0302 12:53:42.779965 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd6e2129-cad0-49c3-a6af-11c8cca20a49-lib-modules\") pod \"calico-node-jzm77\" (UID: \"fd6e2129-cad0-49c3-a6af-11c8cca20a49\") " pod="calico-system/calico-node-jzm77" Mar 2 12:53:42.780722 kubelet[2906]: I0302 12:53:42.779997 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fd6e2129-cad0-49c3-a6af-11c8cca20a49-sys-fs\") pod \"calico-node-jzm77\" (UID: \"fd6e2129-cad0-49c3-a6af-11c8cca20a49\") " pod="calico-system/calico-node-jzm77" Mar 2 12:53:42.780722 kubelet[2906]: I0302 12:53:42.780036 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/fd6e2129-cad0-49c3-a6af-11c8cca20a49-var-run-calico\") pod \"calico-node-jzm77\" (UID: \"fd6e2129-cad0-49c3-a6af-11c8cca20a49\") " pod="calico-system/calico-node-jzm77" Mar 2 12:53:42.780722 kubelet[2906]: I0302 12:53:42.780079 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/fd6e2129-cad0-49c3-a6af-11c8cca20a49-xtables-lock\") pod \"calico-node-jzm77\" (UID: \"fd6e2129-cad0-49c3-a6af-11c8cca20a49\") " pod="calico-system/calico-node-jzm77" Mar 2 12:53:42.780722 kubelet[2906]: I0302 12:53:42.780129 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/fd6e2129-cad0-49c3-a6af-11c8cca20a49-bpffs\") pod \"calico-node-jzm77\" (UID: \"fd6e2129-cad0-49c3-a6af-11c8cca20a49\") " pod="calico-system/calico-node-jzm77" Mar 2 12:53:42.781392 kubelet[2906]: I0302 12:53:42.780201 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/fd6e2129-cad0-49c3-a6af-11c8cca20a49-flexvol-driver-host\") pod \"calico-node-jzm77\" (UID: \"fd6e2129-cad0-49c3-a6af-11c8cca20a49\") " pod="calico-system/calico-node-jzm77" Mar 2 12:53:42.781392 kubelet[2906]: I0302 12:53:42.780235 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/fd6e2129-cad0-49c3-a6af-11c8cca20a49-cni-log-dir\") pod \"calico-node-jzm77\" (UID: \"fd6e2129-cad0-49c3-a6af-11c8cca20a49\") " pod="calico-system/calico-node-jzm77" Mar 2 12:53:42.871207 containerd[1566]: time="2026-03-02T12:53:42.869910001Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-bb965cdbd-xhzsx,Uid:c239fd01-cca9-42ec-8ff4-ec474c02634d,Namespace:calico-system,Attempt:0,}" Mar 2 12:53:42.882242 kubelet[2906]: I0302 12:53:42.882183 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/abbdfe3c-56e5-4932-b650-1489a1c6d2bc-socket-dir\") pod \"csi-node-driver-56k4g\" (UID: \"abbdfe3c-56e5-4932-b650-1489a1c6d2bc\") " pod="calico-system/csi-node-driver-56k4g" Mar 2 12:53:42.883474 kubelet[2906]: I0302 12:53:42.882952 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2xv2\" (UniqueName: \"kubernetes.io/projected/abbdfe3c-56e5-4932-b650-1489a1c6d2bc-kube-api-access-v2xv2\") pod \"csi-node-driver-56k4g\" (UID: \"abbdfe3c-56e5-4932-b650-1489a1c6d2bc\") " pod="calico-system/csi-node-driver-56k4g" Mar 2 12:53:42.883474 kubelet[2906]: I0302 12:53:42.883079 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/abbdfe3c-56e5-4932-b650-1489a1c6d2bc-kubelet-dir\") pod \"csi-node-driver-56k4g\" (UID: \"abbdfe3c-56e5-4932-b650-1489a1c6d2bc\") " pod="calico-system/csi-node-driver-56k4g" Mar 2 12:53:42.883474 kubelet[2906]: I0302 12:53:42.883130 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/abbdfe3c-56e5-4932-b650-1489a1c6d2bc-varrun\") pod \"csi-node-driver-56k4g\" (UID: \"abbdfe3c-56e5-4932-b650-1489a1c6d2bc\") " pod="calico-system/csi-node-driver-56k4g" Mar 2 12:53:42.883474 kubelet[2906]: I0302 12:53:42.883231 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/abbdfe3c-56e5-4932-b650-1489a1c6d2bc-registration-dir\") pod \"csi-node-driver-56k4g\" (UID: \"abbdfe3c-56e5-4932-b650-1489a1c6d2bc\") " pod="calico-system/csi-node-driver-56k4g" Mar 2 12:53:42.888287 kubelet[2906]: E0302 12:53:42.888079 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.888785 kubelet[2906]: W0302 12:53:42.888654 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.889057 kubelet[2906]: E0302 12:53:42.889008 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.890925 kubelet[2906]: E0302 12:53:42.890903 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.891177 kubelet[2906]: W0302 12:53:42.891017 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.891177 kubelet[2906]: E0302 12:53:42.891058 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.892400 kubelet[2906]: E0302 12:53:42.892166 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.892400 kubelet[2906]: W0302 12:53:42.892186 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.892400 kubelet[2906]: E0302 12:53:42.892243 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.894583 kubelet[2906]: E0302 12:53:42.894198 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.894583 kubelet[2906]: W0302 12:53:42.894314 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.894583 kubelet[2906]: E0302 12:53:42.894337 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.908490 kubelet[2906]: E0302 12:53:42.908414 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.908876 kubelet[2906]: W0302 12:53:42.908545 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.908876 kubelet[2906]: E0302 12:53:42.908584 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.909210 kubelet[2906]: E0302 12:53:42.909190 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.909321 kubelet[2906]: W0302 12:53:42.909301 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.909501 kubelet[2906]: E0302 12:53:42.909425 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.921905 kubelet[2906]: E0302 12:53:42.921871 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.922892 kubelet[2906]: W0302 12:53:42.922741 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.922892 kubelet[2906]: E0302 12:53:42.922788 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.940757 containerd[1566]: time="2026-03-02T12:53:42.940584464Z" level=info msg="connecting to shim a569859138e149589219fd128f544d62e431327ac9ab66fe57f55fffb6576194" address="unix:///run/containerd/s/0a3223038ef86b3f4f223981e3000dc2ff338fff7095da674b5df8d8437664df" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:53:42.967210 containerd[1566]: time="2026-03-02T12:53:42.967124935Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-jzm77,Uid:fd6e2129-cad0-49c3-a6af-11c8cca20a49,Namespace:calico-system,Attempt:0,}" Mar 2 12:53:42.985079 kubelet[2906]: E0302 12:53:42.984869 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.985079 kubelet[2906]: W0302 12:53:42.984907 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.985079 kubelet[2906]: E0302 12:53:42.984956 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.985800 kubelet[2906]: E0302 12:53:42.985772 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.985945 kubelet[2906]: W0302 12:53:42.985899 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.985945 kubelet[2906]: E0302 12:53:42.985924 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.986567 kubelet[2906]: E0302 12:53:42.986508 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.986567 kubelet[2906]: W0302 12:53:42.986528 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.986876 kubelet[2906]: E0302 12:53:42.986544 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.987309 kubelet[2906]: E0302 12:53:42.987279 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.987941 kubelet[2906]: W0302 12:53:42.987408 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.987786 systemd[1]: Started cri-containerd-a569859138e149589219fd128f544d62e431327ac9ab66fe57f55fffb6576194.scope - libcontainer container a569859138e149589219fd128f544d62e431327ac9ab66fe57f55fffb6576194. Mar 2 12:53:42.988798 kubelet[2906]: E0302 12:53:42.987433 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.990086 kubelet[2906]: E0302 12:53:42.989898 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.990086 kubelet[2906]: W0302 12:53:42.990036 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.990086 kubelet[2906]: E0302 12:53:42.990057 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.990822 kubelet[2906]: E0302 12:53:42.990758 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.990822 kubelet[2906]: W0302 12:53:42.990777 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.990822 kubelet[2906]: E0302 12:53:42.990793 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.991503 kubelet[2906]: E0302 12:53:42.991361 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.991503 kubelet[2906]: W0302 12:53:42.991409 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.991503 kubelet[2906]: E0302 12:53:42.991425 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.992387 kubelet[2906]: E0302 12:53:42.992366 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.992679 kubelet[2906]: W0302 12:53:42.992489 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.992679 kubelet[2906]: E0302 12:53:42.992510 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.993128 kubelet[2906]: E0302 12:53:42.993059 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.993128 kubelet[2906]: W0302 12:53:42.993078 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.993128 kubelet[2906]: E0302 12:53:42.993095 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.993976 kubelet[2906]: E0302 12:53:42.993911 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.993976 kubelet[2906]: W0302 12:53:42.993931 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.993976 kubelet[2906]: E0302 12:53:42.993947 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.994837 kubelet[2906]: E0302 12:53:42.994712 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.994837 kubelet[2906]: W0302 12:53:42.994731 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.994837 kubelet[2906]: E0302 12:53:42.994746 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.995577 kubelet[2906]: E0302 12:53:42.995557 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.995834 kubelet[2906]: W0302 12:53:42.995693 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.995834 kubelet[2906]: E0302 12:53:42.995719 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.996295 kubelet[2906]: E0302 12:53:42.996244 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.996295 kubelet[2906]: W0302 12:53:42.996261 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.996547 kubelet[2906]: E0302 12:53:42.996278 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:42.998492 containerd[1566]: time="2026-03-02T12:53:42.998396159Z" level=info msg="connecting to shim 313b8666c3a695064382ab578af4ea70c839ba6c0aca4b6db799f673170764f9" address="unix:///run/containerd/s/fd6c9dcd0200759592f322700131ef14c85d69cf6b3159ed12cdfd9d32fc5f5c" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:53:42.999622 kubelet[2906]: E0302 12:53:42.999181 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:42.999622 kubelet[2906]: W0302 12:53:42.999515 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:42.999622 kubelet[2906]: E0302 12:53:42.999533 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:43.000667 kubelet[2906]: E0302 12:53:43.000552 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:43.000667 kubelet[2906]: W0302 12:53:43.000624 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:43.000667 kubelet[2906]: E0302 12:53:43.000644 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:43.002823 kubelet[2906]: E0302 12:53:43.002803 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:43.003086 kubelet[2906]: W0302 12:53:43.003010 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:43.003086 kubelet[2906]: E0302 12:53:43.003049 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:43.004042 kubelet[2906]: E0302 12:53:43.004010 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:43.004300 kubelet[2906]: W0302 12:53:43.004200 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:43.004300 kubelet[2906]: E0302 12:53:43.004226 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:43.005044 kubelet[2906]: E0302 12:53:43.004987 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:43.005253 kubelet[2906]: W0302 12:53:43.005201 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:43.005253 kubelet[2906]: E0302 12:53:43.005231 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:43.006839 kubelet[2906]: E0302 12:53:43.006538 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:43.006839 kubelet[2906]: W0302 12:53:43.006732 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:43.006839 kubelet[2906]: E0302 12:53:43.006758 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:43.008558 kubelet[2906]: E0302 12:53:43.008498 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:43.008848 kubelet[2906]: W0302 12:53:43.008752 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:43.008848 kubelet[2906]: E0302 12:53:43.008778 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:43.009758 kubelet[2906]: E0302 12:53:43.009535 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:43.009758 kubelet[2906]: W0302 12:53:43.009557 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:43.009758 kubelet[2906]: E0302 12:53:43.009574 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:43.011142 kubelet[2906]: E0302 12:53:43.010885 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:43.011142 kubelet[2906]: W0302 12:53:43.010904 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:43.011142 kubelet[2906]: E0302 12:53:43.010920 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:43.011946 kubelet[2906]: E0302 12:53:43.011759 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:43.011946 kubelet[2906]: W0302 12:53:43.011779 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:43.011946 kubelet[2906]: E0302 12:53:43.011795 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:43.012992 kubelet[2906]: E0302 12:53:43.012971 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:43.013274 kubelet[2906]: W0302 12:53:43.013093 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:43.013274 kubelet[2906]: E0302 12:53:43.013118 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:43.015263 kubelet[2906]: E0302 12:53:43.015172 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:43.015263 kubelet[2906]: W0302 12:53:43.015193 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:43.015263 kubelet[2906]: E0302 12:53:43.015210 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:43.044791 kubelet[2906]: E0302 12:53:43.044738 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:43.045115 kubelet[2906]: W0302 12:53:43.045087 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:43.045256 kubelet[2906]: E0302 12:53:43.045231 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:43.075950 systemd[1]: Started cri-containerd-313b8666c3a695064382ab578af4ea70c839ba6c0aca4b6db799f673170764f9.scope - libcontainer container 313b8666c3a695064382ab578af4ea70c839ba6c0aca4b6db799f673170764f9. Mar 2 12:53:43.110474 containerd[1566]: time="2026-03-02T12:53:43.110312778Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-bb965cdbd-xhzsx,Uid:c239fd01-cca9-42ec-8ff4-ec474c02634d,Namespace:calico-system,Attempt:0,} returns sandbox id \"a569859138e149589219fd128f544d62e431327ac9ab66fe57f55fffb6576194\"" Mar 2 12:53:43.115072 containerd[1566]: time="2026-03-02T12:53:43.114559325Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.3\"" Mar 2 12:53:43.156781 containerd[1566]: time="2026-03-02T12:53:43.154769195Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-jzm77,Uid:fd6e2129-cad0-49c3-a6af-11c8cca20a49,Namespace:calico-system,Attempt:0,} returns sandbox id \"313b8666c3a695064382ab578af4ea70c839ba6c0aca4b6db799f673170764f9\"" Mar 2 12:53:44.896555 kubelet[2906]: E0302 12:53:44.895797 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-56k4g" podUID="abbdfe3c-56e5-4932-b650-1489a1c6d2bc" Mar 2 12:53:45.294117 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3478398527.mount: Deactivated successfully. Mar 2 12:53:46.859667 containerd[1566]: time="2026-03-02T12:53:46.859595989Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:53:46.863474 containerd[1566]: time="2026-03-02T12:53:46.862564114Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.3: active requests=0, bytes read=36094696" Mar 2 12:53:46.866347 containerd[1566]: time="2026-03-02T12:53:46.864741924Z" level=info msg="ImageCreate event name:\"sha256:0aa5de4a226c8dff91be273305b5e55a8b7019ef516599fd15c7e4434085cd65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:53:46.892724 containerd[1566]: time="2026-03-02T12:53:46.892642957Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:3e62cf98a20c42a1786397d0192cfb639634ef95c6f463ab92f0439a5c1a4ae5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:53:46.893730 containerd[1566]: time="2026-03-02T12:53:46.893689823Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.3\" with image id \"sha256:0aa5de4a226c8dff91be273305b5e55a8b7019ef516599fd15c7e4434085cd65\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:3e62cf98a20c42a1786397d0192cfb639634ef95c6f463ab92f0439a5c1a4ae5\", size \"36094550\" in 3.779087471s" Mar 2 12:53:46.893806 containerd[1566]: time="2026-03-02T12:53:46.893738813Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.3\" returns image reference \"sha256:0aa5de4a226c8dff91be273305b5e55a8b7019ef516599fd15c7e4434085cd65\"" Mar 2 12:53:46.908636 kubelet[2906]: E0302 12:53:46.908486 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-56k4g" podUID="abbdfe3c-56e5-4932-b650-1489a1c6d2bc" Mar 2 12:53:46.909855 containerd[1566]: time="2026-03-02T12:53:46.909628992Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.3\"" Mar 2 12:53:46.951301 containerd[1566]: time="2026-03-02T12:53:46.951209507Z" level=info msg="CreateContainer within sandbox \"a569859138e149589219fd128f544d62e431327ac9ab66fe57f55fffb6576194\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 2 12:53:46.984809 containerd[1566]: time="2026-03-02T12:53:46.984755479Z" level=info msg="Container 7f4953be4b8c68611d53815ba1bb49ed84117d0643c9a925406a9b4f45540ee0: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:53:46.991195 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount812604731.mount: Deactivated successfully. Mar 2 12:53:47.003480 containerd[1566]: time="2026-03-02T12:53:47.003327369Z" level=info msg="CreateContainer within sandbox \"a569859138e149589219fd128f544d62e431327ac9ab66fe57f55fffb6576194\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"7f4953be4b8c68611d53815ba1bb49ed84117d0643c9a925406a9b4f45540ee0\"" Mar 2 12:53:47.005372 containerd[1566]: time="2026-03-02T12:53:47.005194209Z" level=info msg="StartContainer for \"7f4953be4b8c68611d53815ba1bb49ed84117d0643c9a925406a9b4f45540ee0\"" Mar 2 12:53:47.007594 containerd[1566]: time="2026-03-02T12:53:47.007507959Z" level=info msg="connecting to shim 7f4953be4b8c68611d53815ba1bb49ed84117d0643c9a925406a9b4f45540ee0" address="unix:///run/containerd/s/0a3223038ef86b3f4f223981e3000dc2ff338fff7095da674b5df8d8437664df" protocol=ttrpc version=3 Mar 2 12:53:47.052804 systemd[1]: Started cri-containerd-7f4953be4b8c68611d53815ba1bb49ed84117d0643c9a925406a9b4f45540ee0.scope - libcontainer container 7f4953be4b8c68611d53815ba1bb49ed84117d0643c9a925406a9b4f45540ee0. Mar 2 12:53:47.153475 containerd[1566]: time="2026-03-02T12:53:47.153068552Z" level=info msg="StartContainer for \"7f4953be4b8c68611d53815ba1bb49ed84117d0643c9a925406a9b4f45540ee0\" returns successfully" Mar 2 12:53:48.122701 kubelet[2906]: I0302 12:53:48.122598 2906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-bb965cdbd-xhzsx" podStartSLOduration=2.329603083 podStartE2EDuration="6.122570133s" podCreationTimestamp="2026-03-02 12:53:42 +0000 UTC" firstStartedPulling="2026-03-02 12:53:43.113886153 +0000 UTC m=+23.537693188" lastFinishedPulling="2026-03-02 12:53:46.906853203 +0000 UTC m=+27.330660238" observedRunningTime="2026-03-02 12:53:48.12110186 +0000 UTC m=+28.544908937" watchObservedRunningTime="2026-03-02 12:53:48.122570133 +0000 UTC m=+28.546377162" Mar 2 12:53:48.164901 kubelet[2906]: E0302 12:53:48.164837 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.164901 kubelet[2906]: W0302 12:53:48.164889 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.178226 kubelet[2906]: E0302 12:53:48.178154 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.178702 kubelet[2906]: E0302 12:53:48.178641 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.178702 kubelet[2906]: W0302 12:53:48.178691 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.181306 kubelet[2906]: E0302 12:53:48.178725 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.181306 kubelet[2906]: E0302 12:53:48.179206 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.181306 kubelet[2906]: W0302 12:53:48.179220 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.181306 kubelet[2906]: E0302 12:53:48.179236 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.181306 kubelet[2906]: E0302 12:53:48.179696 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.181306 kubelet[2906]: W0302 12:53:48.179711 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.181306 kubelet[2906]: E0302 12:53:48.179728 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.181306 kubelet[2906]: E0302 12:53:48.180099 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.181306 kubelet[2906]: W0302 12:53:48.180118 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.181306 kubelet[2906]: E0302 12:53:48.180133 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.182845 kubelet[2906]: E0302 12:53:48.180797 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.182845 kubelet[2906]: W0302 12:53:48.180813 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.182845 kubelet[2906]: E0302 12:53:48.180829 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.182845 kubelet[2906]: E0302 12:53:48.182622 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.182845 kubelet[2906]: W0302 12:53:48.182638 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.182845 kubelet[2906]: E0302 12:53:48.182654 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.183635 kubelet[2906]: E0302 12:53:48.183164 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.183635 kubelet[2906]: W0302 12:53:48.183179 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.183635 kubelet[2906]: E0302 12:53:48.183195 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.183950 kubelet[2906]: E0302 12:53:48.183926 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.183950 kubelet[2906]: W0302 12:53:48.183947 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.184061 kubelet[2906]: E0302 12:53:48.183974 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.184890 kubelet[2906]: E0302 12:53:48.184854 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.184890 kubelet[2906]: W0302 12:53:48.184881 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.185016 kubelet[2906]: E0302 12:53:48.184899 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.185260 kubelet[2906]: E0302 12:53:48.185227 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.185260 kubelet[2906]: W0302 12:53:48.185251 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.185373 kubelet[2906]: E0302 12:53:48.185272 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.185891 kubelet[2906]: E0302 12:53:48.185861 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.186017 kubelet[2906]: W0302 12:53:48.185927 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.186017 kubelet[2906]: E0302 12:53:48.185946 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.186689 kubelet[2906]: E0302 12:53:48.186597 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.186689 kubelet[2906]: W0302 12:53:48.186624 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.186689 kubelet[2906]: E0302 12:53:48.186640 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.187029 kubelet[2906]: E0302 12:53:48.186910 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.187029 kubelet[2906]: W0302 12:53:48.186923 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.187029 kubelet[2906]: E0302 12:53:48.186938 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.187737 kubelet[2906]: E0302 12:53:48.187677 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.187737 kubelet[2906]: W0302 12:53:48.187695 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.187737 kubelet[2906]: E0302 12:53:48.187715 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.254884 kubelet[2906]: E0302 12:53:48.254835 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.254884 kubelet[2906]: W0302 12:53:48.254873 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.255175 kubelet[2906]: E0302 12:53:48.254906 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.255278 kubelet[2906]: E0302 12:53:48.255254 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.255331 kubelet[2906]: W0302 12:53:48.255279 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.255331 kubelet[2906]: E0302 12:53:48.255296 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.255724 kubelet[2906]: E0302 12:53:48.255692 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.255724 kubelet[2906]: W0302 12:53:48.255714 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.255868 kubelet[2906]: E0302 12:53:48.255731 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.256076 kubelet[2906]: E0302 12:53:48.256055 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.256076 kubelet[2906]: W0302 12:53:48.256075 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.256192 kubelet[2906]: E0302 12:53:48.256091 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.256390 kubelet[2906]: E0302 12:53:48.256371 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.256390 kubelet[2906]: W0302 12:53:48.256389 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.256525 kubelet[2906]: E0302 12:53:48.256404 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.256824 kubelet[2906]: E0302 12:53:48.256804 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.256824 kubelet[2906]: W0302 12:53:48.256823 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.256925 kubelet[2906]: E0302 12:53:48.256840 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.257164 kubelet[2906]: E0302 12:53:48.257146 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.257164 kubelet[2906]: W0302 12:53:48.257164 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.257300 kubelet[2906]: E0302 12:53:48.257178 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.257492 kubelet[2906]: E0302 12:53:48.257472 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.257492 kubelet[2906]: W0302 12:53:48.257490 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.257594 kubelet[2906]: E0302 12:53:48.257506 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.257827 kubelet[2906]: E0302 12:53:48.257806 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.257827 kubelet[2906]: W0302 12:53:48.257824 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.257932 kubelet[2906]: E0302 12:53:48.257840 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.258338 kubelet[2906]: E0302 12:53:48.258305 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.258338 kubelet[2906]: W0302 12:53:48.258329 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.258445 kubelet[2906]: E0302 12:53:48.258344 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.258615 kubelet[2906]: E0302 12:53:48.258599 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.258615 kubelet[2906]: W0302 12:53:48.258612 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.258739 kubelet[2906]: E0302 12:53:48.258626 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.258916 kubelet[2906]: E0302 12:53:48.258895 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.258916 kubelet[2906]: W0302 12:53:48.258914 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.259060 kubelet[2906]: E0302 12:53:48.258929 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.259197 kubelet[2906]: E0302 12:53:48.259176 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.259197 kubelet[2906]: W0302 12:53:48.259195 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.259306 kubelet[2906]: E0302 12:53:48.259210 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.259522 kubelet[2906]: E0302 12:53:48.259503 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.259522 kubelet[2906]: W0302 12:53:48.259521 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.259639 kubelet[2906]: E0302 12:53:48.259536 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.260111 kubelet[2906]: E0302 12:53:48.260079 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.260111 kubelet[2906]: W0302 12:53:48.260102 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.260224 kubelet[2906]: E0302 12:53:48.260118 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.260370 kubelet[2906]: E0302 12:53:48.260350 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.260370 kubelet[2906]: W0302 12:53:48.260368 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.260501 kubelet[2906]: E0302 12:53:48.260383 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.261165 kubelet[2906]: E0302 12:53:48.261134 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.261165 kubelet[2906]: W0302 12:53:48.261157 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.261277 kubelet[2906]: E0302 12:53:48.261175 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.261433 kubelet[2906]: E0302 12:53:48.261402 2906 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 2 12:53:48.261433 kubelet[2906]: W0302 12:53:48.261425 2906 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 2 12:53:48.261568 kubelet[2906]: E0302 12:53:48.261440 2906 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 2 12:53:48.775177 containerd[1566]: time="2026-03-02T12:53:48.774926223Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:53:48.777219 containerd[1566]: time="2026-03-02T12:53:48.777163548Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.3: active requests=0, bytes read=4630152" Mar 2 12:53:48.779177 containerd[1566]: time="2026-03-02T12:53:48.777981794Z" level=info msg="ImageCreate event name:\"sha256:ecc2a8ca795d595c3a806abf201d701228ddc7a8373e906441c9470dfeadd022\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:53:48.787106 containerd[1566]: time="2026-03-02T12:53:48.787059227Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:6cdc6cc2f7cdcbd4bf2d9b6a59c03ed98b5c47f22e467d78b5c06e5fd7bff132\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:53:48.787904 containerd[1566]: time="2026-03-02T12:53:48.787861044Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.3\" with image id \"sha256:ecc2a8ca795d595c3a806abf201d701228ddc7a8373e906441c9470dfeadd022\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:6cdc6cc2f7cdcbd4bf2d9b6a59c03ed98b5c47f22e467d78b5c06e5fd7bff132\", size \"6186157\" in 1.87817271s" Mar 2 12:53:48.787990 containerd[1566]: time="2026-03-02T12:53:48.787911375Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.3\" returns image reference \"sha256:ecc2a8ca795d595c3a806abf201d701228ddc7a8373e906441c9470dfeadd022\"" Mar 2 12:53:48.796472 containerd[1566]: time="2026-03-02T12:53:48.796393078Z" level=info msg="CreateContainer within sandbox \"313b8666c3a695064382ab578af4ea70c839ba6c0aca4b6db799f673170764f9\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 2 12:53:48.810602 containerd[1566]: time="2026-03-02T12:53:48.810538609Z" level=info msg="Container f57273338ab56a751aebffc78e033bfc7ce1e58e4cbe4463f9f7adc4a9797fb9: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:53:48.814570 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1354280150.mount: Deactivated successfully. Mar 2 12:53:48.826730 containerd[1566]: time="2026-03-02T12:53:48.826652676Z" level=info msg="CreateContainer within sandbox \"313b8666c3a695064382ab578af4ea70c839ba6c0aca4b6db799f673170764f9\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"f57273338ab56a751aebffc78e033bfc7ce1e58e4cbe4463f9f7adc4a9797fb9\"" Mar 2 12:53:48.827638 containerd[1566]: time="2026-03-02T12:53:48.827578246Z" level=info msg="StartContainer for \"f57273338ab56a751aebffc78e033bfc7ce1e58e4cbe4463f9f7adc4a9797fb9\"" Mar 2 12:53:48.832469 containerd[1566]: time="2026-03-02T12:53:48.832254500Z" level=info msg="connecting to shim f57273338ab56a751aebffc78e033bfc7ce1e58e4cbe4463f9f7adc4a9797fb9" address="unix:///run/containerd/s/fd6c9dcd0200759592f322700131ef14c85d69cf6b3159ed12cdfd9d32fc5f5c" protocol=ttrpc version=3 Mar 2 12:53:48.867808 systemd[1]: Started cri-containerd-f57273338ab56a751aebffc78e033bfc7ce1e58e4cbe4463f9f7adc4a9797fb9.scope - libcontainer container f57273338ab56a751aebffc78e033bfc7ce1e58e4cbe4463f9f7adc4a9797fb9. Mar 2 12:53:48.895054 kubelet[2906]: E0302 12:53:48.894921 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-56k4g" podUID="abbdfe3c-56e5-4932-b650-1489a1c6d2bc" Mar 2 12:53:48.977138 containerd[1566]: time="2026-03-02T12:53:48.977079731Z" level=info msg="StartContainer for \"f57273338ab56a751aebffc78e033bfc7ce1e58e4cbe4463f9f7adc4a9797fb9\" returns successfully" Mar 2 12:53:48.999728 systemd[1]: cri-containerd-f57273338ab56a751aebffc78e033bfc7ce1e58e4cbe4463f9f7adc4a9797fb9.scope: Deactivated successfully. Mar 2 12:53:49.040341 containerd[1566]: time="2026-03-02T12:53:49.040154754Z" level=info msg="received container exit event container_id:\"f57273338ab56a751aebffc78e033bfc7ce1e58e4cbe4463f9f7adc4a9797fb9\" id:\"f57273338ab56a751aebffc78e033bfc7ce1e58e4cbe4463f9f7adc4a9797fb9\" pid:3545 exited_at:{seconds:1772456029 nanos:13064573}" Mar 2 12:53:49.083989 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f57273338ab56a751aebffc78e033bfc7ce1e58e4cbe4463f9f7adc4a9797fb9-rootfs.mount: Deactivated successfully. Mar 2 12:53:50.106288 containerd[1566]: time="2026-03-02T12:53:50.106195314Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.3\"" Mar 2 12:53:50.895180 kubelet[2906]: E0302 12:53:50.894975 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-56k4g" podUID="abbdfe3c-56e5-4932-b650-1489a1c6d2bc" Mar 2 12:53:52.896429 kubelet[2906]: E0302 12:53:52.895355 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-56k4g" podUID="abbdfe3c-56e5-4932-b650-1489a1c6d2bc" Mar 2 12:53:54.895625 kubelet[2906]: E0302 12:53:54.895550 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-56k4g" podUID="abbdfe3c-56e5-4932-b650-1489a1c6d2bc" Mar 2 12:53:56.895354 kubelet[2906]: E0302 12:53:56.895276 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-56k4g" podUID="abbdfe3c-56e5-4932-b650-1489a1c6d2bc" Mar 2 12:53:58.895392 kubelet[2906]: E0302 12:53:58.895302 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-56k4g" podUID="abbdfe3c-56e5-4932-b650-1489a1c6d2bc" Mar 2 12:54:00.894994 kubelet[2906]: E0302 12:54:00.894916 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-56k4g" podUID="abbdfe3c-56e5-4932-b650-1489a1c6d2bc" Mar 2 12:54:02.895107 kubelet[2906]: E0302 12:54:02.895030 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-56k4g" podUID="abbdfe3c-56e5-4932-b650-1489a1c6d2bc" Mar 2 12:54:03.389644 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount185866543.mount: Deactivated successfully. Mar 2 12:54:03.440959 containerd[1566]: time="2026-03-02T12:54:03.440690694Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:03.442610 containerd[1566]: time="2026-03-02T12:54:03.442563673Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.3: active requests=0, bytes read=159483365" Mar 2 12:54:03.443716 containerd[1566]: time="2026-03-02T12:54:03.443340065Z" level=info msg="ImageCreate event name:\"sha256:f8495fa3f644ae70c7e5131c7baf23f80864678694dbf1a6a4d0557528433740\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:03.445876 containerd[1566]: time="2026-03-02T12:54:03.445816816Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:c7aefc80042b94800407ab45640b59402d2897ae8755b9d8370516e7b0e404bc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:03.447076 containerd[1566]: time="2026-03-02T12:54:03.446827025Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.3\" with image id \"sha256:f8495fa3f644ae70c7e5131c7baf23f80864678694dbf1a6a4d0557528433740\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:c7aefc80042b94800407ab45640b59402d2897ae8755b9d8370516e7b0e404bc\", size \"159483227\" in 13.340563014s" Mar 2 12:54:03.447076 containerd[1566]: time="2026-03-02T12:54:03.446873405Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.3\" returns image reference \"sha256:f8495fa3f644ae70c7e5131c7baf23f80864678694dbf1a6a4d0557528433740\"" Mar 2 12:54:03.454482 containerd[1566]: time="2026-03-02T12:54:03.454076542Z" level=info msg="CreateContainer within sandbox \"313b8666c3a695064382ab578af4ea70c839ba6c0aca4b6db799f673170764f9\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 2 12:54:03.467191 containerd[1566]: time="2026-03-02T12:54:03.466548797Z" level=info msg="Container d59398666c3319886c217a1bc5c55c27aa113ef1170e0393d6f77491f56bf71d: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:54:03.473743 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2771126607.mount: Deactivated successfully. Mar 2 12:54:03.489581 containerd[1566]: time="2026-03-02T12:54:03.489532999Z" level=info msg="CreateContainer within sandbox \"313b8666c3a695064382ab578af4ea70c839ba6c0aca4b6db799f673170764f9\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"d59398666c3319886c217a1bc5c55c27aa113ef1170e0393d6f77491f56bf71d\"" Mar 2 12:54:03.490476 containerd[1566]: time="2026-03-02T12:54:03.490200432Z" level=info msg="StartContainer for \"d59398666c3319886c217a1bc5c55c27aa113ef1170e0393d6f77491f56bf71d\"" Mar 2 12:54:03.493359 containerd[1566]: time="2026-03-02T12:54:03.493327657Z" level=info msg="connecting to shim d59398666c3319886c217a1bc5c55c27aa113ef1170e0393d6f77491f56bf71d" address="unix:///run/containerd/s/fd6c9dcd0200759592f322700131ef14c85d69cf6b3159ed12cdfd9d32fc5f5c" protocol=ttrpc version=3 Mar 2 12:54:03.611768 systemd[1]: Started cri-containerd-d59398666c3319886c217a1bc5c55c27aa113ef1170e0393d6f77491f56bf71d.scope - libcontainer container d59398666c3319886c217a1bc5c55c27aa113ef1170e0393d6f77491f56bf71d. Mar 2 12:54:03.723127 containerd[1566]: time="2026-03-02T12:54:03.723065992Z" level=info msg="StartContainer for \"d59398666c3319886c217a1bc5c55c27aa113ef1170e0393d6f77491f56bf71d\" returns successfully" Mar 2 12:54:03.797382 systemd[1]: cri-containerd-d59398666c3319886c217a1bc5c55c27aa113ef1170e0393d6f77491f56bf71d.scope: Deactivated successfully. Mar 2 12:54:03.822445 containerd[1566]: time="2026-03-02T12:54:03.822391506Z" level=info msg="received container exit event container_id:\"d59398666c3319886c217a1bc5c55c27aa113ef1170e0393d6f77491f56bf71d\" id:\"d59398666c3319886c217a1bc5c55c27aa113ef1170e0393d6f77491f56bf71d\" pid:3601 exited_at:{seconds:1772456043 nanos:822006354}" Mar 2 12:54:04.163298 containerd[1566]: time="2026-03-02T12:54:04.163138126Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.3\"" Mar 2 12:54:04.382995 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d59398666c3319886c217a1bc5c55c27aa113ef1170e0393d6f77491f56bf71d-rootfs.mount: Deactivated successfully. Mar 2 12:54:04.895558 kubelet[2906]: E0302 12:54:04.895445 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-56k4g" podUID="abbdfe3c-56e5-4932-b650-1489a1c6d2bc" Mar 2 12:54:06.895285 kubelet[2906]: E0302 12:54:06.895190 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-56k4g" podUID="abbdfe3c-56e5-4932-b650-1489a1c6d2bc" Mar 2 12:54:08.519800 systemd[1]: Started sshd@9-10.243.74.166:22-27.155.172.76:39691.service - OpenSSH per-connection server daemon (27.155.172.76:39691). Mar 2 12:54:08.896985 kubelet[2906]: E0302 12:54:08.895948 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-56k4g" podUID="abbdfe3c-56e5-4932-b650-1489a1c6d2bc" Mar 2 12:54:08.978916 containerd[1566]: time="2026-03-02T12:54:08.977716959Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:08.981222 containerd[1566]: time="2026-03-02T12:54:08.981186215Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.3: active requests=0, bytes read=70584418" Mar 2 12:54:08.982266 containerd[1566]: time="2026-03-02T12:54:08.982187048Z" level=info msg="ImageCreate event name:\"sha256:f2520fbaa2761d3cc6c294dcad9c4dc33442ee0c856af33cefd0da5346519691\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:08.997749 containerd[1566]: time="2026-03-02T12:54:08.995698578Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:c25deb6a4b79f5e595eb464adf9fb3735ea5623889e249d5b3efa0b42ffcbb47\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:08.997749 containerd[1566]: time="2026-03-02T12:54:08.996992529Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.3\" with image id \"sha256:f2520fbaa2761d3cc6c294dcad9c4dc33442ee0c856af33cefd0da5346519691\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:c25deb6a4b79f5e595eb464adf9fb3735ea5623889e249d5b3efa0b42ffcbb47\", size \"72140463\" in 4.833773353s" Mar 2 12:54:08.997749 containerd[1566]: time="2026-03-02T12:54:08.997028217Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.3\" returns image reference \"sha256:f2520fbaa2761d3cc6c294dcad9c4dc33442ee0c856af33cefd0da5346519691\"" Mar 2 12:54:09.016062 containerd[1566]: time="2026-03-02T12:54:09.012308948Z" level=info msg="CreateContainer within sandbox \"313b8666c3a695064382ab578af4ea70c839ba6c0aca4b6db799f673170764f9\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 2 12:54:09.040477 containerd[1566]: time="2026-03-02T12:54:09.039773143Z" level=info msg="Container 93519127058cb806cf7a9be2f2d34ae8b3aefacbae48a67eac3666d7bd509563: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:54:09.046163 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2573791086.mount: Deactivated successfully. Mar 2 12:54:09.057798 containerd[1566]: time="2026-03-02T12:54:09.057744469Z" level=info msg="CreateContainer within sandbox \"313b8666c3a695064382ab578af4ea70c839ba6c0aca4b6db799f673170764f9\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"93519127058cb806cf7a9be2f2d34ae8b3aefacbae48a67eac3666d7bd509563\"" Mar 2 12:54:09.060178 containerd[1566]: time="2026-03-02T12:54:09.060105961Z" level=info msg="StartContainer for \"93519127058cb806cf7a9be2f2d34ae8b3aefacbae48a67eac3666d7bd509563\"" Mar 2 12:54:09.063640 containerd[1566]: time="2026-03-02T12:54:09.063560733Z" level=info msg="connecting to shim 93519127058cb806cf7a9be2f2d34ae8b3aefacbae48a67eac3666d7bd509563" address="unix:///run/containerd/s/fd6c9dcd0200759592f322700131ef14c85d69cf6b3159ed12cdfd9d32fc5f5c" protocol=ttrpc version=3 Mar 2 12:54:09.112105 systemd[1]: Started cri-containerd-93519127058cb806cf7a9be2f2d34ae8b3aefacbae48a67eac3666d7bd509563.scope - libcontainer container 93519127058cb806cf7a9be2f2d34ae8b3aefacbae48a67eac3666d7bd509563. Mar 2 12:54:09.227922 systemd[1]: Started sshd@10-10.243.74.166:22-101.204.251.230:12447.service - OpenSSH per-connection server daemon (101.204.251.230:12447). Mar 2 12:54:09.344976 containerd[1566]: time="2026-03-02T12:54:09.344910936Z" level=info msg="StartContainer for \"93519127058cb806cf7a9be2f2d34ae8b3aefacbae48a67eac3666d7bd509563\" returns successfully" Mar 2 12:54:09.932160 sshd[3644]: Connection closed by 27.155.172.76 port 39691 Mar 2 12:54:09.936718 systemd[1]: sshd@9-10.243.74.166:22-27.155.172.76:39691.service: Deactivated successfully. Mar 2 12:54:10.508891 systemd[1]: cri-containerd-93519127058cb806cf7a9be2f2d34ae8b3aefacbae48a67eac3666d7bd509563.scope: Deactivated successfully. Mar 2 12:54:10.509333 systemd[1]: cri-containerd-93519127058cb806cf7a9be2f2d34ae8b3aefacbae48a67eac3666d7bd509563.scope: Consumed 755ms CPU time, 172.3M memory peak, 5.1M read from disk, 176.9M written to disk. Mar 2 12:54:10.575635 containerd[1566]: time="2026-03-02T12:54:10.575567607Z" level=info msg="received container exit event container_id:\"93519127058cb806cf7a9be2f2d34ae8b3aefacbae48a67eac3666d7bd509563\" id:\"93519127058cb806cf7a9be2f2d34ae8b3aefacbae48a67eac3666d7bd509563\" pid:3662 exited_at:{seconds:1772456050 nanos:575230167}" Mar 2 12:54:10.607397 kubelet[2906]: I0302 12:54:10.606902 2906 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Mar 2 12:54:10.642788 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-93519127058cb806cf7a9be2f2d34ae8b3aefacbae48a67eac3666d7bd509563-rootfs.mount: Deactivated successfully. Mar 2 12:54:10.687391 systemd[1]: Created slice kubepods-besteffort-poda2bcb1e0_0c20_4528_b0fb_910750f04440.slice - libcontainer container kubepods-besteffort-poda2bcb1e0_0c20_4528_b0fb_910750f04440.slice. Mar 2 12:54:10.710028 systemd[1]: Created slice kubepods-burstable-pod92e54f41_dc74_4b73_a361_cd79b67e3382.slice - libcontainer container kubepods-burstable-pod92e54f41_dc74_4b73_a361_cd79b67e3382.slice. Mar 2 12:54:10.722850 systemd[1]: Created slice kubepods-burstable-pod97350222_5e6a_4fc2_9dd6_1df0b4873374.slice - libcontainer container kubepods-burstable-pod97350222_5e6a_4fc2_9dd6_1df0b4873374.slice. Mar 2 12:54:10.752567 systemd[1]: Created slice kubepods-besteffort-podf421e0e8_37b5_4f11_a6ed_d6be8a3f5f24.slice - libcontainer container kubepods-besteffort-podf421e0e8_37b5_4f11_a6ed_d6be8a3f5f24.slice. Mar 2 12:54:10.756581 kubelet[2906]: I0302 12:54:10.756010 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2bcb1e0-0c20-4528-b0fb-910750f04440-tigera-ca-bundle\") pod \"calico-kube-controllers-6b77d7d8f9-wth9b\" (UID: \"a2bcb1e0-0c20-4528-b0fb-910750f04440\") " pod="calico-system/calico-kube-controllers-6b77d7d8f9-wth9b" Mar 2 12:54:10.756581 kubelet[2906]: I0302 12:54:10.756062 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5flm\" (UniqueName: \"kubernetes.io/projected/97350222-5e6a-4fc2-9dd6-1df0b4873374-kube-api-access-x5flm\") pod \"coredns-674b8bbfcf-vfkbm\" (UID: \"97350222-5e6a-4fc2-9dd6-1df0b4873374\") " pod="kube-system/coredns-674b8bbfcf-vfkbm" Mar 2 12:54:10.756581 kubelet[2906]: I0302 12:54:10.756096 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlx5s\" (UniqueName: \"kubernetes.io/projected/bbd5153d-dba2-4511-a4b0-736593de7fc7-kube-api-access-nlx5s\") pod \"calico-apiserver-7ccfb58f8-hkzqj\" (UID: \"bbd5153d-dba2-4511-a4b0-736593de7fc7\") " pod="calico-system/calico-apiserver-7ccfb58f8-hkzqj" Mar 2 12:54:10.756581 kubelet[2906]: I0302 12:54:10.756128 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97350222-5e6a-4fc2-9dd6-1df0b4873374-config-volume\") pod \"coredns-674b8bbfcf-vfkbm\" (UID: \"97350222-5e6a-4fc2-9dd6-1df0b4873374\") " pod="kube-system/coredns-674b8bbfcf-vfkbm" Mar 2 12:54:10.756581 kubelet[2906]: I0302 12:54:10.756155 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx6kl\" (UniqueName: \"kubernetes.io/projected/f421e0e8-37b5-4f11-a6ed-d6be8a3f5f24-kube-api-access-jx6kl\") pod \"calico-apiserver-7ccfb58f8-5k2tw\" (UID: \"f421e0e8-37b5-4f11-a6ed-d6be8a3f5f24\") " pod="calico-system/calico-apiserver-7ccfb58f8-5k2tw" Mar 2 12:54:10.756871 kubelet[2906]: I0302 12:54:10.756186 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4s68\" (UniqueName: \"kubernetes.io/projected/a2bcb1e0-0c20-4528-b0fb-910750f04440-kube-api-access-h4s68\") pod \"calico-kube-controllers-6b77d7d8f9-wth9b\" (UID: \"a2bcb1e0-0c20-4528-b0fb-910750f04440\") " pod="calico-system/calico-kube-controllers-6b77d7d8f9-wth9b" Mar 2 12:54:10.756871 kubelet[2906]: I0302 12:54:10.756221 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/bbd5153d-dba2-4511-a4b0-736593de7fc7-calico-apiserver-certs\") pod \"calico-apiserver-7ccfb58f8-hkzqj\" (UID: \"bbd5153d-dba2-4511-a4b0-736593de7fc7\") " pod="calico-system/calico-apiserver-7ccfb58f8-hkzqj" Mar 2 12:54:10.756871 kubelet[2906]: I0302 12:54:10.756258 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92e54f41-dc74-4b73-a361-cd79b67e3382-config-volume\") pod \"coredns-674b8bbfcf-nzwgb\" (UID: \"92e54f41-dc74-4b73-a361-cd79b67e3382\") " pod="kube-system/coredns-674b8bbfcf-nzwgb" Mar 2 12:54:10.756871 kubelet[2906]: I0302 12:54:10.756296 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4gj7\" (UniqueName: \"kubernetes.io/projected/92e54f41-dc74-4b73-a361-cd79b67e3382-kube-api-access-q4gj7\") pod \"coredns-674b8bbfcf-nzwgb\" (UID: \"92e54f41-dc74-4b73-a361-cd79b67e3382\") " pod="kube-system/coredns-674b8bbfcf-nzwgb" Mar 2 12:54:10.756871 kubelet[2906]: I0302 12:54:10.756375 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f421e0e8-37b5-4f11-a6ed-d6be8a3f5f24-calico-apiserver-certs\") pod \"calico-apiserver-7ccfb58f8-5k2tw\" (UID: \"f421e0e8-37b5-4f11-a6ed-d6be8a3f5f24\") " pod="calico-system/calico-apiserver-7ccfb58f8-5k2tw" Mar 2 12:54:10.771232 systemd[1]: Created slice kubepods-besteffort-podbbd5153d_dba2_4511_a4b0_736593de7fc7.slice - libcontainer container kubepods-besteffort-podbbd5153d_dba2_4511_a4b0_736593de7fc7.slice. Mar 2 12:54:10.796850 systemd[1]: Created slice kubepods-besteffort-podd0fb2ace_0bce_4453_8e74_079c4a488b9c.slice - libcontainer container kubepods-besteffort-podd0fb2ace_0bce_4453_8e74_079c4a488b9c.slice. Mar 2 12:54:10.814786 systemd[1]: Created slice kubepods-besteffort-podfb2f2c77_e1c5_400d_a6e1_f2a0ef3b0ba3.slice - libcontainer container kubepods-besteffort-podfb2f2c77_e1c5_400d_a6e1_f2a0ef3b0ba3.slice. Mar 2 12:54:10.858271 kubelet[2906]: I0302 12:54:10.857283 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/d0fb2ace-0bce-4453-8e74-079c4a488b9c-goldmane-key-pair\") pod \"goldmane-9566f57b5-h9nvt\" (UID: \"d0fb2ace-0bce-4453-8e74-079c4a488b9c\") " pod="calico-system/goldmane-9566f57b5-h9nvt" Mar 2 12:54:10.858271 kubelet[2906]: I0302 12:54:10.857372 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvt7s\" (UniqueName: \"kubernetes.io/projected/d0fb2ace-0bce-4453-8e74-079c4a488b9c-kube-api-access-kvt7s\") pod \"goldmane-9566f57b5-h9nvt\" (UID: \"d0fb2ace-0bce-4453-8e74-079c4a488b9c\") " pod="calico-system/goldmane-9566f57b5-h9nvt" Mar 2 12:54:10.858271 kubelet[2906]: I0302 12:54:10.857404 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3-whisker-backend-key-pair\") pod \"whisker-7654b5d44c-kls9b\" (UID: \"fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3\") " pod="calico-system/whisker-7654b5d44c-kls9b" Mar 2 12:54:10.859245 kubelet[2906]: I0302 12:54:10.857442 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3-whisker-ca-bundle\") pod \"whisker-7654b5d44c-kls9b\" (UID: \"fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3\") " pod="calico-system/whisker-7654b5d44c-kls9b" Mar 2 12:54:10.859245 kubelet[2906]: I0302 12:54:10.859225 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv7pg\" (UniqueName: \"kubernetes.io/projected/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3-kube-api-access-cv7pg\") pod \"whisker-7654b5d44c-kls9b\" (UID: \"fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3\") " pod="calico-system/whisker-7654b5d44c-kls9b" Mar 2 12:54:10.859354 kubelet[2906]: I0302 12:54:10.859304 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0fb2ace-0bce-4453-8e74-079c4a488b9c-goldmane-ca-bundle\") pod \"goldmane-9566f57b5-h9nvt\" (UID: \"d0fb2ace-0bce-4453-8e74-079c4a488b9c\") " pod="calico-system/goldmane-9566f57b5-h9nvt" Mar 2 12:54:10.859354 kubelet[2906]: I0302 12:54:10.859335 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3-nginx-config\") pod \"whisker-7654b5d44c-kls9b\" (UID: \"fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3\") " pod="calico-system/whisker-7654b5d44c-kls9b" Mar 2 12:54:10.859444 kubelet[2906]: I0302 12:54:10.859420 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0fb2ace-0bce-4453-8e74-079c4a488b9c-config\") pod \"goldmane-9566f57b5-h9nvt\" (UID: \"d0fb2ace-0bce-4453-8e74-079c4a488b9c\") " pod="calico-system/goldmane-9566f57b5-h9nvt" Mar 2 12:54:10.959536 systemd[1]: Created slice kubepods-besteffort-podabbdfe3c_56e5_4932_b650_1489a1c6d2bc.slice - libcontainer container kubepods-besteffort-podabbdfe3c_56e5_4932_b650_1489a1c6d2bc.slice. Mar 2 12:54:11.000333 containerd[1566]: time="2026-03-02T12:54:10.999823469Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-56k4g,Uid:abbdfe3c-56e5-4932-b650-1489a1c6d2bc,Namespace:calico-system,Attempt:0,}" Mar 2 12:54:11.004218 containerd[1566]: time="2026-03-02T12:54:11.004152175Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b77d7d8f9-wth9b,Uid:a2bcb1e0-0c20-4528-b0fb-910750f04440,Namespace:calico-system,Attempt:0,}" Mar 2 12:54:11.031284 containerd[1566]: time="2026-03-02T12:54:11.030407811Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-nzwgb,Uid:92e54f41-dc74-4b73-a361-cd79b67e3382,Namespace:kube-system,Attempt:0,}" Mar 2 12:54:11.055060 containerd[1566]: time="2026-03-02T12:54:11.054927111Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vfkbm,Uid:97350222-5e6a-4fc2-9dd6-1df0b4873374,Namespace:kube-system,Attempt:0,}" Mar 2 12:54:11.064633 containerd[1566]: time="2026-03-02T12:54:11.064501229Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7ccfb58f8-5k2tw,Uid:f421e0e8-37b5-4f11-a6ed-d6be8a3f5f24,Namespace:calico-system,Attempt:0,}" Mar 2 12:54:11.091423 containerd[1566]: time="2026-03-02T12:54:11.089928566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7ccfb58f8-hkzqj,Uid:bbd5153d-dba2-4511-a4b0-736593de7fc7,Namespace:calico-system,Attempt:0,}" Mar 2 12:54:11.114823 containerd[1566]: time="2026-03-02T12:54:11.114759937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9566f57b5-h9nvt,Uid:d0fb2ace-0bce-4453-8e74-079c4a488b9c,Namespace:calico-system,Attempt:0,}" Mar 2 12:54:11.124652 containerd[1566]: time="2026-03-02T12:54:11.124423667Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7654b5d44c-kls9b,Uid:fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3,Namespace:calico-system,Attempt:0,}" Mar 2 12:54:11.288223 containerd[1566]: time="2026-03-02T12:54:11.287655597Z" level=info msg="CreateContainer within sandbox \"313b8666c3a695064382ab578af4ea70c839ba6c0aca4b6db799f673170764f9\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 2 12:54:11.347511 containerd[1566]: time="2026-03-02T12:54:11.347429540Z" level=info msg="Container 0fc58da3e305088a551ef0df1ad670f2803c03d36857602537e9571e4e1735ae: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:54:11.413934 containerd[1566]: time="2026-03-02T12:54:11.413874025Z" level=info msg="CreateContainer within sandbox \"313b8666c3a695064382ab578af4ea70c839ba6c0aca4b6db799f673170764f9\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"0fc58da3e305088a551ef0df1ad670f2803c03d36857602537e9571e4e1735ae\"" Mar 2 12:54:11.419111 containerd[1566]: time="2026-03-02T12:54:11.419059279Z" level=info msg="StartContainer for \"0fc58da3e305088a551ef0df1ad670f2803c03d36857602537e9571e4e1735ae\"" Mar 2 12:54:11.433656 containerd[1566]: time="2026-03-02T12:54:11.433559142Z" level=info msg="connecting to shim 0fc58da3e305088a551ef0df1ad670f2803c03d36857602537e9571e4e1735ae" address="unix:///run/containerd/s/fd6c9dcd0200759592f322700131ef14c85d69cf6b3159ed12cdfd9d32fc5f5c" protocol=ttrpc version=3 Mar 2 12:54:11.530855 systemd[1]: Started cri-containerd-0fc58da3e305088a551ef0df1ad670f2803c03d36857602537e9571e4e1735ae.scope - libcontainer container 0fc58da3e305088a551ef0df1ad670f2803c03d36857602537e9571e4e1735ae. Mar 2 12:54:11.612604 containerd[1566]: time="2026-03-02T12:54:11.612375718Z" level=error msg="Failed to destroy network for sandbox \"ad87154f945fd0a6bce1971b7a20f6b33a9a92aeda9d62a0671eb5e1da065aa6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.619050 containerd[1566]: time="2026-03-02T12:54:11.617558053Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b77d7d8f9-wth9b,Uid:a2bcb1e0-0c20-4528-b0fb-910750f04440,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad87154f945fd0a6bce1971b7a20f6b33a9a92aeda9d62a0671eb5e1da065aa6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.624912 kubelet[2906]: E0302 12:54:11.619720 2906 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad87154f945fd0a6bce1971b7a20f6b33a9a92aeda9d62a0671eb5e1da065aa6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.624912 kubelet[2906]: E0302 12:54:11.620679 2906 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad87154f945fd0a6bce1971b7a20f6b33a9a92aeda9d62a0671eb5e1da065aa6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6b77d7d8f9-wth9b" Mar 2 12:54:11.624912 kubelet[2906]: E0302 12:54:11.620735 2906 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad87154f945fd0a6bce1971b7a20f6b33a9a92aeda9d62a0671eb5e1da065aa6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6b77d7d8f9-wth9b" Mar 2 12:54:11.627727 kubelet[2906]: E0302 12:54:11.620851 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6b77d7d8f9-wth9b_calico-system(a2bcb1e0-0c20-4528-b0fb-910750f04440)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6b77d7d8f9-wth9b_calico-system(a2bcb1e0-0c20-4528-b0fb-910750f04440)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ad87154f945fd0a6bce1971b7a20f6b33a9a92aeda9d62a0671eb5e1da065aa6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6b77d7d8f9-wth9b" podUID="a2bcb1e0-0c20-4528-b0fb-910750f04440" Mar 2 12:54:11.677486 containerd[1566]: time="2026-03-02T12:54:11.675837166Z" level=error msg="Failed to destroy network for sandbox \"970419ef065e640f30c1bd1f629dffe5e2d2ab71562f116afe117ace71284093\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.685138 systemd[1]: run-netns-cni\x2daa629586\x2d264c\x2d8805\x2d4334\x2dd6fd8e6375be.mount: Deactivated successfully. Mar 2 12:54:11.701233 containerd[1566]: time="2026-03-02T12:54:11.700621286Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7ccfb58f8-5k2tw,Uid:f421e0e8-37b5-4f11-a6ed-d6be8a3f5f24,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"970419ef065e640f30c1bd1f629dffe5e2d2ab71562f116afe117ace71284093\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.703850 kubelet[2906]: E0302 12:54:11.702700 2906 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"970419ef065e640f30c1bd1f629dffe5e2d2ab71562f116afe117ace71284093\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.703850 kubelet[2906]: E0302 12:54:11.702793 2906 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"970419ef065e640f30c1bd1f629dffe5e2d2ab71562f116afe117ace71284093\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7ccfb58f8-5k2tw" Mar 2 12:54:11.703850 kubelet[2906]: E0302 12:54:11.702829 2906 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"970419ef065e640f30c1bd1f629dffe5e2d2ab71562f116afe117ace71284093\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7ccfb58f8-5k2tw" Mar 2 12:54:11.704097 kubelet[2906]: E0302 12:54:11.702915 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7ccfb58f8-5k2tw_calico-system(f421e0e8-37b5-4f11-a6ed-d6be8a3f5f24)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7ccfb58f8-5k2tw_calico-system(f421e0e8-37b5-4f11-a6ed-d6be8a3f5f24)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"970419ef065e640f30c1bd1f629dffe5e2d2ab71562f116afe117ace71284093\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-7ccfb58f8-5k2tw" podUID="f421e0e8-37b5-4f11-a6ed-d6be8a3f5f24" Mar 2 12:54:11.747745 containerd[1566]: time="2026-03-02T12:54:11.745477225Z" level=error msg="Failed to destroy network for sandbox \"9edad8943f25b3d61686755d0e160f11b2375e0aa16fb0055f0d6c737f825a08\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.752037 systemd[1]: run-netns-cni\x2de80cc7c3\x2dba21\x2d643f\x2dd066\x2d23e1104573f0.mount: Deactivated successfully. Mar 2 12:54:11.784977 containerd[1566]: time="2026-03-02T12:54:11.784790523Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vfkbm,Uid:97350222-5e6a-4fc2-9dd6-1df0b4873374,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9edad8943f25b3d61686755d0e160f11b2375e0aa16fb0055f0d6c737f825a08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.786471 kubelet[2906]: E0302 12:54:11.785469 2906 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9edad8943f25b3d61686755d0e160f11b2375e0aa16fb0055f0d6c737f825a08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.786471 kubelet[2906]: E0302 12:54:11.785588 2906 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9edad8943f25b3d61686755d0e160f11b2375e0aa16fb0055f0d6c737f825a08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-vfkbm" Mar 2 12:54:11.786471 kubelet[2906]: E0302 12:54:11.785638 2906 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9edad8943f25b3d61686755d0e160f11b2375e0aa16fb0055f0d6c737f825a08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-vfkbm" Mar 2 12:54:11.786656 kubelet[2906]: E0302 12:54:11.785716 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-vfkbm_kube-system(97350222-5e6a-4fc2-9dd6-1df0b4873374)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-vfkbm_kube-system(97350222-5e6a-4fc2-9dd6-1df0b4873374)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9edad8943f25b3d61686755d0e160f11b2375e0aa16fb0055f0d6c737f825a08\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-vfkbm" podUID="97350222-5e6a-4fc2-9dd6-1df0b4873374" Mar 2 12:54:11.788673 containerd[1566]: time="2026-03-02T12:54:11.788600971Z" level=error msg="Failed to destroy network for sandbox \"0cd0b9a115bda1153977ff28124d911233eec127965a518ee6375fbb9440e136\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.796613 containerd[1566]: time="2026-03-02T12:54:11.791942528Z" level=error msg="Failed to destroy network for sandbox \"d314f167c93f7ef82d1c12849c9dcd7387c878c96789ccbb04fd84e533458ba6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.794905 systemd[1]: run-netns-cni\x2d3a3ab7b2\x2ddd52\x2dadc4\x2d54c5\x2d6025f35f1d50.mount: Deactivated successfully. Mar 2 12:54:11.807156 systemd[1]: run-netns-cni\x2d09a2e89d\x2d17ec\x2de366\x2df8b4\x2d068e6e198fff.mount: Deactivated successfully. Mar 2 12:54:11.813735 containerd[1566]: time="2026-03-02T12:54:11.813652185Z" level=error msg="Failed to destroy network for sandbox \"f30213512a4e4ce51eea19c581e3489aebdca793bb692d7f95b1308f6587656e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.816704 containerd[1566]: time="2026-03-02T12:54:11.816646861Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7654b5d44c-kls9b,Uid:fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d314f167c93f7ef82d1c12849c9dcd7387c878c96789ccbb04fd84e533458ba6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.817359 kubelet[2906]: E0302 12:54:11.817283 2906 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d314f167c93f7ef82d1c12849c9dcd7387c878c96789ccbb04fd84e533458ba6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.819893 kubelet[2906]: E0302 12:54:11.817571 2906 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d314f167c93f7ef82d1c12849c9dcd7387c878c96789ccbb04fd84e533458ba6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7654b5d44c-kls9b" Mar 2 12:54:11.819893 kubelet[2906]: E0302 12:54:11.817656 2906 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d314f167c93f7ef82d1c12849c9dcd7387c878c96789ccbb04fd84e533458ba6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7654b5d44c-kls9b" Mar 2 12:54:11.819893 kubelet[2906]: E0302 12:54:11.817804 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7654b5d44c-kls9b_calico-system(fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7654b5d44c-kls9b_calico-system(fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d314f167c93f7ef82d1c12849c9dcd7387c878c96789ccbb04fd84e533458ba6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7654b5d44c-kls9b" podUID="fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3" Mar 2 12:54:11.826361 containerd[1566]: time="2026-03-02T12:54:11.826196028Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9566f57b5-h9nvt,Uid:d0fb2ace-0bce-4453-8e74-079c4a488b9c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0cd0b9a115bda1153977ff28124d911233eec127965a518ee6375fbb9440e136\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.830180 kubelet[2906]: E0302 12:54:11.827905 2906 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0cd0b9a115bda1153977ff28124d911233eec127965a518ee6375fbb9440e136\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.830180 kubelet[2906]: E0302 12:54:11.827988 2906 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0cd0b9a115bda1153977ff28124d911233eec127965a518ee6375fbb9440e136\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9566f57b5-h9nvt" Mar 2 12:54:11.830180 kubelet[2906]: E0302 12:54:11.828021 2906 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0cd0b9a115bda1153977ff28124d911233eec127965a518ee6375fbb9440e136\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9566f57b5-h9nvt" Mar 2 12:54:11.830468 kubelet[2906]: E0302 12:54:11.828099 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-9566f57b5-h9nvt_calico-system(d0fb2ace-0bce-4453-8e74-079c4a488b9c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-9566f57b5-h9nvt_calico-system(d0fb2ace-0bce-4453-8e74-079c4a488b9c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0cd0b9a115bda1153977ff28124d911233eec127965a518ee6375fbb9440e136\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9566f57b5-h9nvt" podUID="d0fb2ace-0bce-4453-8e74-079c4a488b9c" Mar 2 12:54:11.833572 containerd[1566]: time="2026-03-02T12:54:11.832868414Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7ccfb58f8-hkzqj,Uid:bbd5153d-dba2-4511-a4b0-736593de7fc7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f30213512a4e4ce51eea19c581e3489aebdca793bb692d7f95b1308f6587656e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.834257 kubelet[2906]: E0302 12:54:11.834160 2906 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f30213512a4e4ce51eea19c581e3489aebdca793bb692d7f95b1308f6587656e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.835747 kubelet[2906]: E0302 12:54:11.835547 2906 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f30213512a4e4ce51eea19c581e3489aebdca793bb692d7f95b1308f6587656e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7ccfb58f8-hkzqj" Mar 2 12:54:11.835747 kubelet[2906]: E0302 12:54:11.835594 2906 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f30213512a4e4ce51eea19c581e3489aebdca793bb692d7f95b1308f6587656e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7ccfb58f8-hkzqj" Mar 2 12:54:11.835747 kubelet[2906]: E0302 12:54:11.835675 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7ccfb58f8-hkzqj_calico-system(bbd5153d-dba2-4511-a4b0-736593de7fc7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7ccfb58f8-hkzqj_calico-system(bbd5153d-dba2-4511-a4b0-736593de7fc7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f30213512a4e4ce51eea19c581e3489aebdca793bb692d7f95b1308f6587656e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-7ccfb58f8-hkzqj" podUID="bbd5153d-dba2-4511-a4b0-736593de7fc7" Mar 2 12:54:11.850182 containerd[1566]: time="2026-03-02T12:54:11.850110108Z" level=error msg="Failed to destroy network for sandbox \"4226c8e690df24e7f0a71074b593ad18b599bb967505184ce055c40a547eddcc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.855144 containerd[1566]: time="2026-03-02T12:54:11.855055848Z" level=error msg="Failed to destroy network for sandbox \"34ff96390cfd7ba308abac6057e26e3e7897eaf45a55e76d6685bb23bca749e3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.873564 containerd[1566]: time="2026-03-02T12:54:11.872407026Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-56k4g,Uid:abbdfe3c-56e5-4932-b650-1489a1c6d2bc,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4226c8e690df24e7f0a71074b593ad18b599bb967505184ce055c40a547eddcc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.873564 containerd[1566]: time="2026-03-02T12:54:11.874161206Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-nzwgb,Uid:92e54f41-dc74-4b73-a361-cd79b67e3382,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"34ff96390cfd7ba308abac6057e26e3e7897eaf45a55e76d6685bb23bca749e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.875765 kubelet[2906]: E0302 12:54:11.875215 2906 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"34ff96390cfd7ba308abac6057e26e3e7897eaf45a55e76d6685bb23bca749e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.875765 kubelet[2906]: E0302 12:54:11.875299 2906 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"34ff96390cfd7ba308abac6057e26e3e7897eaf45a55e76d6685bb23bca749e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-nzwgb" Mar 2 12:54:11.875765 kubelet[2906]: E0302 12:54:11.875339 2906 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"34ff96390cfd7ba308abac6057e26e3e7897eaf45a55e76d6685bb23bca749e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-nzwgb" Mar 2 12:54:11.877122 kubelet[2906]: E0302 12:54:11.875408 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-nzwgb_kube-system(92e54f41-dc74-4b73-a361-cd79b67e3382)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-nzwgb_kube-system(92e54f41-dc74-4b73-a361-cd79b67e3382)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"34ff96390cfd7ba308abac6057e26e3e7897eaf45a55e76d6685bb23bca749e3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-nzwgb" podUID="92e54f41-dc74-4b73-a361-cd79b67e3382" Mar 2 12:54:11.877122 kubelet[2906]: E0302 12:54:11.876839 2906 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4226c8e690df24e7f0a71074b593ad18b599bb967505184ce055c40a547eddcc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 2 12:54:11.877122 kubelet[2906]: E0302 12:54:11.877002 2906 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4226c8e690df24e7f0a71074b593ad18b599bb967505184ce055c40a547eddcc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-56k4g" Mar 2 12:54:11.878405 kubelet[2906]: E0302 12:54:11.877037 2906 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4226c8e690df24e7f0a71074b593ad18b599bb967505184ce055c40a547eddcc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-56k4g" Mar 2 12:54:11.878405 kubelet[2906]: E0302 12:54:11.877302 2906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-56k4g_calico-system(abbdfe3c-56e5-4932-b650-1489a1c6d2bc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-56k4g_calico-system(abbdfe3c-56e5-4932-b650-1489a1c6d2bc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4226c8e690df24e7f0a71074b593ad18b599bb967505184ce055c40a547eddcc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-56k4g" podUID="abbdfe3c-56e5-4932-b650-1489a1c6d2bc" Mar 2 12:54:11.886343 containerd[1566]: time="2026-03-02T12:54:11.886277126Z" level=info msg="StartContainer for \"0fc58da3e305088a551ef0df1ad670f2803c03d36857602537e9571e4e1735ae\" returns successfully" Mar 2 12:54:12.297933 kubelet[2906]: I0302 12:54:12.297760 2906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-jzm77" podStartSLOduration=4.457884141 podStartE2EDuration="30.296417855s" podCreationTimestamp="2026-03-02 12:53:42 +0000 UTC" firstStartedPulling="2026-03-02 12:53:43.161298296 +0000 UTC m=+23.585105331" lastFinishedPulling="2026-03-02 12:54:08.999832018 +0000 UTC m=+49.423639045" observedRunningTime="2026-03-02 12:54:12.295008322 +0000 UTC m=+52.718815368" watchObservedRunningTime="2026-03-02 12:54:12.296417855 +0000 UTC m=+52.720224897" Mar 2 12:54:12.642405 systemd[1]: run-netns-cni\x2d4fde499a\x2dfc58\x2dbbfb\x2ddc75\x2d06ac32b1c714.mount: Deactivated successfully. Mar 2 12:54:12.644825 systemd[1]: run-netns-cni\x2dd679b342\x2dccd8\x2da66c\x2df010\x2d341bf5573c93.mount: Deactivated successfully. Mar 2 12:54:12.644932 systemd[1]: run-netns-cni\x2ddab0f7da\x2d4d2a\x2d227b\x2d44fc\x2d038b88334011.mount: Deactivated successfully. Mar 2 12:54:12.679343 kubelet[2906]: I0302 12:54:12.679188 2906 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3-whisker-ca-bundle\") pod \"fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3\" (UID: \"fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3\") " Mar 2 12:54:12.682109 kubelet[2906]: I0302 12:54:12.681526 2906 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv7pg\" (UniqueName: \"kubernetes.io/projected/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3-kube-api-access-cv7pg\") pod \"fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3\" (UID: \"fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3\") " Mar 2 12:54:12.683023 kubelet[2906]: I0302 12:54:12.682692 2906 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3-nginx-config\") pod \"fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3\" (UID: \"fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3\") " Mar 2 12:54:12.683257 kubelet[2906]: I0302 12:54:12.683233 2906 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3-whisker-backend-key-pair\") pod \"fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3\" (UID: \"fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3\") " Mar 2 12:54:12.685426 kubelet[2906]: I0302 12:54:12.682607 2906 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3" (UID: "fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 2 12:54:12.685826 kubelet[2906]: I0302 12:54:12.683757 2906 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3" (UID: "fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 2 12:54:12.695933 systemd[1]: var-lib-kubelet-pods-fb2f2c77\x2de1c5\x2d400d\x2da6e1\x2df2a0ef3b0ba3-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dcv7pg.mount: Deactivated successfully. Mar 2 12:54:12.696085 systemd[1]: var-lib-kubelet-pods-fb2f2c77\x2de1c5\x2d400d\x2da6e1\x2df2a0ef3b0ba3-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 2 12:54:12.699374 kubelet[2906]: I0302 12:54:12.699312 2906 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3" (UID: "fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 2 12:54:12.699701 kubelet[2906]: I0302 12:54:12.699673 2906 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3-kube-api-access-cv7pg" (OuterVolumeSpecName: "kube-api-access-cv7pg") pod "fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3" (UID: "fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3"). InnerVolumeSpecName "kube-api-access-cv7pg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 2 12:54:12.785781 kubelet[2906]: I0302 12:54:12.785654 2906 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3-whisker-ca-bundle\") on node \"srv-zvfam.gb1.brightbox.com\" DevicePath \"\"" Mar 2 12:54:12.785781 kubelet[2906]: I0302 12:54:12.785708 2906 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cv7pg\" (UniqueName: \"kubernetes.io/projected/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3-kube-api-access-cv7pg\") on node \"srv-zvfam.gb1.brightbox.com\" DevicePath \"\"" Mar 2 12:54:12.785781 kubelet[2906]: I0302 12:54:12.785730 2906 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3-nginx-config\") on node \"srv-zvfam.gb1.brightbox.com\" DevicePath \"\"" Mar 2 12:54:12.785781 kubelet[2906]: I0302 12:54:12.785747 2906 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3-whisker-backend-key-pair\") on node \"srv-zvfam.gb1.brightbox.com\" DevicePath \"\"" Mar 2 12:54:13.279767 systemd[1]: Removed slice kubepods-besteffort-podfb2f2c77_e1c5_400d_a6e1_f2a0ef3b0ba3.slice - libcontainer container kubepods-besteffort-podfb2f2c77_e1c5_400d_a6e1_f2a0ef3b0ba3.slice. Mar 2 12:54:13.469867 systemd[1]: Created slice kubepods-besteffort-pod909a5135_5501_4138_bc10_f5b5a4358347.slice - libcontainer container kubepods-besteffort-pod909a5135_5501_4138_bc10_f5b5a4358347.slice. Mar 2 12:54:13.491861 kubelet[2906]: I0302 12:54:13.491748 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/909a5135-5501-4138-bc10-f5b5a4358347-whisker-ca-bundle\") pod \"whisker-79bd9b64d9-ncst9\" (UID: \"909a5135-5501-4138-bc10-f5b5a4358347\") " pod="calico-system/whisker-79bd9b64d9-ncst9" Mar 2 12:54:13.491861 kubelet[2906]: I0302 12:54:13.491879 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/909a5135-5501-4138-bc10-f5b5a4358347-nginx-config\") pod \"whisker-79bd9b64d9-ncst9\" (UID: \"909a5135-5501-4138-bc10-f5b5a4358347\") " pod="calico-system/whisker-79bd9b64d9-ncst9" Mar 2 12:54:13.492164 kubelet[2906]: I0302 12:54:13.491918 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/909a5135-5501-4138-bc10-f5b5a4358347-whisker-backend-key-pair\") pod \"whisker-79bd9b64d9-ncst9\" (UID: \"909a5135-5501-4138-bc10-f5b5a4358347\") " pod="calico-system/whisker-79bd9b64d9-ncst9" Mar 2 12:54:13.492164 kubelet[2906]: I0302 12:54:13.491960 2906 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbkgp\" (UniqueName: \"kubernetes.io/projected/909a5135-5501-4138-bc10-f5b5a4358347-kube-api-access-fbkgp\") pod \"whisker-79bd9b64d9-ncst9\" (UID: \"909a5135-5501-4138-bc10-f5b5a4358347\") " pod="calico-system/whisker-79bd9b64d9-ncst9" Mar 2 12:54:13.779437 containerd[1566]: time="2026-03-02T12:54:13.779351644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79bd9b64d9-ncst9,Uid:909a5135-5501-4138-bc10-f5b5a4358347,Namespace:calico-system,Attempt:0,}" Mar 2 12:54:13.903500 kubelet[2906]: I0302 12:54:13.901626 2906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3" path="/var/lib/kubelet/pods/fb2f2c77-e1c5-400d-a6e1-f2a0ef3b0ba3/volumes" Mar 2 12:54:14.018716 systemd-networkd[1501]: cali6b3826aee45: Link UP Mar 2 12:54:14.019630 systemd-networkd[1501]: cali6b3826aee45: Gained carrier Mar 2 12:54:14.056774 containerd[1566]: 2026-03-02 12:54:13.822 [ERROR][4036] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 2 12:54:14.056774 containerd[1566]: 2026-03-02 12:54:13.865 [INFO][4036] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--zvfam.gb1.brightbox.com-k8s-whisker--79bd9b64d9--ncst9-eth0 whisker-79bd9b64d9- calico-system 909a5135-5501-4138-bc10-f5b5a4358347 925 0 2026-03-02 12:54:13 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:79bd9b64d9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s srv-zvfam.gb1.brightbox.com whisker-79bd9b64d9-ncst9 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali6b3826aee45 [] [] }} ContainerID="97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" Namespace="calico-system" Pod="whisker-79bd9b64d9-ncst9" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-whisker--79bd9b64d9--ncst9-" Mar 2 12:54:14.056774 containerd[1566]: 2026-03-02 12:54:13.866 [INFO][4036] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" Namespace="calico-system" Pod="whisker-79bd9b64d9-ncst9" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-whisker--79bd9b64d9--ncst9-eth0" Mar 2 12:54:14.056774 containerd[1566]: 2026-03-02 12:54:13.931 [INFO][4048] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" HandleID="k8s-pod-network.97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" Workload="srv--zvfam.gb1.brightbox.com-k8s-whisker--79bd9b64d9--ncst9-eth0" Mar 2 12:54:14.058689 containerd[1566]: 2026-03-02 12:54:13.945 [INFO][4048] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" HandleID="k8s-pod-network.97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" Workload="srv--zvfam.gb1.brightbox.com-k8s-whisker--79bd9b64d9--ncst9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000380d70), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-zvfam.gb1.brightbox.com", "pod":"whisker-79bd9b64d9-ncst9", "timestamp":"2026-03-02 12:54:13.931835693 +0000 UTC"}, Hostname:"srv-zvfam.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000188dc0)} Mar 2 12:54:14.058689 containerd[1566]: 2026-03-02 12:54:13.945 [INFO][4048] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 2 12:54:14.058689 containerd[1566]: 2026-03-02 12:54:13.945 [INFO][4048] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 2 12:54:14.058689 containerd[1566]: 2026-03-02 12:54:13.945 [INFO][4048] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-zvfam.gb1.brightbox.com' Mar 2 12:54:14.058689 containerd[1566]: 2026-03-02 12:54:13.949 [INFO][4048] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:14.058689 containerd[1566]: 2026-03-02 12:54:13.956 [INFO][4048] ipam/ipam.go 409: Looking up existing affinities for host host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:14.058689 containerd[1566]: 2026-03-02 12:54:13.964 [INFO][4048] ipam/ipam.go 526: Trying affinity for 192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:14.058689 containerd[1566]: 2026-03-02 12:54:13.967 [INFO][4048] ipam/ipam.go 160: Attempting to load block cidr=192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:14.058689 containerd[1566]: 2026-03-02 12:54:13.970 [INFO][4048] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:14.059054 containerd[1566]: 2026-03-02 12:54:13.970 [INFO][4048] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.69.0/26 handle="k8s-pod-network.97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:14.059054 containerd[1566]: 2026-03-02 12:54:13.972 [INFO][4048] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda Mar 2 12:54:14.059054 containerd[1566]: 2026-03-02 12:54:13.978 [INFO][4048] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.69.0/26 handle="k8s-pod-network.97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:14.059054 containerd[1566]: 2026-03-02 12:54:13.984 [INFO][4048] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.69.1/26] block=192.168.69.0/26 handle="k8s-pod-network.97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:14.059054 containerd[1566]: 2026-03-02 12:54:13.984 [INFO][4048] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.69.1/26] handle="k8s-pod-network.97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:14.059054 containerd[1566]: 2026-03-02 12:54:13.984 [INFO][4048] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 2 12:54:14.059054 containerd[1566]: 2026-03-02 12:54:13.984 [INFO][4048] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.69.1/26] IPv6=[] ContainerID="97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" HandleID="k8s-pod-network.97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" Workload="srv--zvfam.gb1.brightbox.com-k8s-whisker--79bd9b64d9--ncst9-eth0" Mar 2 12:54:14.059653 containerd[1566]: 2026-03-02 12:54:13.991 [INFO][4036] cni-plugin/k8s.go 418: Populated endpoint ContainerID="97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" Namespace="calico-system" Pod="whisker-79bd9b64d9-ncst9" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-whisker--79bd9b64d9--ncst9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zvfam.gb1.brightbox.com-k8s-whisker--79bd9b64d9--ncst9-eth0", GenerateName:"whisker-79bd9b64d9-", Namespace:"calico-system", SelfLink:"", UID:"909a5135-5501-4138-bc10-f5b5a4358347", ResourceVersion:"925", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 54, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"79bd9b64d9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zvfam.gb1.brightbox.com", ContainerID:"", Pod:"whisker-79bd9b64d9-ncst9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.69.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali6b3826aee45", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:54:14.059653 containerd[1566]: 2026-03-02 12:54:13.991 [INFO][4036] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.69.1/32] ContainerID="97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" Namespace="calico-system" Pod="whisker-79bd9b64d9-ncst9" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-whisker--79bd9b64d9--ncst9-eth0" Mar 2 12:54:14.060038 containerd[1566]: 2026-03-02 12:54:13.991 [INFO][4036] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6b3826aee45 ContainerID="97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" Namespace="calico-system" Pod="whisker-79bd9b64d9-ncst9" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-whisker--79bd9b64d9--ncst9-eth0" Mar 2 12:54:14.060038 containerd[1566]: 2026-03-02 12:54:14.021 [INFO][4036] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" Namespace="calico-system" Pod="whisker-79bd9b64d9-ncst9" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-whisker--79bd9b64d9--ncst9-eth0" Mar 2 12:54:14.062521 containerd[1566]: 2026-03-02 12:54:14.021 [INFO][4036] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" Namespace="calico-system" Pod="whisker-79bd9b64d9-ncst9" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-whisker--79bd9b64d9--ncst9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zvfam.gb1.brightbox.com-k8s-whisker--79bd9b64d9--ncst9-eth0", GenerateName:"whisker-79bd9b64d9-", Namespace:"calico-system", SelfLink:"", UID:"909a5135-5501-4138-bc10-f5b5a4358347", ResourceVersion:"925", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 54, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"79bd9b64d9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zvfam.gb1.brightbox.com", ContainerID:"97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda", Pod:"whisker-79bd9b64d9-ncst9", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.69.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali6b3826aee45", MAC:"3a:9f:76:fe:3a:8a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:54:14.062756 containerd[1566]: 2026-03-02 12:54:14.045 [INFO][4036] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" Namespace="calico-system" Pod="whisker-79bd9b64d9-ncst9" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-whisker--79bd9b64d9--ncst9-eth0" Mar 2 12:54:14.199740 containerd[1566]: time="2026-03-02T12:54:14.199652259Z" level=info msg="connecting to shim 97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda" address="unix:///run/containerd/s/460bd06451e78cddb9fd18af207cae93a167bc4fd759c2ca05c3db100538b343" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:54:14.270023 systemd[1]: Started cri-containerd-97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda.scope - libcontainer container 97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda. Mar 2 12:54:14.463150 containerd[1566]: time="2026-03-02T12:54:14.462375390Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79bd9b64d9-ncst9,Uid:909a5135-5501-4138-bc10-f5b5a4358347,Namespace:calico-system,Attempt:0,} returns sandbox id \"97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda\"" Mar 2 12:54:14.471127 containerd[1566]: time="2026-03-02T12:54:14.471073075Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.3\"" Mar 2 12:54:15.881853 systemd-networkd[1501]: cali6b3826aee45: Gained IPv6LL Mar 2 12:54:16.111182 systemd-networkd[1501]: vxlan.calico: Link UP Mar 2 12:54:16.111197 systemd-networkd[1501]: vxlan.calico: Gained carrier Mar 2 12:54:17.248935 containerd[1566]: time="2026-03-02T12:54:17.219214674Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.3: active requests=0, bytes read=6036825" Mar 2 12:54:17.249741 containerd[1566]: time="2026-03-02T12:54:17.226657303Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:17.252108 containerd[1566]: time="2026-03-02T12:54:17.251140103Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.3\" with image id \"sha256:a4bcedf3b244f5fd0077952f436fd9486e0e6b974a358c85a962b60303e94c02\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:3a388b567fff5cc31c64399d4af0fd03d2f4d243ef26e6f6b77a49386dbadeca\", size \"7592862\" in 2.7697549s" Mar 2 12:54:17.252108 containerd[1566]: time="2026-03-02T12:54:17.251201672Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.3\" returns image reference \"sha256:a4bcedf3b244f5fd0077952f436fd9486e0e6b974a358c85a962b60303e94c02\"" Mar 2 12:54:17.267180 containerd[1566]: time="2026-03-02T12:54:17.267109382Z" level=info msg="ImageCreate event name:\"sha256:a4bcedf3b244f5fd0077952f436fd9486e0e6b974a358c85a962b60303e94c02\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:17.268280 containerd[1566]: time="2026-03-02T12:54:17.268248610Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:3a388b567fff5cc31c64399d4af0fd03d2f4d243ef26e6f6b77a49386dbadeca\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:17.317572 containerd[1566]: time="2026-03-02T12:54:17.317508390Z" level=info msg="CreateContainer within sandbox \"97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 2 12:54:17.352255 containerd[1566]: time="2026-03-02T12:54:17.351710368Z" level=info msg="Container e793e3937fbb213460c0a08ec009fdfd34260a850a1c25ee76929ab408fc2209: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:54:17.355314 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1328976077.mount: Deactivated successfully. Mar 2 12:54:17.406514 containerd[1566]: time="2026-03-02T12:54:17.405714980Z" level=info msg="CreateContainer within sandbox \"97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"e793e3937fbb213460c0a08ec009fdfd34260a850a1c25ee76929ab408fc2209\"" Mar 2 12:54:17.409566 containerd[1566]: time="2026-03-02T12:54:17.409497259Z" level=info msg="StartContainer for \"e793e3937fbb213460c0a08ec009fdfd34260a850a1c25ee76929ab408fc2209\"" Mar 2 12:54:17.415087 containerd[1566]: time="2026-03-02T12:54:17.415050269Z" level=info msg="connecting to shim e793e3937fbb213460c0a08ec009fdfd34260a850a1c25ee76929ab408fc2209" address="unix:///run/containerd/s/460bd06451e78cddb9fd18af207cae93a167bc4fd759c2ca05c3db100538b343" protocol=ttrpc version=3 Mar 2 12:54:17.634686 systemd[1]: Started cri-containerd-e793e3937fbb213460c0a08ec009fdfd34260a850a1c25ee76929ab408fc2209.scope - libcontainer container e793e3937fbb213460c0a08ec009fdfd34260a850a1c25ee76929ab408fc2209. Mar 2 12:54:17.725199 systemd-networkd[1501]: vxlan.calico: Gained IPv6LL Mar 2 12:54:17.827744 containerd[1566]: time="2026-03-02T12:54:17.827685119Z" level=info msg="StartContainer for \"e793e3937fbb213460c0a08ec009fdfd34260a850a1c25ee76929ab408fc2209\" returns successfully" Mar 2 12:54:17.838316 containerd[1566]: time="2026-03-02T12:54:17.838243650Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.3\"" Mar 2 12:54:20.441819 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4261538100.mount: Deactivated successfully. Mar 2 12:54:20.465474 containerd[1566]: time="2026-03-02T12:54:20.465385464Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:20.466730 containerd[1566]: time="2026-03-02T12:54:20.466556171Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.3: active requests=0, bytes read=17599119" Mar 2 12:54:20.467505 containerd[1566]: time="2026-03-02T12:54:20.467446442Z" level=info msg="ImageCreate event name:\"sha256:fd911f8f9ea58b19b827b1f51a4c19e899291759aca4ed03c388788897668b8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:20.470429 containerd[1566]: time="2026-03-02T12:54:20.470395042Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:359cb5c751e049ac0bb62c4f7e49b1ac81c59935c70715f5ff4c39a757bf9f38\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:20.473882 containerd[1566]: time="2026-03-02T12:54:20.473645016Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.3\" with image id \"sha256:fd911f8f9ea58b19b827b1f51a4c19e899291759aca4ed03c388788897668b8f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:359cb5c751e049ac0bb62c4f7e49b1ac81c59935c70715f5ff4c39a757bf9f38\", size \"17598949\" in 2.63530986s" Mar 2 12:54:20.473882 containerd[1566]: time="2026-03-02T12:54:20.473691948Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.3\" returns image reference \"sha256:fd911f8f9ea58b19b827b1f51a4c19e899291759aca4ed03c388788897668b8f\"" Mar 2 12:54:20.482431 containerd[1566]: time="2026-03-02T12:54:20.481684916Z" level=info msg="CreateContainer within sandbox \"97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 2 12:54:20.496872 containerd[1566]: time="2026-03-02T12:54:20.496804265Z" level=info msg="Container e082f84cb945b2e2e90649bf8d3812d16760684bd2d18e48edebb87595824282: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:54:20.503862 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3927499990.mount: Deactivated successfully. Mar 2 12:54:20.511591 containerd[1566]: time="2026-03-02T12:54:20.511496544Z" level=info msg="CreateContainer within sandbox \"97c84252ab0c36accfb468874689aeabeee8bfb086bf9cf77967c4bd3b361cda\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"e082f84cb945b2e2e90649bf8d3812d16760684bd2d18e48edebb87595824282\"" Mar 2 12:54:20.512756 containerd[1566]: time="2026-03-02T12:54:20.512725729Z" level=info msg="StartContainer for \"e082f84cb945b2e2e90649bf8d3812d16760684bd2d18e48edebb87595824282\"" Mar 2 12:54:20.514965 containerd[1566]: time="2026-03-02T12:54:20.514922352Z" level=info msg="connecting to shim e082f84cb945b2e2e90649bf8d3812d16760684bd2d18e48edebb87595824282" address="unix:///run/containerd/s/460bd06451e78cddb9fd18af207cae93a167bc4fd759c2ca05c3db100538b343" protocol=ttrpc version=3 Mar 2 12:54:20.561733 systemd[1]: Started cri-containerd-e082f84cb945b2e2e90649bf8d3812d16760684bd2d18e48edebb87595824282.scope - libcontainer container e082f84cb945b2e2e90649bf8d3812d16760684bd2d18e48edebb87595824282. Mar 2 12:54:20.645818 containerd[1566]: time="2026-03-02T12:54:20.645670091Z" level=info msg="StartContainer for \"e082f84cb945b2e2e90649bf8d3812d16760684bd2d18e48edebb87595824282\" returns successfully" Mar 2 12:54:21.487975 kubelet[2906]: I0302 12:54:21.485719 2906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-79bd9b64d9-ncst9" podStartSLOduration=2.471397462 podStartE2EDuration="8.482327847s" podCreationTimestamp="2026-03-02 12:54:13 +0000 UTC" firstStartedPulling="2026-03-02 12:54:14.466178059 +0000 UTC m=+54.889985085" lastFinishedPulling="2026-03-02 12:54:20.477108431 +0000 UTC m=+60.900915470" observedRunningTime="2026-03-02 12:54:21.479378432 +0000 UTC m=+61.903185508" watchObservedRunningTime="2026-03-02 12:54:21.482327847 +0000 UTC m=+61.906134885" Mar 2 12:54:23.898050 containerd[1566]: time="2026-03-02T12:54:23.897693892Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7ccfb58f8-5k2tw,Uid:f421e0e8-37b5-4f11-a6ed-d6be8a3f5f24,Namespace:calico-system,Attempt:0,}" Mar 2 12:54:24.182647 systemd-networkd[1501]: cali1f3ee9c9d23: Link UP Mar 2 12:54:24.182948 systemd-networkd[1501]: cali1f3ee9c9d23: Gained carrier Mar 2 12:54:24.204139 sshd[3672]: Connection closed by 101.204.251.230 port 12447 [preauth] Mar 2 12:54:24.206874 systemd[1]: sshd@10-10.243.74.166:22-101.204.251.230:12447.service: Deactivated successfully. Mar 2 12:54:24.220155 containerd[1566]: 2026-03-02 12:54:23.998 [INFO][4429] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--5k2tw-eth0 calico-apiserver-7ccfb58f8- calico-system f421e0e8-37b5-4f11-a6ed-d6be8a3f5f24 867 0 2026-03-02 12:53:41 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7ccfb58f8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-zvfam.gb1.brightbox.com calico-apiserver-7ccfb58f8-5k2tw eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali1f3ee9c9d23 [] [] }} ContainerID="140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" Namespace="calico-system" Pod="calico-apiserver-7ccfb58f8-5k2tw" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--5k2tw-" Mar 2 12:54:24.220155 containerd[1566]: 2026-03-02 12:54:23.999 [INFO][4429] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" Namespace="calico-system" Pod="calico-apiserver-7ccfb58f8-5k2tw" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--5k2tw-eth0" Mar 2 12:54:24.220155 containerd[1566]: 2026-03-02 12:54:24.113 [INFO][4441] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" HandleID="k8s-pod-network.140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" Workload="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--5k2tw-eth0" Mar 2 12:54:24.220752 containerd[1566]: 2026-03-02 12:54:24.127 [INFO][4441] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" HandleID="k8s-pod-network.140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" Workload="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--5k2tw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000312580), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-zvfam.gb1.brightbox.com", "pod":"calico-apiserver-7ccfb58f8-5k2tw", "timestamp":"2026-03-02 12:54:24.11371438 +0000 UTC"}, Hostname:"srv-zvfam.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0002fe2c0)} Mar 2 12:54:24.220752 containerd[1566]: 2026-03-02 12:54:24.127 [INFO][4441] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 2 12:54:24.220752 containerd[1566]: 2026-03-02 12:54:24.127 [INFO][4441] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 2 12:54:24.220752 containerd[1566]: 2026-03-02 12:54:24.127 [INFO][4441] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-zvfam.gb1.brightbox.com' Mar 2 12:54:24.220752 containerd[1566]: 2026-03-02 12:54:24.132 [INFO][4441] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:24.220752 containerd[1566]: 2026-03-02 12:54:24.140 [INFO][4441] ipam/ipam.go 409: Looking up existing affinities for host host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:24.220752 containerd[1566]: 2026-03-02 12:54:24.148 [INFO][4441] ipam/ipam.go 526: Trying affinity for 192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:24.220752 containerd[1566]: 2026-03-02 12:54:24.151 [INFO][4441] ipam/ipam.go 160: Attempting to load block cidr=192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:24.220752 containerd[1566]: 2026-03-02 12:54:24.155 [INFO][4441] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:24.222049 containerd[1566]: 2026-03-02 12:54:24.155 [INFO][4441] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.69.0/26 handle="k8s-pod-network.140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:24.222049 containerd[1566]: 2026-03-02 12:54:24.157 [INFO][4441] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2 Mar 2 12:54:24.222049 containerd[1566]: 2026-03-02 12:54:24.163 [INFO][4441] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.69.0/26 handle="k8s-pod-network.140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:24.222049 containerd[1566]: 2026-03-02 12:54:24.172 [INFO][4441] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.69.2/26] block=192.168.69.0/26 handle="k8s-pod-network.140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:24.222049 containerd[1566]: 2026-03-02 12:54:24.173 [INFO][4441] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.69.2/26] handle="k8s-pod-network.140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:24.222049 containerd[1566]: 2026-03-02 12:54:24.173 [INFO][4441] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 2 12:54:24.222049 containerd[1566]: 2026-03-02 12:54:24.173 [INFO][4441] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.69.2/26] IPv6=[] ContainerID="140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" HandleID="k8s-pod-network.140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" Workload="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--5k2tw-eth0" Mar 2 12:54:24.223869 containerd[1566]: 2026-03-02 12:54:24.177 [INFO][4429] cni-plugin/k8s.go 418: Populated endpoint ContainerID="140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" Namespace="calico-system" Pod="calico-apiserver-7ccfb58f8-5k2tw" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--5k2tw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--5k2tw-eth0", GenerateName:"calico-apiserver-7ccfb58f8-", Namespace:"calico-system", SelfLink:"", UID:"f421e0e8-37b5-4f11-a6ed-d6be8a3f5f24", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 53, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7ccfb58f8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zvfam.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-7ccfb58f8-5k2tw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.69.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1f3ee9c9d23", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:54:24.224190 containerd[1566]: 2026-03-02 12:54:24.177 [INFO][4429] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.69.2/32] ContainerID="140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" Namespace="calico-system" Pod="calico-apiserver-7ccfb58f8-5k2tw" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--5k2tw-eth0" Mar 2 12:54:24.224190 containerd[1566]: 2026-03-02 12:54:24.177 [INFO][4429] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1f3ee9c9d23 ContainerID="140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" Namespace="calico-system" Pod="calico-apiserver-7ccfb58f8-5k2tw" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--5k2tw-eth0" Mar 2 12:54:24.224190 containerd[1566]: 2026-03-02 12:54:24.181 [INFO][4429] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" Namespace="calico-system" Pod="calico-apiserver-7ccfb58f8-5k2tw" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--5k2tw-eth0" Mar 2 12:54:24.224508 containerd[1566]: 2026-03-02 12:54:24.181 [INFO][4429] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" Namespace="calico-system" Pod="calico-apiserver-7ccfb58f8-5k2tw" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--5k2tw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--5k2tw-eth0", GenerateName:"calico-apiserver-7ccfb58f8-", Namespace:"calico-system", SelfLink:"", UID:"f421e0e8-37b5-4f11-a6ed-d6be8a3f5f24", ResourceVersion:"867", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 53, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7ccfb58f8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zvfam.gb1.brightbox.com", ContainerID:"140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2", Pod:"calico-apiserver-7ccfb58f8-5k2tw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.69.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1f3ee9c9d23", MAC:"d2:42:4c:1e:b8:38", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:54:24.224621 containerd[1566]: 2026-03-02 12:54:24.202 [INFO][4429] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" Namespace="calico-system" Pod="calico-apiserver-7ccfb58f8-5k2tw" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--5k2tw-eth0" Mar 2 12:54:24.294202 containerd[1566]: time="2026-03-02T12:54:24.294126906Z" level=info msg="connecting to shim 140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2" address="unix:///run/containerd/s/cc688b100dd1e500eddd3c4aca0c7463ddf8baf6de96874924ae93e3a28e82e7" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:54:24.345652 systemd[1]: Started cri-containerd-140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2.scope - libcontainer container 140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2. Mar 2 12:54:24.424024 containerd[1566]: time="2026-03-02T12:54:24.423928356Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7ccfb58f8-5k2tw,Uid:f421e0e8-37b5-4f11-a6ed-d6be8a3f5f24,Namespace:calico-system,Attempt:0,} returns sandbox id \"140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2\"" Mar 2 12:54:24.427557 containerd[1566]: time="2026-03-02T12:54:24.427136780Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.3\"" Mar 2 12:54:24.896247 containerd[1566]: time="2026-03-02T12:54:24.896122384Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-56k4g,Uid:abbdfe3c-56e5-4932-b650-1489a1c6d2bc,Namespace:calico-system,Attempt:0,}" Mar 2 12:54:24.896786 containerd[1566]: time="2026-03-02T12:54:24.896715389Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b77d7d8f9-wth9b,Uid:a2bcb1e0-0c20-4528-b0fb-910750f04440,Namespace:calico-system,Attempt:0,}" Mar 2 12:54:24.897704 containerd[1566]: time="2026-03-02T12:54:24.897661855Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7ccfb58f8-hkzqj,Uid:bbd5153d-dba2-4511-a4b0-736593de7fc7,Namespace:calico-system,Attempt:0,}" Mar 2 12:54:25.187321 systemd-networkd[1501]: cali7d7bdf25906: Link UP Mar 2 12:54:25.189301 systemd-networkd[1501]: cali7d7bdf25906: Gained carrier Mar 2 12:54:25.216905 containerd[1566]: 2026-03-02 12:54:24.998 [INFO][4520] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--zvfam.gb1.brightbox.com-k8s-csi--node--driver--56k4g-eth0 csi-node-driver- calico-system abbdfe3c-56e5-4932-b650-1489a1c6d2bc 707 0 2026-03-02 12:53:42 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:7494d65b57 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-zvfam.gb1.brightbox.com csi-node-driver-56k4g eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali7d7bdf25906 [] [] }} ContainerID="f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" Namespace="calico-system" Pod="csi-node-driver-56k4g" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-csi--node--driver--56k4g-" Mar 2 12:54:25.216905 containerd[1566]: 2026-03-02 12:54:24.999 [INFO][4520] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" Namespace="calico-system" Pod="csi-node-driver-56k4g" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-csi--node--driver--56k4g-eth0" Mar 2 12:54:25.216905 containerd[1566]: 2026-03-02 12:54:25.086 [INFO][4554] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" HandleID="k8s-pod-network.f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" Workload="srv--zvfam.gb1.brightbox.com-k8s-csi--node--driver--56k4g-eth0" Mar 2 12:54:25.218900 containerd[1566]: 2026-03-02 12:54:25.107 [INFO][4554] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" HandleID="k8s-pod-network.f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" Workload="srv--zvfam.gb1.brightbox.com-k8s-csi--node--driver--56k4g-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003e8170), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-zvfam.gb1.brightbox.com", "pod":"csi-node-driver-56k4g", "timestamp":"2026-03-02 12:54:25.08627149 +0000 UTC"}, Hostname:"srv-zvfam.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0002c8000)} Mar 2 12:54:25.218900 containerd[1566]: 2026-03-02 12:54:25.107 [INFO][4554] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 2 12:54:25.218900 containerd[1566]: 2026-03-02 12:54:25.107 [INFO][4554] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 2 12:54:25.218900 containerd[1566]: 2026-03-02 12:54:25.108 [INFO][4554] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-zvfam.gb1.brightbox.com' Mar 2 12:54:25.218900 containerd[1566]: 2026-03-02 12:54:25.113 [INFO][4554] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.218900 containerd[1566]: 2026-03-02 12:54:25.126 [INFO][4554] ipam/ipam.go 409: Looking up existing affinities for host host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.218900 containerd[1566]: 2026-03-02 12:54:25.134 [INFO][4554] ipam/ipam.go 526: Trying affinity for 192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.218900 containerd[1566]: 2026-03-02 12:54:25.139 [INFO][4554] ipam/ipam.go 160: Attempting to load block cidr=192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.218900 containerd[1566]: 2026-03-02 12:54:25.144 [INFO][4554] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.219264 containerd[1566]: 2026-03-02 12:54:25.145 [INFO][4554] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.69.0/26 handle="k8s-pod-network.f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.219264 containerd[1566]: 2026-03-02 12:54:25.154 [INFO][4554] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45 Mar 2 12:54:25.219264 containerd[1566]: 2026-03-02 12:54:25.164 [INFO][4554] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.69.0/26 handle="k8s-pod-network.f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.219264 containerd[1566]: 2026-03-02 12:54:25.175 [INFO][4554] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.69.3/26] block=192.168.69.0/26 handle="k8s-pod-network.f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.219264 containerd[1566]: 2026-03-02 12:54:25.175 [INFO][4554] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.69.3/26] handle="k8s-pod-network.f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.219264 containerd[1566]: 2026-03-02 12:54:25.176 [INFO][4554] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 2 12:54:25.219264 containerd[1566]: 2026-03-02 12:54:25.176 [INFO][4554] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.69.3/26] IPv6=[] ContainerID="f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" HandleID="k8s-pod-network.f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" Workload="srv--zvfam.gb1.brightbox.com-k8s-csi--node--driver--56k4g-eth0" Mar 2 12:54:25.220372 containerd[1566]: 2026-03-02 12:54:25.182 [INFO][4520] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" Namespace="calico-system" Pod="csi-node-driver-56k4g" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-csi--node--driver--56k4g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zvfam.gb1.brightbox.com-k8s-csi--node--driver--56k4g-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"abbdfe3c-56e5-4932-b650-1489a1c6d2bc", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 53, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"7494d65b57", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zvfam.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-56k4g", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.69.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7d7bdf25906", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:54:25.220703 containerd[1566]: 2026-03-02 12:54:25.182 [INFO][4520] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.69.3/32] ContainerID="f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" Namespace="calico-system" Pod="csi-node-driver-56k4g" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-csi--node--driver--56k4g-eth0" Mar 2 12:54:25.220703 containerd[1566]: 2026-03-02 12:54:25.183 [INFO][4520] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7d7bdf25906 ContainerID="f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" Namespace="calico-system" Pod="csi-node-driver-56k4g" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-csi--node--driver--56k4g-eth0" Mar 2 12:54:25.220703 containerd[1566]: 2026-03-02 12:54:25.190 [INFO][4520] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" Namespace="calico-system" Pod="csi-node-driver-56k4g" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-csi--node--driver--56k4g-eth0" Mar 2 12:54:25.221021 containerd[1566]: 2026-03-02 12:54:25.191 [INFO][4520] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" Namespace="calico-system" Pod="csi-node-driver-56k4g" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-csi--node--driver--56k4g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zvfam.gb1.brightbox.com-k8s-csi--node--driver--56k4g-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"abbdfe3c-56e5-4932-b650-1489a1c6d2bc", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 53, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"7494d65b57", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zvfam.gb1.brightbox.com", ContainerID:"f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45", Pod:"csi-node-driver-56k4g", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.69.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7d7bdf25906", MAC:"d6:6b:63:3e:2b:08", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:54:25.221156 containerd[1566]: 2026-03-02 12:54:25.211 [INFO][4520] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" Namespace="calico-system" Pod="csi-node-driver-56k4g" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-csi--node--driver--56k4g-eth0" Mar 2 12:54:25.277862 systemd-networkd[1501]: cali1f3ee9c9d23: Gained IPv6LL Mar 2 12:54:25.283351 containerd[1566]: time="2026-03-02T12:54:25.283287147Z" level=info msg="connecting to shim f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45" address="unix:///run/containerd/s/aa13474d37ba1056157b9beb1b94740c0ed3900a4d57370cab155e930496f9f6" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:54:25.302850 systemd-networkd[1501]: cali79217987749: Link UP Mar 2 12:54:25.304840 systemd-networkd[1501]: cali79217987749: Gained carrier Mar 2 12:54:25.368180 containerd[1566]: 2026-03-02 12:54:25.030 [INFO][4526] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--zvfam.gb1.brightbox.com-k8s-calico--kube--controllers--6b77d7d8f9--wth9b-eth0 calico-kube-controllers-6b77d7d8f9- calico-system a2bcb1e0-0c20-4528-b0fb-910750f04440 863 0 2026-03-02 12:53:42 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6b77d7d8f9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-zvfam.gb1.brightbox.com calico-kube-controllers-6b77d7d8f9-wth9b eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali79217987749 [] [] }} ContainerID="de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" Namespace="calico-system" Pod="calico-kube-controllers-6b77d7d8f9-wth9b" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--kube--controllers--6b77d7d8f9--wth9b-" Mar 2 12:54:25.368180 containerd[1566]: 2026-03-02 12:54:25.031 [INFO][4526] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" Namespace="calico-system" Pod="calico-kube-controllers-6b77d7d8f9-wth9b" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--kube--controllers--6b77d7d8f9--wth9b-eth0" Mar 2 12:54:25.368180 containerd[1566]: 2026-03-02 12:54:25.112 [INFO][4560] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" HandleID="k8s-pod-network.de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" Workload="srv--zvfam.gb1.brightbox.com-k8s-calico--kube--controllers--6b77d7d8f9--wth9b-eth0" Mar 2 12:54:25.368616 containerd[1566]: 2026-03-02 12:54:25.126 [INFO][4560] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" HandleID="k8s-pod-network.de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" Workload="srv--zvfam.gb1.brightbox.com-k8s-calico--kube--controllers--6b77d7d8f9--wth9b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00040fb70), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-zvfam.gb1.brightbox.com", "pod":"calico-kube-controllers-6b77d7d8f9-wth9b", "timestamp":"2026-03-02 12:54:25.112446456 +0000 UTC"}, Hostname:"srv-zvfam.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00036a580)} Mar 2 12:54:25.368616 containerd[1566]: 2026-03-02 12:54:25.126 [INFO][4560] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 2 12:54:25.368616 containerd[1566]: 2026-03-02 12:54:25.176 [INFO][4560] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 2 12:54:25.368616 containerd[1566]: 2026-03-02 12:54:25.176 [INFO][4560] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-zvfam.gb1.brightbox.com' Mar 2 12:54:25.368616 containerd[1566]: 2026-03-02 12:54:25.220 [INFO][4560] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.368616 containerd[1566]: 2026-03-02 12:54:25.231 [INFO][4560] ipam/ipam.go 409: Looking up existing affinities for host host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.368616 containerd[1566]: 2026-03-02 12:54:25.242 [INFO][4560] ipam/ipam.go 526: Trying affinity for 192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.368616 containerd[1566]: 2026-03-02 12:54:25.247 [INFO][4560] ipam/ipam.go 160: Attempting to load block cidr=192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.368616 containerd[1566]: 2026-03-02 12:54:25.252 [INFO][4560] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.369026 containerd[1566]: 2026-03-02 12:54:25.253 [INFO][4560] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.69.0/26 handle="k8s-pod-network.de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.369026 containerd[1566]: 2026-03-02 12:54:25.257 [INFO][4560] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2 Mar 2 12:54:25.369026 containerd[1566]: 2026-03-02 12:54:25.269 [INFO][4560] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.69.0/26 handle="k8s-pod-network.de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.369026 containerd[1566]: 2026-03-02 12:54:25.283 [INFO][4560] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.69.4/26] block=192.168.69.0/26 handle="k8s-pod-network.de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.369026 containerd[1566]: 2026-03-02 12:54:25.283 [INFO][4560] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.69.4/26] handle="k8s-pod-network.de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.369026 containerd[1566]: 2026-03-02 12:54:25.284 [INFO][4560] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 2 12:54:25.369026 containerd[1566]: 2026-03-02 12:54:25.284 [INFO][4560] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.69.4/26] IPv6=[] ContainerID="de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" HandleID="k8s-pod-network.de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" Workload="srv--zvfam.gb1.brightbox.com-k8s-calico--kube--controllers--6b77d7d8f9--wth9b-eth0" Mar 2 12:54:25.369277 containerd[1566]: 2026-03-02 12:54:25.292 [INFO][4526] cni-plugin/k8s.go 418: Populated endpoint ContainerID="de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" Namespace="calico-system" Pod="calico-kube-controllers-6b77d7d8f9-wth9b" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--kube--controllers--6b77d7d8f9--wth9b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zvfam.gb1.brightbox.com-k8s-calico--kube--controllers--6b77d7d8f9--wth9b-eth0", GenerateName:"calico-kube-controllers-6b77d7d8f9-", Namespace:"calico-system", SelfLink:"", UID:"a2bcb1e0-0c20-4528-b0fb-910750f04440", ResourceVersion:"863", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 53, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6b77d7d8f9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zvfam.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-6b77d7d8f9-wth9b", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.69.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali79217987749", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:54:25.369365 containerd[1566]: 2026-03-02 12:54:25.293 [INFO][4526] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.69.4/32] ContainerID="de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" Namespace="calico-system" Pod="calico-kube-controllers-6b77d7d8f9-wth9b" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--kube--controllers--6b77d7d8f9--wth9b-eth0" Mar 2 12:54:25.369365 containerd[1566]: 2026-03-02 12:54:25.293 [INFO][4526] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali79217987749 ContainerID="de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" Namespace="calico-system" Pod="calico-kube-controllers-6b77d7d8f9-wth9b" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--kube--controllers--6b77d7d8f9--wth9b-eth0" Mar 2 12:54:25.369365 containerd[1566]: 2026-03-02 12:54:25.306 [INFO][4526] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" Namespace="calico-system" Pod="calico-kube-controllers-6b77d7d8f9-wth9b" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--kube--controllers--6b77d7d8f9--wth9b-eth0" Mar 2 12:54:25.371934 containerd[1566]: 2026-03-02 12:54:25.311 [INFO][4526] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" Namespace="calico-system" Pod="calico-kube-controllers-6b77d7d8f9-wth9b" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--kube--controllers--6b77d7d8f9--wth9b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zvfam.gb1.brightbox.com-k8s-calico--kube--controllers--6b77d7d8f9--wth9b-eth0", GenerateName:"calico-kube-controllers-6b77d7d8f9-", Namespace:"calico-system", SelfLink:"", UID:"a2bcb1e0-0c20-4528-b0fb-910750f04440", ResourceVersion:"863", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 53, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6b77d7d8f9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zvfam.gb1.brightbox.com", ContainerID:"de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2", Pod:"calico-kube-controllers-6b77d7d8f9-wth9b", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.69.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali79217987749", MAC:"c2:e0:33:1e:59:80", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:54:25.372047 containerd[1566]: 2026-03-02 12:54:25.355 [INFO][4526] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" Namespace="calico-system" Pod="calico-kube-controllers-6b77d7d8f9-wth9b" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--kube--controllers--6b77d7d8f9--wth9b-eth0" Mar 2 12:54:25.392969 systemd[1]: Started cri-containerd-f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45.scope - libcontainer container f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45. Mar 2 12:54:25.450763 containerd[1566]: time="2026-03-02T12:54:25.450302562Z" level=info msg="connecting to shim de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2" address="unix:///run/containerd/s/da35c1ba5a0d47b7168543eeca72395503a61eb620ecff8766add70cab100520" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:54:25.463874 systemd-networkd[1501]: cali5b2f3efe998: Link UP Mar 2 12:54:25.468303 systemd-networkd[1501]: cali5b2f3efe998: Gained carrier Mar 2 12:54:25.520552 containerd[1566]: 2026-03-02 12:54:25.040 [INFO][4537] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--hkzqj-eth0 calico-apiserver-7ccfb58f8- calico-system bbd5153d-dba2-4511-a4b0-736593de7fc7 870 0 2026-03-02 12:53:41 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7ccfb58f8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-zvfam.gb1.brightbox.com calico-apiserver-7ccfb58f8-hkzqj eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali5b2f3efe998 [] [] }} ContainerID="01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" Namespace="calico-system" Pod="calico-apiserver-7ccfb58f8-hkzqj" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--hkzqj-" Mar 2 12:54:25.520552 containerd[1566]: 2026-03-02 12:54:25.041 [INFO][4537] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" Namespace="calico-system" Pod="calico-apiserver-7ccfb58f8-hkzqj" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--hkzqj-eth0" Mar 2 12:54:25.520552 containerd[1566]: 2026-03-02 12:54:25.141 [INFO][4565] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" HandleID="k8s-pod-network.01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" Workload="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--hkzqj-eth0" Mar 2 12:54:25.521083 containerd[1566]: 2026-03-02 12:54:25.157 [INFO][4565] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" HandleID="k8s-pod-network.01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" Workload="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--hkzqj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fde80), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-zvfam.gb1.brightbox.com", "pod":"calico-apiserver-7ccfb58f8-hkzqj", "timestamp":"2026-03-02 12:54:25.141771077 +0000 UTC"}, Hostname:"srv-zvfam.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001886e0)} Mar 2 12:54:25.521083 containerd[1566]: 2026-03-02 12:54:25.157 [INFO][4565] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 2 12:54:25.521083 containerd[1566]: 2026-03-02 12:54:25.284 [INFO][4565] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 2 12:54:25.521083 containerd[1566]: 2026-03-02 12:54:25.284 [INFO][4565] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-zvfam.gb1.brightbox.com' Mar 2 12:54:25.521083 containerd[1566]: 2026-03-02 12:54:25.323 [INFO][4565] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.521083 containerd[1566]: 2026-03-02 12:54:25.347 [INFO][4565] ipam/ipam.go 409: Looking up existing affinities for host host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.521083 containerd[1566]: 2026-03-02 12:54:25.376 [INFO][4565] ipam/ipam.go 526: Trying affinity for 192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.521083 containerd[1566]: 2026-03-02 12:54:25.392 [INFO][4565] ipam/ipam.go 160: Attempting to load block cidr=192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.521083 containerd[1566]: 2026-03-02 12:54:25.404 [INFO][4565] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.522036 containerd[1566]: 2026-03-02 12:54:25.404 [INFO][4565] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.69.0/26 handle="k8s-pod-network.01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.522036 containerd[1566]: 2026-03-02 12:54:25.409 [INFO][4565] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa Mar 2 12:54:25.522036 containerd[1566]: 2026-03-02 12:54:25.427 [INFO][4565] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.69.0/26 handle="k8s-pod-network.01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.522036 containerd[1566]: 2026-03-02 12:54:25.443 [INFO][4565] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.69.5/26] block=192.168.69.0/26 handle="k8s-pod-network.01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.522036 containerd[1566]: 2026-03-02 12:54:25.443 [INFO][4565] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.69.5/26] handle="k8s-pod-network.01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:25.522036 containerd[1566]: 2026-03-02 12:54:25.443 [INFO][4565] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 2 12:54:25.522036 containerd[1566]: 2026-03-02 12:54:25.443 [INFO][4565] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.69.5/26] IPv6=[] ContainerID="01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" HandleID="k8s-pod-network.01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" Workload="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--hkzqj-eth0" Mar 2 12:54:25.522403 containerd[1566]: 2026-03-02 12:54:25.449 [INFO][4537] cni-plugin/k8s.go 418: Populated endpoint ContainerID="01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" Namespace="calico-system" Pod="calico-apiserver-7ccfb58f8-hkzqj" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--hkzqj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--hkzqj-eth0", GenerateName:"calico-apiserver-7ccfb58f8-", Namespace:"calico-system", SelfLink:"", UID:"bbd5153d-dba2-4511-a4b0-736593de7fc7", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 53, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7ccfb58f8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zvfam.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-7ccfb58f8-hkzqj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.69.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali5b2f3efe998", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:54:25.522545 containerd[1566]: 2026-03-02 12:54:25.450 [INFO][4537] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.69.5/32] ContainerID="01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" Namespace="calico-system" Pod="calico-apiserver-7ccfb58f8-hkzqj" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--hkzqj-eth0" Mar 2 12:54:25.522545 containerd[1566]: 2026-03-02 12:54:25.450 [INFO][4537] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5b2f3efe998 ContainerID="01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" Namespace="calico-system" Pod="calico-apiserver-7ccfb58f8-hkzqj" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--hkzqj-eth0" Mar 2 12:54:25.522545 containerd[1566]: 2026-03-02 12:54:25.468 [INFO][4537] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" Namespace="calico-system" Pod="calico-apiserver-7ccfb58f8-hkzqj" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--hkzqj-eth0" Mar 2 12:54:25.522677 containerd[1566]: 2026-03-02 12:54:25.479 [INFO][4537] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" Namespace="calico-system" Pod="calico-apiserver-7ccfb58f8-hkzqj" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--hkzqj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--hkzqj-eth0", GenerateName:"calico-apiserver-7ccfb58f8-", Namespace:"calico-system", SelfLink:"", UID:"bbd5153d-dba2-4511-a4b0-736593de7fc7", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 53, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7ccfb58f8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zvfam.gb1.brightbox.com", ContainerID:"01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa", Pod:"calico-apiserver-7ccfb58f8-hkzqj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.69.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali5b2f3efe998", MAC:"0a:60:5d:ee:95:3c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:54:25.522767 containerd[1566]: 2026-03-02 12:54:25.512 [INFO][4537] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" Namespace="calico-system" Pod="calico-apiserver-7ccfb58f8-hkzqj" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-calico--apiserver--7ccfb58f8--hkzqj-eth0" Mar 2 12:54:25.551747 systemd[1]: Started cri-containerd-de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2.scope - libcontainer container de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2. Mar 2 12:54:25.590355 containerd[1566]: time="2026-03-02T12:54:25.589962996Z" level=info msg="connecting to shim 01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa" address="unix:///run/containerd/s/b82f2572f50270b21a2ae77911604ca6a109885c38b0e83297cc2ccb91f2fa58" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:54:25.693867 containerd[1566]: time="2026-03-02T12:54:25.693163757Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-56k4g,Uid:abbdfe3c-56e5-4932-b650-1489a1c6d2bc,Namespace:calico-system,Attempt:0,} returns sandbox id \"f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45\"" Mar 2 12:54:25.701830 systemd[1]: Started cri-containerd-01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa.scope - libcontainer container 01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa. Mar 2 12:54:25.794360 containerd[1566]: time="2026-03-02T12:54:25.794288337Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b77d7d8f9-wth9b,Uid:a2bcb1e0-0c20-4528-b0fb-910750f04440,Namespace:calico-system,Attempt:0,} returns sandbox id \"de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2\"" Mar 2 12:54:25.872208 containerd[1566]: time="2026-03-02T12:54:25.872141092Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7ccfb58f8-hkzqj,Uid:bbd5153d-dba2-4511-a4b0-736593de7fc7,Namespace:calico-system,Attempt:0,} returns sandbox id \"01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa\"" Mar 2 12:54:25.899143 containerd[1566]: time="2026-03-02T12:54:25.898920938Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-nzwgb,Uid:92e54f41-dc74-4b73-a361-cd79b67e3382,Namespace:kube-system,Attempt:0,}" Mar 2 12:54:26.155005 systemd-networkd[1501]: calidf6237a57bc: Link UP Mar 2 12:54:26.159252 systemd-networkd[1501]: calidf6237a57bc: Gained carrier Mar 2 12:54:26.198028 containerd[1566]: 2026-03-02 12:54:25.981 [INFO][4757] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--nzwgb-eth0 coredns-674b8bbfcf- kube-system 92e54f41-dc74-4b73-a361-cd79b67e3382 859 0 2026-03-02 12:53:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-zvfam.gb1.brightbox.com coredns-674b8bbfcf-nzwgb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calidf6237a57bc [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" Namespace="kube-system" Pod="coredns-674b8bbfcf-nzwgb" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--nzwgb-" Mar 2 12:54:26.198028 containerd[1566]: 2026-03-02 12:54:25.981 [INFO][4757] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" Namespace="kube-system" Pod="coredns-674b8bbfcf-nzwgb" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--nzwgb-eth0" Mar 2 12:54:26.198028 containerd[1566]: 2026-03-02 12:54:26.053 [INFO][4771] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" HandleID="k8s-pod-network.8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" Workload="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--nzwgb-eth0" Mar 2 12:54:26.198398 containerd[1566]: 2026-03-02 12:54:26.066 [INFO][4771] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" HandleID="k8s-pod-network.8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" Workload="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--nzwgb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fb20), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-zvfam.gb1.brightbox.com", "pod":"coredns-674b8bbfcf-nzwgb", "timestamp":"2026-03-02 12:54:26.053212145 +0000 UTC"}, Hostname:"srv-zvfam.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000376420)} Mar 2 12:54:26.198398 containerd[1566]: 2026-03-02 12:54:26.066 [INFO][4771] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 2 12:54:26.198398 containerd[1566]: 2026-03-02 12:54:26.066 [INFO][4771] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 2 12:54:26.198398 containerd[1566]: 2026-03-02 12:54:26.066 [INFO][4771] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-zvfam.gb1.brightbox.com' Mar 2 12:54:26.198398 containerd[1566]: 2026-03-02 12:54:26.072 [INFO][4771] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:26.198398 containerd[1566]: 2026-03-02 12:54:26.087 [INFO][4771] ipam/ipam.go 409: Looking up existing affinities for host host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:26.198398 containerd[1566]: 2026-03-02 12:54:26.102 [INFO][4771] ipam/ipam.go 526: Trying affinity for 192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:26.198398 containerd[1566]: 2026-03-02 12:54:26.107 [INFO][4771] ipam/ipam.go 160: Attempting to load block cidr=192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:26.198398 containerd[1566]: 2026-03-02 12:54:26.112 [INFO][4771] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:26.198986 containerd[1566]: 2026-03-02 12:54:26.112 [INFO][4771] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.69.0/26 handle="k8s-pod-network.8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:26.198986 containerd[1566]: 2026-03-02 12:54:26.116 [INFO][4771] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2 Mar 2 12:54:26.198986 containerd[1566]: 2026-03-02 12:54:26.127 [INFO][4771] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.69.0/26 handle="k8s-pod-network.8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:26.198986 containerd[1566]: 2026-03-02 12:54:26.142 [INFO][4771] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.69.6/26] block=192.168.69.0/26 handle="k8s-pod-network.8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:26.198986 containerd[1566]: 2026-03-02 12:54:26.143 [INFO][4771] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.69.6/26] handle="k8s-pod-network.8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:26.198986 containerd[1566]: 2026-03-02 12:54:26.144 [INFO][4771] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 2 12:54:26.198986 containerd[1566]: 2026-03-02 12:54:26.144 [INFO][4771] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.69.6/26] IPv6=[] ContainerID="8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" HandleID="k8s-pod-network.8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" Workload="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--nzwgb-eth0" Mar 2 12:54:26.201084 containerd[1566]: 2026-03-02 12:54:26.149 [INFO][4757] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" Namespace="kube-system" Pod="coredns-674b8bbfcf-nzwgb" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--nzwgb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--nzwgb-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"92e54f41-dc74-4b73-a361-cd79b67e3382", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 53, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zvfam.gb1.brightbox.com", ContainerID:"", Pod:"coredns-674b8bbfcf-nzwgb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.69.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calidf6237a57bc", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:54:26.201084 containerd[1566]: 2026-03-02 12:54:26.150 [INFO][4757] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.69.6/32] ContainerID="8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" Namespace="kube-system" Pod="coredns-674b8bbfcf-nzwgb" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--nzwgb-eth0" Mar 2 12:54:26.201084 containerd[1566]: 2026-03-02 12:54:26.150 [INFO][4757] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidf6237a57bc ContainerID="8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" Namespace="kube-system" Pod="coredns-674b8bbfcf-nzwgb" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--nzwgb-eth0" Mar 2 12:54:26.201084 containerd[1566]: 2026-03-02 12:54:26.158 [INFO][4757] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" Namespace="kube-system" Pod="coredns-674b8bbfcf-nzwgb" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--nzwgb-eth0" Mar 2 12:54:26.201084 containerd[1566]: 2026-03-02 12:54:26.160 [INFO][4757] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" Namespace="kube-system" Pod="coredns-674b8bbfcf-nzwgb" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--nzwgb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--nzwgb-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"92e54f41-dc74-4b73-a361-cd79b67e3382", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 53, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zvfam.gb1.brightbox.com", ContainerID:"8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2", Pod:"coredns-674b8bbfcf-nzwgb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.69.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calidf6237a57bc", MAC:"56:1d:01:05:7d:76", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:54:26.201084 containerd[1566]: 2026-03-02 12:54:26.192 [INFO][4757] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" Namespace="kube-system" Pod="coredns-674b8bbfcf-nzwgb" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--nzwgb-eth0" Mar 2 12:54:26.283290 containerd[1566]: time="2026-03-02T12:54:26.282566850Z" level=info msg="connecting to shim 8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2" address="unix:///run/containerd/s/ce608099ccf59a852226602bba787e02b3feb843e5ce7ea2d00c4e6779d20f46" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:54:26.334840 systemd[1]: Started cri-containerd-8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2.scope - libcontainer container 8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2. Mar 2 12:54:26.464033 containerd[1566]: time="2026-03-02T12:54:26.462518671Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-nzwgb,Uid:92e54f41-dc74-4b73-a361-cd79b67e3382,Namespace:kube-system,Attempt:0,} returns sandbox id \"8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2\"" Mar 2 12:54:26.473423 containerd[1566]: time="2026-03-02T12:54:26.473358264Z" level=info msg="CreateContainer within sandbox \"8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 2 12:54:26.514644 containerd[1566]: time="2026-03-02T12:54:26.514572596Z" level=info msg="Container 85bb882c51b67fa0db7901798d4b6d4bc9d2d3edd0c63eb86c8e0dc2ae9dd55e: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:54:26.516370 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount115806460.mount: Deactivated successfully. Mar 2 12:54:26.527589 containerd[1566]: time="2026-03-02T12:54:26.527535089Z" level=info msg="CreateContainer within sandbox \"8e557bfc93bb634f0073adfaf30dd7ac0c0af1a7f3e4edc0f22ca7e7b1810fe2\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"85bb882c51b67fa0db7901798d4b6d4bc9d2d3edd0c63eb86c8e0dc2ae9dd55e\"" Mar 2 12:54:26.529001 containerd[1566]: time="2026-03-02T12:54:26.528888601Z" level=info msg="StartContainer for \"85bb882c51b67fa0db7901798d4b6d4bc9d2d3edd0c63eb86c8e0dc2ae9dd55e\"" Mar 2 12:54:26.534636 containerd[1566]: time="2026-03-02T12:54:26.534562165Z" level=info msg="connecting to shim 85bb882c51b67fa0db7901798d4b6d4bc9d2d3edd0c63eb86c8e0dc2ae9dd55e" address="unix:///run/containerd/s/ce608099ccf59a852226602bba787e02b3feb843e5ce7ea2d00c4e6779d20f46" protocol=ttrpc version=3 Mar 2 12:54:26.562755 systemd[1]: Started cri-containerd-85bb882c51b67fa0db7901798d4b6d4bc9d2d3edd0c63eb86c8e0dc2ae9dd55e.scope - libcontainer container 85bb882c51b67fa0db7901798d4b6d4bc9d2d3edd0c63eb86c8e0dc2ae9dd55e. Mar 2 12:54:26.630928 containerd[1566]: time="2026-03-02T12:54:26.630833148Z" level=info msg="StartContainer for \"85bb882c51b67fa0db7901798d4b6d4bc9d2d3edd0c63eb86c8e0dc2ae9dd55e\" returns successfully" Mar 2 12:54:26.902311 containerd[1566]: time="2026-03-02T12:54:26.902155474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vfkbm,Uid:97350222-5e6a-4fc2-9dd6-1df0b4873374,Namespace:kube-system,Attempt:0,}" Mar 2 12:54:26.904258 containerd[1566]: time="2026-03-02T12:54:26.902155468Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9566f57b5-h9nvt,Uid:d0fb2ace-0bce-4453-8e74-079c4a488b9c,Namespace:calico-system,Attempt:0,}" Mar 2 12:54:27.005023 systemd-networkd[1501]: cali7d7bdf25906: Gained IPv6LL Mar 2 12:54:27.071238 systemd-networkd[1501]: cali5b2f3efe998: Gained IPv6LL Mar 2 12:54:27.324696 systemd-networkd[1501]: cali79217987749: Gained IPv6LL Mar 2 12:54:27.432782 systemd-networkd[1501]: calia76c824dd11: Link UP Mar 2 12:54:27.435722 systemd-networkd[1501]: calia76c824dd11: Gained carrier Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.073 [INFO][4879] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vfkbm-eth0 coredns-674b8bbfcf- kube-system 97350222-5e6a-4fc2-9dd6-1df0b4873374 869 0 2026-03-02 12:53:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-zvfam.gb1.brightbox.com coredns-674b8bbfcf-vfkbm eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia76c824dd11 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfkbm" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vfkbm-" Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.073 [INFO][4879] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfkbm" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vfkbm-eth0" Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.205 [INFO][4910] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" HandleID="k8s-pod-network.4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" Workload="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vfkbm-eth0" Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.227 [INFO][4910] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" HandleID="k8s-pod-network.4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" Workload="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vfkbm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fb730), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-zvfam.gb1.brightbox.com", "pod":"coredns-674b8bbfcf-vfkbm", "timestamp":"2026-03-02 12:54:27.205898448 +0000 UTC"}, Hostname:"srv-zvfam.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00041f080)} Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.227 [INFO][4910] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.227 [INFO][4910] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.227 [INFO][4910] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-zvfam.gb1.brightbox.com' Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.231 [INFO][4910] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.328 [INFO][4910] ipam/ipam.go 409: Looking up existing affinities for host host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.352 [INFO][4910] ipam/ipam.go 526: Trying affinity for 192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.360 [INFO][4910] ipam/ipam.go 160: Attempting to load block cidr=192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.368 [INFO][4910] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.369 [INFO][4910] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.69.0/26 handle="k8s-pod-network.4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.375 [INFO][4910] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93 Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.388 [INFO][4910] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.69.0/26 handle="k8s-pod-network.4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.403 [INFO][4910] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.69.7/26] block=192.168.69.0/26 handle="k8s-pod-network.4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.403 [INFO][4910] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.69.7/26] handle="k8s-pod-network.4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.405 [INFO][4910] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 2 12:54:27.488310 containerd[1566]: 2026-03-02 12:54:27.405 [INFO][4910] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.69.7/26] IPv6=[] ContainerID="4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" HandleID="k8s-pod-network.4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" Workload="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vfkbm-eth0" Mar 2 12:54:27.492308 containerd[1566]: 2026-03-02 12:54:27.415 [INFO][4879] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfkbm" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vfkbm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vfkbm-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"97350222-5e6a-4fc2-9dd6-1df0b4873374", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 53, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zvfam.gb1.brightbox.com", ContainerID:"", Pod:"coredns-674b8bbfcf-vfkbm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.69.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia76c824dd11", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:54:27.492308 containerd[1566]: 2026-03-02 12:54:27.415 [INFO][4879] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.69.7/32] ContainerID="4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfkbm" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vfkbm-eth0" Mar 2 12:54:27.492308 containerd[1566]: 2026-03-02 12:54:27.415 [INFO][4879] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia76c824dd11 ContainerID="4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfkbm" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vfkbm-eth0" Mar 2 12:54:27.492308 containerd[1566]: 2026-03-02 12:54:27.439 [INFO][4879] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfkbm" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vfkbm-eth0" Mar 2 12:54:27.492308 containerd[1566]: 2026-03-02 12:54:27.443 [INFO][4879] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfkbm" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vfkbm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vfkbm-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"97350222-5e6a-4fc2-9dd6-1df0b4873374", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 53, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zvfam.gb1.brightbox.com", ContainerID:"4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93", Pod:"coredns-674b8bbfcf-vfkbm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.69.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia76c824dd11", MAC:"12:30:c2:b4:bf:18", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:54:27.492308 containerd[1566]: 2026-03-02 12:54:27.469 [INFO][4879] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" Namespace="kube-system" Pod="coredns-674b8bbfcf-vfkbm" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-coredns--674b8bbfcf--vfkbm-eth0" Mar 2 12:54:27.548962 systemd-networkd[1501]: cali4e28b60cd1a: Link UP Mar 2 12:54:27.550701 systemd-networkd[1501]: cali4e28b60cd1a: Gained carrier Mar 2 12:54:27.592605 kubelet[2906]: I0302 12:54:27.591991 2906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-nzwgb" podStartSLOduration=62.591962038 podStartE2EDuration="1m2.591962038s" podCreationTimestamp="2026-03-02 12:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-02 12:54:27.562394693 +0000 UTC m=+67.986201749" watchObservedRunningTime="2026-03-02 12:54:27.591962038 +0000 UTC m=+68.015769100" Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.099 [INFO][4889] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--zvfam.gb1.brightbox.com-k8s-goldmane--9566f57b5--h9nvt-eth0 goldmane-9566f57b5- calico-system d0fb2ace-0bce-4453-8e74-079c4a488b9c 871 0 2026-03-02 12:53:41 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:9566f57b5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s srv-zvfam.gb1.brightbox.com goldmane-9566f57b5-h9nvt eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali4e28b60cd1a [] [] }} ContainerID="517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" Namespace="calico-system" Pod="goldmane-9566f57b5-h9nvt" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-goldmane--9566f57b5--h9nvt-" Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.099 [INFO][4889] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" Namespace="calico-system" Pod="goldmane-9566f57b5-h9nvt" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-goldmane--9566f57b5--h9nvt-eth0" Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.229 [INFO][4916] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" HandleID="k8s-pod-network.517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" Workload="srv--zvfam.gb1.brightbox.com-k8s-goldmane--9566f57b5--h9nvt-eth0" Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.240 [INFO][4916] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" HandleID="k8s-pod-network.517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" Workload="srv--zvfam.gb1.brightbox.com-k8s-goldmane--9566f57b5--h9nvt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f810), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-zvfam.gb1.brightbox.com", "pod":"goldmane-9566f57b5-h9nvt", "timestamp":"2026-03-02 12:54:27.229814354 +0000 UTC"}, Hostname:"srv-zvfam.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000188dc0)} Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.240 [INFO][4916] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.406 [INFO][4916] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.406 [INFO][4916] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-zvfam.gb1.brightbox.com' Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.415 [INFO][4916] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.439 [INFO][4916] ipam/ipam.go 409: Looking up existing affinities for host host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.464 [INFO][4916] ipam/ipam.go 526: Trying affinity for 192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.473 [INFO][4916] ipam/ipam.go 160: Attempting to load block cidr=192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.482 [INFO][4916] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.69.0/26 host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.482 [INFO][4916] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.69.0/26 handle="k8s-pod-network.517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.485 [INFO][4916] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49 Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.494 [INFO][4916] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.69.0/26 handle="k8s-pod-network.517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.517 [INFO][4916] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.69.8/26] block=192.168.69.0/26 handle="k8s-pod-network.517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.518 [INFO][4916] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.69.8/26] handle="k8s-pod-network.517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" host="srv-zvfam.gb1.brightbox.com" Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.518 [INFO][4916] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 2 12:54:27.608994 containerd[1566]: 2026-03-02 12:54:27.521 [INFO][4916] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.69.8/26] IPv6=[] ContainerID="517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" HandleID="k8s-pod-network.517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" Workload="srv--zvfam.gb1.brightbox.com-k8s-goldmane--9566f57b5--h9nvt-eth0" Mar 2 12:54:27.613755 containerd[1566]: 2026-03-02 12:54:27.532 [INFO][4889] cni-plugin/k8s.go 418: Populated endpoint ContainerID="517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" Namespace="calico-system" Pod="goldmane-9566f57b5-h9nvt" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-goldmane--9566f57b5--h9nvt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zvfam.gb1.brightbox.com-k8s-goldmane--9566f57b5--h9nvt-eth0", GenerateName:"goldmane-9566f57b5-", Namespace:"calico-system", SelfLink:"", UID:"d0fb2ace-0bce-4453-8e74-079c4a488b9c", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 53, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9566f57b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zvfam.gb1.brightbox.com", ContainerID:"", Pod:"goldmane-9566f57b5-h9nvt", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.69.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali4e28b60cd1a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:54:27.613755 containerd[1566]: 2026-03-02 12:54:27.533 [INFO][4889] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.69.8/32] ContainerID="517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" Namespace="calico-system" Pod="goldmane-9566f57b5-h9nvt" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-goldmane--9566f57b5--h9nvt-eth0" Mar 2 12:54:27.613755 containerd[1566]: 2026-03-02 12:54:27.533 [INFO][4889] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4e28b60cd1a ContainerID="517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" Namespace="calico-system" Pod="goldmane-9566f57b5-h9nvt" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-goldmane--9566f57b5--h9nvt-eth0" Mar 2 12:54:27.613755 containerd[1566]: 2026-03-02 12:54:27.555 [INFO][4889] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" Namespace="calico-system" Pod="goldmane-9566f57b5-h9nvt" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-goldmane--9566f57b5--h9nvt-eth0" Mar 2 12:54:27.613755 containerd[1566]: 2026-03-02 12:54:27.561 [INFO][4889] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" Namespace="calico-system" Pod="goldmane-9566f57b5-h9nvt" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-goldmane--9566f57b5--h9nvt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--zvfam.gb1.brightbox.com-k8s-goldmane--9566f57b5--h9nvt-eth0", GenerateName:"goldmane-9566f57b5-", Namespace:"calico-system", SelfLink:"", UID:"d0fb2ace-0bce-4453-8e74-079c4a488b9c", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2026, time.March, 2, 12, 53, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9566f57b5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-zvfam.gb1.brightbox.com", ContainerID:"517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49", Pod:"goldmane-9566f57b5-h9nvt", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.69.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali4e28b60cd1a", MAC:"5a:f2:27:79:02:30", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 2 12:54:27.613755 containerd[1566]: 2026-03-02 12:54:27.586 [INFO][4889] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" Namespace="calico-system" Pod="goldmane-9566f57b5-h9nvt" WorkloadEndpoint="srv--zvfam.gb1.brightbox.com-k8s-goldmane--9566f57b5--h9nvt-eth0" Mar 2 12:54:27.631987 containerd[1566]: time="2026-03-02T12:54:27.631838884Z" level=info msg="connecting to shim 4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93" address="unix:///run/containerd/s/8348f20d149f6b4caaef5f8915a96bf661279d96a1bd3e9bed65de594968c98d" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:54:27.731931 containerd[1566]: time="2026-03-02T12:54:27.731859143Z" level=info msg="connecting to shim 517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49" address="unix:///run/containerd/s/27cbd51966a8f10c41af4da386b6d18f1de0b984a59f276828a376a423280114" namespace=k8s.io protocol=ttrpc version=3 Mar 2 12:54:27.791321 systemd[1]: Started cri-containerd-4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93.scope - libcontainer container 4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93. Mar 2 12:54:27.867992 systemd[1]: Started cri-containerd-517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49.scope - libcontainer container 517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49. Mar 2 12:54:28.002163 containerd[1566]: time="2026-03-02T12:54:28.002094779Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-vfkbm,Uid:97350222-5e6a-4fc2-9dd6-1df0b4873374,Namespace:kube-system,Attempt:0,} returns sandbox id \"4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93\"" Mar 2 12:54:28.021562 containerd[1566]: time="2026-03-02T12:54:28.021389842Z" level=info msg="CreateContainer within sandbox \"4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 2 12:54:28.049774 containerd[1566]: time="2026-03-02T12:54:28.049712986Z" level=info msg="Container 3c039cbe967add58e9fc3dd03261ebe0112ef8b44d38b7703ae2f974909e75b6: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:54:28.075005 containerd[1566]: time="2026-03-02T12:54:28.074875552Z" level=info msg="CreateContainer within sandbox \"4d0678108526249ea153d1dcd564a8cff4d3ecd751a94d86b42827e2ac9bcf93\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3c039cbe967add58e9fc3dd03261ebe0112ef8b44d38b7703ae2f974909e75b6\"" Mar 2 12:54:28.078803 containerd[1566]: time="2026-03-02T12:54:28.078763935Z" level=info msg="StartContainer for \"3c039cbe967add58e9fc3dd03261ebe0112ef8b44d38b7703ae2f974909e75b6\"" Mar 2 12:54:28.081466 containerd[1566]: time="2026-03-02T12:54:28.081360851Z" level=info msg="connecting to shim 3c039cbe967add58e9fc3dd03261ebe0112ef8b44d38b7703ae2f974909e75b6" address="unix:///run/containerd/s/8348f20d149f6b4caaef5f8915a96bf661279d96a1bd3e9bed65de594968c98d" protocol=ttrpc version=3 Mar 2 12:54:28.157996 systemd-networkd[1501]: calidf6237a57bc: Gained IPv6LL Mar 2 12:54:28.163717 systemd[1]: Started cri-containerd-3c039cbe967add58e9fc3dd03261ebe0112ef8b44d38b7703ae2f974909e75b6.scope - libcontainer container 3c039cbe967add58e9fc3dd03261ebe0112ef8b44d38b7703ae2f974909e75b6. Mar 2 12:54:28.195765 containerd[1566]: time="2026-03-02T12:54:28.195709867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9566f57b5-h9nvt,Uid:d0fb2ace-0bce-4453-8e74-079c4a488b9c,Namespace:calico-system,Attempt:0,} returns sandbox id \"517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49\"" Mar 2 12:54:28.281197 containerd[1566]: time="2026-03-02T12:54:28.281116043Z" level=info msg="StartContainer for \"3c039cbe967add58e9fc3dd03261ebe0112ef8b44d38b7703ae2f974909e75b6\" returns successfully" Mar 2 12:54:28.574433 kubelet[2906]: I0302 12:54:28.574242 2906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-vfkbm" podStartSLOduration=63.574216868 podStartE2EDuration="1m3.574216868s" podCreationTimestamp="2026-03-02 12:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-02 12:54:28.54453608 +0000 UTC m=+68.968343151" watchObservedRunningTime="2026-03-02 12:54:28.574216868 +0000 UTC m=+68.998023909" Mar 2 12:54:28.924749 systemd-networkd[1501]: cali4e28b60cd1a: Gained IPv6LL Mar 2 12:54:29.180733 systemd-networkd[1501]: calia76c824dd11: Gained IPv6LL Mar 2 12:54:29.556187 containerd[1566]: time="2026-03-02T12:54:29.555793263Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:29.558301 containerd[1566]: time="2026-03-02T12:54:29.558243486Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.3: active requests=0, bytes read=48403149" Mar 2 12:54:29.564212 containerd[1566]: time="2026-03-02T12:54:29.563094949Z" level=info msg="ImageCreate event name:\"sha256:ac46eecb3d7f840a860cf32547a175e8efb0ec76cc6ff942e75d49177b70c694\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:29.568419 containerd[1566]: time="2026-03-02T12:54:29.568358870Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:c2def03be7412561bd678df17fcf2467cac990dbb42278dcfe193aa5a43128d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:29.570439 containerd[1566]: time="2026-03-02T12:54:29.569653092Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.3\" with image id \"sha256:ac46eecb3d7f840a860cf32547a175e8efb0ec76cc6ff942e75d49177b70c694\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:c2def03be7412561bd678df17fcf2467cac990dbb42278dcfe193aa5a43128d4\", size \"49959210\" in 5.14247142s" Mar 2 12:54:29.570439 containerd[1566]: time="2026-03-02T12:54:29.569704106Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.3\" returns image reference \"sha256:ac46eecb3d7f840a860cf32547a175e8efb0ec76cc6ff942e75d49177b70c694\"" Mar 2 12:54:29.571334 containerd[1566]: time="2026-03-02T12:54:29.571277548Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.3\"" Mar 2 12:54:29.575701 containerd[1566]: time="2026-03-02T12:54:29.575602416Z" level=info msg="CreateContainer within sandbox \"140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 2 12:54:29.586701 containerd[1566]: time="2026-03-02T12:54:29.586648030Z" level=info msg="Container e33cc959dafddcf9f32034e3d46cf1eb77fb171192c66bd0c639bcda874e43c1: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:54:29.607352 containerd[1566]: time="2026-03-02T12:54:29.607301619Z" level=info msg="CreateContainer within sandbox \"140cc956723429efff2440d52fcc1578f2c683a7c1df07a9fa29f8d136be75b2\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e33cc959dafddcf9f32034e3d46cf1eb77fb171192c66bd0c639bcda874e43c1\"" Mar 2 12:54:29.608591 containerd[1566]: time="2026-03-02T12:54:29.608562300Z" level=info msg="StartContainer for \"e33cc959dafddcf9f32034e3d46cf1eb77fb171192c66bd0c639bcda874e43c1\"" Mar 2 12:54:29.610938 containerd[1566]: time="2026-03-02T12:54:29.610877530Z" level=info msg="connecting to shim e33cc959dafddcf9f32034e3d46cf1eb77fb171192c66bd0c639bcda874e43c1" address="unix:///run/containerd/s/cc688b100dd1e500eddd3c4aca0c7463ddf8baf6de96874924ae93e3a28e82e7" protocol=ttrpc version=3 Mar 2 12:54:29.669760 systemd[1]: Started cri-containerd-e33cc959dafddcf9f32034e3d46cf1eb77fb171192c66bd0c639bcda874e43c1.scope - libcontainer container e33cc959dafddcf9f32034e3d46cf1eb77fb171192c66bd0c639bcda874e43c1. Mar 2 12:54:29.755395 containerd[1566]: time="2026-03-02T12:54:29.755324823Z" level=info msg="StartContainer for \"e33cc959dafddcf9f32034e3d46cf1eb77fb171192c66bd0c639bcda874e43c1\" returns successfully" Mar 2 12:54:31.543978 kubelet[2906]: I0302 12:54:31.543028 2906 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 2 12:54:31.599496 containerd[1566]: time="2026-03-02T12:54:31.599050708Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:31.600941 containerd[1566]: time="2026-03-02T12:54:31.600493020Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.3: active requests=0, bytes read=8793087" Mar 2 12:54:31.603233 containerd[1566]: time="2026-03-02T12:54:31.603042978Z" level=info msg="ImageCreate event name:\"sha256:6f60b868a297033aea2daba09eb6f77fb2390c659bbc8dfaaac24f32f5b84e27\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:31.608492 containerd[1566]: time="2026-03-02T12:54:31.607293047Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:3d04cd6265f850f0420b413351275ebfd244991b1b9e69c64efe8b4eff45b53f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:31.608492 containerd[1566]: time="2026-03-02T12:54:31.608410235Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.3\" with image id \"sha256:6f60b868a297033aea2daba09eb6f77fb2390c659bbc8dfaaac24f32f5b84e27\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:3d04cd6265f850f0420b413351275ebfd244991b1b9e69c64efe8b4eff45b53f\", size \"10349132\" in 2.03708313s" Mar 2 12:54:31.609113 containerd[1566]: time="2026-03-02T12:54:31.608780254Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.3\" returns image reference \"sha256:6f60b868a297033aea2daba09eb6f77fb2390c659bbc8dfaaac24f32f5b84e27\"" Mar 2 12:54:31.610262 containerd[1566]: time="2026-03-02T12:54:31.610226890Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.3\"" Mar 2 12:54:31.618285 containerd[1566]: time="2026-03-02T12:54:31.618244131Z" level=info msg="CreateContainer within sandbox \"f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 2 12:54:31.644478 containerd[1566]: time="2026-03-02T12:54:31.644006952Z" level=info msg="Container 49fad9eddb84b63c3197ee27ee429b061081199900b13333429caaf09e605201: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:54:31.652154 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1288640237.mount: Deactivated successfully. Mar 2 12:54:31.666824 containerd[1566]: time="2026-03-02T12:54:31.666724147Z" level=info msg="CreateContainer within sandbox \"f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"49fad9eddb84b63c3197ee27ee429b061081199900b13333429caaf09e605201\"" Mar 2 12:54:31.668056 containerd[1566]: time="2026-03-02T12:54:31.668014322Z" level=info msg="StartContainer for \"49fad9eddb84b63c3197ee27ee429b061081199900b13333429caaf09e605201\"" Mar 2 12:54:31.670743 containerd[1566]: time="2026-03-02T12:54:31.670667928Z" level=info msg="connecting to shim 49fad9eddb84b63c3197ee27ee429b061081199900b13333429caaf09e605201" address="unix:///run/containerd/s/aa13474d37ba1056157b9beb1b94740c0ed3900a4d57370cab155e930496f9f6" protocol=ttrpc version=3 Mar 2 12:54:31.716766 systemd[1]: Started cri-containerd-49fad9eddb84b63c3197ee27ee429b061081199900b13333429caaf09e605201.scope - libcontainer container 49fad9eddb84b63c3197ee27ee429b061081199900b13333429caaf09e605201. Mar 2 12:54:31.841386 containerd[1566]: time="2026-03-02T12:54:31.841118307Z" level=info msg="StartContainer for \"49fad9eddb84b63c3197ee27ee429b061081199900b13333429caaf09e605201\" returns successfully" Mar 2 12:54:35.820759 containerd[1566]: time="2026-03-02T12:54:35.820213792Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:35.823333 containerd[1566]: time="2026-03-02T12:54:35.822972106Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.3: active requests=0, bytes read=52396348" Mar 2 12:54:35.826626 containerd[1566]: time="2026-03-02T12:54:35.826545298Z" level=info msg="ImageCreate event name:\"sha256:95bc8e4bc61e762d7451304ff00b4ebc2aed857d8698340cb94b885328290dfe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:35.835950 containerd[1566]: time="2026-03-02T12:54:35.835852130Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:081fd6c3de7754ba9892532b2c7c6cae9ba7bd1cca4c42e4590ee8d0f5a5696b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:35.837544 containerd[1566]: time="2026-03-02T12:54:35.837498796Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.3\" with image id \"sha256:95bc8e4bc61e762d7451304ff00b4ebc2aed857d8698340cb94b885328290dfe\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:081fd6c3de7754ba9892532b2c7c6cae9ba7bd1cca4c42e4590ee8d0f5a5696b\", size \"53952361\" in 4.227218859s" Mar 2 12:54:35.837857 containerd[1566]: time="2026-03-02T12:54:35.837695453Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.3\" returns image reference \"sha256:95bc8e4bc61e762d7451304ff00b4ebc2aed857d8698340cb94b885328290dfe\"" Mar 2 12:54:35.860839 containerd[1566]: time="2026-03-02T12:54:35.860698232Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.3\"" Mar 2 12:54:35.979410 containerd[1566]: time="2026-03-02T12:54:35.979196193Z" level=info msg="CreateContainer within sandbox \"de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 2 12:54:36.032903 containerd[1566]: time="2026-03-02T12:54:36.032841690Z" level=info msg="Container 5f4af2d37c7277ef49552d6ea52f8af6bddcc3eaedd8ee9121fe3d5656594948: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:54:36.042050 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount5057949.mount: Deactivated successfully. Mar 2 12:54:36.090616 containerd[1566]: time="2026-03-02T12:54:36.089854866Z" level=info msg="CreateContainer within sandbox \"de09f6dd337dc8acc92fefceda5f18cf1d5abcfcdbe85df466e12754a49e82b2\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"5f4af2d37c7277ef49552d6ea52f8af6bddcc3eaedd8ee9121fe3d5656594948\"" Mar 2 12:54:36.102988 containerd[1566]: time="2026-03-02T12:54:36.102928371Z" level=info msg="StartContainer for \"5f4af2d37c7277ef49552d6ea52f8af6bddcc3eaedd8ee9121fe3d5656594948\"" Mar 2 12:54:36.114373 containerd[1566]: time="2026-03-02T12:54:36.114227445Z" level=info msg="connecting to shim 5f4af2d37c7277ef49552d6ea52f8af6bddcc3eaedd8ee9121fe3d5656594948" address="unix:///run/containerd/s/da35c1ba5a0d47b7168543eeca72395503a61eb620ecff8766add70cab100520" protocol=ttrpc version=3 Mar 2 12:54:36.194800 systemd[1]: Started cri-containerd-5f4af2d37c7277ef49552d6ea52f8af6bddcc3eaedd8ee9121fe3d5656594948.scope - libcontainer container 5f4af2d37c7277ef49552d6ea52f8af6bddcc3eaedd8ee9121fe3d5656594948. Mar 2 12:54:36.390891 containerd[1566]: time="2026-03-02T12:54:36.389235457Z" level=info msg="StartContainer for \"5f4af2d37c7277ef49552d6ea52f8af6bddcc3eaedd8ee9121fe3d5656594948\" returns successfully" Mar 2 12:54:36.473476 containerd[1566]: time="2026-03-02T12:54:36.473375545Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:36.477162 containerd[1566]: time="2026-03-02T12:54:36.476843682Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.3: active requests=0, bytes read=77" Mar 2 12:54:36.482583 containerd[1566]: time="2026-03-02T12:54:36.482293283Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.3\" with image id \"sha256:ac46eecb3d7f840a860cf32547a175e8efb0ec76cc6ff942e75d49177b70c694\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:c2def03be7412561bd678df17fcf2467cac990dbb42278dcfe193aa5a43128d4\", size \"49959210\" in 621.204222ms" Mar 2 12:54:36.482583 containerd[1566]: time="2026-03-02T12:54:36.482376420Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.3\" returns image reference \"sha256:ac46eecb3d7f840a860cf32547a175e8efb0ec76cc6ff942e75d49177b70c694\"" Mar 2 12:54:36.496185 containerd[1566]: time="2026-03-02T12:54:36.496036850Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.3\"" Mar 2 12:54:36.506817 containerd[1566]: time="2026-03-02T12:54:36.506755125Z" level=info msg="CreateContainer within sandbox \"01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 2 12:54:36.538740 containerd[1566]: time="2026-03-02T12:54:36.538678206Z" level=info msg="Container 0ddc098b8242f746376e4a80cb706bbba36fff9e5e15b27cb5d1a720d7f70d15: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:54:36.573316 containerd[1566]: time="2026-03-02T12:54:36.573223443Z" level=info msg="CreateContainer within sandbox \"01880c29bc6c2f74112e6efe87f7e8f33f68b3c04878d2a1a9245e694d330daa\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"0ddc098b8242f746376e4a80cb706bbba36fff9e5e15b27cb5d1a720d7f70d15\"" Mar 2 12:54:36.607313 containerd[1566]: time="2026-03-02T12:54:36.603905609Z" level=info msg="StartContainer for \"0ddc098b8242f746376e4a80cb706bbba36fff9e5e15b27cb5d1a720d7f70d15\"" Mar 2 12:54:36.612322 containerd[1566]: time="2026-03-02T12:54:36.612246195Z" level=info msg="connecting to shim 0ddc098b8242f746376e4a80cb706bbba36fff9e5e15b27cb5d1a720d7f70d15" address="unix:///run/containerd/s/b82f2572f50270b21a2ae77911604ca6a109885c38b0e83297cc2ccb91f2fa58" protocol=ttrpc version=3 Mar 2 12:54:36.662240 systemd[1]: Started cri-containerd-0ddc098b8242f746376e4a80cb706bbba36fff9e5e15b27cb5d1a720d7f70d15.scope - libcontainer container 0ddc098b8242f746376e4a80cb706bbba36fff9e5e15b27cb5d1a720d7f70d15. Mar 2 12:54:36.901737 kubelet[2906]: I0302 12:54:36.901574 2906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6b77d7d8f9-wth9b" podStartSLOduration=44.826299736 podStartE2EDuration="54.882819328s" podCreationTimestamp="2026-03-02 12:53:42 +0000 UTC" firstStartedPulling="2026-03-02 12:54:25.797844329 +0000 UTC m=+66.221651356" lastFinishedPulling="2026-03-02 12:54:35.854363908 +0000 UTC m=+76.278170948" observedRunningTime="2026-03-02 12:54:36.752999626 +0000 UTC m=+77.176806685" watchObservedRunningTime="2026-03-02 12:54:36.882819328 +0000 UTC m=+77.306626370" Mar 2 12:54:36.902572 kubelet[2906]: I0302 12:54:36.901979 2906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-7ccfb58f8-5k2tw" podStartSLOduration=50.757956329 podStartE2EDuration="55.901957236s" podCreationTimestamp="2026-03-02 12:53:41 +0000 UTC" firstStartedPulling="2026-03-02 12:54:24.426855724 +0000 UTC m=+64.850662757" lastFinishedPulling="2026-03-02 12:54:29.570856629 +0000 UTC m=+69.994663664" observedRunningTime="2026-03-02 12:54:30.565050237 +0000 UTC m=+70.988857321" watchObservedRunningTime="2026-03-02 12:54:36.901957236 +0000 UTC m=+77.325764278" Mar 2 12:54:37.058892 containerd[1566]: time="2026-03-02T12:54:37.058676906Z" level=info msg="StartContainer for \"0ddc098b8242f746376e4a80cb706bbba36fff9e5e15b27cb5d1a720d7f70d15\" returns successfully" Mar 2 12:54:38.676811 kubelet[2906]: I0302 12:54:38.676644 2906 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 2 12:54:40.358112 kubelet[2906]: I0302 12:54:40.357834 2906 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 2 12:54:40.433567 kubelet[2906]: I0302 12:54:40.433125 2906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-7ccfb58f8-hkzqj" podStartSLOduration=48.8143156 podStartE2EDuration="59.432899353s" podCreationTimestamp="2026-03-02 12:53:41 +0000 UTC" firstStartedPulling="2026-03-02 12:54:25.874169091 +0000 UTC m=+66.297976130" lastFinishedPulling="2026-03-02 12:54:36.492752848 +0000 UTC m=+76.916559883" observedRunningTime="2026-03-02 12:54:37.688705435 +0000 UTC m=+78.112512497" watchObservedRunningTime="2026-03-02 12:54:40.432899353 +0000 UTC m=+80.856706394" Mar 2 12:54:40.678242 kubelet[2906]: I0302 12:54:40.677326 2906 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 2 12:54:42.724138 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1036968034.mount: Deactivated successfully. Mar 2 12:54:44.058411 containerd[1566]: time="2026-03-02T12:54:44.057634719Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:44.095098 containerd[1566]: time="2026-03-02T12:54:44.095015877Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.3: active requests=0, bytes read=55607954" Mar 2 12:54:44.169158 containerd[1566]: time="2026-03-02T12:54:44.168696600Z" level=info msg="ImageCreate event name:\"sha256:6eaae458d5f115c04bbd6cd0facdbc393958d24af9934b90825fea68960a2f1a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:44.178546 containerd[1566]: time="2026-03-02T12:54:44.178426349Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:e85ffa1d9468908b0bd44664de0d023da6669faefb3e1013b3a15b63dfa1f9a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:44.181207 containerd[1566]: time="2026-03-02T12:54:44.181167550Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.3\" with image id \"sha256:6eaae458d5f115c04bbd6cd0facdbc393958d24af9934b90825fea68960a2f1a\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:e85ffa1d9468908b0bd44664de0d023da6669faefb3e1013b3a15b63dfa1f9a9\", size \"55607800\" in 7.685063148s" Mar 2 12:54:44.182571 containerd[1566]: time="2026-03-02T12:54:44.181220226Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.3\" returns image reference \"sha256:6eaae458d5f115c04bbd6cd0facdbc393958d24af9934b90825fea68960a2f1a\"" Mar 2 12:54:44.304322 containerd[1566]: time="2026-03-02T12:54:44.302687107Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.3\"" Mar 2 12:54:44.443315 containerd[1566]: time="2026-03-02T12:54:44.443237774Z" level=info msg="CreateContainer within sandbox \"517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 2 12:54:44.566543 containerd[1566]: time="2026-03-02T12:54:44.563851162Z" level=info msg="Container cddaf14b9045385ea414626272cc81f817685a357fb2b61ec536d98b9a2b4092: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:54:44.626929 containerd[1566]: time="2026-03-02T12:54:44.626860970Z" level=info msg="CreateContainer within sandbox \"517c3e109da507f501f148b4b22ae972538df45a74e61c83d240f83dddc0bf49\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"cddaf14b9045385ea414626272cc81f817685a357fb2b61ec536d98b9a2b4092\"" Mar 2 12:54:44.702582 containerd[1566]: time="2026-03-02T12:54:44.702229732Z" level=info msg="StartContainer for \"cddaf14b9045385ea414626272cc81f817685a357fb2b61ec536d98b9a2b4092\"" Mar 2 12:54:44.708174 containerd[1566]: time="2026-03-02T12:54:44.708129598Z" level=info msg="connecting to shim cddaf14b9045385ea414626272cc81f817685a357fb2b61ec536d98b9a2b4092" address="unix:///run/containerd/s/27cbd51966a8f10c41af4da386b6d18f1de0b984a59f276828a376a423280114" protocol=ttrpc version=3 Mar 2 12:54:44.826718 systemd[1]: Started cri-containerd-cddaf14b9045385ea414626272cc81f817685a357fb2b61ec536d98b9a2b4092.scope - libcontainer container cddaf14b9045385ea414626272cc81f817685a357fb2b61ec536d98b9a2b4092. Mar 2 12:54:44.957685 containerd[1566]: time="2026-03-02T12:54:44.957545734Z" level=info msg="StartContainer for \"cddaf14b9045385ea414626272cc81f817685a357fb2b61ec536d98b9a2b4092\" returns successfully" Mar 2 12:54:45.906066 kubelet[2906]: I0302 12:54:45.872500 2906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-9566f57b5-h9nvt" podStartSLOduration=48.756949402000004 podStartE2EDuration="1m4.85423407s" podCreationTimestamp="2026-03-02 12:53:41 +0000 UTC" firstStartedPulling="2026-03-02 12:54:28.200020569 +0000 UTC m=+68.623827614" lastFinishedPulling="2026-03-02 12:54:44.297305248 +0000 UTC m=+84.721112282" observedRunningTime="2026-03-02 12:54:45.851562265 +0000 UTC m=+86.275369324" watchObservedRunningTime="2026-03-02 12:54:45.85423407 +0000 UTC m=+86.278041116" Mar 2 12:54:46.722130 containerd[1566]: time="2026-03-02T12:54:46.721969942Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:46.725752 containerd[1566]: time="2026-03-02T12:54:46.725064079Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.3: active requests=0, bytes read=14702266" Mar 2 12:54:46.726844 containerd[1566]: time="2026-03-02T12:54:46.726410977Z" level=info msg="ImageCreate event name:\"sha256:a06d58cceef55662d827ba735c38dc374717b4fe7115379961a819e177ccc50d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:46.732182 containerd[1566]: time="2026-03-02T12:54:46.732129434Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:2bdced3111efc84af5b77534155b084a55a3f839010807e7e83e75faefc8cf33\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 2 12:54:46.733486 containerd[1566]: time="2026-03-02T12:54:46.733113468Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.3\" with image id \"sha256:a06d58cceef55662d827ba735c38dc374717b4fe7115379961a819e177ccc50d\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:2bdced3111efc84af5b77534155b084a55a3f839010807e7e83e75faefc8cf33\", size \"16258263\" in 2.429120261s" Mar 2 12:54:46.734052 containerd[1566]: time="2026-03-02T12:54:46.733628800Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.3\" returns image reference \"sha256:a06d58cceef55662d827ba735c38dc374717b4fe7115379961a819e177ccc50d\"" Mar 2 12:54:46.745496 containerd[1566]: time="2026-03-02T12:54:46.745237297Z" level=info msg="CreateContainer within sandbox \"f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 2 12:54:46.759790 containerd[1566]: time="2026-03-02T12:54:46.759727543Z" level=info msg="Container 026292ad36f2dbf7e962f8595afdca76c7682c7bbfc63ba3dec01141249e146f: CDI devices from CRI Config.CDIDevices: []" Mar 2 12:54:46.769480 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4077034.mount: Deactivated successfully. Mar 2 12:54:46.775655 containerd[1566]: time="2026-03-02T12:54:46.775590998Z" level=info msg="CreateContainer within sandbox \"f9700f38f9b78b02e64bc16fcabafb742176480312885d06aa24ea2edee8ef45\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"026292ad36f2dbf7e962f8595afdca76c7682c7bbfc63ba3dec01141249e146f\"" Mar 2 12:54:46.777895 containerd[1566]: time="2026-03-02T12:54:46.777250968Z" level=info msg="StartContainer for \"026292ad36f2dbf7e962f8595afdca76c7682c7bbfc63ba3dec01141249e146f\"" Mar 2 12:54:46.785657 containerd[1566]: time="2026-03-02T12:54:46.785610301Z" level=info msg="connecting to shim 026292ad36f2dbf7e962f8595afdca76c7682c7bbfc63ba3dec01141249e146f" address="unix:///run/containerd/s/aa13474d37ba1056157b9beb1b94740c0ed3900a4d57370cab155e930496f9f6" protocol=ttrpc version=3 Mar 2 12:54:46.830909 systemd[1]: Started cri-containerd-026292ad36f2dbf7e962f8595afdca76c7682c7bbfc63ba3dec01141249e146f.scope - libcontainer container 026292ad36f2dbf7e962f8595afdca76c7682c7bbfc63ba3dec01141249e146f. Mar 2 12:54:46.968726 containerd[1566]: time="2026-03-02T12:54:46.968661822Z" level=info msg="StartContainer for \"026292ad36f2dbf7e962f8595afdca76c7682c7bbfc63ba3dec01141249e146f\" returns successfully" Mar 2 12:54:47.341664 kubelet[2906]: I0302 12:54:47.339499 2906 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 2 12:54:47.346510 kubelet[2906]: I0302 12:54:47.346418 2906 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 2 12:54:47.821485 kubelet[2906]: I0302 12:54:47.820402 2906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-56k4g" podStartSLOduration=44.782673398 podStartE2EDuration="1m5.82037849s" podCreationTimestamp="2026-03-02 12:53:42 +0000 UTC" firstStartedPulling="2026-03-02 12:54:25.698934656 +0000 UTC m=+66.122741693" lastFinishedPulling="2026-03-02 12:54:46.736639746 +0000 UTC m=+87.160446785" observedRunningTime="2026-03-02 12:54:47.819653205 +0000 UTC m=+88.243460279" watchObservedRunningTime="2026-03-02 12:54:47.82037849 +0000 UTC m=+88.244185542" Mar 2 12:54:49.128653 systemd[1]: Started sshd@11-10.243.74.166:22-68.220.241.50:34684.service - OpenSSH per-connection server daemon (68.220.241.50:34684). Mar 2 12:54:49.789755 sshd[5480]: Accepted publickey for core from 68.220.241.50 port 34684 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:54:49.793396 sshd-session[5480]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:54:49.808763 systemd-logind[1549]: New session 12 of user core. Mar 2 12:54:49.825075 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 2 12:54:50.854364 sshd[5483]: Connection closed by 68.220.241.50 port 34684 Mar 2 12:54:50.858151 sshd-session[5480]: pam_unix(sshd:session): session closed for user core Mar 2 12:54:50.873135 systemd-logind[1549]: Session 12 logged out. Waiting for processes to exit. Mar 2 12:54:50.878148 systemd[1]: sshd@11-10.243.74.166:22-68.220.241.50:34684.service: Deactivated successfully. Mar 2 12:54:50.883481 systemd[1]: session-12.scope: Deactivated successfully. Mar 2 12:54:50.888731 systemd-logind[1549]: Removed session 12. Mar 2 12:54:55.966152 systemd[1]: Started sshd@12-10.243.74.166:22-68.220.241.50:54330.service - OpenSSH per-connection server daemon (68.220.241.50:54330). Mar 2 12:54:56.540589 sshd[5524]: Accepted publickey for core from 68.220.241.50 port 54330 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:54:56.542826 sshd-session[5524]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:54:56.553020 systemd-logind[1549]: New session 13 of user core. Mar 2 12:54:56.559673 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 2 12:54:57.128105 sshd[5529]: Connection closed by 68.220.241.50 port 54330 Mar 2 12:54:57.129386 sshd-session[5524]: pam_unix(sshd:session): session closed for user core Mar 2 12:54:57.136910 systemd-logind[1549]: Session 13 logged out. Waiting for processes to exit. Mar 2 12:54:57.137160 systemd[1]: sshd@12-10.243.74.166:22-68.220.241.50:54330.service: Deactivated successfully. Mar 2 12:54:57.140304 systemd[1]: session-13.scope: Deactivated successfully. Mar 2 12:54:57.142600 systemd-logind[1549]: Removed session 13. Mar 2 12:55:02.227509 systemd[1]: Started sshd@13-10.243.74.166:22-68.220.241.50:60468.service - OpenSSH per-connection server daemon (68.220.241.50:60468). Mar 2 12:55:02.760496 sshd[5552]: Accepted publickey for core from 68.220.241.50 port 60468 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:55:02.762154 sshd-session[5552]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:02.773627 systemd-logind[1549]: New session 14 of user core. Mar 2 12:55:02.780710 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 2 12:55:03.163000 sshd[5555]: Connection closed by 68.220.241.50 port 60468 Mar 2 12:55:03.163580 sshd-session[5552]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:03.179731 systemd[1]: sshd@13-10.243.74.166:22-68.220.241.50:60468.service: Deactivated successfully. Mar 2 12:55:03.183643 systemd[1]: session-14.scope: Deactivated successfully. Mar 2 12:55:03.195817 systemd-logind[1549]: Session 14 logged out. Waiting for processes to exit. Mar 2 12:55:03.199228 systemd-logind[1549]: Removed session 14. Mar 2 12:55:07.475115 systemd[1]: Started sshd@14-10.243.74.166:22-109.123.253.26:50266.service - OpenSSH per-connection server daemon (109.123.253.26:50266). Mar 2 12:55:07.695088 sshd[5605]: Received disconnect from 109.123.253.26 port 50266:11: Bye Bye [preauth] Mar 2 12:55:07.695088 sshd[5605]: Disconnected from authenticating user root 109.123.253.26 port 50266 [preauth] Mar 2 12:55:07.698988 systemd[1]: sshd@14-10.243.74.166:22-109.123.253.26:50266.service: Deactivated successfully. Mar 2 12:55:08.279217 systemd[1]: Started sshd@15-10.243.74.166:22-68.220.241.50:60482.service - OpenSSH per-connection server daemon (68.220.241.50:60482). Mar 2 12:55:08.800136 sshd[5617]: Accepted publickey for core from 68.220.241.50 port 60482 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:55:08.802408 sshd-session[5617]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:08.810726 systemd-logind[1549]: New session 15 of user core. Mar 2 12:55:08.820887 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 2 12:55:09.201998 sshd[5642]: Connection closed by 68.220.241.50 port 60482 Mar 2 12:55:09.202605 sshd-session[5617]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:09.209793 systemd[1]: sshd@15-10.243.74.166:22-68.220.241.50:60482.service: Deactivated successfully. Mar 2 12:55:09.213356 systemd[1]: session-15.scope: Deactivated successfully. Mar 2 12:55:09.215561 systemd-logind[1549]: Session 15 logged out. Waiting for processes to exit. Mar 2 12:55:09.217612 systemd-logind[1549]: Removed session 15. Mar 2 12:55:14.309442 systemd[1]: Started sshd@16-10.243.74.166:22-68.220.241.50:43162.service - OpenSSH per-connection server daemon (68.220.241.50:43162). Mar 2 12:55:14.872917 sshd[5699]: Accepted publickey for core from 68.220.241.50 port 43162 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:55:14.878193 sshd-session[5699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:14.890406 systemd-logind[1549]: New session 16 of user core. Mar 2 12:55:14.898832 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 2 12:55:15.316061 sshd[5702]: Connection closed by 68.220.241.50 port 43162 Mar 2 12:55:15.316212 sshd-session[5699]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:15.328911 systemd[1]: sshd@16-10.243.74.166:22-68.220.241.50:43162.service: Deactivated successfully. Mar 2 12:55:15.332813 systemd[1]: session-16.scope: Deactivated successfully. Mar 2 12:55:15.334780 systemd-logind[1549]: Session 16 logged out. Waiting for processes to exit. Mar 2 12:55:15.337519 systemd-logind[1549]: Removed session 16. Mar 2 12:55:15.423237 systemd[1]: Started sshd@17-10.243.74.166:22-68.220.241.50:43170.service - OpenSSH per-connection server daemon (68.220.241.50:43170). Mar 2 12:55:16.004811 sshd[5715]: Accepted publickey for core from 68.220.241.50 port 43170 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:55:16.006819 sshd-session[5715]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:16.013619 systemd-logind[1549]: New session 17 of user core. Mar 2 12:55:16.023700 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 2 12:55:16.505070 sshd[5718]: Connection closed by 68.220.241.50 port 43170 Mar 2 12:55:16.506747 sshd-session[5715]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:16.514211 systemd[1]: sshd@17-10.243.74.166:22-68.220.241.50:43170.service: Deactivated successfully. Mar 2 12:55:16.518399 systemd[1]: session-17.scope: Deactivated successfully. Mar 2 12:55:16.521728 systemd-logind[1549]: Session 17 logged out. Waiting for processes to exit. Mar 2 12:55:16.523547 systemd-logind[1549]: Removed session 17. Mar 2 12:55:16.608443 systemd[1]: Started sshd@18-10.243.74.166:22-68.220.241.50:43172.service - OpenSSH per-connection server daemon (68.220.241.50:43172). Mar 2 12:55:17.139625 sshd[5728]: Accepted publickey for core from 68.220.241.50 port 43172 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:55:17.141936 sshd-session[5728]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:17.154721 systemd-logind[1549]: New session 18 of user core. Mar 2 12:55:17.161707 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 2 12:55:17.573549 sshd[5754]: Connection closed by 68.220.241.50 port 43172 Mar 2 12:55:17.572567 sshd-session[5728]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:17.583783 systemd-logind[1549]: Session 18 logged out. Waiting for processes to exit. Mar 2 12:55:17.586659 systemd[1]: sshd@18-10.243.74.166:22-68.220.241.50:43172.service: Deactivated successfully. Mar 2 12:55:17.591055 systemd[1]: session-18.scope: Deactivated successfully. Mar 2 12:55:17.594906 systemd-logind[1549]: Removed session 18. Mar 2 12:55:22.686397 systemd[1]: Started sshd@19-10.243.74.166:22-68.220.241.50:37788.service - OpenSSH per-connection server daemon (68.220.241.50:37788). Mar 2 12:55:23.283967 sshd[5767]: Accepted publickey for core from 68.220.241.50 port 37788 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:55:23.286629 sshd-session[5767]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:23.295840 systemd-logind[1549]: New session 19 of user core. Mar 2 12:55:23.308706 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 2 12:55:23.851210 sshd[5770]: Connection closed by 68.220.241.50 port 37788 Mar 2 12:55:23.850091 sshd-session[5767]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:23.856935 systemd[1]: sshd@19-10.243.74.166:22-68.220.241.50:37788.service: Deactivated successfully. Mar 2 12:55:23.860428 systemd[1]: session-19.scope: Deactivated successfully. Mar 2 12:55:23.863711 systemd-logind[1549]: Session 19 logged out. Waiting for processes to exit. Mar 2 12:55:23.866157 systemd-logind[1549]: Removed session 19. Mar 2 12:55:23.951396 systemd[1]: Started sshd@20-10.243.74.166:22-68.220.241.50:37804.service - OpenSSH per-connection server daemon (68.220.241.50:37804). Mar 2 12:55:24.491234 sshd[5781]: Accepted publickey for core from 68.220.241.50 port 37804 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:55:24.493318 sshd-session[5781]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:24.500356 systemd-logind[1549]: New session 20 of user core. Mar 2 12:55:24.514817 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 2 12:55:25.338317 sshd[5784]: Connection closed by 68.220.241.50 port 37804 Mar 2 12:55:25.350614 sshd-session[5781]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:25.364863 systemd[1]: sshd@20-10.243.74.166:22-68.220.241.50:37804.service: Deactivated successfully. Mar 2 12:55:25.368828 systemd[1]: session-20.scope: Deactivated successfully. Mar 2 12:55:25.370756 systemd-logind[1549]: Session 20 logged out. Waiting for processes to exit. Mar 2 12:55:25.374263 systemd-logind[1549]: Removed session 20. Mar 2 12:55:25.443497 systemd[1]: Started sshd@21-10.243.74.166:22-68.220.241.50:37816.service - OpenSSH per-connection server daemon (68.220.241.50:37816). Mar 2 12:55:25.991351 sshd[5794]: Accepted publickey for core from 68.220.241.50 port 37816 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:55:25.993552 sshd-session[5794]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:26.000910 systemd-logind[1549]: New session 21 of user core. Mar 2 12:55:26.009772 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 2 12:55:27.249352 sshd[5798]: Connection closed by 68.220.241.50 port 37816 Mar 2 12:55:27.250335 sshd-session[5794]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:27.260588 systemd-logind[1549]: Session 21 logged out. Waiting for processes to exit. Mar 2 12:55:27.261667 systemd[1]: sshd@21-10.243.74.166:22-68.220.241.50:37816.service: Deactivated successfully. Mar 2 12:55:27.268694 systemd[1]: session-21.scope: Deactivated successfully. Mar 2 12:55:27.273899 systemd-logind[1549]: Removed session 21. Mar 2 12:55:27.350772 systemd[1]: Started sshd@22-10.243.74.166:22-68.220.241.50:37828.service - OpenSSH per-connection server daemon (68.220.241.50:37828). Mar 2 12:55:27.881237 sshd[5825]: Accepted publickey for core from 68.220.241.50 port 37828 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:55:27.883321 sshd-session[5825]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:27.891340 systemd-logind[1549]: New session 22 of user core. Mar 2 12:55:27.896760 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 2 12:55:28.773661 sshd[5828]: Connection closed by 68.220.241.50 port 37828 Mar 2 12:55:28.774739 sshd-session[5825]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:28.792627 systemd[1]: sshd@22-10.243.74.166:22-68.220.241.50:37828.service: Deactivated successfully. Mar 2 12:55:28.796977 systemd[1]: session-22.scope: Deactivated successfully. Mar 2 12:55:28.798381 systemd-logind[1549]: Session 22 logged out. Waiting for processes to exit. Mar 2 12:55:28.802383 systemd-logind[1549]: Removed session 22. Mar 2 12:55:28.894557 systemd[1]: Started sshd@23-10.243.74.166:22-68.220.241.50:37834.service - OpenSSH per-connection server daemon (68.220.241.50:37834). Mar 2 12:55:29.423152 sshd[5838]: Accepted publickey for core from 68.220.241.50 port 37834 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:55:29.425444 sshd-session[5838]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:29.433367 systemd-logind[1549]: New session 23 of user core. Mar 2 12:55:29.441846 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 2 12:55:29.814026 sshd[5841]: Connection closed by 68.220.241.50 port 37834 Mar 2 12:55:29.815238 sshd-session[5838]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:29.821950 systemd[1]: sshd@23-10.243.74.166:22-68.220.241.50:37834.service: Deactivated successfully. Mar 2 12:55:29.826284 systemd[1]: session-23.scope: Deactivated successfully. Mar 2 12:55:29.827924 systemd-logind[1549]: Session 23 logged out. Waiting for processes to exit. Mar 2 12:55:29.830296 systemd-logind[1549]: Removed session 23. Mar 2 12:55:34.924884 systemd[1]: Started sshd@24-10.243.74.166:22-68.220.241.50:46424.service - OpenSSH per-connection server daemon (68.220.241.50:46424). Mar 2 12:55:35.458540 sshd[5853]: Accepted publickey for core from 68.220.241.50 port 46424 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:55:35.460974 sshd-session[5853]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:35.468601 systemd-logind[1549]: New session 24 of user core. Mar 2 12:55:35.479735 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 2 12:55:35.942827 sshd[5856]: Connection closed by 68.220.241.50 port 46424 Mar 2 12:55:35.942671 sshd-session[5853]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:35.949885 systemd[1]: sshd@24-10.243.74.166:22-68.220.241.50:46424.service: Deactivated successfully. Mar 2 12:55:35.953217 systemd[1]: session-24.scope: Deactivated successfully. Mar 2 12:55:35.955664 systemd-logind[1549]: Session 24 logged out. Waiting for processes to exit. Mar 2 12:55:35.959061 systemd-logind[1549]: Removed session 24. Mar 2 12:55:41.049000 systemd[1]: Started sshd@25-10.243.74.166:22-68.220.241.50:46432.service - OpenSSH per-connection server daemon (68.220.241.50:46432). Mar 2 12:55:41.668493 sshd[5901]: Accepted publickey for core from 68.220.241.50 port 46432 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:55:41.671395 sshd-session[5901]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:41.681940 systemd-logind[1549]: New session 25 of user core. Mar 2 12:55:41.689798 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 2 12:55:42.394231 sshd[5904]: Connection closed by 68.220.241.50 port 46432 Mar 2 12:55:42.398561 sshd-session[5901]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:42.408805 systemd[1]: sshd@25-10.243.74.166:22-68.220.241.50:46432.service: Deactivated successfully. Mar 2 12:55:42.411787 systemd[1]: session-25.scope: Deactivated successfully. Mar 2 12:55:42.414311 systemd-logind[1549]: Session 25 logged out. Waiting for processes to exit. Mar 2 12:55:42.417506 systemd-logind[1549]: Removed session 25. Mar 2 12:55:47.508739 systemd[1]: Started sshd@26-10.243.74.166:22-68.220.241.50:47068.service - OpenSSH per-connection server daemon (68.220.241.50:47068). Mar 2 12:55:48.080913 sshd[5963]: Accepted publickey for core from 68.220.241.50 port 47068 ssh2: RSA SHA256:eJfPTcu5Pm24mvlygD7W7Kd1ohgQtGwIItOmwstcNsE Mar 2 12:55:48.084361 sshd-session[5963]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 2 12:55:48.097566 systemd-logind[1549]: New session 26 of user core. Mar 2 12:55:48.102721 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 2 12:55:48.681514 sshd[5983]: Connection closed by 68.220.241.50 port 47068 Mar 2 12:55:48.682538 sshd-session[5963]: pam_unix(sshd:session): session closed for user core Mar 2 12:55:48.688576 systemd[1]: sshd@26-10.243.74.166:22-68.220.241.50:47068.service: Deactivated successfully. Mar 2 12:55:48.692281 systemd[1]: session-26.scope: Deactivated successfully. Mar 2 12:55:48.693738 systemd-logind[1549]: Session 26 logged out. Waiting for processes to exit. Mar 2 12:55:48.698514 systemd-logind[1549]: Removed session 26. Mar 2 12:55:50.275199 systemd[1]: Started sshd@27-10.243.74.166:22-120.48.135.189:55546.service - OpenSSH per-connection server daemon (120.48.135.189:55546). Mar 2 12:55:51.360696 sshd[6016]: Received disconnect from 120.48.135.189 port 55546:11: Bye Bye [preauth] Mar 2 12:55:51.361381 sshd[6016]: Disconnected from authenticating user root 120.48.135.189 port 55546 [preauth] Mar 2 12:55:51.365724 systemd[1]: sshd@27-10.243.74.166:22-120.48.135.189:55546.service: Deactivated successfully.