Dec 16 16:20:30.943722 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Dec 12 15:21:28 -00 2025 Dec 16 16:20:30.943769 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 16 16:20:30.943784 kernel: BIOS-provided physical RAM map: Dec 16 16:20:30.943795 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Dec 16 16:20:30.943810 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Dec 16 16:20:30.943820 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Dec 16 16:20:30.943832 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Dec 16 16:20:30.943843 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Dec 16 16:20:30.943854 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Dec 16 16:20:30.943865 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Dec 16 16:20:30.943876 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Dec 16 16:20:30.943886 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Dec 16 16:20:30.943897 kernel: NX (Execute Disable) protection: active Dec 16 16:20:30.943912 kernel: APIC: Static calls initialized Dec 16 16:20:30.943925 kernel: SMBIOS 2.8 present. Dec 16 16:20:30.943937 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Dec 16 16:20:30.943949 kernel: DMI: Memory slots populated: 1/1 Dec 16 16:20:30.943960 kernel: Hypervisor detected: KVM Dec 16 16:20:30.943972 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Dec 16 16:20:30.943988 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Dec 16 16:20:30.944000 kernel: kvm-clock: using sched offset of 5822321145 cycles Dec 16 16:20:30.944012 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Dec 16 16:20:30.944024 kernel: tsc: Detected 2499.998 MHz processor Dec 16 16:20:30.944036 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Dec 16 16:20:30.944048 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Dec 16 16:20:30.944060 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Dec 16 16:20:30.945003 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Dec 16 16:20:30.945026 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Dec 16 16:20:30.945046 kernel: Using GB pages for direct mapping Dec 16 16:20:30.945059 kernel: ACPI: Early table checksum verification disabled Dec 16 16:20:30.945070 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Dec 16 16:20:30.945128 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 16:20:30.945151 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 16:20:30.945163 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 16:20:30.945183 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Dec 16 16:20:30.945196 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 16:20:30.945208 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 16:20:30.945227 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 16:20:30.945274 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 16:20:30.945293 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Dec 16 16:20:30.945332 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Dec 16 16:20:30.945345 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Dec 16 16:20:30.945357 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Dec 16 16:20:30.945374 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Dec 16 16:20:30.945386 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Dec 16 16:20:30.945399 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Dec 16 16:20:30.945411 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Dec 16 16:20:30.945423 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Dec 16 16:20:30.945435 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Dec 16 16:20:30.945448 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00001000-0x7ffdbfff] Dec 16 16:20:30.945460 kernel: NODE_DATA(0) allocated [mem 0x7ffd4dc0-0x7ffdbfff] Dec 16 16:20:30.945477 kernel: Zone ranges: Dec 16 16:20:30.945489 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Dec 16 16:20:30.945502 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Dec 16 16:20:30.945514 kernel: Normal empty Dec 16 16:20:30.945526 kernel: Device empty Dec 16 16:20:30.945538 kernel: Movable zone start for each node Dec 16 16:20:30.945550 kernel: Early memory node ranges Dec 16 16:20:30.945562 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Dec 16 16:20:30.945574 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Dec 16 16:20:30.945590 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Dec 16 16:20:30.945603 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Dec 16 16:20:30.945615 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Dec 16 16:20:30.945627 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Dec 16 16:20:30.945640 kernel: ACPI: PM-Timer IO Port: 0x608 Dec 16 16:20:30.945652 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Dec 16 16:20:30.945667 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Dec 16 16:20:30.945679 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Dec 16 16:20:30.945691 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Dec 16 16:20:30.945703 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Dec 16 16:20:30.945720 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Dec 16 16:20:30.945732 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Dec 16 16:20:30.945744 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Dec 16 16:20:30.945756 kernel: TSC deadline timer available Dec 16 16:20:30.945769 kernel: CPU topo: Max. logical packages: 16 Dec 16 16:20:30.945781 kernel: CPU topo: Max. logical dies: 16 Dec 16 16:20:30.945793 kernel: CPU topo: Max. dies per package: 1 Dec 16 16:20:30.945805 kernel: CPU topo: Max. threads per core: 1 Dec 16 16:20:30.945817 kernel: CPU topo: Num. cores per package: 1 Dec 16 16:20:30.945833 kernel: CPU topo: Num. threads per package: 1 Dec 16 16:20:30.945846 kernel: CPU topo: Allowing 2 present CPUs plus 14 hotplug CPUs Dec 16 16:20:30.945858 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Dec 16 16:20:30.945870 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Dec 16 16:20:30.945882 kernel: Booting paravirtualized kernel on KVM Dec 16 16:20:30.945894 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Dec 16 16:20:30.945907 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Dec 16 16:20:30.945919 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Dec 16 16:20:30.945931 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Dec 16 16:20:30.945948 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Dec 16 16:20:30.945960 kernel: kvm-guest: PV spinlocks enabled Dec 16 16:20:30.945972 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Dec 16 16:20:30.945986 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 16 16:20:30.945999 kernel: random: crng init done Dec 16 16:20:30.946011 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 16 16:20:30.946023 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 16 16:20:30.946035 kernel: Fallback order for Node 0: 0 Dec 16 16:20:30.946052 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524154 Dec 16 16:20:30.946064 kernel: Policy zone: DMA32 Dec 16 16:20:30.946102 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 16 16:20:30.946115 kernel: software IO TLB: area num 16. Dec 16 16:20:30.946134 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Dec 16 16:20:30.946148 kernel: Kernel/User page tables isolation: enabled Dec 16 16:20:30.946160 kernel: ftrace: allocating 40103 entries in 157 pages Dec 16 16:20:30.946173 kernel: ftrace: allocated 157 pages with 5 groups Dec 16 16:20:30.946185 kernel: Dynamic Preempt: voluntary Dec 16 16:20:30.946203 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 16 16:20:30.946216 kernel: rcu: RCU event tracing is enabled. Dec 16 16:20:30.946229 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Dec 16 16:20:30.946241 kernel: Trampoline variant of Tasks RCU enabled. Dec 16 16:20:30.946254 kernel: Rude variant of Tasks RCU enabled. Dec 16 16:20:30.946266 kernel: Tracing variant of Tasks RCU enabled. Dec 16 16:20:30.946278 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 16 16:20:30.946291 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Dec 16 16:20:30.946314 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Dec 16 16:20:30.946332 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Dec 16 16:20:30.946345 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Dec 16 16:20:30.946357 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Dec 16 16:20:30.946370 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 16 16:20:30.946393 kernel: Console: colour VGA+ 80x25 Dec 16 16:20:30.946410 kernel: printk: legacy console [tty0] enabled Dec 16 16:20:30.946423 kernel: printk: legacy console [ttyS0] enabled Dec 16 16:20:30.946435 kernel: ACPI: Core revision 20240827 Dec 16 16:20:30.946448 kernel: APIC: Switch to symmetric I/O mode setup Dec 16 16:20:30.946461 kernel: x2apic enabled Dec 16 16:20:30.946473 kernel: APIC: Switched APIC routing to: physical x2apic Dec 16 16:20:30.946487 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Dec 16 16:20:30.946504 kernel: Calibrating delay loop (skipped) preset value.. 4999.99 BogoMIPS (lpj=2499998) Dec 16 16:20:30.946517 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Dec 16 16:20:30.946530 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Dec 16 16:20:30.946542 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Dec 16 16:20:30.946555 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Dec 16 16:20:30.949001 kernel: Spectre V2 : Mitigation: Retpolines Dec 16 16:20:30.949023 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Dec 16 16:20:30.949036 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Dec 16 16:20:30.949049 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Dec 16 16:20:30.949068 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Dec 16 16:20:30.949097 kernel: MDS: Mitigation: Clear CPU buffers Dec 16 16:20:30.949110 kernel: MMIO Stale Data: Unknown: No mitigations Dec 16 16:20:30.949122 kernel: SRBDS: Unknown: Dependent on hypervisor status Dec 16 16:20:30.949135 kernel: active return thunk: its_return_thunk Dec 16 16:20:30.949148 kernel: ITS: Mitigation: Aligned branch/return thunks Dec 16 16:20:30.949161 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Dec 16 16:20:30.949181 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Dec 16 16:20:30.949194 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Dec 16 16:20:30.949207 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Dec 16 16:20:30.949220 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Dec 16 16:20:30.949233 kernel: Freeing SMP alternatives memory: 32K Dec 16 16:20:30.949245 kernel: pid_max: default: 32768 minimum: 301 Dec 16 16:20:30.949258 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 16 16:20:30.949271 kernel: landlock: Up and running. Dec 16 16:20:30.949283 kernel: SELinux: Initializing. Dec 16 16:20:30.949308 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Dec 16 16:20:30.949322 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Dec 16 16:20:30.949335 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Dec 16 16:20:30.949354 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Dec 16 16:20:30.949367 kernel: signal: max sigframe size: 1776 Dec 16 16:20:30.949380 kernel: rcu: Hierarchical SRCU implementation. Dec 16 16:20:30.949394 kernel: rcu: Max phase no-delay instances is 400. Dec 16 16:20:30.949407 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Dec 16 16:20:30.949420 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Dec 16 16:20:30.949433 kernel: smp: Bringing up secondary CPUs ... Dec 16 16:20:30.949446 kernel: smpboot: x86: Booting SMP configuration: Dec 16 16:20:30.949459 kernel: .... node #0, CPUs: #1 Dec 16 16:20:30.949476 kernel: smp: Brought up 1 node, 2 CPUs Dec 16 16:20:30.949489 kernel: smpboot: Total of 2 processors activated (9999.99 BogoMIPS) Dec 16 16:20:30.949502 kernel: Memory: 1887484K/2096616K available (14336K kernel code, 2444K rwdata, 26064K rodata, 46188K init, 2572K bss, 203116K reserved, 0K cma-reserved) Dec 16 16:20:30.949515 kernel: devtmpfs: initialized Dec 16 16:20:30.949528 kernel: x86/mm: Memory block size: 128MB Dec 16 16:20:30.949541 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 16 16:20:30.949554 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Dec 16 16:20:30.949567 kernel: pinctrl core: initialized pinctrl subsystem Dec 16 16:20:30.949580 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 16 16:20:30.949597 kernel: audit: initializing netlink subsys (disabled) Dec 16 16:20:30.949610 kernel: audit: type=2000 audit(1765902027.113:1): state=initialized audit_enabled=0 res=1 Dec 16 16:20:30.949623 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 16 16:20:30.949636 kernel: thermal_sys: Registered thermal governor 'user_space' Dec 16 16:20:30.949648 kernel: cpuidle: using governor menu Dec 16 16:20:30.949661 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 16 16:20:30.949674 kernel: dca service started, version 1.12.1 Dec 16 16:20:30.949687 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Dec 16 16:20:30.949700 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry Dec 16 16:20:30.949717 kernel: PCI: Using configuration type 1 for base access Dec 16 16:20:30.949730 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Dec 16 16:20:30.949743 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 16 16:20:30.949759 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Dec 16 16:20:30.949771 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 16 16:20:30.949784 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Dec 16 16:20:30.949797 kernel: ACPI: Added _OSI(Module Device) Dec 16 16:20:30.949810 kernel: ACPI: Added _OSI(Processor Device) Dec 16 16:20:30.949823 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 16 16:20:30.949840 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 16 16:20:30.949853 kernel: ACPI: Interpreter enabled Dec 16 16:20:30.949865 kernel: ACPI: PM: (supports S0 S5) Dec 16 16:20:30.949878 kernel: ACPI: Using IOAPIC for interrupt routing Dec 16 16:20:30.949894 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Dec 16 16:20:30.949907 kernel: PCI: Using E820 reservations for host bridge windows Dec 16 16:20:30.949919 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Dec 16 16:20:30.949932 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 16 16:20:30.950646 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 16 16:20:30.950830 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Dec 16 16:20:30.950996 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Dec 16 16:20:30.951017 kernel: PCI host bridge to bus 0000:00 Dec 16 16:20:30.951220 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Dec 16 16:20:30.951391 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Dec 16 16:20:30.951541 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Dec 16 16:20:30.951699 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Dec 16 16:20:30.951846 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Dec 16 16:20:30.951994 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Dec 16 16:20:30.953969 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 16 16:20:30.954210 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Dec 16 16:20:30.954435 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 conventional PCI endpoint Dec 16 16:20:30.954615 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfa000000-0xfbffffff pref] Dec 16 16:20:30.954786 kernel: pci 0000:00:01.0: BAR 1 [mem 0xfea50000-0xfea50fff] Dec 16 16:20:30.954950 kernel: pci 0000:00:01.0: ROM [mem 0xfea40000-0xfea4ffff pref] Dec 16 16:20:30.955182 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Dec 16 16:20:30.955386 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 16:20:30.955557 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea51000-0xfea51fff] Dec 16 16:20:30.955723 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 16 16:20:30.955897 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Dec 16 16:20:30.956060 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Dec 16 16:20:30.964202 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 16:20:30.964404 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea52000-0xfea52fff] Dec 16 16:20:30.964572 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 16 16:20:30.964744 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Dec 16 16:20:30.964910 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Dec 16 16:20:30.965159 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 16:20:30.965351 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea53000-0xfea53fff] Dec 16 16:20:30.965516 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 16 16:20:30.965680 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Dec 16 16:20:30.965842 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Dec 16 16:20:30.966016 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 16:20:30.966219 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea54000-0xfea54fff] Dec 16 16:20:30.966407 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 16 16:20:30.966571 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Dec 16 16:20:30.966735 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Dec 16 16:20:30.966908 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 16:20:30.967089 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea55000-0xfea55fff] Dec 16 16:20:30.967260 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 16 16:20:30.967437 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Dec 16 16:20:30.967609 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Dec 16 16:20:30.967788 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 16:20:30.967951 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea56000-0xfea56fff] Dec 16 16:20:30.968132 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 16 16:20:30.968310 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Dec 16 16:20:30.968477 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Dec 16 16:20:30.968650 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 16:20:30.968822 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea57000-0xfea57fff] Dec 16 16:20:30.968986 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 16 16:20:30.969165 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Dec 16 16:20:30.969344 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Dec 16 16:20:30.969519 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 16:20:30.969684 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea58000-0xfea58fff] Dec 16 16:20:30.969855 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 16 16:20:30.970017 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Dec 16 16:20:30.972033 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Dec 16 16:20:30.972247 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Dec 16 16:20:30.972432 kernel: pci 0000:00:03.0: BAR 0 [io 0xc0c0-0xc0df] Dec 16 16:20:30.972606 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfea59000-0xfea59fff] Dec 16 16:20:30.972773 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Dec 16 16:20:30.972938 kernel: pci 0000:00:03.0: ROM [mem 0xfea00000-0xfea3ffff pref] Dec 16 16:20:30.973137 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Dec 16 16:20:30.973319 kernel: pci 0000:00:04.0: BAR 0 [io 0xc000-0xc07f] Dec 16 16:20:30.973487 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfea5a000-0xfea5afff] Dec 16 16:20:30.973650 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfd004000-0xfd007fff 64bit pref] Dec 16 16:20:30.973823 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Dec 16 16:20:30.973988 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Dec 16 16:20:30.974192 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Dec 16 16:20:30.974380 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc0e0-0xc0ff] Dec 16 16:20:30.974543 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea5b000-0xfea5bfff] Dec 16 16:20:30.974715 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Dec 16 16:20:30.974879 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Dec 16 16:20:30.975063 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Dec 16 16:20:30.975251 kernel: pci 0000:01:00.0: BAR 0 [mem 0xfda00000-0xfda000ff 64bit] Dec 16 16:20:30.975444 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 16 16:20:30.975613 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Dec 16 16:20:30.975780 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 16 16:20:30.975967 kernel: pci_bus 0000:02: extended config space not accessible Dec 16 16:20:30.976228 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 conventional PCI endpoint Dec 16 16:20:30.976423 kernel: pci 0000:02:01.0: BAR 0 [mem 0xfd800000-0xfd80000f] Dec 16 16:20:30.976593 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 16 16:20:30.976808 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 16 16:20:30.976978 kernel: pci 0000:03:00.0: BAR 0 [mem 0xfe800000-0xfe803fff 64bit] Dec 16 16:20:30.977161 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 16 16:20:30.977367 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 16 16:20:30.977539 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Dec 16 16:20:30.977704 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 16 16:20:30.977878 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 16 16:20:30.978042 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 16 16:20:30.978253 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 16 16:20:30.978433 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 16 16:20:30.978597 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 16 16:20:30.978618 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Dec 16 16:20:30.978632 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Dec 16 16:20:30.978653 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Dec 16 16:20:30.978674 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Dec 16 16:20:30.978687 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Dec 16 16:20:30.978706 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Dec 16 16:20:30.978719 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Dec 16 16:20:30.978732 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Dec 16 16:20:30.978745 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Dec 16 16:20:30.978758 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Dec 16 16:20:30.978771 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Dec 16 16:20:30.978784 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Dec 16 16:20:30.978802 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Dec 16 16:20:30.978815 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Dec 16 16:20:30.978828 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Dec 16 16:20:30.978841 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Dec 16 16:20:30.978854 kernel: iommu: Default domain type: Translated Dec 16 16:20:30.978867 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Dec 16 16:20:30.978880 kernel: PCI: Using ACPI for IRQ routing Dec 16 16:20:30.978893 kernel: PCI: pci_cache_line_size set to 64 bytes Dec 16 16:20:30.978906 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Dec 16 16:20:30.978924 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Dec 16 16:20:30.979102 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Dec 16 16:20:30.979267 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Dec 16 16:20:30.979441 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Dec 16 16:20:30.979462 kernel: vgaarb: loaded Dec 16 16:20:30.979475 kernel: clocksource: Switched to clocksource kvm-clock Dec 16 16:20:30.979488 kernel: VFS: Disk quotas dquot_6.6.0 Dec 16 16:20:30.979501 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 16 16:20:30.979522 kernel: pnp: PnP ACPI init Dec 16 16:20:30.979701 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Dec 16 16:20:30.979723 kernel: pnp: PnP ACPI: found 5 devices Dec 16 16:20:30.979736 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Dec 16 16:20:30.979750 kernel: NET: Registered PF_INET protocol family Dec 16 16:20:30.979763 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 16:20:30.979776 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Dec 16 16:20:30.979789 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 16 16:20:30.979809 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Dec 16 16:20:30.979823 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Dec 16 16:20:30.979836 kernel: TCP: Hash tables configured (established 16384 bind 16384) Dec 16 16:20:30.979849 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Dec 16 16:20:30.979862 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Dec 16 16:20:30.979876 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 16 16:20:30.979889 kernel: NET: Registered PF_XDP protocol family Dec 16 16:20:30.983103 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Dec 16 16:20:30.983279 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 16 16:20:30.983469 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 16 16:20:30.983634 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 16 16:20:30.983806 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 16 16:20:30.983971 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 16 16:20:30.984166 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 16 16:20:30.984346 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 16 16:20:30.984511 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Dec 16 16:20:30.984674 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Dec 16 16:20:30.984856 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Dec 16 16:20:30.985020 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Dec 16 16:20:30.985221 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Dec 16 16:20:30.985401 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Dec 16 16:20:30.985565 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Dec 16 16:20:30.985728 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Dec 16 16:20:30.985896 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 16 16:20:30.986125 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Dec 16 16:20:30.986305 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 16 16:20:30.986475 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Dec 16 16:20:30.986660 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Dec 16 16:20:30.986837 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Dec 16 16:20:30.987002 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 16 16:20:30.987974 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Dec 16 16:20:30.988177 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Dec 16 16:20:30.988361 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Dec 16 16:20:30.988526 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 16 16:20:30.988699 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Dec 16 16:20:30.988864 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Dec 16 16:20:30.989036 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Dec 16 16:20:30.989239 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 16 16:20:30.989419 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Dec 16 16:20:30.989583 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Dec 16 16:20:30.989755 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Dec 16 16:20:30.989918 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 16 16:20:30.990107 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Dec 16 16:20:30.990277 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Dec 16 16:20:30.990456 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Dec 16 16:20:30.990630 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 16 16:20:30.990794 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Dec 16 16:20:30.990958 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Dec 16 16:20:30.991140 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Dec 16 16:20:30.991319 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 16 16:20:30.991484 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Dec 16 16:20:30.991650 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Dec 16 16:20:30.991820 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Dec 16 16:20:30.991985 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 16 16:20:30.992175 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Dec 16 16:20:30.992353 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Dec 16 16:20:30.992517 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Dec 16 16:20:30.992672 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Dec 16 16:20:30.992823 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Dec 16 16:20:30.992972 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Dec 16 16:20:30.993152 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Dec 16 16:20:30.993315 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Dec 16 16:20:30.995176 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Dec 16 16:20:30.995390 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Dec 16 16:20:30.995555 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Dec 16 16:20:30.995731 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Dec 16 16:20:30.996010 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Dec 16 16:20:30.996278 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Dec 16 16:20:30.996494 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Dec 16 16:20:30.996661 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Dec 16 16:20:30.996827 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Dec 16 16:20:30.996983 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Dec 16 16:20:30.997154 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Dec 16 16:20:30.997334 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Dec 16 16:20:30.997494 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Dec 16 16:20:30.997649 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Dec 16 16:20:30.997828 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Dec 16 16:20:30.997984 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Dec 16 16:20:30.998753 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Dec 16 16:20:30.998929 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Dec 16 16:20:30.999103 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Dec 16 16:20:30.999260 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Dec 16 16:20:30.999440 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Dec 16 16:20:30.999604 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Dec 16 16:20:30.999779 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Dec 16 16:20:30.999945 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Dec 16 16:20:31.000146 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Dec 16 16:20:31.000325 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Dec 16 16:20:31.000348 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Dec 16 16:20:31.000362 kernel: PCI: CLS 0 bytes, default 64 Dec 16 16:20:31.000384 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Dec 16 16:20:31.000398 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Dec 16 16:20:31.000412 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Dec 16 16:20:31.000426 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Dec 16 16:20:31.000440 kernel: Initialise system trusted keyrings Dec 16 16:20:31.000454 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Dec 16 16:20:31.000468 kernel: Key type asymmetric registered Dec 16 16:20:31.000481 kernel: Asymmetric key parser 'x509' registered Dec 16 16:20:31.000499 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Dec 16 16:20:31.000517 kernel: io scheduler mq-deadline registered Dec 16 16:20:31.000531 kernel: io scheduler kyber registered Dec 16 16:20:31.000545 kernel: io scheduler bfq registered Dec 16 16:20:31.000711 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Dec 16 16:20:31.000880 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Dec 16 16:20:31.001046 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 16:20:31.001253 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Dec 16 16:20:31.001442 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Dec 16 16:20:31.001607 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 16:20:31.001776 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Dec 16 16:20:31.001941 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Dec 16 16:20:31.002134 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 16:20:31.002313 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Dec 16 16:20:31.002489 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Dec 16 16:20:31.002655 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 16:20:31.002820 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Dec 16 16:20:31.002983 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Dec 16 16:20:31.003175 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 16:20:31.003356 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Dec 16 16:20:31.003530 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Dec 16 16:20:31.003697 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 16:20:31.003866 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Dec 16 16:20:31.004053 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Dec 16 16:20:31.004250 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 16:20:31.004429 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Dec 16 16:20:31.004602 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Dec 16 16:20:31.004766 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 16:20:31.004787 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Dec 16 16:20:31.004802 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Dec 16 16:20:31.004816 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Dec 16 16:20:31.004830 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 16 16:20:31.004844 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Dec 16 16:20:31.004865 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Dec 16 16:20:31.004879 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Dec 16 16:20:31.004893 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Dec 16 16:20:31.005063 kernel: rtc_cmos 00:03: RTC can wake from S4 Dec 16 16:20:31.005103 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Dec 16 16:20:31.005259 kernel: rtc_cmos 00:03: registered as rtc0 Dec 16 16:20:31.005429 kernel: rtc_cmos 00:03: setting system clock to 2025-12-16T16:20:30 UTC (1765902030) Dec 16 16:20:31.005585 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Dec 16 16:20:31.005612 kernel: intel_pstate: CPU model not supported Dec 16 16:20:31.005627 kernel: NET: Registered PF_INET6 protocol family Dec 16 16:20:31.005641 kernel: Segment Routing with IPv6 Dec 16 16:20:31.005654 kernel: In-situ OAM (IOAM) with IPv6 Dec 16 16:20:31.005668 kernel: NET: Registered PF_PACKET protocol family Dec 16 16:20:31.005682 kernel: Key type dns_resolver registered Dec 16 16:20:31.005695 kernel: IPI shorthand broadcast: enabled Dec 16 16:20:31.005709 kernel: sched_clock: Marking stable (3487051719, 226919303)->(3838971612, -125000590) Dec 16 16:20:31.005723 kernel: registered taskstats version 1 Dec 16 16:20:31.005741 kernel: Loading compiled-in X.509 certificates Dec 16 16:20:31.005755 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 0d0c78e6590cb40d27f1cef749ef9f2f3425f38d' Dec 16 16:20:31.005769 kernel: Demotion targets for Node 0: null Dec 16 16:20:31.005783 kernel: Key type .fscrypt registered Dec 16 16:20:31.005796 kernel: Key type fscrypt-provisioning registered Dec 16 16:20:31.005809 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 16 16:20:31.005823 kernel: ima: Allocated hash algorithm: sha1 Dec 16 16:20:31.005837 kernel: ima: No architecture policies found Dec 16 16:20:31.005850 kernel: clk: Disabling unused clocks Dec 16 16:20:31.005864 kernel: Warning: unable to open an initial console. Dec 16 16:20:31.005883 kernel: Freeing unused kernel image (initmem) memory: 46188K Dec 16 16:20:31.005897 kernel: Write protecting the kernel read-only data: 40960k Dec 16 16:20:31.005910 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Dec 16 16:20:31.005924 kernel: Run /init as init process Dec 16 16:20:31.005937 kernel: with arguments: Dec 16 16:20:31.005951 kernel: /init Dec 16 16:20:31.005964 kernel: with environment: Dec 16 16:20:31.005977 kernel: HOME=/ Dec 16 16:20:31.005995 kernel: TERM=linux Dec 16 16:20:31.006024 systemd[1]: Successfully made /usr/ read-only. Dec 16 16:20:31.006044 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 16:20:31.006060 systemd[1]: Detected virtualization kvm. Dec 16 16:20:31.006477 systemd[1]: Detected architecture x86-64. Dec 16 16:20:31.006500 systemd[1]: Running in initrd. Dec 16 16:20:31.006515 systemd[1]: No hostname configured, using default hostname. Dec 16 16:20:31.006530 systemd[1]: Hostname set to . Dec 16 16:20:31.006553 systemd[1]: Initializing machine ID from VM UUID. Dec 16 16:20:31.006568 systemd[1]: Queued start job for default target initrd.target. Dec 16 16:20:31.006583 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 16:20:31.006597 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 16:20:31.006613 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 16 16:20:31.006628 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 16:20:31.006643 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 16 16:20:31.006664 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 16 16:20:31.006680 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 16 16:20:31.006695 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 16 16:20:31.006710 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 16:20:31.006725 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 16:20:31.006740 systemd[1]: Reached target paths.target - Path Units. Dec 16 16:20:31.006754 systemd[1]: Reached target slices.target - Slice Units. Dec 16 16:20:31.006769 systemd[1]: Reached target swap.target - Swaps. Dec 16 16:20:31.006788 systemd[1]: Reached target timers.target - Timer Units. Dec 16 16:20:31.006804 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 16:20:31.006818 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 16:20:31.006833 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 16 16:20:31.006848 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 16 16:20:31.006862 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 16:20:31.006877 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 16:20:31.006892 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 16:20:31.006911 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 16:20:31.006925 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 16 16:20:31.006940 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 16:20:31.006955 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 16 16:20:31.006970 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 16 16:20:31.006985 systemd[1]: Starting systemd-fsck-usr.service... Dec 16 16:20:31.006999 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 16:20:31.007014 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 16:20:31.007028 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 16:20:31.007115 systemd-journald[210]: Collecting audit messages is disabled. Dec 16 16:20:31.007151 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 16 16:20:31.007174 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 16:20:31.007189 systemd[1]: Finished systemd-fsck-usr.service. Dec 16 16:20:31.007204 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 16:20:31.007219 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 16 16:20:31.007234 systemd-journald[210]: Journal started Dec 16 16:20:31.007271 systemd-journald[210]: Runtime Journal (/run/log/journal/c471c646c4dc43a8bd7a06ba169fce5c) is 4.7M, max 37.8M, 33.1M free. Dec 16 16:20:30.943764 systemd-modules-load[212]: Inserted module 'overlay' Dec 16 16:20:31.080887 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 16:20:31.080924 kernel: Bridge firewalling registered Dec 16 16:20:31.012638 systemd-modules-load[212]: Inserted module 'br_netfilter' Dec 16 16:20:31.080117 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 16:20:31.081803 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 16:20:31.082956 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 16:20:31.087223 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 16 16:20:31.090226 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 16:20:31.100580 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 16:20:31.106242 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 16:20:31.111890 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 16:20:31.128154 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 16:20:31.131016 systemd-tmpfiles[229]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 16 16:20:31.132505 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 16 16:20:31.135719 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 16:20:31.149349 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 16:20:31.152593 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 16:20:31.172024 dracut-cmdline[247]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 16 16:20:31.213338 systemd-resolved[251]: Positive Trust Anchors: Dec 16 16:20:31.214439 systemd-resolved[251]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 16:20:31.214483 systemd-resolved[251]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 16:20:31.222652 systemd-resolved[251]: Defaulting to hostname 'linux'. Dec 16 16:20:31.225650 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 16:20:31.226782 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 16:20:31.294241 kernel: SCSI subsystem initialized Dec 16 16:20:31.306137 kernel: Loading iSCSI transport class v2.0-870. Dec 16 16:20:31.320092 kernel: iscsi: registered transport (tcp) Dec 16 16:20:31.347417 kernel: iscsi: registered transport (qla4xxx) Dec 16 16:20:31.347473 kernel: QLogic iSCSI HBA Driver Dec 16 16:20:31.373676 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 16:20:31.402436 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 16:20:31.405875 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 16:20:31.468028 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 16 16:20:31.472274 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 16 16:20:31.542129 kernel: raid6: sse2x4 gen() 13448 MB/s Dec 16 16:20:31.560116 kernel: raid6: sse2x2 gen() 9467 MB/s Dec 16 16:20:31.578754 kernel: raid6: sse2x1 gen() 9606 MB/s Dec 16 16:20:31.578818 kernel: raid6: using algorithm sse2x4 gen() 13448 MB/s Dec 16 16:20:31.597796 kernel: raid6: .... xor() 7779 MB/s, rmw enabled Dec 16 16:20:31.597906 kernel: raid6: using ssse3x2 recovery algorithm Dec 16 16:20:31.623112 kernel: xor: automatically using best checksumming function avx Dec 16 16:20:31.818151 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 16 16:20:31.828708 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 16 16:20:31.832088 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 16:20:31.864755 systemd-udevd[460]: Using default interface naming scheme 'v255'. Dec 16 16:20:31.874620 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 16:20:31.878832 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 16 16:20:31.905583 dracut-pre-trigger[470]: rd.md=0: removing MD RAID activation Dec 16 16:20:31.938913 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 16:20:31.942358 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 16:20:32.063070 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 16:20:32.069530 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 16 16:20:32.186149 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Dec 16 16:20:32.205106 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Dec 16 16:20:32.220102 kernel: cryptd: max_cpu_qlen set to 1000 Dec 16 16:20:32.233527 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 16 16:20:32.233603 kernel: GPT:17805311 != 125829119 Dec 16 16:20:32.233625 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 16 16:20:32.233643 kernel: GPT:17805311 != 125829119 Dec 16 16:20:32.233659 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 16 16:20:32.233676 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 16:20:32.247917 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Dec 16 16:20:32.253451 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 16:20:32.255973 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 16:20:32.257961 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 16:20:32.261939 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 16:20:32.266281 kernel: AES CTR mode by8 optimization enabled Dec 16 16:20:32.274401 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 16 16:20:32.281177 kernel: ACPI: bus type USB registered Dec 16 16:20:32.281218 kernel: usbcore: registered new interface driver usbfs Dec 16 16:20:32.286111 kernel: usbcore: registered new interface driver hub Dec 16 16:20:32.287099 kernel: usbcore: registered new device driver usb Dec 16 16:20:32.310110 kernel: libata version 3.00 loaded. Dec 16 16:20:32.326114 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Dec 16 16:20:32.330128 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Dec 16 16:20:32.343122 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 16 16:20:32.353123 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Dec 16 16:20:32.353409 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Dec 16 16:20:32.353621 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Dec 16 16:20:32.356152 kernel: hub 1-0:1.0: USB hub found Dec 16 16:20:32.357119 kernel: hub 1-0:1.0: 4 ports detected Dec 16 16:20:32.365099 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 16 16:20:32.365372 kernel: hub 2-0:1.0: USB hub found Dec 16 16:20:32.365601 kernel: hub 2-0:1.0: 4 ports detected Dec 16 16:20:32.370752 kernel: ahci 0000:00:1f.2: version 3.0 Dec 16 16:20:32.371038 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Dec 16 16:20:32.371062 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Dec 16 16:20:32.372249 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Dec 16 16:20:32.372466 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Dec 16 16:20:32.399634 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Dec 16 16:20:32.469890 kernel: scsi host0: ahci Dec 16 16:20:32.470194 kernel: scsi host1: ahci Dec 16 16:20:32.470417 kernel: scsi host2: ahci Dec 16 16:20:32.470615 kernel: scsi host3: ahci Dec 16 16:20:32.470804 kernel: scsi host4: ahci Dec 16 16:20:32.470997 kernel: scsi host5: ahci Dec 16 16:20:32.471419 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 41 lpm-pol 1 Dec 16 16:20:32.471441 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 41 lpm-pol 1 Dec 16 16:20:32.471468 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 41 lpm-pol 1 Dec 16 16:20:32.471487 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 41 lpm-pol 1 Dec 16 16:20:32.471504 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 41 lpm-pol 1 Dec 16 16:20:32.471522 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 41 lpm-pol 1 Dec 16 16:20:32.469404 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 16:20:32.492508 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Dec 16 16:20:32.522510 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 16 16:20:32.533051 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Dec 16 16:20:32.534047 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Dec 16 16:20:32.537394 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 16 16:20:32.559365 disk-uuid[612]: Primary Header is updated. Dec 16 16:20:32.559365 disk-uuid[612]: Secondary Entries is updated. Dec 16 16:20:32.559365 disk-uuid[612]: Secondary Header is updated. Dec 16 16:20:32.564133 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 16:20:32.575134 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 16:20:32.603146 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 16 16:20:32.718112 kernel: ata6: SATA link down (SStatus 0 SControl 300) Dec 16 16:20:32.719920 kernel: ata1: SATA link down (SStatus 0 SControl 300) Dec 16 16:20:32.721737 kernel: ata5: SATA link down (SStatus 0 SControl 300) Dec 16 16:20:32.723544 kernel: ata4: SATA link down (SStatus 0 SControl 300) Dec 16 16:20:32.724116 kernel: ata3: SATA link down (SStatus 0 SControl 300) Dec 16 16:20:32.727107 kernel: ata2: SATA link down (SStatus 0 SControl 300) Dec 16 16:20:32.753130 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 16 16:20:32.761430 kernel: usbcore: registered new interface driver usbhid Dec 16 16:20:32.761496 kernel: usbhid: USB HID core driver Dec 16 16:20:32.770097 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Dec 16 16:20:32.774100 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Dec 16 16:20:32.789428 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 16 16:20:32.791584 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 16:20:32.792470 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 16:20:32.794181 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 16:20:32.797310 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 16 16:20:32.826299 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 16 16:20:33.577925 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 16:20:33.578877 disk-uuid[613]: The operation has completed successfully. Dec 16 16:20:33.638561 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 16 16:20:33.638755 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 16 16:20:33.687739 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 16 16:20:33.707913 sh[638]: Success Dec 16 16:20:33.733527 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 16 16:20:33.733617 kernel: device-mapper: uevent: version 1.0.3 Dec 16 16:20:33.735431 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 16 16:20:33.750120 kernel: device-mapper: verity: sha256 using shash "sha256-avx" Dec 16 16:20:33.803720 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 16 16:20:33.806985 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 16 16:20:33.825228 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 16 16:20:33.839599 kernel: BTRFS: device fsid a6ae7f96-a076-4d3c-81ed-46dd341492f8 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (650) Dec 16 16:20:33.839648 kernel: BTRFS info (device dm-0): first mount of filesystem a6ae7f96-a076-4d3c-81ed-46dd341492f8 Dec 16 16:20:33.842529 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Dec 16 16:20:33.854061 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 16 16:20:33.854131 kernel: BTRFS info (device dm-0): enabling free space tree Dec 16 16:20:33.857282 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 16 16:20:33.858224 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 16 16:20:33.859470 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 16 16:20:33.860476 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 16 16:20:33.864355 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 16 16:20:33.897105 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (685) Dec 16 16:20:33.900445 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 16:20:33.900484 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 16:20:33.911174 kernel: BTRFS info (device vda6): turning on async discard Dec 16 16:20:33.911219 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 16:20:33.921761 kernel: BTRFS info (device vda6): last unmount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 16:20:33.920955 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 16 16:20:33.924690 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 16 16:20:34.003217 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 16:20:34.006810 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 16:20:34.061196 systemd-networkd[819]: lo: Link UP Dec 16 16:20:34.062472 systemd-networkd[819]: lo: Gained carrier Dec 16 16:20:34.065943 systemd-networkd[819]: Enumeration completed Dec 16 16:20:34.066144 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 16:20:34.066990 systemd-networkd[819]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 16:20:34.066996 systemd-networkd[819]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 16:20:34.069587 systemd[1]: Reached target network.target - Network. Dec 16 16:20:34.071234 systemd-networkd[819]: eth0: Link UP Dec 16 16:20:34.072015 systemd-networkd[819]: eth0: Gained carrier Dec 16 16:20:34.072030 systemd-networkd[819]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 16:20:34.129191 systemd-networkd[819]: eth0: DHCPv4 address 10.244.29.226/30, gateway 10.244.29.225 acquired from 10.244.29.225 Dec 16 16:20:34.144380 ignition[740]: Ignition 2.22.0 Dec 16 16:20:34.144402 ignition[740]: Stage: fetch-offline Dec 16 16:20:34.144476 ignition[740]: no configs at "/usr/lib/ignition/base.d" Dec 16 16:20:34.144493 ignition[740]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 16:20:34.144631 ignition[740]: parsed url from cmdline: "" Dec 16 16:20:34.144639 ignition[740]: no config URL provided Dec 16 16:20:34.144655 ignition[740]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 16:20:34.149713 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 16:20:34.144671 ignition[740]: no config at "/usr/lib/ignition/user.ign" Dec 16 16:20:34.144685 ignition[740]: failed to fetch config: resource requires networking Dec 16 16:20:34.153282 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 16 16:20:34.144917 ignition[740]: Ignition finished successfully Dec 16 16:20:34.200670 ignition[830]: Ignition 2.22.0 Dec 16 16:20:34.200695 ignition[830]: Stage: fetch Dec 16 16:20:34.200951 ignition[830]: no configs at "/usr/lib/ignition/base.d" Dec 16 16:20:34.200971 ignition[830]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 16:20:34.201140 ignition[830]: parsed url from cmdline: "" Dec 16 16:20:34.201148 ignition[830]: no config URL provided Dec 16 16:20:34.201158 ignition[830]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 16:20:34.201174 ignition[830]: no config at "/usr/lib/ignition/user.ign" Dec 16 16:20:34.201403 ignition[830]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Dec 16 16:20:34.201840 ignition[830]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Dec 16 16:20:34.201894 ignition[830]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Dec 16 16:20:34.218929 ignition[830]: GET result: OK Dec 16 16:20:34.219125 ignition[830]: parsing config with SHA512: c117112fb60cc67fcff5bd8613735578b9bbf003f2faa43acee68ed59fdd4a4380615591442c86d8ddd6308d98e7cfb2d7451c3a77d9640a7901fc4b113d23f3 Dec 16 16:20:34.226004 unknown[830]: fetched base config from "system" Dec 16 16:20:34.226028 unknown[830]: fetched base config from "system" Dec 16 16:20:34.226038 unknown[830]: fetched user config from "openstack" Dec 16 16:20:34.229806 ignition[830]: fetch: fetch complete Dec 16 16:20:34.229825 ignition[830]: fetch: fetch passed Dec 16 16:20:34.229903 ignition[830]: Ignition finished successfully Dec 16 16:20:34.232781 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 16 16:20:34.235556 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 16 16:20:34.270954 ignition[837]: Ignition 2.22.0 Dec 16 16:20:34.270980 ignition[837]: Stage: kargs Dec 16 16:20:34.271204 ignition[837]: no configs at "/usr/lib/ignition/base.d" Dec 16 16:20:34.271233 ignition[837]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 16:20:34.272748 ignition[837]: kargs: kargs passed Dec 16 16:20:34.274454 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 16 16:20:34.272820 ignition[837]: Ignition finished successfully Dec 16 16:20:34.277952 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 16 16:20:34.319402 ignition[843]: Ignition 2.22.0 Dec 16 16:20:34.320196 ignition[843]: Stage: disks Dec 16 16:20:34.320406 ignition[843]: no configs at "/usr/lib/ignition/base.d" Dec 16 16:20:34.320425 ignition[843]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 16:20:34.324523 ignition[843]: disks: disks passed Dec 16 16:20:34.325304 ignition[843]: Ignition finished successfully Dec 16 16:20:34.327561 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 16 16:20:34.328946 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 16 16:20:34.329937 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 16 16:20:34.331600 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 16:20:34.333208 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 16:20:34.334588 systemd[1]: Reached target basic.target - Basic System. Dec 16 16:20:34.337406 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 16 16:20:34.375437 systemd-fsck[851]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Dec 16 16:20:34.379484 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 16 16:20:34.382398 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 16 16:20:34.516112 kernel: EXT4-fs (vda9): mounted filesystem e48ca59c-1206-4abd-b121-5e9b35e49852 r/w with ordered data mode. Quota mode: none. Dec 16 16:20:34.517219 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 16 16:20:34.519309 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 16 16:20:34.522655 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 16:20:34.524452 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 16 16:20:34.526906 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Dec 16 16:20:34.528902 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Dec 16 16:20:34.532515 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 16 16:20:34.533890 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 16:20:34.541782 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 16 16:20:34.547277 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 16 16:20:34.556100 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (859) Dec 16 16:20:34.562227 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 16:20:34.562289 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 16:20:34.569161 kernel: BTRFS info (device vda6): turning on async discard Dec 16 16:20:34.569244 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 16:20:34.579598 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 16:20:34.625496 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 16:20:34.661616 initrd-setup-root[888]: cut: /sysroot/etc/passwd: No such file or directory Dec 16 16:20:34.669560 initrd-setup-root[895]: cut: /sysroot/etc/group: No such file or directory Dec 16 16:20:34.675040 initrd-setup-root[902]: cut: /sysroot/etc/shadow: No such file or directory Dec 16 16:20:34.681226 initrd-setup-root[909]: cut: /sysroot/etc/gshadow: No such file or directory Dec 16 16:20:34.790206 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 16 16:20:34.792491 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 16 16:20:34.794256 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 16 16:20:34.819115 kernel: BTRFS info (device vda6): last unmount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 16:20:34.836922 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 16 16:20:34.841429 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 16 16:20:34.860049 ignition[978]: INFO : Ignition 2.22.0 Dec 16 16:20:34.860049 ignition[978]: INFO : Stage: mount Dec 16 16:20:34.862870 ignition[978]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 16:20:34.862870 ignition[978]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 16:20:34.862870 ignition[978]: INFO : mount: mount passed Dec 16 16:20:34.862870 ignition[978]: INFO : Ignition finished successfully Dec 16 16:20:34.865165 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 16 16:20:35.183386 systemd-networkd[819]: eth0: Gained IPv6LL Dec 16 16:20:35.658098 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 16:20:36.692525 systemd-networkd[819]: eth0: Ignoring DHCPv6 address 2a02:1348:17d:778:24:19ff:fef4:1de2/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17d:778:24:19ff:fef4:1de2/64 assigned by NDisc. Dec 16 16:20:36.692542 systemd-networkd[819]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Dec 16 16:20:37.669099 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 16:20:41.678142 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 16:20:41.688466 coreos-metadata[861]: Dec 16 16:20:41.688 WARN failed to locate config-drive, using the metadata service API instead Dec 16 16:20:41.711715 coreos-metadata[861]: Dec 16 16:20:41.711 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 16 16:20:41.724663 coreos-metadata[861]: Dec 16 16:20:41.724 INFO Fetch successful Dec 16 16:20:41.725825 coreos-metadata[861]: Dec 16 16:20:41.725 INFO wrote hostname srv-bfhb9.gb1.brightbox.com to /sysroot/etc/hostname Dec 16 16:20:41.728416 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Dec 16 16:20:41.728589 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Dec 16 16:20:41.732591 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 16 16:20:41.764407 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 16:20:41.794150 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (994) Dec 16 16:20:41.798868 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 16:20:41.798908 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 16:20:41.804865 kernel: BTRFS info (device vda6): turning on async discard Dec 16 16:20:41.804913 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 16:20:41.808095 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 16:20:41.847206 ignition[1012]: INFO : Ignition 2.22.0 Dec 16 16:20:41.847206 ignition[1012]: INFO : Stage: files Dec 16 16:20:41.849272 ignition[1012]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 16:20:41.849272 ignition[1012]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 16:20:41.849272 ignition[1012]: DEBUG : files: compiled without relabeling support, skipping Dec 16 16:20:41.852017 ignition[1012]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 16 16:20:41.852017 ignition[1012]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 16 16:20:41.859965 ignition[1012]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 16 16:20:41.861188 ignition[1012]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 16 16:20:41.861188 ignition[1012]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 16 16:20:41.861029 unknown[1012]: wrote ssh authorized keys file for user: core Dec 16 16:20:41.864487 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Dec 16 16:20:41.864487 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Dec 16 16:20:42.066472 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 16 16:20:42.311440 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Dec 16 16:20:42.313283 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 16 16:20:42.313283 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 16 16:20:42.313283 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 16 16:20:42.313283 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 16 16:20:42.313283 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 16:20:42.313283 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 16:20:42.313283 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 16:20:42.313283 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 16:20:42.322411 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 16:20:42.322411 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 16:20:42.322411 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 16 16:20:42.322411 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 16 16:20:42.322411 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 16 16:20:42.322411 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Dec 16 16:20:42.611293 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 16 16:20:43.904067 ignition[1012]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Dec 16 16:20:43.904067 ignition[1012]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 16 16:20:43.907425 ignition[1012]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 16:20:43.908948 ignition[1012]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 16:20:43.908948 ignition[1012]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 16 16:20:43.908948 ignition[1012]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Dec 16 16:20:43.908948 ignition[1012]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Dec 16 16:20:43.908948 ignition[1012]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 16 16:20:43.908948 ignition[1012]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 16 16:20:43.908948 ignition[1012]: INFO : files: files passed Dec 16 16:20:43.908948 ignition[1012]: INFO : Ignition finished successfully Dec 16 16:20:43.911743 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 16 16:20:43.916332 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 16 16:20:43.920898 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 16 16:20:43.947179 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 16 16:20:43.947902 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 16 16:20:43.958586 initrd-setup-root-after-ignition[1046]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 16:20:43.960464 initrd-setup-root-after-ignition[1042]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 16:20:43.960464 initrd-setup-root-after-ignition[1042]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 16 16:20:43.962383 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 16:20:43.964156 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 16 16:20:43.966869 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 16 16:20:44.043448 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 16 16:20:44.043670 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 16 16:20:44.046028 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 16 16:20:44.047037 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 16 16:20:44.048721 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 16 16:20:44.050271 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 16 16:20:44.094531 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 16:20:44.097725 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 16 16:20:44.126112 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 16 16:20:44.128213 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 16:20:44.129947 systemd[1]: Stopped target timers.target - Timer Units. Dec 16 16:20:44.131364 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 16 16:20:44.131558 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 16:20:44.133919 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 16 16:20:44.134899 systemd[1]: Stopped target basic.target - Basic System. Dec 16 16:20:44.136289 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 16 16:20:44.137661 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 16:20:44.139360 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 16 16:20:44.140931 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 16 16:20:44.142709 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 16 16:20:44.144220 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 16:20:44.145900 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 16 16:20:44.147416 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 16 16:20:44.148994 systemd[1]: Stopped target swap.target - Swaps. Dec 16 16:20:44.150201 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 16 16:20:44.150390 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 16 16:20:44.152328 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 16 16:20:44.153421 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 16:20:44.154981 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 16 16:20:44.155442 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 16:20:44.156666 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 16 16:20:44.156897 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 16 16:20:44.158722 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 16 16:20:44.158915 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 16:20:44.160963 systemd[1]: ignition-files.service: Deactivated successfully. Dec 16 16:20:44.161308 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 16 16:20:44.174316 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 16 16:20:44.177178 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 16 16:20:44.178848 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 16 16:20:44.180292 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 16:20:44.183302 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 16 16:20:44.183481 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 16:20:44.193014 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 16 16:20:44.193296 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 16 16:20:44.225585 ignition[1066]: INFO : Ignition 2.22.0 Dec 16 16:20:44.226000 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 16 16:20:44.227638 ignition[1066]: INFO : Stage: umount Dec 16 16:20:44.228717 ignition[1066]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 16:20:44.230430 ignition[1066]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 16:20:44.232946 ignition[1066]: INFO : umount: umount passed Dec 16 16:20:44.234575 ignition[1066]: INFO : Ignition finished successfully Dec 16 16:20:44.235772 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 16 16:20:44.236764 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 16 16:20:44.238605 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 16 16:20:44.238692 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 16 16:20:44.239924 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 16 16:20:44.240009 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 16 16:20:44.241459 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 16 16:20:44.241536 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 16 16:20:44.242886 systemd[1]: Stopped target network.target - Network. Dec 16 16:20:44.244236 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 16 16:20:44.244320 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 16:20:44.245756 systemd[1]: Stopped target paths.target - Path Units. Dec 16 16:20:44.247221 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 16 16:20:44.251172 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 16:20:44.252135 systemd[1]: Stopped target slices.target - Slice Units. Dec 16 16:20:44.253724 systemd[1]: Stopped target sockets.target - Socket Units. Dec 16 16:20:44.255549 systemd[1]: iscsid.socket: Deactivated successfully. Dec 16 16:20:44.255629 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 16:20:44.256926 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 16 16:20:44.256991 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 16:20:44.258299 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 16 16:20:44.258379 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 16 16:20:44.259636 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 16 16:20:44.259711 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 16 16:20:44.261249 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 16 16:20:44.264112 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 16 16:20:44.267370 systemd-networkd[819]: eth0: DHCPv6 lease lost Dec 16 16:20:44.274539 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 16 16:20:44.274780 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 16 16:20:44.279696 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Dec 16 16:20:44.281034 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 16 16:20:44.281253 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 16 16:20:44.284544 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Dec 16 16:20:44.285421 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 16 16:20:44.286472 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 16 16:20:44.286551 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 16 16:20:44.292124 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 16 16:20:44.292857 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 16 16:20:44.292934 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 16:20:44.295502 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 16 16:20:44.295572 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 16 16:20:44.298257 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 16 16:20:44.298399 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 16 16:20:44.300494 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 16 16:20:44.300571 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 16:20:44.303246 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 16:20:44.305679 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 16 16:20:44.305775 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 16 16:20:44.319109 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 16 16:20:44.320792 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 16:20:44.322264 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 16 16:20:44.322336 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 16 16:20:44.323899 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 16 16:20:44.323961 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 16:20:44.325564 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 16 16:20:44.325637 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 16 16:20:44.327817 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 16 16:20:44.327892 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 16 16:20:44.331735 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 16 16:20:44.331816 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 16:20:44.335246 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 16 16:20:44.336278 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 16 16:20:44.336357 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 16:20:44.341243 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 16 16:20:44.341331 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 16:20:44.343268 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Dec 16 16:20:44.343342 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 16:20:44.346595 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 16 16:20:44.346674 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 16:20:44.348095 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 16:20:44.348176 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 16:20:44.353509 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Dec 16 16:20:44.353595 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Dec 16 16:20:44.353665 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Dec 16 16:20:44.353740 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 16 16:20:44.354276 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 16 16:20:44.358292 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 16 16:20:44.364974 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 16 16:20:44.365166 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 16 16:20:44.371581 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 16 16:20:44.371750 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 16 16:20:44.373463 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 16 16:20:44.374498 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 16 16:20:44.374604 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 16 16:20:44.377327 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 16 16:20:44.401171 systemd[1]: Switching root. Dec 16 16:20:44.438593 systemd-journald[210]: Journal stopped Dec 16 16:20:46.201284 systemd-journald[210]: Received SIGTERM from PID 1 (systemd). Dec 16 16:20:46.201441 kernel: SELinux: policy capability network_peer_controls=1 Dec 16 16:20:46.201486 kernel: SELinux: policy capability open_perms=1 Dec 16 16:20:46.201523 kernel: SELinux: policy capability extended_socket_class=1 Dec 16 16:20:46.201562 kernel: SELinux: policy capability always_check_network=0 Dec 16 16:20:46.201592 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 16 16:20:46.201621 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 16 16:20:46.201649 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 16 16:20:46.201670 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 16 16:20:46.201688 kernel: SELinux: policy capability userspace_initial_context=0 Dec 16 16:20:46.201713 kernel: audit: type=1403 audit(1765902044.741:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 16 16:20:46.201757 systemd[1]: Successfully loaded SELinux policy in 81.404ms. Dec 16 16:20:46.201823 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 14.381ms. Dec 16 16:20:46.201847 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 16:20:46.201869 systemd[1]: Detected virtualization kvm. Dec 16 16:20:46.201889 systemd[1]: Detected architecture x86-64. Dec 16 16:20:46.201919 systemd[1]: Detected first boot. Dec 16 16:20:46.201948 systemd[1]: Hostname set to . Dec 16 16:20:46.201986 systemd[1]: Initializing machine ID from VM UUID. Dec 16 16:20:46.202014 zram_generator::config[1110]: No configuration found. Dec 16 16:20:46.202067 kernel: Guest personality initialized and is inactive Dec 16 16:20:46.204126 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Dec 16 16:20:46.204160 kernel: Initialized host personality Dec 16 16:20:46.204181 kernel: NET: Registered PF_VSOCK protocol family Dec 16 16:20:46.204207 systemd[1]: Populated /etc with preset unit settings. Dec 16 16:20:46.204241 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Dec 16 16:20:46.204272 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 16 16:20:46.204307 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 16 16:20:46.204342 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 16 16:20:46.204370 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 16 16:20:46.204392 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 16 16:20:46.204413 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 16 16:20:46.204440 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 16 16:20:46.204468 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 16 16:20:46.204509 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 16 16:20:46.204546 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 16 16:20:46.204568 systemd[1]: Created slice user.slice - User and Session Slice. Dec 16 16:20:46.204588 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 16:20:46.204609 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 16:20:46.204636 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 16 16:20:46.204658 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 16 16:20:46.204703 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 16 16:20:46.204726 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 16:20:46.204747 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Dec 16 16:20:46.204769 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 16:20:46.204798 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 16:20:46.204825 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 16 16:20:46.204854 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 16 16:20:46.204883 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 16 16:20:46.204920 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 16 16:20:46.204948 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 16:20:46.204977 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 16:20:46.204998 systemd[1]: Reached target slices.target - Slice Units. Dec 16 16:20:46.205019 systemd[1]: Reached target swap.target - Swaps. Dec 16 16:20:46.205055 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 16 16:20:46.205097 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 16 16:20:46.205120 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 16 16:20:46.205141 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 16:20:46.205176 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 16:20:46.205199 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 16:20:46.205219 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 16 16:20:46.205249 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 16 16:20:46.205271 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 16 16:20:46.205298 systemd[1]: Mounting media.mount - External Media Directory... Dec 16 16:20:46.205319 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 16:20:46.205339 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 16 16:20:46.205365 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 16 16:20:46.205400 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 16 16:20:46.205428 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 16 16:20:46.205457 systemd[1]: Reached target machines.target - Containers. Dec 16 16:20:46.205484 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 16 16:20:46.205506 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 16:20:46.205528 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 16:20:46.205554 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 16 16:20:46.205575 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 16:20:46.205607 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 16:20:46.205635 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 16:20:46.205666 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 16 16:20:46.205688 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 16:20:46.205716 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 16 16:20:46.205744 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 16 16:20:46.205765 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 16 16:20:46.205785 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 16 16:20:46.205811 systemd[1]: Stopped systemd-fsck-usr.service. Dec 16 16:20:46.205847 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 16:20:46.205875 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 16:20:46.205917 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 16:20:46.205945 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 16:20:46.205974 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 16 16:20:46.206008 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 16 16:20:46.206052 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 16:20:46.208113 systemd[1]: verity-setup.service: Deactivated successfully. Dec 16 16:20:46.208142 systemd[1]: Stopped verity-setup.service. Dec 16 16:20:46.208181 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 16:20:46.208204 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 16 16:20:46.208232 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 16 16:20:46.208260 systemd[1]: Mounted media.mount - External Media Directory. Dec 16 16:20:46.208282 kernel: loop: module loaded Dec 16 16:20:46.208309 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 16 16:20:46.208336 kernel: fuse: init (API version 7.41) Dec 16 16:20:46.208363 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 16 16:20:46.208396 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 16 16:20:46.208433 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 16:20:46.208455 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 16 16:20:46.208482 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 16 16:20:46.208505 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 16 16:20:46.208535 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 16:20:46.208557 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 16:20:46.208577 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 16:20:46.208597 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 16:20:46.208633 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 16 16:20:46.208655 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 16 16:20:46.208685 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 16:20:46.208705 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 16:20:46.208732 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 16:20:46.208754 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 16 16:20:46.208774 kernel: ACPI: bus type drm_connector registered Dec 16 16:20:46.208794 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 16 16:20:46.208827 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 16 16:20:46.208855 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 16 16:20:46.208889 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 16:20:46.208910 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 16 16:20:46.208931 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 16 16:20:46.209014 systemd-journald[1204]: Collecting audit messages is disabled. Dec 16 16:20:46.209104 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 16:20:46.209130 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 16 16:20:46.209152 systemd-journald[1204]: Journal started Dec 16 16:20:46.209199 systemd-journald[1204]: Runtime Journal (/run/log/journal/c471c646c4dc43a8bd7a06ba169fce5c) is 4.7M, max 37.8M, 33.1M free. Dec 16 16:20:46.209264 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 16:20:45.684825 systemd[1]: Queued start job for default target multi-user.target. Dec 16 16:20:45.699411 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Dec 16 16:20:45.700398 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 16 16:20:46.219112 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 16 16:20:46.225109 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 16:20:46.233102 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 16:20:46.240126 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 16 16:20:46.250105 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 16:20:46.254108 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 16:20:46.257911 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 16:20:46.259339 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 16:20:46.261763 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 16:20:46.263018 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 16 16:20:46.265416 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 16 16:20:46.266376 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 16 16:20:46.298197 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 16:20:46.312637 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 16 16:20:46.337674 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 16 16:20:46.339940 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 16 16:20:46.348400 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 16 16:20:46.361833 systemd-journald[1204]: Time spent on flushing to /var/log/journal/c471c646c4dc43a8bd7a06ba169fce5c is 59.486ms for 1169 entries. Dec 16 16:20:46.361833 systemd-journald[1204]: System Journal (/var/log/journal/c471c646c4dc43a8bd7a06ba169fce5c) is 8M, max 584.8M, 576.8M free. Dec 16 16:20:46.437584 systemd-journald[1204]: Received client request to flush runtime journal. Dec 16 16:20:46.437661 kernel: loop0: detected capacity change from 0 to 110984 Dec 16 16:20:46.444143 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 16 16:20:46.380133 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 16:20:46.434908 systemd-tmpfiles[1230]: ACLs are not supported, ignoring. Dec 16 16:20:46.434931 systemd-tmpfiles[1230]: ACLs are not supported, ignoring. Dec 16 16:20:46.447898 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 16 16:20:46.465971 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 16 16:20:46.484776 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 16:20:46.488385 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 16:20:46.497115 kernel: loop1: detected capacity change from 0 to 128560 Dec 16 16:20:46.498328 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 16 16:20:46.550126 kernel: loop2: detected capacity change from 0 to 8 Dec 16 16:20:46.554362 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 16 16:20:46.562272 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 16:20:46.594347 kernel: loop3: detected capacity change from 0 to 224512 Dec 16 16:20:46.589461 systemd-tmpfiles[1271]: ACLs are not supported, ignoring. Dec 16 16:20:46.589482 systemd-tmpfiles[1271]: ACLs are not supported, ignoring. Dec 16 16:20:46.595269 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 16:20:46.644632 kernel: loop4: detected capacity change from 0 to 110984 Dec 16 16:20:46.676108 kernel: loop5: detected capacity change from 0 to 128560 Dec 16 16:20:46.692315 kernel: loop6: detected capacity change from 0 to 8 Dec 16 16:20:46.695102 kernel: loop7: detected capacity change from 0 to 224512 Dec 16 16:20:46.703603 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 16 16:20:46.709391 (sd-merge)[1275]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Dec 16 16:20:46.710222 (sd-merge)[1275]: Merged extensions into '/usr'. Dec 16 16:20:46.721204 systemd[1]: Reload requested from client PID 1229 ('systemd-sysext') (unit systemd-sysext.service)... Dec 16 16:20:46.721244 systemd[1]: Reloading... Dec 16 16:20:46.918833 zram_generator::config[1302]: No configuration found. Dec 16 16:20:47.104155 ldconfig[1225]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 16 16:20:47.262242 systemd[1]: Reloading finished in 540 ms. Dec 16 16:20:47.280352 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 16 16:20:47.281761 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 16 16:20:47.293296 systemd[1]: Starting ensure-sysext.service... Dec 16 16:20:47.298288 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 16:20:47.340260 systemd-tmpfiles[1358]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 16 16:20:47.340851 systemd-tmpfiles[1358]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 16 16:20:47.343466 systemd-tmpfiles[1358]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 16 16:20:47.343868 systemd-tmpfiles[1358]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 16 16:20:47.347340 systemd-tmpfiles[1358]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 16 16:20:47.347845 systemd-tmpfiles[1358]: ACLs are not supported, ignoring. Dec 16 16:20:47.348106 systemd-tmpfiles[1358]: ACLs are not supported, ignoring. Dec 16 16:20:47.348112 systemd[1]: Reload requested from client PID 1357 ('systemctl') (unit ensure-sysext.service)... Dec 16 16:20:47.348131 systemd[1]: Reloading... Dec 16 16:20:47.360497 systemd-tmpfiles[1358]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 16:20:47.362122 systemd-tmpfiles[1358]: Skipping /boot Dec 16 16:20:47.383550 systemd-tmpfiles[1358]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 16:20:47.383997 systemd-tmpfiles[1358]: Skipping /boot Dec 16 16:20:47.430106 zram_generator::config[1385]: No configuration found. Dec 16 16:20:47.709564 systemd[1]: Reloading finished in 360 ms. Dec 16 16:20:47.733532 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 16 16:20:47.746863 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 16:20:47.757233 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 16:20:47.763278 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 16 16:20:47.771572 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 16 16:20:47.776396 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 16:20:47.782259 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 16:20:47.787492 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 16 16:20:47.793431 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 16:20:47.793717 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 16:20:47.797068 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 16:20:47.806631 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 16:20:47.813023 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 16:20:47.814246 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 16:20:47.814413 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 16:20:47.814578 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 16:20:47.824528 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 16 16:20:47.827403 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 16:20:47.827683 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 16:20:47.827914 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 16:20:47.828058 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 16:20:47.828215 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 16:20:47.833033 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 16:20:47.833407 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 16:20:47.835025 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 16:20:47.836322 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 16:20:47.836476 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 16:20:47.836665 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 16:20:47.850108 systemd[1]: Finished ensure-sysext.service. Dec 16 16:20:47.868586 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Dec 16 16:20:47.871596 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 16:20:47.871916 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 16:20:47.873403 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 16:20:47.877330 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 16:20:47.878971 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 16:20:47.879320 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 16:20:47.886847 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 16 16:20:47.888954 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 16:20:47.889300 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 16:20:47.892050 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 16:20:47.892250 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 16:20:47.919529 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 16 16:20:47.925380 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 16 16:20:47.946741 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 16 16:20:47.948311 augenrules[1484]: No rules Dec 16 16:20:47.948857 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 16 16:20:47.950589 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 16:20:47.950884 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 16:20:47.954738 systemd-udevd[1448]: Using default interface naming scheme 'v255'. Dec 16 16:20:47.965140 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 16 16:20:47.978930 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 16 16:20:47.999915 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 16:20:48.007948 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 16:20:48.244398 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Dec 16 16:20:48.289011 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Dec 16 16:20:48.290470 systemd[1]: Reached target time-set.target - System Time Set. Dec 16 16:20:48.312964 systemd-resolved[1446]: Positive Trust Anchors: Dec 16 16:20:48.312988 systemd-resolved[1446]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 16:20:48.313045 systemd-resolved[1446]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 16:20:48.322257 systemd-networkd[1505]: lo: Link UP Dec 16 16:20:48.322272 systemd-networkd[1505]: lo: Gained carrier Dec 16 16:20:48.324312 systemd-networkd[1505]: Enumeration completed Dec 16 16:20:48.324448 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 16:20:48.326297 systemd-resolved[1446]: Using system hostname 'srv-bfhb9.gb1.brightbox.com'. Dec 16 16:20:48.329364 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 16 16:20:48.335294 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 16 16:20:48.336300 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 16:20:48.343396 systemd[1]: Reached target network.target - Network. Dec 16 16:20:48.344155 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 16:20:48.345397 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 16:20:48.347276 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 16 16:20:48.348124 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 16 16:20:48.350212 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Dec 16 16:20:48.351232 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 16 16:20:48.352297 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 16 16:20:48.355191 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 16 16:20:48.356049 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 16 16:20:48.356135 systemd[1]: Reached target paths.target - Path Units. Dec 16 16:20:48.356772 systemd[1]: Reached target timers.target - Timer Units. Dec 16 16:20:48.360355 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 16 16:20:48.364775 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 16 16:20:48.372390 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 16 16:20:48.373602 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 16 16:20:48.375028 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 16 16:20:48.383188 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 16 16:20:48.386028 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 16 16:20:48.390169 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 16 16:20:48.391932 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 16 16:20:48.394665 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 16:20:48.395452 systemd[1]: Reached target basic.target - Basic System. Dec 16 16:20:48.396241 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 16 16:20:48.396300 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 16 16:20:48.397956 systemd[1]: Starting containerd.service - containerd container runtime... Dec 16 16:20:48.404273 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 16 16:20:48.408920 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 16 16:20:48.415279 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 16 16:20:48.421304 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 16 16:20:48.427783 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 16 16:20:48.428547 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 16 16:20:48.432864 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Dec 16 16:20:48.437877 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 16 16:20:48.446409 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 16 16:20:48.452121 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 16:20:48.453966 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 16 16:20:48.464652 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 16 16:20:48.471108 jq[1540]: false Dec 16 16:20:48.473943 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 16 16:20:48.475993 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 16 16:20:48.477742 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 16 16:20:48.481363 systemd[1]: Starting update-engine.service - Update Engine... Dec 16 16:20:48.488339 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 16 16:20:48.490753 oslogin_cache_refresh[1542]: Refreshing passwd entry cache Dec 16 16:20:48.492665 google_oslogin_nss_cache[1542]: oslogin_cache_refresh[1542]: Refreshing passwd entry cache Dec 16 16:20:48.500089 google_oslogin_nss_cache[1542]: oslogin_cache_refresh[1542]: Failure getting users, quitting Dec 16 16:20:48.496945 oslogin_cache_refresh[1542]: Failure getting users, quitting Dec 16 16:20:48.500148 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 16 16:20:48.501827 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 16 16:20:48.504154 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 16 16:20:48.515503 google_oslogin_nss_cache[1542]: oslogin_cache_refresh[1542]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 16 16:20:48.515503 google_oslogin_nss_cache[1542]: oslogin_cache_refresh[1542]: Refreshing group entry cache Dec 16 16:20:48.515503 google_oslogin_nss_cache[1542]: oslogin_cache_refresh[1542]: Failure getting groups, quitting Dec 16 16:20:48.515503 google_oslogin_nss_cache[1542]: oslogin_cache_refresh[1542]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 16 16:20:48.514592 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Dec 16 16:20:48.511144 oslogin_cache_refresh[1542]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 16 16:20:48.515016 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Dec 16 16:20:48.511234 oslogin_cache_refresh[1542]: Refreshing group entry cache Dec 16 16:20:48.512166 oslogin_cache_refresh[1542]: Failure getting groups, quitting Dec 16 16:20:48.512183 oslogin_cache_refresh[1542]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 16 16:20:48.535365 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 16 16:20:48.546368 extend-filesystems[1541]: Found /dev/vda6 Dec 16 16:20:48.545169 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 16 16:20:48.566363 extend-filesystems[1541]: Found /dev/vda9 Dec 16 16:20:48.574537 update_engine[1550]: I20251216 16:20:48.571779 1550 main.cc:92] Flatcar Update Engine starting Dec 16 16:20:48.574914 jq[1551]: true Dec 16 16:20:48.585330 extend-filesystems[1541]: Checking size of /dev/vda9 Dec 16 16:20:48.593099 (ntainerd)[1574]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 16 16:20:48.606866 systemd[1]: motdgen.service: Deactivated successfully. Dec 16 16:20:48.612277 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 16 16:20:48.614944 tar[1561]: linux-amd64/LICENSE Dec 16 16:20:48.618655 tar[1561]: linux-amd64/helm Dec 16 16:20:48.620850 jq[1576]: true Dec 16 16:20:48.644495 dbus-daemon[1538]: [system] SELinux support is enabled Dec 16 16:20:48.644832 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 16 16:20:48.652200 extend-filesystems[1541]: Resized partition /dev/vda9 Dec 16 16:20:48.652489 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 16 16:20:48.652527 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 16 16:20:48.655376 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 16 16:20:48.655418 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 16 16:20:48.661983 extend-filesystems[1587]: resize2fs 1.47.3 (8-Jul-2025) Dec 16 16:20:48.676284 systemd[1]: Started update-engine.service - Update Engine. Dec 16 16:20:48.681488 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Dec 16 16:20:48.681570 update_engine[1550]: I20251216 16:20:48.678977 1550 update_check_scheduler.cc:74] Next update check in 8m3s Dec 16 16:20:48.686005 systemd-networkd[1505]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 16:20:48.688762 systemd-networkd[1505]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 16:20:48.693227 systemd-networkd[1505]: eth0: Link UP Dec 16 16:20:48.693560 systemd-networkd[1505]: eth0: Gained carrier Dec 16 16:20:48.693584 systemd-networkd[1505]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 16:20:48.699422 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 16 16:20:48.743227 systemd-networkd[1505]: eth0: DHCPv4 address 10.244.29.226/30, gateway 10.244.29.225 acquired from 10.244.29.225 Dec 16 16:20:48.743512 dbus-daemon[1538]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.0' (uid=244 pid=1505 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Dec 16 16:20:48.747479 systemd-timesyncd[1464]: Network configuration changed, trying to establish connection. Dec 16 16:20:48.749411 systemd-timesyncd[1464]: Network configuration changed, trying to establish connection. Dec 16 16:20:48.753406 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Dec 16 16:20:48.866772 bash[1607]: Updated "/home/core/.ssh/authorized_keys" Dec 16 16:20:48.875589 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 16 16:20:48.884688 systemd[1]: Starting sshkeys.service... Dec 16 16:20:48.913310 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 16 16:20:48.919191 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 16 16:20:48.982767 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 16 16:20:48.989838 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 16 16:20:49.013111 kernel: mousedev: PS/2 mouse device common for all mice Dec 16 16:20:49.023865 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 16:20:49.030574 systemd-logind[1549]: New seat seat0. Dec 16 16:20:49.034644 systemd[1]: Started systemd-logind.service - User Login Management. Dec 16 16:20:49.054401 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Dec 16 16:20:49.072159 containerd[1574]: time="2025-12-16T16:20:49Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 16 16:20:49.067456 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 16 16:20:49.074361 containerd[1574]: time="2025-12-16T16:20:49.073591682Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Dec 16 16:20:49.075734 extend-filesystems[1587]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Dec 16 16:20:49.075734 extend-filesystems[1587]: old_desc_blocks = 1, new_desc_blocks = 8 Dec 16 16:20:49.075734 extend-filesystems[1587]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Dec 16 16:20:49.092005 extend-filesystems[1541]: Resized filesystem in /dev/vda9 Dec 16 16:20:49.077134 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 16 16:20:49.077459 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 16 16:20:49.121980 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Dec 16 16:20:49.129165 kernel: ACPI: button: Power Button [PWRF] Dec 16 16:20:49.143098 containerd[1574]: time="2025-12-16T16:20:49.142257514Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="18.775µs" Dec 16 16:20:49.143098 containerd[1574]: time="2025-12-16T16:20:49.142300536Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 16 16:20:49.143098 containerd[1574]: time="2025-12-16T16:20:49.142343790Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 16 16:20:49.143098 containerd[1574]: time="2025-12-16T16:20:49.142642764Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 16 16:20:49.143098 containerd[1574]: time="2025-12-16T16:20:49.142674440Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 16 16:20:49.143098 containerd[1574]: time="2025-12-16T16:20:49.142723767Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 16:20:49.143098 containerd[1574]: time="2025-12-16T16:20:49.142831768Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 16:20:49.143098 containerd[1574]: time="2025-12-16T16:20:49.142851754Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 16:20:49.146167 containerd[1574]: time="2025-12-16T16:20:49.145970945Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 16:20:49.146167 containerd[1574]: time="2025-12-16T16:20:49.146019786Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 16:20:49.146167 containerd[1574]: time="2025-12-16T16:20:49.146044140Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 16:20:49.146167 containerd[1574]: time="2025-12-16T16:20:49.146060317Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 16 16:20:49.146512 containerd[1574]: time="2025-12-16T16:20:49.146483549Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 16 16:20:49.150111 containerd[1574]: time="2025-12-16T16:20:49.146913340Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 16:20:49.150111 containerd[1574]: time="2025-12-16T16:20:49.146974331Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 16:20:49.150111 containerd[1574]: time="2025-12-16T16:20:49.147008317Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 16 16:20:49.150111 containerd[1574]: time="2025-12-16T16:20:49.149594479Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 16 16:20:49.150111 containerd[1574]: time="2025-12-16T16:20:49.149916522Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 16 16:20:49.150111 containerd[1574]: time="2025-12-16T16:20:49.150036243Z" level=info msg="metadata content store policy set" policy=shared Dec 16 16:20:49.152242 locksmithd[1588]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 16 16:20:49.163126 containerd[1574]: time="2025-12-16T16:20:49.161668205Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 16 16:20:49.163126 containerd[1574]: time="2025-12-16T16:20:49.161776497Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 16 16:20:49.163126 containerd[1574]: time="2025-12-16T16:20:49.161841090Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 16 16:20:49.163126 containerd[1574]: time="2025-12-16T16:20:49.161867153Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 16 16:20:49.163126 containerd[1574]: time="2025-12-16T16:20:49.161887092Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 16 16:20:49.163126 containerd[1574]: time="2025-12-16T16:20:49.161904197Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 16 16:20:49.163126 containerd[1574]: time="2025-12-16T16:20:49.161924638Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 16 16:20:49.163126 containerd[1574]: time="2025-12-16T16:20:49.161943494Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 16 16:20:49.163126 containerd[1574]: time="2025-12-16T16:20:49.161960379Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 16 16:20:49.163126 containerd[1574]: time="2025-12-16T16:20:49.161976084Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 16 16:20:49.163126 containerd[1574]: time="2025-12-16T16:20:49.162008084Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 16 16:20:49.163126 containerd[1574]: time="2025-12-16T16:20:49.162029393Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 16 16:20:49.163126 containerd[1574]: time="2025-12-16T16:20:49.162227515Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 16 16:20:49.163126 containerd[1574]: time="2025-12-16T16:20:49.162260422Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 16 16:20:49.163603 containerd[1574]: time="2025-12-16T16:20:49.162281924Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 16 16:20:49.163603 containerd[1574]: time="2025-12-16T16:20:49.162300435Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 16 16:20:49.163603 containerd[1574]: time="2025-12-16T16:20:49.162318692Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 16 16:20:49.163603 containerd[1574]: time="2025-12-16T16:20:49.162335638Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 16 16:20:49.163603 containerd[1574]: time="2025-12-16T16:20:49.162352457Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 16 16:20:49.163603 containerd[1574]: time="2025-12-16T16:20:49.162369249Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 16 16:20:49.163603 containerd[1574]: time="2025-12-16T16:20:49.162422215Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 16 16:20:49.163603 containerd[1574]: time="2025-12-16T16:20:49.162445067Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 16 16:20:49.163603 containerd[1574]: time="2025-12-16T16:20:49.162462547Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 16 16:20:49.163603 containerd[1574]: time="2025-12-16T16:20:49.162551124Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 16 16:20:49.163603 containerd[1574]: time="2025-12-16T16:20:49.162576676Z" level=info msg="Start snapshots syncer" Dec 16 16:20:49.163603 containerd[1574]: time="2025-12-16T16:20:49.162627562Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 16 16:20:49.163965 containerd[1574]: time="2025-12-16T16:20:49.162996613Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 16 16:20:49.166739 containerd[1574]: time="2025-12-16T16:20:49.165817422Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 16 16:20:49.167482 containerd[1574]: time="2025-12-16T16:20:49.166937346Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 16 16:20:49.170660 containerd[1574]: time="2025-12-16T16:20:49.170619784Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 16 16:20:49.177558 containerd[1574]: time="2025-12-16T16:20:49.177354762Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 16 16:20:49.177558 containerd[1574]: time="2025-12-16T16:20:49.177430111Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 16 16:20:49.177558 containerd[1574]: time="2025-12-16T16:20:49.177471963Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 16 16:20:49.177558 containerd[1574]: time="2025-12-16T16:20:49.177497282Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 16 16:20:49.177558 containerd[1574]: time="2025-12-16T16:20:49.177515034Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 16 16:20:49.177558 containerd[1574]: time="2025-12-16T16:20:49.177564703Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 16 16:20:49.177824 containerd[1574]: time="2025-12-16T16:20:49.177606636Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 16 16:20:49.177824 containerd[1574]: time="2025-12-16T16:20:49.177649758Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 16 16:20:49.177824 containerd[1574]: time="2025-12-16T16:20:49.177669786Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 16 16:20:49.177824 containerd[1574]: time="2025-12-16T16:20:49.177748650Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 16:20:49.177824 containerd[1574]: time="2025-12-16T16:20:49.177788410Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 16:20:49.178021 containerd[1574]: time="2025-12-16T16:20:49.177823773Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 16:20:49.178021 containerd[1574]: time="2025-12-16T16:20:49.177844386Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 16:20:49.178021 containerd[1574]: time="2025-12-16T16:20:49.177859763Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 16 16:20:49.178021 containerd[1574]: time="2025-12-16T16:20:49.177920869Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 16 16:20:49.178021 containerd[1574]: time="2025-12-16T16:20:49.177955675Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 16 16:20:49.178246 containerd[1574]: time="2025-12-16T16:20:49.178039732Z" level=info msg="runtime interface created" Dec 16 16:20:49.178246 containerd[1574]: time="2025-12-16T16:20:49.178072063Z" level=info msg="created NRI interface" Dec 16 16:20:49.178246 containerd[1574]: time="2025-12-16T16:20:49.178107389Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 16 16:20:49.178246 containerd[1574]: time="2025-12-16T16:20:49.178163931Z" level=info msg="Connect containerd service" Dec 16 16:20:49.179450 containerd[1574]: time="2025-12-16T16:20:49.178206721Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 16 16:20:49.190801 containerd[1574]: time="2025-12-16T16:20:49.187234945Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 16:20:49.196567 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Dec 16 16:20:49.198118 dbus-daemon[1538]: [system] Successfully activated service 'org.freedesktop.hostname1' Dec 16 16:20:49.219715 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Dec 16 16:20:49.220307 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Dec 16 16:20:49.223308 dbus-daemon[1538]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.8' (uid=0 pid=1594 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Dec 16 16:20:49.231530 systemd[1]: Starting polkit.service - Authorization Manager... Dec 16 16:20:49.544214 containerd[1574]: time="2025-12-16T16:20:49.544152023Z" level=info msg="Start subscribing containerd event" Dec 16 16:20:49.544438 containerd[1574]: time="2025-12-16T16:20:49.544250159Z" level=info msg="Start recovering state" Dec 16 16:20:49.544438 containerd[1574]: time="2025-12-16T16:20:49.544420252Z" level=info msg="Start event monitor" Dec 16 16:20:49.544594 containerd[1574]: time="2025-12-16T16:20:49.544442637Z" level=info msg="Start cni network conf syncer for default" Dec 16 16:20:49.544594 containerd[1574]: time="2025-12-16T16:20:49.544460071Z" level=info msg="Start streaming server" Dec 16 16:20:49.544594 containerd[1574]: time="2025-12-16T16:20:49.544484834Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 16 16:20:49.544594 containerd[1574]: time="2025-12-16T16:20:49.544501411Z" level=info msg="runtime interface starting up..." Dec 16 16:20:49.544594 containerd[1574]: time="2025-12-16T16:20:49.544515110Z" level=info msg="starting plugins..." Dec 16 16:20:49.544594 containerd[1574]: time="2025-12-16T16:20:49.544557853Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 16 16:20:49.546294 containerd[1574]: time="2025-12-16T16:20:49.545107740Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 16 16:20:49.546294 containerd[1574]: time="2025-12-16T16:20:49.545195472Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 16 16:20:49.546294 containerd[1574]: time="2025-12-16T16:20:49.545348317Z" level=info msg="containerd successfully booted in 0.494198s" Dec 16 16:20:49.545518 systemd[1]: Started containerd.service - containerd container runtime. Dec 16 16:20:49.575255 polkitd[1630]: Started polkitd version 126 Dec 16 16:20:49.596061 polkitd[1630]: Loading rules from directory /etc/polkit-1/rules.d Dec 16 16:20:49.600543 polkitd[1630]: Loading rules from directory /run/polkit-1/rules.d Dec 16 16:20:49.600630 polkitd[1630]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Dec 16 16:20:49.600985 polkitd[1630]: Loading rules from directory /usr/local/share/polkit-1/rules.d Dec 16 16:20:49.601032 polkitd[1630]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Dec 16 16:20:49.601118 polkitd[1630]: Loading rules from directory /usr/share/polkit-1/rules.d Dec 16 16:20:49.606828 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 16 16:20:49.609833 polkitd[1630]: Finished loading, compiling and executing 2 rules Dec 16 16:20:49.611298 systemd[1]: Started polkit.service - Authorization Manager. Dec 16 16:20:49.613640 dbus-daemon[1538]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Dec 16 16:20:49.614312 polkitd[1630]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Dec 16 16:20:49.676052 systemd-hostnamed[1594]: Hostname set to (static) Dec 16 16:20:49.711272 systemd-networkd[1505]: eth0: Gained IPv6LL Dec 16 16:20:49.712442 systemd-timesyncd[1464]: Network configuration changed, trying to establish connection. Dec 16 16:20:49.716564 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 16 16:20:49.721118 systemd[1]: Reached target network-online.target - Network is Online. Dec 16 16:20:49.729700 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 16:20:49.736674 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 16 16:20:49.854668 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 16 16:20:49.914743 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 16:20:49.990062 sshd_keygen[1582]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 16 16:20:50.077325 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 16 16:20:50.089522 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 16 16:20:50.096504 systemd[1]: Started sshd@0-10.244.29.226:22-139.178.68.195:43048.service - OpenSSH per-connection server daemon (139.178.68.195:43048). Dec 16 16:20:50.132912 tar[1561]: linux-amd64/README.md Dec 16 16:20:50.164500 systemd[1]: issuegen.service: Deactivated successfully. Dec 16 16:20:50.164864 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 16 16:20:50.169993 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 16 16:20:50.172403 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 16 16:20:50.183025 systemd-logind[1549]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Dec 16 16:20:50.227171 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 16 16:20:50.425414 systemd-logind[1549]: Watching system buttons on /dev/input/event3 (Power Button) Dec 16 16:20:50.433375 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 16 16:20:50.481031 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Dec 16 16:20:50.483360 systemd[1]: Reached target getty.target - Login Prompts. Dec 16 16:20:50.681140 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 16:20:51.073849 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 16:20:51.089002 (kubelet)[1710]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 16:20:51.123206 sshd[1687]: Accepted publickey for core from 139.178.68.195 port 43048 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:20:51.125928 sshd-session[1687]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:20:51.140520 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 16 16:20:51.143462 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 16 16:20:51.163226 systemd-logind[1549]: New session 1 of user core. Dec 16 16:20:51.183064 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 16 16:20:51.190527 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 16 16:20:51.211394 (systemd)[1713]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 16 16:20:51.218536 systemd-logind[1549]: New session c1 of user core. Dec 16 16:20:51.220254 systemd-timesyncd[1464]: Network configuration changed, trying to establish connection. Dec 16 16:20:51.224930 systemd-networkd[1505]: eth0: Ignoring DHCPv6 address 2a02:1348:17d:778:24:19ff:fef4:1de2/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17d:778:24:19ff:fef4:1de2/64 assigned by NDisc. Dec 16 16:20:51.224943 systemd-networkd[1505]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Dec 16 16:20:51.419146 systemd[1713]: Queued start job for default target default.target. Dec 16 16:20:51.428453 systemd[1713]: Created slice app.slice - User Application Slice. Dec 16 16:20:51.428754 systemd[1713]: Reached target paths.target - Paths. Dec 16 16:20:51.429122 systemd[1713]: Reached target timers.target - Timers. Dec 16 16:20:51.433195 systemd[1713]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 16 16:20:51.455055 systemd[1713]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 16 16:20:51.455679 systemd[1713]: Reached target sockets.target - Sockets. Dec 16 16:20:51.455859 systemd[1713]: Reached target basic.target - Basic System. Dec 16 16:20:51.456115 systemd[1713]: Reached target default.target - Main User Target. Dec 16 16:20:51.456186 systemd[1713]: Startup finished in 220ms. Dec 16 16:20:51.456542 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 16 16:20:51.468473 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 16 16:20:51.668295 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 16:20:51.668465 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 16:20:51.777217 kubelet[1710]: E1216 16:20:51.777036 1710 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 16:20:51.780357 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 16:20:51.780980 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 16:20:51.782199 systemd[1]: kubelet.service: Consumed 1.064s CPU time, 263.1M memory peak. Dec 16 16:20:52.108541 systemd[1]: Started sshd@1-10.244.29.226:22-139.178.68.195:42834.service - OpenSSH per-connection server daemon (139.178.68.195:42834). Dec 16 16:20:52.656129 systemd-timesyncd[1464]: Network configuration changed, trying to establish connection. Dec 16 16:20:53.019296 sshd[1733]: Accepted publickey for core from 139.178.68.195 port 42834 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:20:53.021190 sshd-session[1733]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:20:53.028931 systemd-logind[1549]: New session 2 of user core. Dec 16 16:20:53.039421 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 16 16:20:53.645823 sshd[1736]: Connection closed by 139.178.68.195 port 42834 Dec 16 16:20:53.646993 sshd-session[1733]: pam_unix(sshd:session): session closed for user core Dec 16 16:20:53.653377 systemd[1]: sshd@1-10.244.29.226:22-139.178.68.195:42834.service: Deactivated successfully. Dec 16 16:20:53.656344 systemd[1]: session-2.scope: Deactivated successfully. Dec 16 16:20:53.658036 systemd-logind[1549]: Session 2 logged out. Waiting for processes to exit. Dec 16 16:20:53.660467 systemd-logind[1549]: Removed session 2. Dec 16 16:20:53.685279 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 16:20:53.687542 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 16:20:53.808371 systemd[1]: Started sshd@2-10.244.29.226:22-139.178.68.195:42850.service - OpenSSH per-connection server daemon (139.178.68.195:42850). Dec 16 16:20:54.732169 sshd[1744]: Accepted publickey for core from 139.178.68.195 port 42850 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:20:54.734178 sshd-session[1744]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:20:54.742738 systemd-logind[1549]: New session 3 of user core. Dec 16 16:20:54.754574 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 16 16:20:55.357106 sshd[1747]: Connection closed by 139.178.68.195 port 42850 Dec 16 16:20:55.356307 sshd-session[1744]: pam_unix(sshd:session): session closed for user core Dec 16 16:20:55.362539 systemd[1]: sshd@2-10.244.29.226:22-139.178.68.195:42850.service: Deactivated successfully. Dec 16 16:20:55.365270 systemd[1]: session-3.scope: Deactivated successfully. Dec 16 16:20:55.366813 systemd-logind[1549]: Session 3 logged out. Waiting for processes to exit. Dec 16 16:20:55.369203 systemd-logind[1549]: Removed session 3. Dec 16 16:20:55.538229 login[1701]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Dec 16 16:20:55.541446 login[1703]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Dec 16 16:20:55.548337 systemd-logind[1549]: New session 4 of user core. Dec 16 16:20:55.560406 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 16 16:20:55.565437 systemd-logind[1549]: New session 5 of user core. Dec 16 16:20:55.572440 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 16 16:20:57.712196 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 16:20:57.715099 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 16:20:57.729782 coreos-metadata[1537]: Dec 16 16:20:57.729 WARN failed to locate config-drive, using the metadata service API instead Dec 16 16:20:57.732160 coreos-metadata[1612]: Dec 16 16:20:57.729 WARN failed to locate config-drive, using the metadata service API instead Dec 16 16:20:57.753508 coreos-metadata[1537]: Dec 16 16:20:57.753 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Dec 16 16:20:57.754351 coreos-metadata[1612]: Dec 16 16:20:57.754 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Dec 16 16:20:57.761290 coreos-metadata[1537]: Dec 16 16:20:57.761 INFO Fetch failed with 404: resource not found Dec 16 16:20:57.761553 coreos-metadata[1537]: Dec 16 16:20:57.761 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 16 16:20:57.762531 coreos-metadata[1537]: Dec 16 16:20:57.762 INFO Fetch successful Dec 16 16:20:57.762875 coreos-metadata[1537]: Dec 16 16:20:57.762 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Dec 16 16:20:57.775526 coreos-metadata[1537]: Dec 16 16:20:57.775 INFO Fetch successful Dec 16 16:20:57.775748 coreos-metadata[1537]: Dec 16 16:20:57.775 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Dec 16 16:20:57.778332 coreos-metadata[1612]: Dec 16 16:20:57.778 INFO Fetch successful Dec 16 16:20:57.778485 coreos-metadata[1612]: Dec 16 16:20:57.778 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Dec 16 16:20:57.793402 coreos-metadata[1537]: Dec 16 16:20:57.793 INFO Fetch successful Dec 16 16:20:57.793655 coreos-metadata[1537]: Dec 16 16:20:57.793 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Dec 16 16:20:57.808145 coreos-metadata[1537]: Dec 16 16:20:57.808 INFO Fetch successful Dec 16 16:20:57.808369 coreos-metadata[1537]: Dec 16 16:20:57.808 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Dec 16 16:20:57.810650 coreos-metadata[1612]: Dec 16 16:20:57.810 INFO Fetch successful Dec 16 16:20:57.812700 unknown[1612]: wrote ssh authorized keys file for user: core Dec 16 16:20:57.824576 coreos-metadata[1537]: Dec 16 16:20:57.824 INFO Fetch successful Dec 16 16:20:57.837415 update-ssh-keys[1782]: Updated "/home/core/.ssh/authorized_keys" Dec 16 16:20:57.839408 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 16 16:20:57.844659 systemd[1]: Finished sshkeys.service. Dec 16 16:20:57.857049 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 16 16:20:57.857911 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 16 16:20:57.859291 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 16 16:20:57.860183 systemd[1]: Startup finished in 3.566s (kernel) + 14.065s (initrd) + 13.198s (userspace) = 30.831s. Dec 16 16:21:01.877206 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 16 16:21:01.879680 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 16:21:02.100734 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 16:21:02.112787 (kubelet)[1799]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 16:21:02.168974 kubelet[1799]: E1216 16:21:02.168819 1799 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 16:21:02.173592 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 16:21:02.173881 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 16:21:02.174733 systemd[1]: kubelet.service: Consumed 238ms CPU time, 110.5M memory peak. Dec 16 16:21:05.516476 systemd[1]: Started sshd@3-10.244.29.226:22-139.178.68.195:50904.service - OpenSSH per-connection server daemon (139.178.68.195:50904). Dec 16 16:21:06.436356 sshd[1807]: Accepted publickey for core from 139.178.68.195 port 50904 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:21:06.438120 sshd-session[1807]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:21:06.445331 systemd-logind[1549]: New session 6 of user core. Dec 16 16:21:06.454394 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 16 16:21:07.061181 sshd[1810]: Connection closed by 139.178.68.195 port 50904 Dec 16 16:21:07.062037 sshd-session[1807]: pam_unix(sshd:session): session closed for user core Dec 16 16:21:07.067390 systemd[1]: sshd@3-10.244.29.226:22-139.178.68.195:50904.service: Deactivated successfully. Dec 16 16:21:07.070156 systemd[1]: session-6.scope: Deactivated successfully. Dec 16 16:21:07.073055 systemd-logind[1549]: Session 6 logged out. Waiting for processes to exit. Dec 16 16:21:07.074978 systemd-logind[1549]: Removed session 6. Dec 16 16:21:07.248782 systemd[1]: Started sshd@4-10.244.29.226:22-139.178.68.195:50916.service - OpenSSH per-connection server daemon (139.178.68.195:50916). Dec 16 16:21:08.246461 sshd[1816]: Accepted publickey for core from 139.178.68.195 port 50916 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:21:08.248243 sshd-session[1816]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:21:08.255130 systemd-logind[1549]: New session 7 of user core. Dec 16 16:21:08.262348 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 16 16:21:08.923000 sshd[1819]: Connection closed by 139.178.68.195 port 50916 Dec 16 16:21:08.922856 sshd-session[1816]: pam_unix(sshd:session): session closed for user core Dec 16 16:21:08.927795 systemd[1]: sshd@4-10.244.29.226:22-139.178.68.195:50916.service: Deactivated successfully. Dec 16 16:21:08.929970 systemd[1]: session-7.scope: Deactivated successfully. Dec 16 16:21:08.931188 systemd-logind[1549]: Session 7 logged out. Waiting for processes to exit. Dec 16 16:21:08.933029 systemd-logind[1549]: Removed session 7. Dec 16 16:21:09.071286 systemd[1]: Started sshd@5-10.244.29.226:22-139.178.68.195:50928.service - OpenSSH per-connection server daemon (139.178.68.195:50928). Dec 16 16:21:09.988815 sshd[1825]: Accepted publickey for core from 139.178.68.195 port 50928 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:21:09.990495 sshd-session[1825]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:21:09.998811 systemd-logind[1549]: New session 8 of user core. Dec 16 16:21:10.008332 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 16 16:21:10.621183 sshd[1828]: Connection closed by 139.178.68.195 port 50928 Dec 16 16:21:10.619825 sshd-session[1825]: pam_unix(sshd:session): session closed for user core Dec 16 16:21:10.624445 systemd[1]: sshd@5-10.244.29.226:22-139.178.68.195:50928.service: Deactivated successfully. Dec 16 16:21:10.627184 systemd[1]: session-8.scope: Deactivated successfully. Dec 16 16:21:10.629048 systemd-logind[1549]: Session 8 logged out. Waiting for processes to exit. Dec 16 16:21:10.631429 systemd-logind[1549]: Removed session 8. Dec 16 16:21:10.780696 systemd[1]: Started sshd@6-10.244.29.226:22-139.178.68.195:34328.service - OpenSSH per-connection server daemon (139.178.68.195:34328). Dec 16 16:21:11.696049 sshd[1834]: Accepted publickey for core from 139.178.68.195 port 34328 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:21:11.697776 sshd-session[1834]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:21:11.707450 systemd-logind[1549]: New session 9 of user core. Dec 16 16:21:11.714353 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 16 16:21:12.192835 sudo[1838]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 16 16:21:12.193305 sudo[1838]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 16:21:12.194825 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 16 16:21:12.198756 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 16:21:12.213349 sudo[1838]: pam_unix(sudo:session): session closed for user root Dec 16 16:21:12.359742 sshd[1837]: Connection closed by 139.178.68.195 port 34328 Dec 16 16:21:12.360636 sshd-session[1834]: pam_unix(sshd:session): session closed for user core Dec 16 16:21:12.367382 systemd[1]: sshd@6-10.244.29.226:22-139.178.68.195:34328.service: Deactivated successfully. Dec 16 16:21:12.370756 systemd[1]: session-9.scope: Deactivated successfully. Dec 16 16:21:12.378831 systemd-logind[1549]: Session 9 logged out. Waiting for processes to exit. Dec 16 16:21:12.379667 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 16:21:12.384618 systemd-logind[1549]: Removed session 9. Dec 16 16:21:12.391526 (kubelet)[1848]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 16:21:12.493061 kubelet[1848]: E1216 16:21:12.492805 1848 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 16:21:12.495599 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 16:21:12.495975 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 16:21:12.496993 systemd[1]: kubelet.service: Consumed 212ms CPU time, 108.5M memory peak. Dec 16 16:21:12.512752 systemd[1]: Started sshd@7-10.244.29.226:22-139.178.68.195:34334.service - OpenSSH per-connection server daemon (139.178.68.195:34334). Dec 16 16:21:13.427408 sshd[1859]: Accepted publickey for core from 139.178.68.195 port 34334 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:21:13.429118 sshd-session[1859]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:21:13.435956 systemd-logind[1549]: New session 10 of user core. Dec 16 16:21:13.449549 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 16 16:21:13.911056 sudo[1864]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 16 16:21:13.912060 sudo[1864]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 16:21:13.918896 sudo[1864]: pam_unix(sudo:session): session closed for user root Dec 16 16:21:13.927473 sudo[1863]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 16 16:21:13.927912 sudo[1863]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 16:21:13.941172 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 16:21:13.990501 augenrules[1886]: No rules Dec 16 16:21:13.991337 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 16:21:13.991877 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 16:21:13.993672 sudo[1863]: pam_unix(sudo:session): session closed for user root Dec 16 16:21:14.139203 sshd[1862]: Connection closed by 139.178.68.195 port 34334 Dec 16 16:21:14.140097 sshd-session[1859]: pam_unix(sshd:session): session closed for user core Dec 16 16:21:14.146883 systemd[1]: sshd@7-10.244.29.226:22-139.178.68.195:34334.service: Deactivated successfully. Dec 16 16:21:14.149643 systemd[1]: session-10.scope: Deactivated successfully. Dec 16 16:21:14.150946 systemd-logind[1549]: Session 10 logged out. Waiting for processes to exit. Dec 16 16:21:14.153211 systemd-logind[1549]: Removed session 10. Dec 16 16:21:14.297095 systemd[1]: Started sshd@8-10.244.29.226:22-139.178.68.195:34342.service - OpenSSH per-connection server daemon (139.178.68.195:34342). Dec 16 16:21:15.218110 sshd[1895]: Accepted publickey for core from 139.178.68.195 port 34342 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:21:15.220225 sshd-session[1895]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:21:15.229325 systemd-logind[1549]: New session 11 of user core. Dec 16 16:21:15.240463 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 16 16:21:15.701459 sudo[1899]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 16 16:21:15.701889 sudo[1899]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 16:21:16.234430 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 16 16:21:16.256691 (dockerd)[1917]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 16 16:21:16.637781 dockerd[1917]: time="2025-12-16T16:21:16.637671070Z" level=info msg="Starting up" Dec 16 16:21:16.639585 dockerd[1917]: time="2025-12-16T16:21:16.639497970Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 16 16:21:16.655915 dockerd[1917]: time="2025-12-16T16:21:16.655796887Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 16 16:21:16.695866 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport3432545321-merged.mount: Deactivated successfully. Dec 16 16:21:16.706768 systemd[1]: var-lib-docker-metacopy\x2dcheck609637690-merged.mount: Deactivated successfully. Dec 16 16:21:16.732011 dockerd[1917]: time="2025-12-16T16:21:16.731695539Z" level=info msg="Loading containers: start." Dec 16 16:21:16.750490 kernel: Initializing XFRM netlink socket Dec 16 16:21:17.038720 systemd-timesyncd[1464]: Network configuration changed, trying to establish connection. Dec 16 16:21:17.101012 systemd-networkd[1505]: docker0: Link UP Dec 16 16:21:17.107971 dockerd[1917]: time="2025-12-16T16:21:17.107746926Z" level=info msg="Loading containers: done." Dec 16 16:21:17.128850 dockerd[1917]: time="2025-12-16T16:21:17.128304019Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 16 16:21:17.128850 dockerd[1917]: time="2025-12-16T16:21:17.128441344Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 16 16:21:17.128850 dockerd[1917]: time="2025-12-16T16:21:17.128589920Z" level=info msg="Initializing buildkit" Dec 16 16:21:17.156762 dockerd[1917]: time="2025-12-16T16:21:17.156658720Z" level=info msg="Completed buildkit initialization" Dec 16 16:21:17.168391 dockerd[1917]: time="2025-12-16T16:21:17.168180776Z" level=info msg="Daemon has completed initialization" Dec 16 16:21:17.168391 dockerd[1917]: time="2025-12-16T16:21:17.168276903Z" level=info msg="API listen on /run/docker.sock" Dec 16 16:21:17.168588 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 16 16:21:17.352281 systemd-timesyncd[1464]: Contacted time server [2a00:2381:19c6::100]:123 (2.flatcar.pool.ntp.org). Dec 16 16:21:17.352377 systemd-timesyncd[1464]: Initial clock synchronization to Tue 2025-12-16 16:21:17.703092 UTC. Dec 16 16:21:17.688140 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3196335358-merged.mount: Deactivated successfully. Dec 16 16:21:18.452670 containerd[1574]: time="2025-12-16T16:21:18.452582194Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\"" Dec 16 16:21:19.188097 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3880977002.mount: Deactivated successfully. Dec 16 16:21:21.078972 containerd[1574]: time="2025-12-16T16:21:21.078867391Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:21.080989 containerd[1574]: time="2025-12-16T16:21:21.080683515Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.10: active requests=0, bytes read=29072191" Dec 16 16:21:21.083088 containerd[1574]: time="2025-12-16T16:21:21.083044340Z" level=info msg="ImageCreate event name:\"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:21.086659 containerd[1574]: time="2025-12-16T16:21:21.086622930Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:21.088420 containerd[1574]: time="2025-12-16T16:21:21.088380515Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.10\" with image id \"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\", size \"29068782\" in 2.635705234s" Dec 16 16:21:21.088578 containerd[1574]: time="2025-12-16T16:21:21.088548712Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\" returns image reference \"sha256:77f8b0de97da9ee43e174b170c363c893ab69a20b03878e1bf6b54b10d44ef6f\"" Dec 16 16:21:21.089945 containerd[1574]: time="2025-12-16T16:21:21.089897627Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\"" Dec 16 16:21:21.271444 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Dec 16 16:21:22.628325 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 16 16:21:22.633758 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 16:21:22.862326 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 16:21:22.875987 (kubelet)[2204]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 16:21:22.951024 kubelet[2204]: E1216 16:21:22.950774 2204 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 16:21:22.954345 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 16:21:22.954579 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 16:21:22.955357 systemd[1]: kubelet.service: Consumed 229ms CPU time, 110.9M memory peak. Dec 16 16:21:23.682331 containerd[1574]: time="2025-12-16T16:21:23.682275660Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:23.684327 containerd[1574]: time="2025-12-16T16:21:23.682897037Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.10: active requests=0, bytes read=24992018" Dec 16 16:21:23.685282 containerd[1574]: time="2025-12-16T16:21:23.685248417Z" level=info msg="ImageCreate event name:\"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:23.689403 containerd[1574]: time="2025-12-16T16:21:23.689370994Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:23.692233 containerd[1574]: time="2025-12-16T16:21:23.692198792Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.10\" with image id \"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\", size \"26649046\" in 2.602256697s" Dec 16 16:21:23.692367 containerd[1574]: time="2025-12-16T16:21:23.692340101Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\" returns image reference \"sha256:34e0beef266f1ca24c0093506853b1cc0ed91e873aeef655f39721813f10f924\"" Dec 16 16:21:23.693061 containerd[1574]: time="2025-12-16T16:21:23.693023378Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\"" Dec 16 16:21:25.738479 containerd[1574]: time="2025-12-16T16:21:25.738371123Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:25.740079 containerd[1574]: time="2025-12-16T16:21:25.739842270Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.10: active requests=0, bytes read=19404256" Dec 16 16:21:25.741209 containerd[1574]: time="2025-12-16T16:21:25.741156533Z" level=info msg="ImageCreate event name:\"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:25.748039 containerd[1574]: time="2025-12-16T16:21:25.747990701Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:25.750952 containerd[1574]: time="2025-12-16T16:21:25.750867823Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.10\" with image id \"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\", size \"21061302\" in 2.05757163s" Dec 16 16:21:25.750952 containerd[1574]: time="2025-12-16T16:21:25.750914083Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\" returns image reference \"sha256:fd6f6aae834c2ec73b534bc30902f1602089a8f4d1bbd8c521fe2b39968efe4a\"" Dec 16 16:21:25.751584 containerd[1574]: time="2025-12-16T16:21:25.751521265Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\"" Dec 16 16:21:27.863848 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount534243322.mount: Deactivated successfully. Dec 16 16:21:28.671774 containerd[1574]: time="2025-12-16T16:21:28.671717417Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:28.673171 containerd[1574]: time="2025-12-16T16:21:28.673141440Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.10: active requests=0, bytes read=31161431" Dec 16 16:21:28.673362 containerd[1574]: time="2025-12-16T16:21:28.673328698Z" level=info msg="ImageCreate event name:\"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:28.675741 containerd[1574]: time="2025-12-16T16:21:28.675699160Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:28.676711 containerd[1574]: time="2025-12-16T16:21:28.676661241Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.10\" with image id \"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\", repo tag \"registry.k8s.io/kube-proxy:v1.32.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\", size \"31160442\" in 2.924947107s" Dec 16 16:21:28.676800 containerd[1574]: time="2025-12-16T16:21:28.676714320Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\" returns image reference \"sha256:db4bcdca85a39c02add2db5eed4fc6ab21eb20616fbf8cd2cf824e59e384a956\"" Dec 16 16:21:28.677420 containerd[1574]: time="2025-12-16T16:21:28.677386078Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Dec 16 16:21:29.434998 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3354241733.mount: Deactivated successfully. Dec 16 16:21:30.791781 containerd[1574]: time="2025-12-16T16:21:30.791711405Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:30.793223 containerd[1574]: time="2025-12-16T16:21:30.793193878Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" Dec 16 16:21:30.794276 containerd[1574]: time="2025-12-16T16:21:30.794230191Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:30.797895 containerd[1574]: time="2025-12-16T16:21:30.797858303Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:30.800179 containerd[1574]: time="2025-12-16T16:21:30.800134769Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 2.122708319s" Dec 16 16:21:30.800392 containerd[1574]: time="2025-12-16T16:21:30.800319956Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Dec 16 16:21:30.801134 containerd[1574]: time="2025-12-16T16:21:30.801074376Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Dec 16 16:21:31.498038 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1859149212.mount: Deactivated successfully. Dec 16 16:21:31.504108 containerd[1574]: time="2025-12-16T16:21:31.503590220Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 16:21:31.504741 containerd[1574]: time="2025-12-16T16:21:31.504711352Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Dec 16 16:21:31.505604 containerd[1574]: time="2025-12-16T16:21:31.505559886Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 16:21:31.508257 containerd[1574]: time="2025-12-16T16:21:31.508219913Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 16:21:31.509170 containerd[1574]: time="2025-12-16T16:21:31.509130624Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 707.997779ms" Dec 16 16:21:31.509248 containerd[1574]: time="2025-12-16T16:21:31.509174097Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Dec 16 16:21:31.510145 containerd[1574]: time="2025-12-16T16:21:31.509772085Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Dec 16 16:21:32.257945 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2319404709.mount: Deactivated successfully. Dec 16 16:21:33.127330 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Dec 16 16:21:33.131599 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 16:21:33.417465 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 16:21:33.430877 (kubelet)[2340]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 16:21:33.562722 kubelet[2340]: E1216 16:21:33.562640 2340 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 16:21:33.566605 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 16:21:33.566857 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 16:21:33.568217 systemd[1]: kubelet.service: Consumed 249ms CPU time, 108.5M memory peak. Dec 16 16:21:34.029736 update_engine[1550]: I20251216 16:21:34.029443 1550 update_attempter.cc:509] Updating boot flags... Dec 16 16:21:36.058101 containerd[1574]: time="2025-12-16T16:21:36.057965619Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:36.059832 containerd[1574]: time="2025-12-16T16:21:36.059761698Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682064" Dec 16 16:21:36.061153 containerd[1574]: time="2025-12-16T16:21:36.060811738Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:36.064362 containerd[1574]: time="2025-12-16T16:21:36.064328360Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:36.066372 containerd[1574]: time="2025-12-16T16:21:36.065855347Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 4.556032916s" Dec 16 16:21:36.066372 containerd[1574]: time="2025-12-16T16:21:36.065914206Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Dec 16 16:21:40.972827 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 16:21:40.973655 systemd[1]: kubelet.service: Consumed 249ms CPU time, 108.5M memory peak. Dec 16 16:21:40.976835 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 16:21:41.017308 systemd[1]: Reload requested from client PID 2394 ('systemctl') (unit session-11.scope)... Dec 16 16:21:41.017358 systemd[1]: Reloading... Dec 16 16:21:41.177123 zram_generator::config[2435]: No configuration found. Dec 16 16:21:41.529875 systemd[1]: Reloading finished in 511 ms. Dec 16 16:21:41.607860 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 16 16:21:41.607999 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 16 16:21:41.608508 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 16:21:41.608576 systemd[1]: kubelet.service: Consumed 147ms CPU time, 98.2M memory peak. Dec 16 16:21:41.610951 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 16:21:41.793234 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 16:21:41.806718 (kubelet)[2506]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 16:21:41.870286 kubelet[2506]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 16:21:41.870286 kubelet[2506]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 16:21:41.870286 kubelet[2506]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 16:21:41.873923 kubelet[2506]: I1216 16:21:41.870388 2506 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 16:21:42.390122 kubelet[2506]: I1216 16:21:42.389644 2506 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Dec 16 16:21:42.390122 kubelet[2506]: I1216 16:21:42.389691 2506 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 16:21:42.390122 kubelet[2506]: I1216 16:21:42.390048 2506 server.go:954] "Client rotation is on, will bootstrap in background" Dec 16 16:21:42.431051 kubelet[2506]: E1216 16:21:42.430983 2506 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.244.29.226:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.244.29.226:6443: connect: connection refused" logger="UnhandledError" Dec 16 16:21:42.433399 kubelet[2506]: I1216 16:21:42.433179 2506 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 16:21:42.467608 kubelet[2506]: I1216 16:21:42.467562 2506 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 16:21:42.480263 kubelet[2506]: I1216 16:21:42.480218 2506 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 16 16:21:42.483033 kubelet[2506]: I1216 16:21:42.482585 2506 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 16:21:42.483033 kubelet[2506]: I1216 16:21:42.482641 2506 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-bfhb9.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 16:21:42.485340 kubelet[2506]: I1216 16:21:42.485313 2506 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 16:21:42.485453 kubelet[2506]: I1216 16:21:42.485435 2506 container_manager_linux.go:304] "Creating device plugin manager" Dec 16 16:21:42.486980 kubelet[2506]: I1216 16:21:42.486841 2506 state_mem.go:36] "Initialized new in-memory state store" Dec 16 16:21:42.491356 kubelet[2506]: I1216 16:21:42.491308 2506 kubelet.go:446] "Attempting to sync node with API server" Dec 16 16:21:42.491566 kubelet[2506]: I1216 16:21:42.491510 2506 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 16:21:42.493571 kubelet[2506]: I1216 16:21:42.493522 2506 kubelet.go:352] "Adding apiserver pod source" Dec 16 16:21:42.493666 kubelet[2506]: I1216 16:21:42.493576 2506 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 16:21:42.501925 kubelet[2506]: W1216 16:21:42.500890 2506 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.244.29.226:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-bfhb9.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.244.29.226:6443: connect: connection refused Dec 16 16:21:42.501925 kubelet[2506]: E1216 16:21:42.500994 2506 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.244.29.226:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-bfhb9.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.244.29.226:6443: connect: connection refused" logger="UnhandledError" Dec 16 16:21:42.501925 kubelet[2506]: W1216 16:21:42.501487 2506 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.244.29.226:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.244.29.226:6443: connect: connection refused Dec 16 16:21:42.501925 kubelet[2506]: E1216 16:21:42.501534 2506 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.244.29.226:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.244.29.226:6443: connect: connection refused" logger="UnhandledError" Dec 16 16:21:42.503585 kubelet[2506]: I1216 16:21:42.503558 2506 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 16 16:21:42.507150 kubelet[2506]: I1216 16:21:42.507126 2506 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 16 16:21:42.507346 kubelet[2506]: W1216 16:21:42.507326 2506 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 16 16:21:42.511683 kubelet[2506]: I1216 16:21:42.511659 2506 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 16 16:21:42.511853 kubelet[2506]: I1216 16:21:42.511822 2506 server.go:1287] "Started kubelet" Dec 16 16:21:42.512274 kubelet[2506]: I1216 16:21:42.512226 2506 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 16:21:42.518005 kubelet[2506]: I1216 16:21:42.517655 2506 server.go:479] "Adding debug handlers to kubelet server" Dec 16 16:21:42.520017 kubelet[2506]: I1216 16:21:42.519480 2506 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 16:21:42.520017 kubelet[2506]: I1216 16:21:42.519917 2506 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 16:21:42.523518 kubelet[2506]: I1216 16:21:42.523400 2506 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 16:21:42.524340 kubelet[2506]: E1216 16:21:42.520952 2506 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.244.29.226:6443/api/v1/namespaces/default/events\": dial tcp 10.244.29.226:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-bfhb9.gb1.brightbox.com.1881be9b2854895c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-bfhb9.gb1.brightbox.com,UID:srv-bfhb9.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-bfhb9.gb1.brightbox.com,},FirstTimestamp:2025-12-16 16:21:42.511782236 +0000 UTC m=+0.698648320,LastTimestamp:2025-12-16 16:21:42.511782236 +0000 UTC m=+0.698648320,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-bfhb9.gb1.brightbox.com,}" Dec 16 16:21:42.525186 kubelet[2506]: I1216 16:21:42.525139 2506 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 16:21:42.535329 kubelet[2506]: I1216 16:21:42.534686 2506 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 16 16:21:42.535329 kubelet[2506]: E1216 16:21:42.534979 2506 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-bfhb9.gb1.brightbox.com\" not found" Dec 16 16:21:42.536334 kubelet[2506]: E1216 16:21:42.536284 2506 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.29.226:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-bfhb9.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.29.226:6443: connect: connection refused" interval="200ms" Dec 16 16:21:42.536822 kubelet[2506]: I1216 16:21:42.536799 2506 factory.go:221] Registration of the systemd container factory successfully Dec 16 16:21:42.537236 kubelet[2506]: I1216 16:21:42.537122 2506 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 16:21:42.539475 kubelet[2506]: I1216 16:21:42.539456 2506 reconciler.go:26] "Reconciler: start to sync state" Dec 16 16:21:42.539644 kubelet[2506]: I1216 16:21:42.539623 2506 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 16 16:21:42.540215 kubelet[2506]: W1216 16:21:42.540160 2506 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.244.29.226:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.244.29.226:6443: connect: connection refused Dec 16 16:21:42.540371 kubelet[2506]: E1216 16:21:42.540345 2506 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.244.29.226:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.244.29.226:6443: connect: connection refused" logger="UnhandledError" Dec 16 16:21:42.543949 kubelet[2506]: I1216 16:21:42.543800 2506 factory.go:221] Registration of the containerd container factory successfully Dec 16 16:21:42.557227 kubelet[2506]: I1216 16:21:42.557160 2506 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 16 16:21:42.559181 kubelet[2506]: I1216 16:21:42.559146 2506 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 16 16:21:42.559267 kubelet[2506]: I1216 16:21:42.559189 2506 status_manager.go:227] "Starting to sync pod status with apiserver" Dec 16 16:21:42.559267 kubelet[2506]: I1216 16:21:42.559224 2506 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 16:21:42.559267 kubelet[2506]: I1216 16:21:42.559236 2506 kubelet.go:2382] "Starting kubelet main sync loop" Dec 16 16:21:42.559399 kubelet[2506]: E1216 16:21:42.559307 2506 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 16:21:42.571541 kubelet[2506]: W1216 16:21:42.569226 2506 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.244.29.226:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.244.29.226:6443: connect: connection refused Dec 16 16:21:42.571541 kubelet[2506]: E1216 16:21:42.569310 2506 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.244.29.226:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.244.29.226:6443: connect: connection refused" logger="UnhandledError" Dec 16 16:21:42.589199 kubelet[2506]: I1216 16:21:42.589148 2506 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 16:21:42.589199 kubelet[2506]: I1216 16:21:42.589181 2506 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 16:21:42.589378 kubelet[2506]: I1216 16:21:42.589213 2506 state_mem.go:36] "Initialized new in-memory state store" Dec 16 16:21:42.591688 kubelet[2506]: I1216 16:21:42.591136 2506 policy_none.go:49] "None policy: Start" Dec 16 16:21:42.591688 kubelet[2506]: I1216 16:21:42.591197 2506 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 16 16:21:42.591688 kubelet[2506]: I1216 16:21:42.591227 2506 state_mem.go:35] "Initializing new in-memory state store" Dec 16 16:21:42.602225 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 16 16:21:42.620212 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 16 16:21:42.626057 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 16 16:21:42.635373 kubelet[2506]: E1216 16:21:42.635295 2506 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-bfhb9.gb1.brightbox.com\" not found" Dec 16 16:21:42.648219 kubelet[2506]: I1216 16:21:42.648108 2506 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 16 16:21:42.652102 kubelet[2506]: I1216 16:21:42.650435 2506 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 16:21:42.652102 kubelet[2506]: I1216 16:21:42.651519 2506 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 16:21:42.652102 kubelet[2506]: I1216 16:21:42.651970 2506 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 16:21:42.653404 kubelet[2506]: E1216 16:21:42.653356 2506 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 16:21:42.653500 kubelet[2506]: E1216 16:21:42.653450 2506 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-bfhb9.gb1.brightbox.com\" not found" Dec 16 16:21:42.676858 systemd[1]: Created slice kubepods-burstable-pod589d03a1769cb0a53a5550fb0617ff99.slice - libcontainer container kubepods-burstable-pod589d03a1769cb0a53a5550fb0617ff99.slice. Dec 16 16:21:42.690613 kubelet[2506]: E1216 16:21:42.690485 2506 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-bfhb9.gb1.brightbox.com\" not found" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:42.695325 systemd[1]: Created slice kubepods-burstable-pod7a3c75e949f853f163415f313e57b649.slice - libcontainer container kubepods-burstable-pod7a3c75e949f853f163415f313e57b649.slice. Dec 16 16:21:42.700118 kubelet[2506]: E1216 16:21:42.699623 2506 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-bfhb9.gb1.brightbox.com\" not found" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:42.703008 systemd[1]: Created slice kubepods-burstable-pod554aed2157c57755815399401198e254.slice - libcontainer container kubepods-burstable-pod554aed2157c57755815399401198e254.slice. Dec 16 16:21:42.706273 kubelet[2506]: E1216 16:21:42.706220 2506 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-bfhb9.gb1.brightbox.com\" not found" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:42.738044 kubelet[2506]: E1216 16:21:42.737963 2506 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.29.226:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-bfhb9.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.29.226:6443: connect: connection refused" interval="400ms" Dec 16 16:21:42.754966 kubelet[2506]: I1216 16:21:42.754922 2506 kubelet_node_status.go:75] "Attempting to register node" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:42.755372 kubelet[2506]: E1216 16:21:42.755339 2506 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.29.226:6443/api/v1/nodes\": dial tcp 10.244.29.226:6443: connect: connection refused" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:42.842521 kubelet[2506]: I1216 16:21:42.842392 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7a3c75e949f853f163415f313e57b649-flexvolume-dir\") pod \"kube-controller-manager-srv-bfhb9.gb1.brightbox.com\" (UID: \"7a3c75e949f853f163415f313e57b649\") " pod="kube-system/kube-controller-manager-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:42.842878 kubelet[2506]: I1216 16:21:42.842486 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7a3c75e949f853f163415f313e57b649-k8s-certs\") pod \"kube-controller-manager-srv-bfhb9.gb1.brightbox.com\" (UID: \"7a3c75e949f853f163415f313e57b649\") " pod="kube-system/kube-controller-manager-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:42.842878 kubelet[2506]: I1216 16:21:42.842800 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7a3c75e949f853f163415f313e57b649-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-bfhb9.gb1.brightbox.com\" (UID: \"7a3c75e949f853f163415f313e57b649\") " pod="kube-system/kube-controller-manager-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:42.842878 kubelet[2506]: I1216 16:21:42.842838 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/554aed2157c57755815399401198e254-kubeconfig\") pod \"kube-scheduler-srv-bfhb9.gb1.brightbox.com\" (UID: \"554aed2157c57755815399401198e254\") " pod="kube-system/kube-scheduler-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:42.843292 kubelet[2506]: I1216 16:21:42.843146 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/589d03a1769cb0a53a5550fb0617ff99-k8s-certs\") pod \"kube-apiserver-srv-bfhb9.gb1.brightbox.com\" (UID: \"589d03a1769cb0a53a5550fb0617ff99\") " pod="kube-system/kube-apiserver-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:42.843292 kubelet[2506]: I1216 16:21:42.843249 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/589d03a1769cb0a53a5550fb0617ff99-usr-share-ca-certificates\") pod \"kube-apiserver-srv-bfhb9.gb1.brightbox.com\" (UID: \"589d03a1769cb0a53a5550fb0617ff99\") " pod="kube-system/kube-apiserver-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:42.843507 kubelet[2506]: I1216 16:21:42.843451 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7a3c75e949f853f163415f313e57b649-ca-certs\") pod \"kube-controller-manager-srv-bfhb9.gb1.brightbox.com\" (UID: \"7a3c75e949f853f163415f313e57b649\") " pod="kube-system/kube-controller-manager-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:42.843680 kubelet[2506]: I1216 16:21:42.843616 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/589d03a1769cb0a53a5550fb0617ff99-ca-certs\") pod \"kube-apiserver-srv-bfhb9.gb1.brightbox.com\" (UID: \"589d03a1769cb0a53a5550fb0617ff99\") " pod="kube-system/kube-apiserver-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:42.843809 kubelet[2506]: I1216 16:21:42.843785 2506 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7a3c75e949f853f163415f313e57b649-kubeconfig\") pod \"kube-controller-manager-srv-bfhb9.gb1.brightbox.com\" (UID: \"7a3c75e949f853f163415f313e57b649\") " pod="kube-system/kube-controller-manager-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:42.958185 kubelet[2506]: I1216 16:21:42.958028 2506 kubelet_node_status.go:75] "Attempting to register node" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:42.958674 kubelet[2506]: E1216 16:21:42.958582 2506 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.29.226:6443/api/v1/nodes\": dial tcp 10.244.29.226:6443: connect: connection refused" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:42.993222 containerd[1574]: time="2025-12-16T16:21:42.993152176Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-bfhb9.gb1.brightbox.com,Uid:589d03a1769cb0a53a5550fb0617ff99,Namespace:kube-system,Attempt:0,}" Dec 16 16:21:43.007654 containerd[1574]: time="2025-12-16T16:21:43.007476264Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-bfhb9.gb1.brightbox.com,Uid:7a3c75e949f853f163415f313e57b649,Namespace:kube-system,Attempt:0,}" Dec 16 16:21:43.008446 containerd[1574]: time="2025-12-16T16:21:43.008412646Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-bfhb9.gb1.brightbox.com,Uid:554aed2157c57755815399401198e254,Namespace:kube-system,Attempt:0,}" Dec 16 16:21:43.159717 kubelet[2506]: E1216 16:21:43.159656 2506 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.29.226:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-bfhb9.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.29.226:6443: connect: connection refused" interval="800ms" Dec 16 16:21:43.167144 containerd[1574]: time="2025-12-16T16:21:43.165704725Z" level=info msg="connecting to shim dc6254254fff97d776d231e733fc52ff9be825dce1808b2ca0fb7d179f81d50e" address="unix:///run/containerd/s/f90f13c63a612ea177e9861e39efcd78aadd346fed84c2ff8812650db31ec2c1" namespace=k8s.io protocol=ttrpc version=3 Dec 16 16:21:43.178134 containerd[1574]: time="2025-12-16T16:21:43.177701625Z" level=info msg="connecting to shim 0d5b452d1c003784f0b5cefecc7d38d47f88370e7c8e057efdd8de2a80f92c70" address="unix:///run/containerd/s/7b732617a352094a17b68992de8fe0dea652fe0ac444277c2de4c852521fe201" namespace=k8s.io protocol=ttrpc version=3 Dec 16 16:21:43.185436 containerd[1574]: time="2025-12-16T16:21:43.185394455Z" level=info msg="connecting to shim 0b7eeedf9fcb9a96db65c4be668dea20325f57f612f3adb22051f37246019f9d" address="unix:///run/containerd/s/f998b716fc2e9aaab5643e156ec99a232173476b654c1b789f2ae57903547913" namespace=k8s.io protocol=ttrpc version=3 Dec 16 16:21:43.289358 systemd[1]: Started cri-containerd-0d5b452d1c003784f0b5cefecc7d38d47f88370e7c8e057efdd8de2a80f92c70.scope - libcontainer container 0d5b452d1c003784f0b5cefecc7d38d47f88370e7c8e057efdd8de2a80f92c70. Dec 16 16:21:43.292100 systemd[1]: Started cri-containerd-dc6254254fff97d776d231e733fc52ff9be825dce1808b2ca0fb7d179f81d50e.scope - libcontainer container dc6254254fff97d776d231e733fc52ff9be825dce1808b2ca0fb7d179f81d50e. Dec 16 16:21:43.302255 systemd[1]: Started cri-containerd-0b7eeedf9fcb9a96db65c4be668dea20325f57f612f3adb22051f37246019f9d.scope - libcontainer container 0b7eeedf9fcb9a96db65c4be668dea20325f57f612f3adb22051f37246019f9d. Dec 16 16:21:43.362103 kubelet[2506]: I1216 16:21:43.362041 2506 kubelet_node_status.go:75] "Attempting to register node" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:43.363331 kubelet[2506]: E1216 16:21:43.363230 2506 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.29.226:6443/api/v1/nodes\": dial tcp 10.244.29.226:6443: connect: connection refused" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:43.437906 containerd[1574]: time="2025-12-16T16:21:43.437692951Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-bfhb9.gb1.brightbox.com,Uid:589d03a1769cb0a53a5550fb0617ff99,Namespace:kube-system,Attempt:0,} returns sandbox id \"dc6254254fff97d776d231e733fc52ff9be825dce1808b2ca0fb7d179f81d50e\"" Dec 16 16:21:43.440123 containerd[1574]: time="2025-12-16T16:21:43.439870714Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-bfhb9.gb1.brightbox.com,Uid:554aed2157c57755815399401198e254,Namespace:kube-system,Attempt:0,} returns sandbox id \"0d5b452d1c003784f0b5cefecc7d38d47f88370e7c8e057efdd8de2a80f92c70\"" Dec 16 16:21:43.446045 containerd[1574]: time="2025-12-16T16:21:43.445994323Z" level=info msg="CreateContainer within sandbox \"dc6254254fff97d776d231e733fc52ff9be825dce1808b2ca0fb7d179f81d50e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 16 16:21:43.447508 containerd[1574]: time="2025-12-16T16:21:43.447460991Z" level=info msg="CreateContainer within sandbox \"0d5b452d1c003784f0b5cefecc7d38d47f88370e7c8e057efdd8de2a80f92c70\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 16 16:21:43.462892 containerd[1574]: time="2025-12-16T16:21:43.461889487Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-bfhb9.gb1.brightbox.com,Uid:7a3c75e949f853f163415f313e57b649,Namespace:kube-system,Attempt:0,} returns sandbox id \"0b7eeedf9fcb9a96db65c4be668dea20325f57f612f3adb22051f37246019f9d\"" Dec 16 16:21:43.467113 containerd[1574]: time="2025-12-16T16:21:43.466304317Z" level=info msg="CreateContainer within sandbox \"0b7eeedf9fcb9a96db65c4be668dea20325f57f612f3adb22051f37246019f9d\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 16 16:21:43.471012 kubelet[2506]: W1216 16:21:43.470952 2506 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.244.29.226:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.244.29.226:6443: connect: connection refused Dec 16 16:21:43.471128 kubelet[2506]: E1216 16:21:43.471030 2506 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.244.29.226:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.244.29.226:6443: connect: connection refused" logger="UnhandledError" Dec 16 16:21:43.473754 containerd[1574]: time="2025-12-16T16:21:43.473719584Z" level=info msg="Container 70201c6cf177a08e2a54708c788fa30c3c83ee344a16931892f735c477d265a5: CDI devices from CRI Config.CDIDevices: []" Dec 16 16:21:43.473946 containerd[1574]: time="2025-12-16T16:21:43.473915600Z" level=info msg="Container b849491990bc8f18f8cf0185db7d67e7d2b5b30c03c172523b43fcc020fba5a3: CDI devices from CRI Config.CDIDevices: []" Dec 16 16:21:43.479888 containerd[1574]: time="2025-12-16T16:21:43.479845077Z" level=info msg="Container 226af3156022c6140ca308cd5ff4a462fa98bbcfef23d564c0957aeaddff0dea: CDI devices from CRI Config.CDIDevices: []" Dec 16 16:21:43.486468 containerd[1574]: time="2025-12-16T16:21:43.486181536Z" level=info msg="CreateContainer within sandbox \"dc6254254fff97d776d231e733fc52ff9be825dce1808b2ca0fb7d179f81d50e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"70201c6cf177a08e2a54708c788fa30c3c83ee344a16931892f735c477d265a5\"" Dec 16 16:21:43.486468 containerd[1574]: time="2025-12-16T16:21:43.486409433Z" level=info msg="CreateContainer within sandbox \"0d5b452d1c003784f0b5cefecc7d38d47f88370e7c8e057efdd8de2a80f92c70\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"b849491990bc8f18f8cf0185db7d67e7d2b5b30c03c172523b43fcc020fba5a3\"" Dec 16 16:21:43.488193 containerd[1574]: time="2025-12-16T16:21:43.488152743Z" level=info msg="StartContainer for \"b849491990bc8f18f8cf0185db7d67e7d2b5b30c03c172523b43fcc020fba5a3\"" Dec 16 16:21:43.488596 containerd[1574]: time="2025-12-16T16:21:43.488517715Z" level=info msg="StartContainer for \"70201c6cf177a08e2a54708c788fa30c3c83ee344a16931892f735c477d265a5\"" Dec 16 16:21:43.491465 containerd[1574]: time="2025-12-16T16:21:43.491416149Z" level=info msg="connecting to shim 70201c6cf177a08e2a54708c788fa30c3c83ee344a16931892f735c477d265a5" address="unix:///run/containerd/s/f90f13c63a612ea177e9861e39efcd78aadd346fed84c2ff8812650db31ec2c1" protocol=ttrpc version=3 Dec 16 16:21:43.493722 containerd[1574]: time="2025-12-16T16:21:43.492610530Z" level=info msg="connecting to shim b849491990bc8f18f8cf0185db7d67e7d2b5b30c03c172523b43fcc020fba5a3" address="unix:///run/containerd/s/7b732617a352094a17b68992de8fe0dea652fe0ac444277c2de4c852521fe201" protocol=ttrpc version=3 Dec 16 16:21:43.496378 containerd[1574]: time="2025-12-16T16:21:43.496346438Z" level=info msg="CreateContainer within sandbox \"0b7eeedf9fcb9a96db65c4be668dea20325f57f612f3adb22051f37246019f9d\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"226af3156022c6140ca308cd5ff4a462fa98bbcfef23d564c0957aeaddff0dea\"" Dec 16 16:21:43.497000 containerd[1574]: time="2025-12-16T16:21:43.496943652Z" level=info msg="StartContainer for \"226af3156022c6140ca308cd5ff4a462fa98bbcfef23d564c0957aeaddff0dea\"" Dec 16 16:21:43.499098 containerd[1574]: time="2025-12-16T16:21:43.499020866Z" level=info msg="connecting to shim 226af3156022c6140ca308cd5ff4a462fa98bbcfef23d564c0957aeaddff0dea" address="unix:///run/containerd/s/f998b716fc2e9aaab5643e156ec99a232173476b654c1b789f2ae57903547913" protocol=ttrpc version=3 Dec 16 16:21:43.519803 kubelet[2506]: W1216 16:21:43.519617 2506 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.244.29.226:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.244.29.226:6443: connect: connection refused Dec 16 16:21:43.519957 kubelet[2506]: E1216 16:21:43.519811 2506 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.244.29.226:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.244.29.226:6443: connect: connection refused" logger="UnhandledError" Dec 16 16:21:43.528267 systemd[1]: Started cri-containerd-b849491990bc8f18f8cf0185db7d67e7d2b5b30c03c172523b43fcc020fba5a3.scope - libcontainer container b849491990bc8f18f8cf0185db7d67e7d2b5b30c03c172523b43fcc020fba5a3. Dec 16 16:21:43.552427 systemd[1]: Started cri-containerd-226af3156022c6140ca308cd5ff4a462fa98bbcfef23d564c0957aeaddff0dea.scope - libcontainer container 226af3156022c6140ca308cd5ff4a462fa98bbcfef23d564c0957aeaddff0dea. Dec 16 16:21:43.566347 systemd[1]: Started cri-containerd-70201c6cf177a08e2a54708c788fa30c3c83ee344a16931892f735c477d265a5.scope - libcontainer container 70201c6cf177a08e2a54708c788fa30c3c83ee344a16931892f735c477d265a5. Dec 16 16:21:43.636756 kubelet[2506]: W1216 16:21:43.636665 2506 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.244.29.226:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.244.29.226:6443: connect: connection refused Dec 16 16:21:43.636756 kubelet[2506]: E1216 16:21:43.636766 2506 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.244.29.226:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.244.29.226:6443: connect: connection refused" logger="UnhandledError" Dec 16 16:21:43.685967 containerd[1574]: time="2025-12-16T16:21:43.685781686Z" level=info msg="StartContainer for \"70201c6cf177a08e2a54708c788fa30c3c83ee344a16931892f735c477d265a5\" returns successfully" Dec 16 16:21:43.718925 containerd[1574]: time="2025-12-16T16:21:43.718498105Z" level=info msg="StartContainer for \"226af3156022c6140ca308cd5ff4a462fa98bbcfef23d564c0957aeaddff0dea\" returns successfully" Dec 16 16:21:43.719689 containerd[1574]: time="2025-12-16T16:21:43.719654471Z" level=info msg="StartContainer for \"b849491990bc8f18f8cf0185db7d67e7d2b5b30c03c172523b43fcc020fba5a3\" returns successfully" Dec 16 16:21:43.947588 kubelet[2506]: W1216 16:21:43.947459 2506 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.244.29.226:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-bfhb9.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.244.29.226:6443: connect: connection refused Dec 16 16:21:43.947588 kubelet[2506]: E1216 16:21:43.947590 2506 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.244.29.226:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-bfhb9.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.244.29.226:6443: connect: connection refused" logger="UnhandledError" Dec 16 16:21:43.962247 kubelet[2506]: E1216 16:21:43.962160 2506 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.29.226:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-bfhb9.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.29.226:6443: connect: connection refused" interval="1.6s" Dec 16 16:21:44.167366 kubelet[2506]: I1216 16:21:44.167332 2506 kubelet_node_status.go:75] "Attempting to register node" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:44.611544 kubelet[2506]: E1216 16:21:44.611505 2506 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-bfhb9.gb1.brightbox.com\" not found" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:44.621507 kubelet[2506]: E1216 16:21:44.621473 2506 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-bfhb9.gb1.brightbox.com\" not found" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:44.624826 kubelet[2506]: E1216 16:21:44.624800 2506 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-bfhb9.gb1.brightbox.com\" not found" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:45.634106 kubelet[2506]: E1216 16:21:45.632131 2506 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-bfhb9.gb1.brightbox.com\" not found" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:45.634106 kubelet[2506]: E1216 16:21:45.632677 2506 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-bfhb9.gb1.brightbox.com\" not found" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:45.637121 kubelet[2506]: E1216 16:21:45.634903 2506 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-bfhb9.gb1.brightbox.com\" not found" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:46.631987 kubelet[2506]: E1216 16:21:46.631664 2506 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-bfhb9.gb1.brightbox.com\" not found" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:46.631987 kubelet[2506]: E1216 16:21:46.631806 2506 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-bfhb9.gb1.brightbox.com\" not found" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:46.937401 kubelet[2506]: E1216 16:21:46.936923 2506 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-bfhb9.gb1.brightbox.com\" not found" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:47.084722 kubelet[2506]: I1216 16:21:47.084669 2506 kubelet_node_status.go:78] "Successfully registered node" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:47.097006 kubelet[2506]: E1216 16:21:47.096737 2506 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{srv-bfhb9.gb1.brightbox.com.1881be9b2854895c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-bfhb9.gb1.brightbox.com,UID:srv-bfhb9.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-bfhb9.gb1.brightbox.com,},FirstTimestamp:2025-12-16 16:21:42.511782236 +0000 UTC m=+0.698648320,LastTimestamp:2025-12-16 16:21:42.511782236 +0000 UTC m=+0.698648320,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-bfhb9.gb1.brightbox.com,}" Dec 16 16:21:47.135937 kubelet[2506]: I1216 16:21:47.135876 2506 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:47.146276 kubelet[2506]: E1216 16:21:47.145868 2506 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-bfhb9.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:47.146276 kubelet[2506]: I1216 16:21:47.145925 2506 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:47.148897 kubelet[2506]: E1216 16:21:47.148861 2506 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-bfhb9.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:47.148897 kubelet[2506]: I1216 16:21:47.148895 2506 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:47.150659 kubelet[2506]: E1216 16:21:47.150629 2506 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-bfhb9.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:47.502549 kubelet[2506]: I1216 16:21:47.502476 2506 apiserver.go:52] "Watching apiserver" Dec 16 16:21:47.540253 kubelet[2506]: I1216 16:21:47.540185 2506 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 16 16:21:47.632058 kubelet[2506]: I1216 16:21:47.632013 2506 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:47.634992 kubelet[2506]: E1216 16:21:47.634952 2506 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-bfhb9.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:48.841275 systemd[1]: Reload requested from client PID 2776 ('systemctl') (unit session-11.scope)... Dec 16 16:21:48.841301 systemd[1]: Reloading... Dec 16 16:21:48.991109 zram_generator::config[2823]: No configuration found. Dec 16 16:21:49.430893 systemd[1]: Reloading finished in 588 ms. Dec 16 16:21:49.481909 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 16:21:49.496821 systemd[1]: kubelet.service: Deactivated successfully. Dec 16 16:21:49.497206 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 16:21:49.497282 systemd[1]: kubelet.service: Consumed 1.245s CPU time, 128.1M memory peak. Dec 16 16:21:49.500864 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 16:21:49.721922 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 16:21:49.738045 (kubelet)[2885]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 16:21:49.880963 kubelet[2885]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 16:21:49.880963 kubelet[2885]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 16:21:49.880963 kubelet[2885]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 16:21:49.883827 kubelet[2885]: I1216 16:21:49.881988 2885 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 16:21:49.896403 kubelet[2885]: I1216 16:21:49.895760 2885 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Dec 16 16:21:49.896403 kubelet[2885]: I1216 16:21:49.895793 2885 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 16:21:49.896403 kubelet[2885]: I1216 16:21:49.896192 2885 server.go:954] "Client rotation is on, will bootstrap in background" Dec 16 16:21:49.899909 kubelet[2885]: I1216 16:21:49.899223 2885 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 16 16:21:49.909394 kubelet[2885]: I1216 16:21:49.909335 2885 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 16:21:49.920306 kubelet[2885]: I1216 16:21:49.920269 2885 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 16:21:49.932380 kubelet[2885]: I1216 16:21:49.932344 2885 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 16 16:21:49.934746 kubelet[2885]: I1216 16:21:49.933527 2885 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 16:21:49.935065 kubelet[2885]: I1216 16:21:49.934738 2885 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-bfhb9.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 16:21:49.935065 kubelet[2885]: I1216 16:21:49.934974 2885 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 16:21:49.935065 kubelet[2885]: I1216 16:21:49.934993 2885 container_manager_linux.go:304] "Creating device plugin manager" Dec 16 16:21:49.936185 kubelet[2885]: I1216 16:21:49.936159 2885 state_mem.go:36] "Initialized new in-memory state store" Dec 16 16:21:49.936439 kubelet[2885]: I1216 16:21:49.936418 2885 kubelet.go:446] "Attempting to sync node with API server" Dec 16 16:21:49.936517 kubelet[2885]: I1216 16:21:49.936454 2885 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 16:21:49.936517 kubelet[2885]: I1216 16:21:49.936488 2885 kubelet.go:352] "Adding apiserver pod source" Dec 16 16:21:49.936517 kubelet[2885]: I1216 16:21:49.936503 2885 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 16:21:49.944331 kubelet[2885]: I1216 16:21:49.943799 2885 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 16 16:21:49.944448 kubelet[2885]: I1216 16:21:49.944394 2885 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 16 16:21:49.945535 kubelet[2885]: I1216 16:21:49.944953 2885 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 16 16:21:49.945535 kubelet[2885]: I1216 16:21:49.944999 2885 server.go:1287] "Started kubelet" Dec 16 16:21:49.951593 kubelet[2885]: I1216 16:21:49.951543 2885 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 16:21:49.955450 kubelet[2885]: I1216 16:21:49.955412 2885 server.go:479] "Adding debug handlers to kubelet server" Dec 16 16:21:49.959811 kubelet[2885]: I1216 16:21:49.959249 2885 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 16:21:49.959811 kubelet[2885]: I1216 16:21:49.959530 2885 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 16:21:49.963551 kubelet[2885]: I1216 16:21:49.963442 2885 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 16:21:49.976557 kubelet[2885]: I1216 16:21:49.976440 2885 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 16:21:49.988605 kubelet[2885]: I1216 16:21:49.988547 2885 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 16 16:21:49.989736 kubelet[2885]: E1216 16:21:49.989708 2885 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-bfhb9.gb1.brightbox.com\" not found" Dec 16 16:21:50.000573 kubelet[2885]: I1216 16:21:50.000538 2885 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 16 16:21:50.002279 kubelet[2885]: I1216 16:21:50.002256 2885 reconciler.go:26] "Reconciler: start to sync state" Dec 16 16:21:50.004572 kubelet[2885]: I1216 16:21:50.003721 2885 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 16:21:50.010118 kubelet[2885]: E1216 16:21:50.009496 2885 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 16:21:50.010118 kubelet[2885]: I1216 16:21:50.009866 2885 factory.go:221] Registration of the containerd container factory successfully Dec 16 16:21:50.010118 kubelet[2885]: I1216 16:21:50.009885 2885 factory.go:221] Registration of the systemd container factory successfully Dec 16 16:21:50.016126 kubelet[2885]: I1216 16:21:50.015767 2885 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 16 16:21:50.019659 kubelet[2885]: I1216 16:21:50.019634 2885 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 16 16:21:50.019828 kubelet[2885]: I1216 16:21:50.019808 2885 status_manager.go:227] "Starting to sync pod status with apiserver" Dec 16 16:21:50.019950 kubelet[2885]: I1216 16:21:50.019931 2885 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 16:21:50.020065 kubelet[2885]: I1216 16:21:50.020045 2885 kubelet.go:2382] "Starting kubelet main sync loop" Dec 16 16:21:50.020279 kubelet[2885]: E1216 16:21:50.020249 2885 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 16:21:50.115104 kubelet[2885]: I1216 16:21:50.114552 2885 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 16:21:50.115104 kubelet[2885]: I1216 16:21:50.114584 2885 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 16:21:50.115104 kubelet[2885]: I1216 16:21:50.114612 2885 state_mem.go:36] "Initialized new in-memory state store" Dec 16 16:21:50.115104 kubelet[2885]: I1216 16:21:50.114852 2885 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 16 16:21:50.115104 kubelet[2885]: I1216 16:21:50.114876 2885 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 16 16:21:50.115104 kubelet[2885]: I1216 16:21:50.114910 2885 policy_none.go:49] "None policy: Start" Dec 16 16:21:50.115104 kubelet[2885]: I1216 16:21:50.114929 2885 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 16 16:21:50.115104 kubelet[2885]: I1216 16:21:50.114951 2885 state_mem.go:35] "Initializing new in-memory state store" Dec 16 16:21:50.115578 kubelet[2885]: I1216 16:21:50.115221 2885 state_mem.go:75] "Updated machine memory state" Dec 16 16:21:50.120500 kubelet[2885]: E1216 16:21:50.120419 2885 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 16 16:21:50.134757 kubelet[2885]: I1216 16:21:50.134722 2885 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 16 16:21:50.135948 kubelet[2885]: I1216 16:21:50.135390 2885 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 16:21:50.135948 kubelet[2885]: I1216 16:21:50.135415 2885 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 16:21:50.135948 kubelet[2885]: I1216 16:21:50.135821 2885 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 16:21:50.145762 kubelet[2885]: E1216 16:21:50.144463 2885 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 16:21:50.265799 kubelet[2885]: I1216 16:21:50.265025 2885 kubelet_node_status.go:75] "Attempting to register node" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:50.281072 kubelet[2885]: I1216 16:21:50.279915 2885 kubelet_node_status.go:124] "Node was previously registered" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:50.281072 kubelet[2885]: I1216 16:21:50.280028 2885 kubelet_node_status.go:78] "Successfully registered node" node="srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:50.323536 kubelet[2885]: I1216 16:21:50.323485 2885 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:50.329788 kubelet[2885]: I1216 16:21:50.328498 2885 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:50.330108 kubelet[2885]: I1216 16:21:50.329604 2885 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:50.336972 kubelet[2885]: W1216 16:21:50.336248 2885 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Dec 16 16:21:50.338715 kubelet[2885]: W1216 16:21:50.338684 2885 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Dec 16 16:21:50.338811 kubelet[2885]: W1216 16:21:50.338786 2885 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Dec 16 16:21:50.405509 kubelet[2885]: I1216 16:21:50.404934 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7a3c75e949f853f163415f313e57b649-flexvolume-dir\") pod \"kube-controller-manager-srv-bfhb9.gb1.brightbox.com\" (UID: \"7a3c75e949f853f163415f313e57b649\") " pod="kube-system/kube-controller-manager-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:50.405509 kubelet[2885]: I1216 16:21:50.404996 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7a3c75e949f853f163415f313e57b649-kubeconfig\") pod \"kube-controller-manager-srv-bfhb9.gb1.brightbox.com\" (UID: \"7a3c75e949f853f163415f313e57b649\") " pod="kube-system/kube-controller-manager-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:50.405509 kubelet[2885]: I1216 16:21:50.405027 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/589d03a1769cb0a53a5550fb0617ff99-ca-certs\") pod \"kube-apiserver-srv-bfhb9.gb1.brightbox.com\" (UID: \"589d03a1769cb0a53a5550fb0617ff99\") " pod="kube-system/kube-apiserver-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:50.405509 kubelet[2885]: I1216 16:21:50.405055 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/589d03a1769cb0a53a5550fb0617ff99-usr-share-ca-certificates\") pod \"kube-apiserver-srv-bfhb9.gb1.brightbox.com\" (UID: \"589d03a1769cb0a53a5550fb0617ff99\") " pod="kube-system/kube-apiserver-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:50.405509 kubelet[2885]: I1216 16:21:50.405107 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7a3c75e949f853f163415f313e57b649-ca-certs\") pod \"kube-controller-manager-srv-bfhb9.gb1.brightbox.com\" (UID: \"7a3c75e949f853f163415f313e57b649\") " pod="kube-system/kube-controller-manager-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:50.405868 kubelet[2885]: I1216 16:21:50.405135 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7a3c75e949f853f163415f313e57b649-k8s-certs\") pod \"kube-controller-manager-srv-bfhb9.gb1.brightbox.com\" (UID: \"7a3c75e949f853f163415f313e57b649\") " pod="kube-system/kube-controller-manager-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:50.405868 kubelet[2885]: I1216 16:21:50.405171 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7a3c75e949f853f163415f313e57b649-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-bfhb9.gb1.brightbox.com\" (UID: \"7a3c75e949f853f163415f313e57b649\") " pod="kube-system/kube-controller-manager-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:50.405868 kubelet[2885]: I1216 16:21:50.405203 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/554aed2157c57755815399401198e254-kubeconfig\") pod \"kube-scheduler-srv-bfhb9.gb1.brightbox.com\" (UID: \"554aed2157c57755815399401198e254\") " pod="kube-system/kube-scheduler-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:50.405868 kubelet[2885]: I1216 16:21:50.405240 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/589d03a1769cb0a53a5550fb0617ff99-k8s-certs\") pod \"kube-apiserver-srv-bfhb9.gb1.brightbox.com\" (UID: \"589d03a1769cb0a53a5550fb0617ff99\") " pod="kube-system/kube-apiserver-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:50.938378 kubelet[2885]: I1216 16:21:50.937118 2885 apiserver.go:52] "Watching apiserver" Dec 16 16:21:51.001798 kubelet[2885]: I1216 16:21:51.001727 2885 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 16 16:21:51.065928 kubelet[2885]: I1216 16:21:51.064601 2885 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:51.076411 kubelet[2885]: W1216 16:21:51.076344 2885 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Dec 16 16:21:51.076725 kubelet[2885]: E1216 16:21:51.076469 2885 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-bfhb9.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-bfhb9.gb1.brightbox.com" Dec 16 16:21:51.093827 kubelet[2885]: I1216 16:21:51.093629 2885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-bfhb9.gb1.brightbox.com" podStartSLOduration=1.093602298 podStartE2EDuration="1.093602298s" podCreationTimestamp="2025-12-16 16:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 16:21:51.093547098 +0000 UTC m=+1.300494330" watchObservedRunningTime="2025-12-16 16:21:51.093602298 +0000 UTC m=+1.300549511" Dec 16 16:21:51.111657 kubelet[2885]: I1216 16:21:51.111232 2885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-bfhb9.gb1.brightbox.com" podStartSLOduration=1.1112109430000001 podStartE2EDuration="1.111210943s" podCreationTimestamp="2025-12-16 16:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 16:21:51.109970592 +0000 UTC m=+1.316917827" watchObservedRunningTime="2025-12-16 16:21:51.111210943 +0000 UTC m=+1.318158149" Dec 16 16:21:51.144148 kubelet[2885]: I1216 16:21:51.143559 2885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-bfhb9.gb1.brightbox.com" podStartSLOduration=1.143534457 podStartE2EDuration="1.143534457s" podCreationTimestamp="2025-12-16 16:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 16:21:51.123921246 +0000 UTC m=+1.330868467" watchObservedRunningTime="2025-12-16 16:21:51.143534457 +0000 UTC m=+1.350481674" Dec 16 16:21:53.674160 kubelet[2885]: I1216 16:21:53.674021 2885 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 16 16:21:53.674824 containerd[1574]: time="2025-12-16T16:21:53.674529144Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 16 16:21:53.677068 kubelet[2885]: I1216 16:21:53.675639 2885 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 16 16:21:54.407702 systemd[1]: Created slice kubepods-besteffort-podbc08ec37_60a8_47e4_b816_be2f1ed76945.slice - libcontainer container kubepods-besteffort-podbc08ec37_60a8_47e4_b816_be2f1ed76945.slice. Dec 16 16:21:54.431495 kubelet[2885]: I1216 16:21:54.431393 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsp9t\" (UniqueName: \"kubernetes.io/projected/bc08ec37-60a8-47e4-b816-be2f1ed76945-kube-api-access-wsp9t\") pod \"kube-proxy-nhfpn\" (UID: \"bc08ec37-60a8-47e4-b816-be2f1ed76945\") " pod="kube-system/kube-proxy-nhfpn" Dec 16 16:21:54.431981 kubelet[2885]: I1216 16:21:54.431743 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/bc08ec37-60a8-47e4-b816-be2f1ed76945-kube-proxy\") pod \"kube-proxy-nhfpn\" (UID: \"bc08ec37-60a8-47e4-b816-be2f1ed76945\") " pod="kube-system/kube-proxy-nhfpn" Dec 16 16:21:54.432264 kubelet[2885]: I1216 16:21:54.431952 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/bc08ec37-60a8-47e4-b816-be2f1ed76945-xtables-lock\") pod \"kube-proxy-nhfpn\" (UID: \"bc08ec37-60a8-47e4-b816-be2f1ed76945\") " pod="kube-system/kube-proxy-nhfpn" Dec 16 16:21:54.432264 kubelet[2885]: I1216 16:21:54.432209 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bc08ec37-60a8-47e4-b816-be2f1ed76945-lib-modules\") pod \"kube-proxy-nhfpn\" (UID: \"bc08ec37-60a8-47e4-b816-be2f1ed76945\") " pod="kube-system/kube-proxy-nhfpn" Dec 16 16:21:54.723165 containerd[1574]: time="2025-12-16T16:21:54.722732333Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-nhfpn,Uid:bc08ec37-60a8-47e4-b816-be2f1ed76945,Namespace:kube-system,Attempt:0,}" Dec 16 16:21:54.770280 containerd[1574]: time="2025-12-16T16:21:54.770197009Z" level=info msg="connecting to shim 49933001dda7352e49e9a88edbb66623eabd08deb7453f34df208a00426f2f5f" address="unix:///run/containerd/s/51846b910d385f36e6c39dbe3538e10203a818b30f7198a08b64ff3684d26858" namespace=k8s.io protocol=ttrpc version=3 Dec 16 16:21:54.840636 systemd[1]: Started cri-containerd-49933001dda7352e49e9a88edbb66623eabd08deb7453f34df208a00426f2f5f.scope - libcontainer container 49933001dda7352e49e9a88edbb66623eabd08deb7453f34df208a00426f2f5f. Dec 16 16:21:54.857245 kubelet[2885]: I1216 16:21:54.857178 2885 status_manager.go:890] "Failed to get status for pod" podUID="7228d96c-af2d-4a2b-94ab-7da92e4ca121" pod="tigera-operator/tigera-operator-7dcd859c48-jvvxp" err="pods \"tigera-operator-7dcd859c48-jvvxp\" is forbidden: User \"system:node:srv-bfhb9.gb1.brightbox.com\" cannot get resource \"pods\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'srv-bfhb9.gb1.brightbox.com' and this object" Dec 16 16:21:54.860795 systemd[1]: Created slice kubepods-besteffort-pod7228d96c_af2d_4a2b_94ab_7da92e4ca121.slice - libcontainer container kubepods-besteffort-pod7228d96c_af2d_4a2b_94ab_7da92e4ca121.slice. Dec 16 16:21:54.931356 containerd[1574]: time="2025-12-16T16:21:54.931189028Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-nhfpn,Uid:bc08ec37-60a8-47e4-b816-be2f1ed76945,Namespace:kube-system,Attempt:0,} returns sandbox id \"49933001dda7352e49e9a88edbb66623eabd08deb7453f34df208a00426f2f5f\"" Dec 16 16:21:54.934762 kubelet[2885]: I1216 16:21:54.934618 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7pph\" (UniqueName: \"kubernetes.io/projected/7228d96c-af2d-4a2b-94ab-7da92e4ca121-kube-api-access-t7pph\") pod \"tigera-operator-7dcd859c48-jvvxp\" (UID: \"7228d96c-af2d-4a2b-94ab-7da92e4ca121\") " pod="tigera-operator/tigera-operator-7dcd859c48-jvvxp" Dec 16 16:21:54.934762 kubelet[2885]: I1216 16:21:54.934680 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7228d96c-af2d-4a2b-94ab-7da92e4ca121-var-lib-calico\") pod \"tigera-operator-7dcd859c48-jvvxp\" (UID: \"7228d96c-af2d-4a2b-94ab-7da92e4ca121\") " pod="tigera-operator/tigera-operator-7dcd859c48-jvvxp" Dec 16 16:21:54.937168 containerd[1574]: time="2025-12-16T16:21:54.937121133Z" level=info msg="CreateContainer within sandbox \"49933001dda7352e49e9a88edbb66623eabd08deb7453f34df208a00426f2f5f\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 16 16:21:54.972934 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2176617027.mount: Deactivated successfully. Dec 16 16:21:54.978172 containerd[1574]: time="2025-12-16T16:21:54.974161242Z" level=info msg="Container 81f4958f5a56ecb8a7727dfb698c2bc1f875b1c3e5acb2af17e9ea65a53bc73b: CDI devices from CRI Config.CDIDevices: []" Dec 16 16:21:54.992550 containerd[1574]: time="2025-12-16T16:21:54.992458427Z" level=info msg="CreateContainer within sandbox \"49933001dda7352e49e9a88edbb66623eabd08deb7453f34df208a00426f2f5f\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"81f4958f5a56ecb8a7727dfb698c2bc1f875b1c3e5acb2af17e9ea65a53bc73b\"" Dec 16 16:21:54.994058 containerd[1574]: time="2025-12-16T16:21:54.993934101Z" level=info msg="StartContainer for \"81f4958f5a56ecb8a7727dfb698c2bc1f875b1c3e5acb2af17e9ea65a53bc73b\"" Dec 16 16:21:55.001276 containerd[1574]: time="2025-12-16T16:21:55.001222067Z" level=info msg="connecting to shim 81f4958f5a56ecb8a7727dfb698c2bc1f875b1c3e5acb2af17e9ea65a53bc73b" address="unix:///run/containerd/s/51846b910d385f36e6c39dbe3538e10203a818b30f7198a08b64ff3684d26858" protocol=ttrpc version=3 Dec 16 16:21:55.036743 systemd[1]: Started cri-containerd-81f4958f5a56ecb8a7727dfb698c2bc1f875b1c3e5acb2af17e9ea65a53bc73b.scope - libcontainer container 81f4958f5a56ecb8a7727dfb698c2bc1f875b1c3e5acb2af17e9ea65a53bc73b. Dec 16 16:21:55.156599 containerd[1574]: time="2025-12-16T16:21:55.156548842Z" level=info msg="StartContainer for \"81f4958f5a56ecb8a7727dfb698c2bc1f875b1c3e5acb2af17e9ea65a53bc73b\" returns successfully" Dec 16 16:21:55.168680 containerd[1574]: time="2025-12-16T16:21:55.168565350Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-jvvxp,Uid:7228d96c-af2d-4a2b-94ab-7da92e4ca121,Namespace:tigera-operator,Attempt:0,}" Dec 16 16:21:55.195716 containerd[1574]: time="2025-12-16T16:21:55.195070568Z" level=info msg="connecting to shim 98063dd8e1d9fa521fb2a3f490ce7aaf58954b00b409ef8e8cec508d00cc1e43" address="unix:///run/containerd/s/83a1759eed0385b12a929809c9060f5f1c040de1ef509dfcb0bd7ac5492b7d59" namespace=k8s.io protocol=ttrpc version=3 Dec 16 16:21:55.237294 systemd[1]: Started cri-containerd-98063dd8e1d9fa521fb2a3f490ce7aaf58954b00b409ef8e8cec508d00cc1e43.scope - libcontainer container 98063dd8e1d9fa521fb2a3f490ce7aaf58954b00b409ef8e8cec508d00cc1e43. Dec 16 16:21:55.351157 containerd[1574]: time="2025-12-16T16:21:55.351072615Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-jvvxp,Uid:7228d96c-af2d-4a2b-94ab-7da92e4ca121,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"98063dd8e1d9fa521fb2a3f490ce7aaf58954b00b409ef8e8cec508d00cc1e43\"" Dec 16 16:21:55.354588 containerd[1574]: time="2025-12-16T16:21:55.354492831Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 16 16:21:56.113030 kubelet[2885]: I1216 16:21:56.112944 2885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-nhfpn" podStartSLOduration=2.112923352 podStartE2EDuration="2.112923352s" podCreationTimestamp="2025-12-16 16:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 16:21:56.101936906 +0000 UTC m=+6.308884125" watchObservedRunningTime="2025-12-16 16:21:56.112923352 +0000 UTC m=+6.319870574" Dec 16 16:21:58.175239 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3181088864.mount: Deactivated successfully. Dec 16 16:21:59.286861 containerd[1574]: time="2025-12-16T16:21:59.285779164Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:59.286861 containerd[1574]: time="2025-12-16T16:21:59.286815298Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=25061691" Dec 16 16:21:59.287837 containerd[1574]: time="2025-12-16T16:21:59.287800053Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:59.290703 containerd[1574]: time="2025-12-16T16:21:59.290642347Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:21:59.293385 containerd[1574]: time="2025-12-16T16:21:59.293299842Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 3.938557999s" Dec 16 16:21:59.293385 containerd[1574]: time="2025-12-16T16:21:59.293341856Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Dec 16 16:21:59.298702 containerd[1574]: time="2025-12-16T16:21:59.298662156Z" level=info msg="CreateContainer within sandbox \"98063dd8e1d9fa521fb2a3f490ce7aaf58954b00b409ef8e8cec508d00cc1e43\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 16 16:21:59.307402 containerd[1574]: time="2025-12-16T16:21:59.306739342Z" level=info msg="Container 3c7834f468b152a4e0b34417fb8b5d225bac15f28e2083787fced195049e6e4d: CDI devices from CRI Config.CDIDevices: []" Dec 16 16:21:59.314188 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4040717434.mount: Deactivated successfully. Dec 16 16:21:59.316744 containerd[1574]: time="2025-12-16T16:21:59.316601855Z" level=info msg="CreateContainer within sandbox \"98063dd8e1d9fa521fb2a3f490ce7aaf58954b00b409ef8e8cec508d00cc1e43\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"3c7834f468b152a4e0b34417fb8b5d225bac15f28e2083787fced195049e6e4d\"" Dec 16 16:21:59.317690 containerd[1574]: time="2025-12-16T16:21:59.317654522Z" level=info msg="StartContainer for \"3c7834f468b152a4e0b34417fb8b5d225bac15f28e2083787fced195049e6e4d\"" Dec 16 16:21:59.319328 containerd[1574]: time="2025-12-16T16:21:59.319215161Z" level=info msg="connecting to shim 3c7834f468b152a4e0b34417fb8b5d225bac15f28e2083787fced195049e6e4d" address="unix:///run/containerd/s/83a1759eed0385b12a929809c9060f5f1c040de1ef509dfcb0bd7ac5492b7d59" protocol=ttrpc version=3 Dec 16 16:21:59.353312 systemd[1]: Started cri-containerd-3c7834f468b152a4e0b34417fb8b5d225bac15f28e2083787fced195049e6e4d.scope - libcontainer container 3c7834f468b152a4e0b34417fb8b5d225bac15f28e2083787fced195049e6e4d. Dec 16 16:21:59.407889 containerd[1574]: time="2025-12-16T16:21:59.407832658Z" level=info msg="StartContainer for \"3c7834f468b152a4e0b34417fb8b5d225bac15f28e2083787fced195049e6e4d\" returns successfully" Dec 16 16:22:00.141268 kubelet[2885]: I1216 16:22:00.141103 2885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-jvvxp" podStartSLOduration=2.199091675 podStartE2EDuration="6.140954184s" podCreationTimestamp="2025-12-16 16:21:54 +0000 UTC" firstStartedPulling="2025-12-16 16:21:55.353838409 +0000 UTC m=+5.560785614" lastFinishedPulling="2025-12-16 16:21:59.295700918 +0000 UTC m=+9.502648123" observedRunningTime="2025-12-16 16:22:00.115937815 +0000 UTC m=+10.322885037" watchObservedRunningTime="2025-12-16 16:22:00.140954184 +0000 UTC m=+10.347901396" Dec 16 16:22:06.759326 sudo[1899]: pam_unix(sudo:session): session closed for user root Dec 16 16:22:06.908135 sshd[1898]: Connection closed by 139.178.68.195 port 34342 Dec 16 16:22:06.907778 sshd-session[1895]: pam_unix(sshd:session): session closed for user core Dec 16 16:22:06.921132 systemd[1]: sshd@8-10.244.29.226:22-139.178.68.195:34342.service: Deactivated successfully. Dec 16 16:22:06.925680 systemd[1]: session-11.scope: Deactivated successfully. Dec 16 16:22:06.926621 systemd[1]: session-11.scope: Consumed 7.304s CPU time, 159.6M memory peak. Dec 16 16:22:06.930866 systemd-logind[1549]: Session 11 logged out. Waiting for processes to exit. Dec 16 16:22:06.934645 systemd-logind[1549]: Removed session 11. Dec 16 16:22:12.116776 systemd[1]: Created slice kubepods-besteffort-podbd7c91d9_5b59_4110_92e2_ba57456a5c2f.slice - libcontainer container kubepods-besteffort-podbd7c91d9_5b59_4110_92e2_ba57456a5c2f.slice. Dec 16 16:22:12.241466 kubelet[2885]: I1216 16:22:12.241271 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/bd7c91d9-5b59-4110-92e2-ba57456a5c2f-typha-certs\") pod \"calico-typha-669867fb68-9z7rt\" (UID: \"bd7c91d9-5b59-4110-92e2-ba57456a5c2f\") " pod="calico-system/calico-typha-669867fb68-9z7rt" Dec 16 16:22:12.241466 kubelet[2885]: I1216 16:22:12.241347 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd7c91d9-5b59-4110-92e2-ba57456a5c2f-tigera-ca-bundle\") pod \"calico-typha-669867fb68-9z7rt\" (UID: \"bd7c91d9-5b59-4110-92e2-ba57456a5c2f\") " pod="calico-system/calico-typha-669867fb68-9z7rt" Dec 16 16:22:12.241466 kubelet[2885]: I1216 16:22:12.241380 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jmnd\" (UniqueName: \"kubernetes.io/projected/bd7c91d9-5b59-4110-92e2-ba57456a5c2f-kube-api-access-8jmnd\") pod \"calico-typha-669867fb68-9z7rt\" (UID: \"bd7c91d9-5b59-4110-92e2-ba57456a5c2f\") " pod="calico-system/calico-typha-669867fb68-9z7rt" Dec 16 16:22:12.349271 systemd[1]: Created slice kubepods-besteffort-pod10d00c49_0c56_422a_afe0_c3efa0c1269e.slice - libcontainer container kubepods-besteffort-pod10d00c49_0c56_422a_afe0_c3efa0c1269e.slice. Dec 16 16:22:12.429552 containerd[1574]: time="2025-12-16T16:22:12.429490384Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-669867fb68-9z7rt,Uid:bd7c91d9-5b59-4110-92e2-ba57456a5c2f,Namespace:calico-system,Attempt:0,}" Dec 16 16:22:12.442809 kubelet[2885]: I1216 16:22:12.442187 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/10d00c49-0c56-422a-afe0-c3efa0c1269e-lib-modules\") pod \"calico-node-vm24s\" (UID: \"10d00c49-0c56-422a-afe0-c3efa0c1269e\") " pod="calico-system/calico-node-vm24s" Dec 16 16:22:12.442809 kubelet[2885]: I1216 16:22:12.442252 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/10d00c49-0c56-422a-afe0-c3efa0c1269e-var-lib-calico\") pod \"calico-node-vm24s\" (UID: \"10d00c49-0c56-422a-afe0-c3efa0c1269e\") " pod="calico-system/calico-node-vm24s" Dec 16 16:22:12.442809 kubelet[2885]: I1216 16:22:12.442287 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/10d00c49-0c56-422a-afe0-c3efa0c1269e-policysync\") pod \"calico-node-vm24s\" (UID: \"10d00c49-0c56-422a-afe0-c3efa0c1269e\") " pod="calico-system/calico-node-vm24s" Dec 16 16:22:12.442809 kubelet[2885]: I1216 16:22:12.442317 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/10d00c49-0c56-422a-afe0-c3efa0c1269e-flexvol-driver-host\") pod \"calico-node-vm24s\" (UID: \"10d00c49-0c56-422a-afe0-c3efa0c1269e\") " pod="calico-system/calico-node-vm24s" Dec 16 16:22:12.442809 kubelet[2885]: I1216 16:22:12.442351 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/10d00c49-0c56-422a-afe0-c3efa0c1269e-xtables-lock\") pod \"calico-node-vm24s\" (UID: \"10d00c49-0c56-422a-afe0-c3efa0c1269e\") " pod="calico-system/calico-node-vm24s" Dec 16 16:22:12.443130 kubelet[2885]: I1216 16:22:12.442390 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/10d00c49-0c56-422a-afe0-c3efa0c1269e-node-certs\") pod \"calico-node-vm24s\" (UID: \"10d00c49-0c56-422a-afe0-c3efa0c1269e\") " pod="calico-system/calico-node-vm24s" Dec 16 16:22:12.443130 kubelet[2885]: I1216 16:22:12.442418 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/10d00c49-0c56-422a-afe0-c3efa0c1269e-cni-log-dir\") pod \"calico-node-vm24s\" (UID: \"10d00c49-0c56-422a-afe0-c3efa0c1269e\") " pod="calico-system/calico-node-vm24s" Dec 16 16:22:12.443130 kubelet[2885]: I1216 16:22:12.442446 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/10d00c49-0c56-422a-afe0-c3efa0c1269e-cni-bin-dir\") pod \"calico-node-vm24s\" (UID: \"10d00c49-0c56-422a-afe0-c3efa0c1269e\") " pod="calico-system/calico-node-vm24s" Dec 16 16:22:12.443130 kubelet[2885]: I1216 16:22:12.442471 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/10d00c49-0c56-422a-afe0-c3efa0c1269e-cni-net-dir\") pod \"calico-node-vm24s\" (UID: \"10d00c49-0c56-422a-afe0-c3efa0c1269e\") " pod="calico-system/calico-node-vm24s" Dec 16 16:22:12.443130 kubelet[2885]: I1216 16:22:12.442496 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10d00c49-0c56-422a-afe0-c3efa0c1269e-tigera-ca-bundle\") pod \"calico-node-vm24s\" (UID: \"10d00c49-0c56-422a-afe0-c3efa0c1269e\") " pod="calico-system/calico-node-vm24s" Dec 16 16:22:12.443346 kubelet[2885]: I1216 16:22:12.442521 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/10d00c49-0c56-422a-afe0-c3efa0c1269e-var-run-calico\") pod \"calico-node-vm24s\" (UID: \"10d00c49-0c56-422a-afe0-c3efa0c1269e\") " pod="calico-system/calico-node-vm24s" Dec 16 16:22:12.443346 kubelet[2885]: I1216 16:22:12.442548 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6n4j\" (UniqueName: \"kubernetes.io/projected/10d00c49-0c56-422a-afe0-c3efa0c1269e-kube-api-access-r6n4j\") pod \"calico-node-vm24s\" (UID: \"10d00c49-0c56-422a-afe0-c3efa0c1269e\") " pod="calico-system/calico-node-vm24s" Dec 16 16:22:12.499110 containerd[1574]: time="2025-12-16T16:22:12.498765536Z" level=info msg="connecting to shim d21d96274bc526d4916751b237b8d3a14e2b4582f32783ffcac23543479f9eb7" address="unix:///run/containerd/s/cec9e95b09d9a783e8fe7cb7f5a21fc40b8e555395b3c239f7e40b4b7ec6324a" namespace=k8s.io protocol=ttrpc version=3 Dec 16 16:22:12.544329 systemd[1]: Started cri-containerd-d21d96274bc526d4916751b237b8d3a14e2b4582f32783ffcac23543479f9eb7.scope - libcontainer container d21d96274bc526d4916751b237b8d3a14e2b4582f32783ffcac23543479f9eb7. Dec 16 16:22:12.554320 kubelet[2885]: E1216 16:22:12.554186 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.554320 kubelet[2885]: W1216 16:22:12.554218 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.555509 kubelet[2885]: E1216 16:22:12.554935 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.555509 kubelet[2885]: W1216 16:22:12.554978 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.555509 kubelet[2885]: E1216 16:22:12.555216 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.556240 kubelet[2885]: E1216 16:22:12.556181 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.557134 kubelet[2885]: E1216 16:22:12.557106 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.557134 kubelet[2885]: W1216 16:22:12.557130 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.558737 kubelet[2885]: E1216 16:22:12.557154 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.558737 kubelet[2885]: E1216 16:22:12.558581 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.558737 kubelet[2885]: W1216 16:22:12.558647 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.558737 kubelet[2885]: E1216 16:22:12.558664 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.559165 kubelet[2885]: E1216 16:22:12.559118 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.559165 kubelet[2885]: W1216 16:22:12.559139 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.559451 kubelet[2885]: E1216 16:22:12.559320 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.559648 kubelet[2885]: E1216 16:22:12.559560 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.559871 kubelet[2885]: W1216 16:22:12.559697 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.559871 kubelet[2885]: E1216 16:22:12.559839 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.560545 kubelet[2885]: E1216 16:22:12.560514 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.561048 kubelet[2885]: W1216 16:22:12.560733 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.561048 kubelet[2885]: E1216 16:22:12.560789 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.561674 kubelet[2885]: E1216 16:22:12.561632 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.562240 kubelet[2885]: W1216 16:22:12.561994 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.562240 kubelet[2885]: E1216 16:22:12.562166 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.563455 kubelet[2885]: E1216 16:22:12.563424 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.563965 kubelet[2885]: W1216 16:22:12.563941 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.564416 kubelet[2885]: E1216 16:22:12.564385 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.565097 kubelet[2885]: E1216 16:22:12.564787 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.565097 kubelet[2885]: W1216 16:22:12.564813 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.565878 kubelet[2885]: E1216 16:22:12.565857 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.565986 kubelet[2885]: W1216 16:22:12.565965 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.566160 kubelet[2885]: E1216 16:22:12.566128 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.567884 kubelet[2885]: E1216 16:22:12.567598 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.568382 kubelet[2885]: E1216 16:22:12.568303 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.568382 kubelet[2885]: W1216 16:22:12.568323 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.568382 kubelet[2885]: E1216 16:22:12.568341 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.569806 kubelet[2885]: E1216 16:22:12.569731 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.569806 kubelet[2885]: W1216 16:22:12.569752 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.569806 kubelet[2885]: E1216 16:22:12.569769 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.578102 kubelet[2885]: E1216 16:22:12.573053 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.578102 kubelet[2885]: W1216 16:22:12.573123 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.578102 kubelet[2885]: E1216 16:22:12.573154 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.588947 kubelet[2885]: E1216 16:22:12.588876 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:22:12.606289 kubelet[2885]: E1216 16:22:12.605639 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.606289 kubelet[2885]: W1216 16:22:12.606281 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.606684 kubelet[2885]: E1216 16:22:12.606329 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.653119 kubelet[2885]: E1216 16:22:12.652771 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.653119 kubelet[2885]: W1216 16:22:12.652802 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.653119 kubelet[2885]: E1216 16:22:12.652956 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.654109 kubelet[2885]: E1216 16:22:12.653498 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.654109 kubelet[2885]: W1216 16:22:12.653556 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.654109 kubelet[2885]: E1216 16:22:12.653577 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.654109 kubelet[2885]: E1216 16:22:12.653970 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.654109 kubelet[2885]: W1216 16:22:12.653983 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.654109 kubelet[2885]: E1216 16:22:12.653998 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.655026 kubelet[2885]: E1216 16:22:12.654993 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.655134 kubelet[2885]: W1216 16:22:12.655015 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.655134 kubelet[2885]: E1216 16:22:12.655113 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.655876 kubelet[2885]: E1216 16:22:12.655767 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.655876 kubelet[2885]: W1216 16:22:12.655823 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.655876 kubelet[2885]: E1216 16:22:12.655874 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.657432 kubelet[2885]: E1216 16:22:12.656271 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.657432 kubelet[2885]: W1216 16:22:12.656322 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.657432 kubelet[2885]: E1216 16:22:12.656340 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.657432 kubelet[2885]: E1216 16:22:12.656753 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.657432 kubelet[2885]: W1216 16:22:12.656766 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.657432 kubelet[2885]: E1216 16:22:12.656818 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.657432 kubelet[2885]: E1216 16:22:12.657256 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.657432 kubelet[2885]: W1216 16:22:12.657270 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.657432 kubelet[2885]: E1216 16:22:12.657285 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.658624 kubelet[2885]: E1216 16:22:12.657821 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.658624 kubelet[2885]: W1216 16:22:12.657849 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.658624 kubelet[2885]: E1216 16:22:12.657865 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.658624 kubelet[2885]: E1216 16:22:12.658350 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.658624 kubelet[2885]: W1216 16:22:12.658364 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.658624 kubelet[2885]: E1216 16:22:12.658382 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.659449 kubelet[2885]: E1216 16:22:12.659418 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.659449 kubelet[2885]: W1216 16:22:12.659439 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.659564 kubelet[2885]: E1216 16:22:12.659455 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.661102 kubelet[2885]: E1216 16:22:12.660162 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.661102 kubelet[2885]: W1216 16:22:12.660195 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.661102 kubelet[2885]: E1216 16:22:12.660213 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.661102 kubelet[2885]: E1216 16:22:12.660608 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.661102 kubelet[2885]: W1216 16:22:12.660634 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.661102 kubelet[2885]: E1216 16:22:12.660649 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.661102 kubelet[2885]: E1216 16:22:12.660962 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.661102 kubelet[2885]: W1216 16:22:12.660976 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.661102 kubelet[2885]: E1216 16:22:12.660991 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.661794 kubelet[2885]: E1216 16:22:12.661363 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.661794 kubelet[2885]: W1216 16:22:12.661376 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.661794 kubelet[2885]: E1216 16:22:12.661391 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.661794 kubelet[2885]: E1216 16:22:12.661742 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.661794 kubelet[2885]: W1216 16:22:12.661755 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.661794 kubelet[2885]: E1216 16:22:12.661770 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.663118 kubelet[2885]: E1216 16:22:12.662047 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.663118 kubelet[2885]: W1216 16:22:12.662125 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.663118 kubelet[2885]: E1216 16:22:12.662141 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.663118 kubelet[2885]: E1216 16:22:12.662379 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.663118 kubelet[2885]: W1216 16:22:12.662392 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.663118 kubelet[2885]: E1216 16:22:12.662406 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.663118 kubelet[2885]: E1216 16:22:12.662649 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.663118 kubelet[2885]: W1216 16:22:12.662663 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.663118 kubelet[2885]: E1216 16:22:12.662677 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.663118 kubelet[2885]: E1216 16:22:12.662902 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.663792 kubelet[2885]: W1216 16:22:12.662915 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.663792 kubelet[2885]: E1216 16:22:12.662928 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.676987 containerd[1574]: time="2025-12-16T16:22:12.676924854Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vm24s,Uid:10d00c49-0c56-422a-afe0-c3efa0c1269e,Namespace:calico-system,Attempt:0,}" Dec 16 16:22:12.728702 containerd[1574]: time="2025-12-16T16:22:12.724886967Z" level=info msg="connecting to shim 22b71f85b8a08a4600a98fcaf07ceed2afe0d5fb8c2c9b4b17eb8a7d38036b4f" address="unix:///run/containerd/s/7fc8168771acdadc103dc1dde30da3e76b2c95d04b87ad11eab00fc2993d55e1" namespace=k8s.io protocol=ttrpc version=3 Dec 16 16:22:12.746784 kubelet[2885]: E1216 16:22:12.746742 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.746784 kubelet[2885]: W1216 16:22:12.746774 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.766846 kubelet[2885]: E1216 16:22:12.746807 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.770645 kubelet[2885]: E1216 16:22:12.770524 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.770645 kubelet[2885]: W1216 16:22:12.770562 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.770645 kubelet[2885]: E1216 16:22:12.770595 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.770645 kubelet[2885]: I1216 16:22:12.770599 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/79791809-4f39-420a-be3e-00a912b46628-registration-dir\") pod \"csi-node-driver-44s4j\" (UID: \"79791809-4f39-420a-be3e-00a912b46628\") " pod="calico-system/csi-node-driver-44s4j" Dec 16 16:22:12.771401 kubelet[2885]: E1216 16:22:12.771070 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.771401 kubelet[2885]: W1216 16:22:12.771165 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.771401 kubelet[2885]: E1216 16:22:12.771183 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.771777 kubelet[2885]: E1216 16:22:12.771737 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.771777 kubelet[2885]: W1216 16:22:12.771758 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.771777 kubelet[2885]: E1216 16:22:12.771776 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.772719 kubelet[2885]: I1216 16:22:12.771809 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/79791809-4f39-420a-be3e-00a912b46628-kubelet-dir\") pod \"csi-node-driver-44s4j\" (UID: \"79791809-4f39-420a-be3e-00a912b46628\") " pod="calico-system/csi-node-driver-44s4j" Dec 16 16:22:12.772719 kubelet[2885]: E1216 16:22:12.772429 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.772719 kubelet[2885]: W1216 16:22:12.772446 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.772719 kubelet[2885]: E1216 16:22:12.772482 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.772719 kubelet[2885]: I1216 16:22:12.772508 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/79791809-4f39-420a-be3e-00a912b46628-socket-dir\") pod \"csi-node-driver-44s4j\" (UID: \"79791809-4f39-420a-be3e-00a912b46628\") " pod="calico-system/csi-node-driver-44s4j" Dec 16 16:22:12.773556 kubelet[2885]: E1216 16:22:12.772808 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.773556 kubelet[2885]: W1216 16:22:12.772824 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.773556 kubelet[2885]: E1216 16:22:12.772849 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.774250 kubelet[2885]: E1216 16:22:12.774149 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.774250 kubelet[2885]: W1216 16:22:12.774169 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.774250 kubelet[2885]: E1216 16:22:12.774192 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.775292 kubelet[2885]: E1216 16:22:12.775259 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.775292 kubelet[2885]: W1216 16:22:12.775282 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.775414 kubelet[2885]: E1216 16:22:12.775300 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.775414 kubelet[2885]: I1216 16:22:12.775325 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/79791809-4f39-420a-be3e-00a912b46628-varrun\") pod \"csi-node-driver-44s4j\" (UID: \"79791809-4f39-420a-be3e-00a912b46628\") " pod="calico-system/csi-node-driver-44s4j" Dec 16 16:22:12.776230 kubelet[2885]: E1216 16:22:12.776161 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.776305 kubelet[2885]: W1216 16:22:12.776185 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.776357 kubelet[2885]: E1216 16:22:12.776311 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.776357 kubelet[2885]: I1216 16:22:12.776335 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kfls\" (UniqueName: \"kubernetes.io/projected/79791809-4f39-420a-be3e-00a912b46628-kube-api-access-8kfls\") pod \"csi-node-driver-44s4j\" (UID: \"79791809-4f39-420a-be3e-00a912b46628\") " pod="calico-system/csi-node-driver-44s4j" Dec 16 16:22:12.778489 kubelet[2885]: E1216 16:22:12.777501 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.778569 kubelet[2885]: W1216 16:22:12.778453 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.778569 kubelet[2885]: E1216 16:22:12.778516 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.779514 kubelet[2885]: E1216 16:22:12.779050 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.779514 kubelet[2885]: W1216 16:22:12.779071 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.779514 kubelet[2885]: E1216 16:22:12.779235 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.780289 kubelet[2885]: E1216 16:22:12.779877 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.780289 kubelet[2885]: W1216 16:22:12.779897 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.780289 kubelet[2885]: E1216 16:22:12.779913 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.781729 kubelet[2885]: E1216 16:22:12.780876 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.781729 kubelet[2885]: W1216 16:22:12.780889 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.781729 kubelet[2885]: E1216 16:22:12.780904 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.782464 kubelet[2885]: E1216 16:22:12.781974 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.782464 kubelet[2885]: W1216 16:22:12.781996 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.782464 kubelet[2885]: E1216 16:22:12.782013 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.782863 kubelet[2885]: E1216 16:22:12.782839 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.782863 kubelet[2885]: W1216 16:22:12.782858 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.782969 kubelet[2885]: E1216 16:22:12.782874 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.803338 systemd[1]: Started cri-containerd-22b71f85b8a08a4600a98fcaf07ceed2afe0d5fb8c2c9b4b17eb8a7d38036b4f.scope - libcontainer container 22b71f85b8a08a4600a98fcaf07ceed2afe0d5fb8c2c9b4b17eb8a7d38036b4f. Dec 16 16:22:12.826653 containerd[1574]: time="2025-12-16T16:22:12.826529939Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-669867fb68-9z7rt,Uid:bd7c91d9-5b59-4110-92e2-ba57456a5c2f,Namespace:calico-system,Attempt:0,} returns sandbox id \"d21d96274bc526d4916751b237b8d3a14e2b4582f32783ffcac23543479f9eb7\"" Dec 16 16:22:12.830629 containerd[1574]: time="2025-12-16T16:22:12.830201561Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 16 16:22:12.878318 kubelet[2885]: E1216 16:22:12.878272 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.878318 kubelet[2885]: W1216 16:22:12.878308 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.878318 kubelet[2885]: E1216 16:22:12.878335 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.878925 kubelet[2885]: E1216 16:22:12.878889 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.878925 kubelet[2885]: W1216 16:22:12.878909 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.878925 kubelet[2885]: E1216 16:22:12.878926 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.879944 kubelet[2885]: E1216 16:22:12.879923 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.879944 kubelet[2885]: W1216 16:22:12.879943 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.880523 kubelet[2885]: E1216 16:22:12.879968 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.880948 kubelet[2885]: E1216 16:22:12.880920 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.880948 kubelet[2885]: W1216 16:22:12.880941 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.881820 kubelet[2885]: E1216 16:22:12.881464 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.881820 kubelet[2885]: W1216 16:22:12.881489 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.881820 kubelet[2885]: E1216 16:22:12.881743 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.882685 kubelet[2885]: E1216 16:22:12.881933 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.882685 kubelet[2885]: E1216 16:22:12.882621 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.882685 kubelet[2885]: W1216 16:22:12.882636 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.883432 kubelet[2885]: E1216 16:22:12.883188 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.883967 kubelet[2885]: E1216 16:22:12.883945 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.883967 kubelet[2885]: W1216 16:22:12.883964 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.884310 kubelet[2885]: E1216 16:22:12.884134 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.884873 kubelet[2885]: E1216 16:22:12.884851 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.884873 kubelet[2885]: W1216 16:22:12.884871 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.885580 kubelet[2885]: E1216 16:22:12.885551 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.886102 kubelet[2885]: E1216 16:22:12.885928 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.886102 kubelet[2885]: W1216 16:22:12.885950 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.886587 kubelet[2885]: E1216 16:22:12.886355 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.886948 kubelet[2885]: E1216 16:22:12.886924 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.886948 kubelet[2885]: W1216 16:22:12.886946 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.887818 kubelet[2885]: E1216 16:22:12.887618 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.888558 kubelet[2885]: E1216 16:22:12.888523 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.888558 kubelet[2885]: W1216 16:22:12.888544 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.888874 kubelet[2885]: E1216 16:22:12.888641 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.888943 kubelet[2885]: E1216 16:22:12.888908 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.888943 kubelet[2885]: W1216 16:22:12.888922 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.889566 kubelet[2885]: E1216 16:22:12.889539 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.890184 kubelet[2885]: E1216 16:22:12.890149 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.890184 kubelet[2885]: W1216 16:22:12.890169 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.890184 kubelet[2885]: E1216 16:22:12.890219 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.891450 kubelet[2885]: E1216 16:22:12.891404 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.891450 kubelet[2885]: W1216 16:22:12.891425 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.892555 kubelet[2885]: E1216 16:22:12.892386 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.893138 kubelet[2885]: E1216 16:22:12.892697 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.893138 kubelet[2885]: W1216 16:22:12.893021 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.893138 kubelet[2885]: E1216 16:22:12.893067 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.895427 kubelet[2885]: E1216 16:22:12.894508 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.895765 kubelet[2885]: W1216 16:22:12.895681 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.896031 kubelet[2885]: E1216 16:22:12.895859 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.896556 kubelet[2885]: E1216 16:22:12.896490 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.896764 kubelet[2885]: W1216 16:22:12.896567 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.897614 kubelet[2885]: E1216 16:22:12.896861 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.897990 kubelet[2885]: E1216 16:22:12.897854 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.897990 kubelet[2885]: W1216 16:22:12.897875 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.898154 kubelet[2885]: E1216 16:22:12.898137 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.898364 kubelet[2885]: E1216 16:22:12.898342 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.898364 kubelet[2885]: W1216 16:22:12.898360 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.899325 kubelet[2885]: E1216 16:22:12.898603 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.899325 kubelet[2885]: E1216 16:22:12.899283 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.899325 kubelet[2885]: W1216 16:22:12.899296 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.901900 kubelet[2885]: E1216 16:22:12.901875 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.905202 kubelet[2885]: E1216 16:22:12.902879 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.905202 kubelet[2885]: W1216 16:22:12.904822 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.905338 kubelet[2885]: E1216 16:22:12.905220 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.905338 kubelet[2885]: W1216 16:22:12.905234 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.905338 kubelet[2885]: E1216 16:22:12.905287 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.905338 kubelet[2885]: E1216 16:22:12.905309 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.905898 kubelet[2885]: E1216 16:22:12.905605 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.905898 kubelet[2885]: W1216 16:22:12.905632 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.906094 kubelet[2885]: E1216 16:22:12.906014 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.906094 kubelet[2885]: W1216 16:22:12.906035 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.906094 kubelet[2885]: E1216 16:22:12.906058 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.906758 kubelet[2885]: E1216 16:22:12.906727 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.907472 kubelet[2885]: E1216 16:22:12.907070 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.907472 kubelet[2885]: W1216 16:22:12.907114 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.907472 kubelet[2885]: E1216 16:22:12.907145 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:12.922725 containerd[1574]: time="2025-12-16T16:22:12.922679202Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vm24s,Uid:10d00c49-0c56-422a-afe0-c3efa0c1269e,Namespace:calico-system,Attempt:0,} returns sandbox id \"22b71f85b8a08a4600a98fcaf07ceed2afe0d5fb8c2c9b4b17eb8a7d38036b4f\"" Dec 16 16:22:12.924490 kubelet[2885]: E1216 16:22:12.924460 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:12.924490 kubelet[2885]: W1216 16:22:12.924488 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:12.924615 kubelet[2885]: E1216 16:22:12.924512 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:14.024561 kubelet[2885]: E1216 16:22:14.024161 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:22:14.504697 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount919048238.mount: Deactivated successfully. Dec 16 16:22:16.021349 kubelet[2885]: E1216 16:22:16.021301 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:22:16.157311 containerd[1574]: time="2025-12-16T16:22:16.157255519Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:22:16.174471 containerd[1574]: time="2025-12-16T16:22:16.174338200Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=35234628" Dec 16 16:22:16.182583 containerd[1574]: time="2025-12-16T16:22:16.182502898Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:22:16.197904 containerd[1574]: time="2025-12-16T16:22:16.197537379Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:22:16.208459 containerd[1574]: time="2025-12-16T16:22:16.207827262Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 3.377556716s" Dec 16 16:22:16.208459 containerd[1574]: time="2025-12-16T16:22:16.207887395Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Dec 16 16:22:16.209714 containerd[1574]: time="2025-12-16T16:22:16.209664666Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 16 16:22:16.234100 containerd[1574]: time="2025-12-16T16:22:16.234016606Z" level=info msg="CreateContainer within sandbox \"d21d96274bc526d4916751b237b8d3a14e2b4582f32783ffcac23543479f9eb7\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 16 16:22:16.292397 containerd[1574]: time="2025-12-16T16:22:16.292267370Z" level=info msg="Container 77384c7cb2269a89a24e0dc9afb01990d629cae904934bb832c237b3f4e82c79: CDI devices from CRI Config.CDIDevices: []" Dec 16 16:22:16.299044 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3747794136.mount: Deactivated successfully. Dec 16 16:22:16.329390 containerd[1574]: time="2025-12-16T16:22:16.329326165Z" level=info msg="CreateContainer within sandbox \"d21d96274bc526d4916751b237b8d3a14e2b4582f32783ffcac23543479f9eb7\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"77384c7cb2269a89a24e0dc9afb01990d629cae904934bb832c237b3f4e82c79\"" Dec 16 16:22:16.330443 containerd[1574]: time="2025-12-16T16:22:16.330168991Z" level=info msg="StartContainer for \"77384c7cb2269a89a24e0dc9afb01990d629cae904934bb832c237b3f4e82c79\"" Dec 16 16:22:16.331664 containerd[1574]: time="2025-12-16T16:22:16.331609319Z" level=info msg="connecting to shim 77384c7cb2269a89a24e0dc9afb01990d629cae904934bb832c237b3f4e82c79" address="unix:///run/containerd/s/cec9e95b09d9a783e8fe7cb7f5a21fc40b8e555395b3c239f7e40b4b7ec6324a" protocol=ttrpc version=3 Dec 16 16:22:16.370316 systemd[1]: Started cri-containerd-77384c7cb2269a89a24e0dc9afb01990d629cae904934bb832c237b3f4e82c79.scope - libcontainer container 77384c7cb2269a89a24e0dc9afb01990d629cae904934bb832c237b3f4e82c79. Dec 16 16:22:16.471687 containerd[1574]: time="2025-12-16T16:22:16.471639975Z" level=info msg="StartContainer for \"77384c7cb2269a89a24e0dc9afb01990d629cae904934bb832c237b3f4e82c79\" returns successfully" Dec 16 16:22:17.203302 kubelet[2885]: E1216 16:22:17.203044 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.203302 kubelet[2885]: W1216 16:22:17.203127 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.205850 kubelet[2885]: I1216 16:22:17.204556 2885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-669867fb68-9z7rt" podStartSLOduration=1.8171889430000001 podStartE2EDuration="5.196766319s" podCreationTimestamp="2025-12-16 16:22:12 +0000 UTC" firstStartedPulling="2025-12-16 16:22:12.829587434 +0000 UTC m=+23.036534639" lastFinishedPulling="2025-12-16 16:22:16.209164798 +0000 UTC m=+26.416112015" observedRunningTime="2025-12-16 16:22:17.193627587 +0000 UTC m=+27.400574808" watchObservedRunningTime="2025-12-16 16:22:17.196766319 +0000 UTC m=+27.403713526" Dec 16 16:22:17.215606 kubelet[2885]: E1216 16:22:17.215529 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.219232 kubelet[2885]: E1216 16:22:17.217727 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.219232 kubelet[2885]: W1216 16:22:17.217761 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.219232 kubelet[2885]: E1216 16:22:17.217787 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.219232 kubelet[2885]: E1216 16:22:17.218062 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.219232 kubelet[2885]: W1216 16:22:17.218094 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.219232 kubelet[2885]: E1216 16:22:17.218112 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.219232 kubelet[2885]: E1216 16:22:17.218434 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.219232 kubelet[2885]: W1216 16:22:17.218448 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.219232 kubelet[2885]: E1216 16:22:17.218463 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.219232 kubelet[2885]: E1216 16:22:17.218919 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.219944 kubelet[2885]: W1216 16:22:17.218942 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.219944 kubelet[2885]: E1216 16:22:17.218966 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.219944 kubelet[2885]: E1216 16:22:17.219320 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.219944 kubelet[2885]: W1216 16:22:17.219334 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.219944 kubelet[2885]: E1216 16:22:17.219349 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.219944 kubelet[2885]: E1216 16:22:17.219618 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.219944 kubelet[2885]: W1216 16:22:17.219632 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.219944 kubelet[2885]: E1216 16:22:17.219646 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.219944 kubelet[2885]: E1216 16:22:17.219947 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.225247 kubelet[2885]: W1216 16:22:17.219961 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.225247 kubelet[2885]: E1216 16:22:17.219975 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.225247 kubelet[2885]: E1216 16:22:17.220404 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.225247 kubelet[2885]: W1216 16:22:17.220418 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.225247 kubelet[2885]: E1216 16:22:17.220433 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.225247 kubelet[2885]: E1216 16:22:17.220720 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.225247 kubelet[2885]: W1216 16:22:17.220735 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.225247 kubelet[2885]: E1216 16:22:17.220756 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.225247 kubelet[2885]: E1216 16:22:17.221008 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.225247 kubelet[2885]: W1216 16:22:17.221020 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.226541 kubelet[2885]: E1216 16:22:17.221035 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.226541 kubelet[2885]: E1216 16:22:17.221338 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.226541 kubelet[2885]: W1216 16:22:17.221360 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.226541 kubelet[2885]: E1216 16:22:17.221375 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.226541 kubelet[2885]: E1216 16:22:17.222479 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.226541 kubelet[2885]: W1216 16:22:17.222620 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.226541 kubelet[2885]: E1216 16:22:17.222638 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.226541 kubelet[2885]: E1216 16:22:17.222935 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.226541 kubelet[2885]: W1216 16:22:17.222950 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.226541 kubelet[2885]: E1216 16:22:17.222964 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.228589 kubelet[2885]: E1216 16:22:17.225210 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.228589 kubelet[2885]: W1216 16:22:17.225226 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.228589 kubelet[2885]: E1216 16:22:17.225242 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.228589 kubelet[2885]: E1216 16:22:17.226563 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.228589 kubelet[2885]: W1216 16:22:17.226580 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.228589 kubelet[2885]: E1216 16:22:17.226596 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.228589 kubelet[2885]: E1216 16:22:17.227447 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.228589 kubelet[2885]: W1216 16:22:17.227461 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.228589 kubelet[2885]: E1216 16:22:17.227483 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.228589 kubelet[2885]: E1216 16:22:17.228167 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.229001 kubelet[2885]: W1216 16:22:17.228192 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.229001 kubelet[2885]: E1216 16:22:17.228218 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.230473 kubelet[2885]: E1216 16:22:17.230143 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.230730 kubelet[2885]: W1216 16:22:17.230165 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.230730 kubelet[2885]: E1216 16:22:17.230678 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.231884 kubelet[2885]: E1216 16:22:17.231860 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.232207 kubelet[2885]: W1216 16:22:17.231932 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.232207 kubelet[2885]: E1216 16:22:17.231963 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.232648 kubelet[2885]: E1216 16:22:17.232261 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.232648 kubelet[2885]: W1216 16:22:17.232275 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.232648 kubelet[2885]: E1216 16:22:17.232290 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.233199 kubelet[2885]: E1216 16:22:17.233121 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.233199 kubelet[2885]: W1216 16:22:17.233140 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.233685 kubelet[2885]: E1216 16:22:17.233402 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.233685 kubelet[2885]: E1216 16:22:17.233496 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.233685 kubelet[2885]: W1216 16:22:17.233509 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.233685 kubelet[2885]: E1216 16:22:17.233599 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.234794 kubelet[2885]: E1216 16:22:17.234216 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.234794 kubelet[2885]: W1216 16:22:17.234234 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.234921 kubelet[2885]: E1216 16:22:17.234825 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.235523 kubelet[2885]: E1216 16:22:17.235498 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.235523 kubelet[2885]: W1216 16:22:17.235523 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.235791 kubelet[2885]: E1216 16:22:17.235674 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.236403 kubelet[2885]: E1216 16:22:17.236361 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.236403 kubelet[2885]: W1216 16:22:17.236383 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.236549 kubelet[2885]: E1216 16:22:17.236520 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.237705 kubelet[2885]: E1216 16:22:17.237613 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.237705 kubelet[2885]: W1216 16:22:17.237677 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.237705 kubelet[2885]: E1216 16:22:17.237697 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.238183 kubelet[2885]: E1216 16:22:17.238160 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.238183 kubelet[2885]: W1216 16:22:17.238179 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.238667 kubelet[2885]: E1216 16:22:17.238339 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.239959 kubelet[2885]: E1216 16:22:17.239915 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.239959 kubelet[2885]: W1216 16:22:17.239937 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.239959 kubelet[2885]: E1216 16:22:17.239954 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.241923 kubelet[2885]: E1216 16:22:17.241852 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.241923 kubelet[2885]: W1216 16:22:17.241910 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.245882 kubelet[2885]: E1216 16:22:17.241928 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.245882 kubelet[2885]: E1216 16:22:17.243344 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.245882 kubelet[2885]: W1216 16:22:17.243359 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.245882 kubelet[2885]: E1216 16:22:17.243388 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.245882 kubelet[2885]: E1216 16:22:17.245119 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.245882 kubelet[2885]: W1216 16:22:17.245135 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.245882 kubelet[2885]: E1216 16:22:17.245151 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:17.248592 kubelet[2885]: E1216 16:22:17.248563 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:17.249128 kubelet[2885]: W1216 16:22:17.248708 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:17.249128 kubelet[2885]: E1216 16:22:17.248748 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.024553 kubelet[2885]: E1216 16:22:18.024441 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:22:18.195980 containerd[1574]: time="2025-12-16T16:22:18.194915915Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:22:18.195980 containerd[1574]: time="2025-12-16T16:22:18.195921706Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4446754" Dec 16 16:22:18.197072 containerd[1574]: time="2025-12-16T16:22:18.197032509Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:22:18.199680 containerd[1574]: time="2025-12-16T16:22:18.199641323Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:22:18.200773 containerd[1574]: time="2025-12-16T16:22:18.200729858Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.990905247s" Dec 16 16:22:18.200863 containerd[1574]: time="2025-12-16T16:22:18.200772865Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Dec 16 16:22:18.205263 containerd[1574]: time="2025-12-16T16:22:18.205222839Z" level=info msg="CreateContainer within sandbox \"22b71f85b8a08a4600a98fcaf07ceed2afe0d5fb8c2c9b4b17eb8a7d38036b4f\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 16 16:22:18.220097 containerd[1574]: time="2025-12-16T16:22:18.219340964Z" level=info msg="Container 7ebf070df7bd2d36cc0da6a7c7c6966e8817b1372270bad8ff6813317c5c85c3: CDI devices from CRI Config.CDIDevices: []" Dec 16 16:22:18.231665 kubelet[2885]: E1216 16:22:18.231605 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.231665 kubelet[2885]: W1216 16:22:18.231632 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.231665 kubelet[2885]: E1216 16:22:18.231657 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.232694 kubelet[2885]: E1216 16:22:18.232531 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.232694 kubelet[2885]: W1216 16:22:18.232563 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.232694 kubelet[2885]: E1216 16:22:18.232583 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.233415 kubelet[2885]: E1216 16:22:18.233395 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.233747 kubelet[2885]: W1216 16:22:18.233675 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.233747 kubelet[2885]: E1216 16:22:18.233702 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.234805 kubelet[2885]: E1216 16:22:18.234702 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.234805 kubelet[2885]: W1216 16:22:18.234721 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.234805 kubelet[2885]: E1216 16:22:18.234737 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.235831 kubelet[2885]: E1216 16:22:18.235795 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.236056 kubelet[2885]: W1216 16:22:18.235813 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.236056 kubelet[2885]: E1216 16:22:18.235989 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.237494 kubelet[2885]: E1216 16:22:18.237397 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.237494 kubelet[2885]: W1216 16:22:18.237469 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.237849 kubelet[2885]: E1216 16:22:18.237688 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.238282 kubelet[2885]: E1216 16:22:18.238238 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.238495 kubelet[2885]: W1216 16:22:18.238256 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.238495 kubelet[2885]: E1216 16:22:18.238427 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.239052 kubelet[2885]: E1216 16:22:18.239004 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.239052 kubelet[2885]: W1216 16:22:18.239022 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.239569 kubelet[2885]: E1216 16:22:18.239267 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.239925 kubelet[2885]: E1216 16:22:18.239867 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.239925 kubelet[2885]: W1216 16:22:18.239885 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.240301 kubelet[2885]: E1216 16:22:18.240021 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.240704 kubelet[2885]: E1216 16:22:18.240666 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.240887 kubelet[2885]: W1216 16:22:18.240804 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.240887 kubelet[2885]: E1216 16:22:18.240824 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.241481 kubelet[2885]: E1216 16:22:18.241398 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.241481 kubelet[2885]: W1216 16:22:18.241416 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.241481 kubelet[2885]: E1216 16:22:18.241431 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.242219 kubelet[2885]: E1216 16:22:18.242156 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.242219 kubelet[2885]: W1216 16:22:18.242174 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.242444 kubelet[2885]: E1216 16:22:18.242318 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.243033 kubelet[2885]: E1216 16:22:18.242963 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.243033 kubelet[2885]: W1216 16:22:18.242983 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.243033 kubelet[2885]: E1216 16:22:18.242998 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.243406 containerd[1574]: time="2025-12-16T16:22:18.243276358Z" level=info msg="CreateContainer within sandbox \"22b71f85b8a08a4600a98fcaf07ceed2afe0d5fb8c2c9b4b17eb8a7d38036b4f\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"7ebf070df7bd2d36cc0da6a7c7c6966e8817b1372270bad8ff6813317c5c85c3\"" Dec 16 16:22:18.244254 kubelet[2885]: E1216 16:22:18.244015 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.244254 kubelet[2885]: W1216 16:22:18.244177 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.244254 kubelet[2885]: E1216 16:22:18.244193 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.244872 containerd[1574]: time="2025-12-16T16:22:18.244630971Z" level=info msg="StartContainer for \"7ebf070df7bd2d36cc0da6a7c7c6966e8817b1372270bad8ff6813317c5c85c3\"" Dec 16 16:22:18.245763 kubelet[2885]: E1216 16:22:18.245686 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.245763 kubelet[2885]: W1216 16:22:18.245704 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.245763 kubelet[2885]: E1216 16:22:18.245721 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.246785 kubelet[2885]: E1216 16:22:18.246722 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.246785 kubelet[2885]: W1216 16:22:18.246741 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.247153 kubelet[2885]: E1216 16:22:18.246757 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.248633 kubelet[2885]: E1216 16:22:18.248583 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.248971 kubelet[2885]: W1216 16:22:18.248747 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.248971 kubelet[2885]: E1216 16:22:18.248913 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.249711 kubelet[2885]: E1216 16:22:18.249639 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.249711 kubelet[2885]: W1216 16:22:18.249658 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.249834 kubelet[2885]: E1216 16:22:18.249710 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.250660 kubelet[2885]: E1216 16:22:18.250608 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.250914 kubelet[2885]: W1216 16:22:18.250842 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.250982 kubelet[2885]: E1216 16:22:18.250916 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.251659 kubelet[2885]: E1216 16:22:18.251619 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.251931 kubelet[2885]: W1216 16:22:18.251637 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.252293 kubelet[2885]: E1216 16:22:18.251916 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.252719 kubelet[2885]: E1216 16:22:18.252661 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.252719 kubelet[2885]: W1216 16:22:18.252681 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.252979 kubelet[2885]: E1216 16:22:18.252838 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.253058 kubelet[2885]: E1216 16:22:18.253042 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.253370 kubelet[2885]: W1216 16:22:18.253058 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.253370 kubelet[2885]: E1216 16:22:18.253157 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.253521 kubelet[2885]: E1216 16:22:18.253393 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.253521 kubelet[2885]: W1216 16:22:18.253407 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.253991 kubelet[2885]: E1216 16:22:18.253648 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.253991 kubelet[2885]: E1216 16:22:18.253684 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.253991 kubelet[2885]: W1216 16:22:18.253699 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.253991 kubelet[2885]: E1216 16:22:18.253724 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.256473 kubelet[2885]: E1216 16:22:18.254040 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.256473 kubelet[2885]: W1216 16:22:18.254054 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.256473 kubelet[2885]: E1216 16:22:18.254118 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.256473 kubelet[2885]: E1216 16:22:18.254449 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.256473 kubelet[2885]: W1216 16:22:18.254464 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.256473 kubelet[2885]: E1216 16:22:18.254486 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.258609 kubelet[2885]: E1216 16:22:18.258493 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.258609 kubelet[2885]: W1216 16:22:18.258513 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.258609 kubelet[2885]: E1216 16:22:18.258561 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.260545 kubelet[2885]: E1216 16:22:18.258941 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.260545 kubelet[2885]: W1216 16:22:18.258962 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.260545 kubelet[2885]: E1216 16:22:18.258985 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.260545 kubelet[2885]: E1216 16:22:18.259295 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.260545 kubelet[2885]: W1216 16:22:18.259320 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.260545 kubelet[2885]: E1216 16:22:18.259335 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.260545 kubelet[2885]: E1216 16:22:18.259766 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.260545 kubelet[2885]: W1216 16:22:18.259779 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.260545 kubelet[2885]: E1216 16:22:18.259794 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.260545 kubelet[2885]: E1216 16:22:18.260012 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.260995 kubelet[2885]: W1216 16:22:18.260024 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.260995 kubelet[2885]: E1216 16:22:18.260038 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.260995 kubelet[2885]: E1216 16:22:18.260599 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.260995 kubelet[2885]: W1216 16:22:18.260612 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.260995 kubelet[2885]: E1216 16:22:18.260628 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.262934 kubelet[2885]: E1216 16:22:18.262910 2885 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 16:22:18.262934 kubelet[2885]: W1216 16:22:18.262928 2885 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 16:22:18.263051 kubelet[2885]: E1216 16:22:18.262944 2885 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 16:22:18.266132 containerd[1574]: time="2025-12-16T16:22:18.254250119Z" level=info msg="connecting to shim 7ebf070df7bd2d36cc0da6a7c7c6966e8817b1372270bad8ff6813317c5c85c3" address="unix:///run/containerd/s/7fc8168771acdadc103dc1dde30da3e76b2c95d04b87ad11eab00fc2993d55e1" protocol=ttrpc version=3 Dec 16 16:22:18.309394 systemd[1]: Started cri-containerd-7ebf070df7bd2d36cc0da6a7c7c6966e8817b1372270bad8ff6813317c5c85c3.scope - libcontainer container 7ebf070df7bd2d36cc0da6a7c7c6966e8817b1372270bad8ff6813317c5c85c3. Dec 16 16:22:18.413741 containerd[1574]: time="2025-12-16T16:22:18.413686623Z" level=info msg="StartContainer for \"7ebf070df7bd2d36cc0da6a7c7c6966e8817b1372270bad8ff6813317c5c85c3\" returns successfully" Dec 16 16:22:18.420122 systemd[1]: cri-containerd-7ebf070df7bd2d36cc0da6a7c7c6966e8817b1372270bad8ff6813317c5c85c3.scope: Deactivated successfully. Dec 16 16:22:18.459351 containerd[1574]: time="2025-12-16T16:22:18.459268453Z" level=info msg="received container exit event container_id:\"7ebf070df7bd2d36cc0da6a7c7c6966e8817b1372270bad8ff6813317c5c85c3\" id:\"7ebf070df7bd2d36cc0da6a7c7c6966e8817b1372270bad8ff6813317c5c85c3\" pid:3599 exited_at:{seconds:1765902138 nanos:430935629}" Dec 16 16:22:18.539021 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7ebf070df7bd2d36cc0da6a7c7c6966e8817b1372270bad8ff6813317c5c85c3-rootfs.mount: Deactivated successfully. Dec 16 16:22:19.191911 containerd[1574]: time="2025-12-16T16:22:19.191855510Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 16 16:22:20.021853 kubelet[2885]: E1216 16:22:20.021393 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:22:22.022226 kubelet[2885]: E1216 16:22:22.021792 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:22:24.021493 kubelet[2885]: E1216 16:22:24.021425 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:22:25.894989 containerd[1574]: time="2025-12-16T16:22:25.894921591Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:22:25.897274 containerd[1574]: time="2025-12-16T16:22:25.897235813Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70446859" Dec 16 16:22:25.898175 containerd[1574]: time="2025-12-16T16:22:25.898068146Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:22:25.902461 containerd[1574]: time="2025-12-16T16:22:25.902045758Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:22:25.903215 containerd[1574]: time="2025-12-16T16:22:25.903183029Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 6.711277627s" Dec 16 16:22:25.903342 containerd[1574]: time="2025-12-16T16:22:25.903317027Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Dec 16 16:22:25.914020 containerd[1574]: time="2025-12-16T16:22:25.913982510Z" level=info msg="CreateContainer within sandbox \"22b71f85b8a08a4600a98fcaf07ceed2afe0d5fb8c2c9b4b17eb8a7d38036b4f\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 16 16:22:25.931814 containerd[1574]: time="2025-12-16T16:22:25.931336478Z" level=info msg="Container dc2916b3a8da5688aacfbcc7d2c74f512ef1a72f6d3be74b5b4199fe11d416d0: CDI devices from CRI Config.CDIDevices: []" Dec 16 16:22:25.941119 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4055913203.mount: Deactivated successfully. Dec 16 16:22:25.946691 containerd[1574]: time="2025-12-16T16:22:25.946529422Z" level=info msg="CreateContainer within sandbox \"22b71f85b8a08a4600a98fcaf07ceed2afe0d5fb8c2c9b4b17eb8a7d38036b4f\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"dc2916b3a8da5688aacfbcc7d2c74f512ef1a72f6d3be74b5b4199fe11d416d0\"" Dec 16 16:22:25.948120 containerd[1574]: time="2025-12-16T16:22:25.947573443Z" level=info msg="StartContainer for \"dc2916b3a8da5688aacfbcc7d2c74f512ef1a72f6d3be74b5b4199fe11d416d0\"" Dec 16 16:22:25.950020 containerd[1574]: time="2025-12-16T16:22:25.949980694Z" level=info msg="connecting to shim dc2916b3a8da5688aacfbcc7d2c74f512ef1a72f6d3be74b5b4199fe11d416d0" address="unix:///run/containerd/s/7fc8168771acdadc103dc1dde30da3e76b2c95d04b87ad11eab00fc2993d55e1" protocol=ttrpc version=3 Dec 16 16:22:25.996302 systemd[1]: Started cri-containerd-dc2916b3a8da5688aacfbcc7d2c74f512ef1a72f6d3be74b5b4199fe11d416d0.scope - libcontainer container dc2916b3a8da5688aacfbcc7d2c74f512ef1a72f6d3be74b5b4199fe11d416d0. Dec 16 16:22:26.023168 kubelet[2885]: E1216 16:22:26.023068 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:22:26.121027 containerd[1574]: time="2025-12-16T16:22:26.120977784Z" level=info msg="StartContainer for \"dc2916b3a8da5688aacfbcc7d2c74f512ef1a72f6d3be74b5b4199fe11d416d0\" returns successfully" Dec 16 16:22:27.280921 systemd[1]: cri-containerd-dc2916b3a8da5688aacfbcc7d2c74f512ef1a72f6d3be74b5b4199fe11d416d0.scope: Deactivated successfully. Dec 16 16:22:27.281935 systemd[1]: cri-containerd-dc2916b3a8da5688aacfbcc7d2c74f512ef1a72f6d3be74b5b4199fe11d416d0.scope: Consumed 758ms CPU time, 168.7M memory peak, 3.6M read from disk, 171.3M written to disk. Dec 16 16:22:27.287141 containerd[1574]: time="2025-12-16T16:22:27.287097979Z" level=info msg="received container exit event container_id:\"dc2916b3a8da5688aacfbcc7d2c74f512ef1a72f6d3be74b5b4199fe11d416d0\" id:\"dc2916b3a8da5688aacfbcc7d2c74f512ef1a72f6d3be74b5b4199fe11d416d0\" pid:3659 exited_at:{seconds:1765902147 nanos:284148721}" Dec 16 16:22:27.334519 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dc2916b3a8da5688aacfbcc7d2c74f512ef1a72f6d3be74b5b4199fe11d416d0-rootfs.mount: Deactivated successfully. Dec 16 16:22:27.364099 kubelet[2885]: I1216 16:22:27.362893 2885 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Dec 16 16:22:27.428518 kubelet[2885]: W1216 16:22:27.428386 2885 reflector.go:569] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:srv-bfhb9.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'srv-bfhb9.gb1.brightbox.com' and this object Dec 16 16:22:27.429439 systemd[1]: Created slice kubepods-burstable-pod65e40112_221a_4b95_a8b2_4f1ac0bbad0f.slice - libcontainer container kubepods-burstable-pod65e40112_221a_4b95_a8b2_4f1ac0bbad0f.slice. Dec 16 16:22:27.432312 kubelet[2885]: E1216 16:22:27.430991 2885 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:srv-bfhb9.gb1.brightbox.com\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'srv-bfhb9.gb1.brightbox.com' and this object" logger="UnhandledError" Dec 16 16:22:27.432312 kubelet[2885]: W1216 16:22:27.431487 2885 reflector.go:569] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:srv-bfhb9.gb1.brightbox.com" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'srv-bfhb9.gb1.brightbox.com' and this object Dec 16 16:22:27.432312 kubelet[2885]: E1216 16:22:27.431540 2885 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:srv-bfhb9.gb1.brightbox.com\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'srv-bfhb9.gb1.brightbox.com' and this object" logger="UnhandledError" Dec 16 16:22:27.445891 systemd[1]: Created slice kubepods-besteffort-pod27b8eb94_af6b_44b6_a0c4_8cd0e3a973f8.slice - libcontainer container kubepods-besteffort-pod27b8eb94_af6b_44b6_a0c4_8cd0e3a973f8.slice. Dec 16 16:22:27.465673 systemd[1]: Created slice kubepods-burstable-pod720712db_dca2_4214_827a_7cbbbdd0f811.slice - libcontainer container kubepods-burstable-pod720712db_dca2_4214_827a_7cbbbdd0f811.slice. Dec 16 16:22:27.468502 kubelet[2885]: W1216 16:22:27.468465 2885 reflector.go:569] object-"calico-system"/"whisker-ca-bundle": failed to list *v1.ConfigMap: configmaps "whisker-ca-bundle" is forbidden: User "system:node:srv-bfhb9.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'srv-bfhb9.gb1.brightbox.com' and this object Dec 16 16:22:27.468761 kubelet[2885]: E1216 16:22:27.468516 2885 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"whisker-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"whisker-ca-bundle\" is forbidden: User \"system:node:srv-bfhb9.gb1.brightbox.com\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'srv-bfhb9.gb1.brightbox.com' and this object" logger="UnhandledError" Dec 16 16:22:27.468761 kubelet[2885]: W1216 16:22:27.468594 2885 reflector.go:569] object-"calico-system"/"whisker-backend-key-pair": failed to list *v1.Secret: secrets "whisker-backend-key-pair" is forbidden: User "system:node:srv-bfhb9.gb1.brightbox.com" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'srv-bfhb9.gb1.brightbox.com' and this object Dec 16 16:22:27.468761 kubelet[2885]: E1216 16:22:27.468619 2885 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"whisker-backend-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"whisker-backend-key-pair\" is forbidden: User \"system:node:srv-bfhb9.gb1.brightbox.com\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'srv-bfhb9.gb1.brightbox.com' and this object" logger="UnhandledError" Dec 16 16:22:27.480546 systemd[1]: Created slice kubepods-besteffort-poddf9fadf4_abe7_46bf_a41a_4f54249b3de9.slice - libcontainer container kubepods-besteffort-poddf9fadf4_abe7_46bf_a41a_4f54249b3de9.slice. Dec 16 16:22:27.495052 systemd[1]: Created slice kubepods-besteffort-pod00e405ab_5218_434e_9f1a_e07a6a584faa.slice - libcontainer container kubepods-besteffort-pod00e405ab_5218_434e_9f1a_e07a6a584faa.slice. Dec 16 16:22:27.507316 systemd[1]: Created slice kubepods-besteffort-pod475547d4_6717_4e55_a04e_817534e2d535.slice - libcontainer container kubepods-besteffort-pod475547d4_6717_4e55_a04e_817534e2d535.slice. Dec 16 16:22:27.517948 systemd[1]: Created slice kubepods-besteffort-podc1794fb1_0f3d_47eb_a0fe_5d4e2aa97585.slice - libcontainer container kubepods-besteffort-podc1794fb1_0f3d_47eb_a0fe_5d4e2aa97585.slice. Dec 16 16:22:27.542395 kubelet[2885]: I1216 16:22:27.542220 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df9fadf4-abe7-46bf-a41a-4f54249b3de9-tigera-ca-bundle\") pod \"calico-kube-controllers-79db8d8ff-l77mg\" (UID: \"df9fadf4-abe7-46bf-a41a-4f54249b3de9\") " pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" Dec 16 16:22:27.543150 kubelet[2885]: I1216 16:22:27.542992 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dn65\" (UniqueName: \"kubernetes.io/projected/475547d4-6717-4e55-a04e-817534e2d535-kube-api-access-2dn65\") pod \"calico-apiserver-79b6bb8fb6-vmcnq\" (UID: \"475547d4-6717-4e55-a04e-817534e2d535\") " pod="calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq" Dec 16 16:22:27.544540 kubelet[2885]: I1216 16:22:27.543342 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00e405ab-5218-434e-9f1a-e07a6a584faa-whisker-ca-bundle\") pod \"whisker-7796b4f945-xfw6t\" (UID: \"00e405ab-5218-434e-9f1a-e07a6a584faa\") " pod="calico-system/whisker-7796b4f945-xfw6t" Dec 16 16:22:27.544540 kubelet[2885]: I1216 16:22:27.543383 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585-goldmane-ca-bundle\") pod \"goldmane-666569f655-57n7d\" (UID: \"c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585\") " pod="calico-system/goldmane-666569f655-57n7d" Dec 16 16:22:27.544540 kubelet[2885]: I1216 16:22:27.543413 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctjk8\" (UniqueName: \"kubernetes.io/projected/720712db-dca2-4214-827a-7cbbbdd0f811-kube-api-access-ctjk8\") pod \"coredns-668d6bf9bc-jnqzj\" (UID: \"720712db-dca2-4214-827a-7cbbbdd0f811\") " pod="kube-system/coredns-668d6bf9bc-jnqzj" Dec 16 16:22:27.544540 kubelet[2885]: I1216 16:22:27.543454 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/475547d4-6717-4e55-a04e-817534e2d535-calico-apiserver-certs\") pod \"calico-apiserver-79b6bb8fb6-vmcnq\" (UID: \"475547d4-6717-4e55-a04e-817534e2d535\") " pod="calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq" Dec 16 16:22:27.544540 kubelet[2885]: I1216 16:22:27.543493 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k88rq\" (UniqueName: \"kubernetes.io/projected/65e40112-221a-4b95-a8b2-4f1ac0bbad0f-kube-api-access-k88rq\") pod \"coredns-668d6bf9bc-p5zp4\" (UID: \"65e40112-221a-4b95-a8b2-4f1ac0bbad0f\") " pod="kube-system/coredns-668d6bf9bc-p5zp4" Dec 16 16:22:27.544814 kubelet[2885]: I1216 16:22:27.543530 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/00e405ab-5218-434e-9f1a-e07a6a584faa-whisker-backend-key-pair\") pod \"whisker-7796b4f945-xfw6t\" (UID: \"00e405ab-5218-434e-9f1a-e07a6a584faa\") " pod="calico-system/whisker-7796b4f945-xfw6t" Dec 16 16:22:27.544814 kubelet[2885]: I1216 16:22:27.543560 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4ptd\" (UniqueName: \"kubernetes.io/projected/00e405ab-5218-434e-9f1a-e07a6a584faa-kube-api-access-j4ptd\") pod \"whisker-7796b4f945-xfw6t\" (UID: \"00e405ab-5218-434e-9f1a-e07a6a584faa\") " pod="calico-system/whisker-7796b4f945-xfw6t" Dec 16 16:22:27.544814 kubelet[2885]: I1216 16:22:27.543592 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn5l2\" (UniqueName: \"kubernetes.io/projected/27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8-kube-api-access-sn5l2\") pod \"calico-apiserver-79b6bb8fb6-gt22s\" (UID: \"27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8\") " pod="calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s" Dec 16 16:22:27.544814 kubelet[2885]: I1216 16:22:27.543631 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/720712db-dca2-4214-827a-7cbbbdd0f811-config-volume\") pod \"coredns-668d6bf9bc-jnqzj\" (UID: \"720712db-dca2-4214-827a-7cbbbdd0f811\") " pod="kube-system/coredns-668d6bf9bc-jnqzj" Dec 16 16:22:27.544814 kubelet[2885]: I1216 16:22:27.543658 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj22d\" (UniqueName: \"kubernetes.io/projected/df9fadf4-abe7-46bf-a41a-4f54249b3de9-kube-api-access-wj22d\") pod \"calico-kube-controllers-79db8d8ff-l77mg\" (UID: \"df9fadf4-abe7-46bf-a41a-4f54249b3de9\") " pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" Dec 16 16:22:27.545604 kubelet[2885]: I1216 16:22:27.543689 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8-calico-apiserver-certs\") pod \"calico-apiserver-79b6bb8fb6-gt22s\" (UID: \"27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8\") " pod="calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s" Dec 16 16:22:27.545604 kubelet[2885]: I1216 16:22:27.543738 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585-config\") pod \"goldmane-666569f655-57n7d\" (UID: \"c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585\") " pod="calico-system/goldmane-666569f655-57n7d" Dec 16 16:22:27.545604 kubelet[2885]: I1216 16:22:27.543767 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585-goldmane-key-pair\") pod \"goldmane-666569f655-57n7d\" (UID: \"c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585\") " pod="calico-system/goldmane-666569f655-57n7d" Dec 16 16:22:27.545604 kubelet[2885]: I1216 16:22:27.543803 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65e40112-221a-4b95-a8b2-4f1ac0bbad0f-config-volume\") pod \"coredns-668d6bf9bc-p5zp4\" (UID: \"65e40112-221a-4b95-a8b2-4f1ac0bbad0f\") " pod="kube-system/coredns-668d6bf9bc-p5zp4" Dec 16 16:22:27.545604 kubelet[2885]: I1216 16:22:27.543837 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh92g\" (UniqueName: \"kubernetes.io/projected/c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585-kube-api-access-sh92g\") pod \"goldmane-666569f655-57n7d\" (UID: \"c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585\") " pod="calico-system/goldmane-666569f655-57n7d" Dec 16 16:22:27.743906 containerd[1574]: time="2025-12-16T16:22:27.743844274Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-p5zp4,Uid:65e40112-221a-4b95-a8b2-4f1ac0bbad0f,Namespace:kube-system,Attempt:0,}" Dec 16 16:22:27.789352 containerd[1574]: time="2025-12-16T16:22:27.789265692Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-jnqzj,Uid:720712db-dca2-4214-827a-7cbbbdd0f811,Namespace:kube-system,Attempt:0,}" Dec 16 16:22:27.802892 containerd[1574]: time="2025-12-16T16:22:27.802005152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79db8d8ff-l77mg,Uid:df9fadf4-abe7-46bf-a41a-4f54249b3de9,Namespace:calico-system,Attempt:0,}" Dec 16 16:22:27.829640 containerd[1574]: time="2025-12-16T16:22:27.827997032Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-57n7d,Uid:c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585,Namespace:calico-system,Attempt:0,}" Dec 16 16:22:28.023226 containerd[1574]: time="2025-12-16T16:22:28.023114662Z" level=error msg="Failed to destroy network for sandbox \"76eb09dc245c538470804275f7778b943a7ce19b5c58efc0cf4b025aceced95f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:28.045598 systemd[1]: Created slice kubepods-besteffort-pod79791809_4f39_420a_be3e_00a912b46628.slice - libcontainer container kubepods-besteffort-pod79791809_4f39_420a_be3e_00a912b46628.slice. Dec 16 16:22:28.051701 containerd[1574]: time="2025-12-16T16:22:28.051646862Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-44s4j,Uid:79791809-4f39-420a-be3e-00a912b46628,Namespace:calico-system,Attempt:0,}" Dec 16 16:22:28.076190 containerd[1574]: time="2025-12-16T16:22:28.075816095Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-jnqzj,Uid:720712db-dca2-4214-827a-7cbbbdd0f811,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"76eb09dc245c538470804275f7778b943a7ce19b5c58efc0cf4b025aceced95f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:28.081837 kubelet[2885]: E1216 16:22:28.081751 2885 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76eb09dc245c538470804275f7778b943a7ce19b5c58efc0cf4b025aceced95f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:28.082025 kubelet[2885]: E1216 16:22:28.081882 2885 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76eb09dc245c538470804275f7778b943a7ce19b5c58efc0cf4b025aceced95f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-jnqzj" Dec 16 16:22:28.082025 kubelet[2885]: E1216 16:22:28.081927 2885 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76eb09dc245c538470804275f7778b943a7ce19b5c58efc0cf4b025aceced95f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-jnqzj" Dec 16 16:22:28.082025 kubelet[2885]: E1216 16:22:28.081997 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-jnqzj_kube-system(720712db-dca2-4214-827a-7cbbbdd0f811)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-jnqzj_kube-system(720712db-dca2-4214-827a-7cbbbdd0f811)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"76eb09dc245c538470804275f7778b943a7ce19b5c58efc0cf4b025aceced95f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-jnqzj" podUID="720712db-dca2-4214-827a-7cbbbdd0f811" Dec 16 16:22:28.095749 containerd[1574]: time="2025-12-16T16:22:28.095684522Z" level=error msg="Failed to destroy network for sandbox \"884c06e4f327fc9d8da1ca1cdcff8a35b49f79785938ae04f5cbeed916b8dd16\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:28.102454 containerd[1574]: time="2025-12-16T16:22:28.102289263Z" level=error msg="Failed to destroy network for sandbox \"f3a997aee701dfbb50f3140f8c45c68a677d6de1f0830ad2f1a455d8625a94a8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:28.102791 containerd[1574]: time="2025-12-16T16:22:28.102707736Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-p5zp4,Uid:65e40112-221a-4b95-a8b2-4f1ac0bbad0f,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"884c06e4f327fc9d8da1ca1cdcff8a35b49f79785938ae04f5cbeed916b8dd16\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:28.105143 containerd[1574]: time="2025-12-16T16:22:28.104410969Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-57n7d,Uid:c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f3a997aee701dfbb50f3140f8c45c68a677d6de1f0830ad2f1a455d8625a94a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:28.105305 kubelet[2885]: E1216 16:22:28.104810 2885 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f3a997aee701dfbb50f3140f8c45c68a677d6de1f0830ad2f1a455d8625a94a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:28.105305 kubelet[2885]: E1216 16:22:28.104878 2885 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f3a997aee701dfbb50f3140f8c45c68a677d6de1f0830ad2f1a455d8625a94a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-57n7d" Dec 16 16:22:28.105305 kubelet[2885]: E1216 16:22:28.104907 2885 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f3a997aee701dfbb50f3140f8c45c68a677d6de1f0830ad2f1a455d8625a94a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-57n7d" Dec 16 16:22:28.105515 kubelet[2885]: E1216 16:22:28.104978 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-57n7d_calico-system(c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-57n7d_calico-system(c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f3a997aee701dfbb50f3140f8c45c68a677d6de1f0830ad2f1a455d8625a94a8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-57n7d" podUID="c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585" Dec 16 16:22:28.106504 kubelet[2885]: E1216 16:22:28.106330 2885 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"884c06e4f327fc9d8da1ca1cdcff8a35b49f79785938ae04f5cbeed916b8dd16\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:28.106679 kubelet[2885]: E1216 16:22:28.106649 2885 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"884c06e4f327fc9d8da1ca1cdcff8a35b49f79785938ae04f5cbeed916b8dd16\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-p5zp4" Dec 16 16:22:28.109046 kubelet[2885]: E1216 16:22:28.107635 2885 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"884c06e4f327fc9d8da1ca1cdcff8a35b49f79785938ae04f5cbeed916b8dd16\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-p5zp4" Dec 16 16:22:28.109046 kubelet[2885]: E1216 16:22:28.107696 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-p5zp4_kube-system(65e40112-221a-4b95-a8b2-4f1ac0bbad0f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-p5zp4_kube-system(65e40112-221a-4b95-a8b2-4f1ac0bbad0f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"884c06e4f327fc9d8da1ca1cdcff8a35b49f79785938ae04f5cbeed916b8dd16\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-p5zp4" podUID="65e40112-221a-4b95-a8b2-4f1ac0bbad0f" Dec 16 16:22:28.124565 containerd[1574]: time="2025-12-16T16:22:28.124496176Z" level=error msg="Failed to destroy network for sandbox \"13526337fd5435434e2f86e806dab51afd6e718ed3ba809500baed763ba7bbda\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:28.126550 containerd[1574]: time="2025-12-16T16:22:28.126465519Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79db8d8ff-l77mg,Uid:df9fadf4-abe7-46bf-a41a-4f54249b3de9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"13526337fd5435434e2f86e806dab51afd6e718ed3ba809500baed763ba7bbda\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:28.127031 kubelet[2885]: E1216 16:22:28.126978 2885 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13526337fd5435434e2f86e806dab51afd6e718ed3ba809500baed763ba7bbda\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:28.127256 kubelet[2885]: E1216 16:22:28.127226 2885 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13526337fd5435434e2f86e806dab51afd6e718ed3ba809500baed763ba7bbda\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" Dec 16 16:22:28.127432 kubelet[2885]: E1216 16:22:28.127370 2885 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13526337fd5435434e2f86e806dab51afd6e718ed3ba809500baed763ba7bbda\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" Dec 16 16:22:28.128070 kubelet[2885]: E1216 16:22:28.127904 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-79db8d8ff-l77mg_calico-system(df9fadf4-abe7-46bf-a41a-4f54249b3de9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-79db8d8ff-l77mg_calico-system(df9fadf4-abe7-46bf-a41a-4f54249b3de9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"13526337fd5435434e2f86e806dab51afd6e718ed3ba809500baed763ba7bbda\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" podUID="df9fadf4-abe7-46bf-a41a-4f54249b3de9" Dec 16 16:22:28.179186 containerd[1574]: time="2025-12-16T16:22:28.179050937Z" level=error msg="Failed to destroy network for sandbox \"1981d4ed195924ab0697a03bd02f1ad18daa1ba4f11f460a95f6bf75d7da832f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:28.181143 containerd[1574]: time="2025-12-16T16:22:28.181038829Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-44s4j,Uid:79791809-4f39-420a-be3e-00a912b46628,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1981d4ed195924ab0697a03bd02f1ad18daa1ba4f11f460a95f6bf75d7da832f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:28.183123 kubelet[2885]: E1216 16:22:28.181433 2885 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1981d4ed195924ab0697a03bd02f1ad18daa1ba4f11f460a95f6bf75d7da832f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:28.183123 kubelet[2885]: E1216 16:22:28.181507 2885 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1981d4ed195924ab0697a03bd02f1ad18daa1ba4f11f460a95f6bf75d7da832f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-44s4j" Dec 16 16:22:28.183123 kubelet[2885]: E1216 16:22:28.181551 2885 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1981d4ed195924ab0697a03bd02f1ad18daa1ba4f11f460a95f6bf75d7da832f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-44s4j" Dec 16 16:22:28.183352 kubelet[2885]: E1216 16:22:28.181610 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-44s4j_calico-system(79791809-4f39-420a-be3e-00a912b46628)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-44s4j_calico-system(79791809-4f39-420a-be3e-00a912b46628)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1981d4ed195924ab0697a03bd02f1ad18daa1ba4f11f460a95f6bf75d7da832f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:22:28.299503 containerd[1574]: time="2025-12-16T16:22:28.296024237Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 16 16:22:28.646786 kubelet[2885]: E1216 16:22:28.646402 2885 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Dec 16 16:22:28.646786 kubelet[2885]: E1216 16:22:28.646579 2885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8-calico-apiserver-certs podName:27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8 nodeName:}" failed. No retries permitted until 2025-12-16 16:22:29.146541573 +0000 UTC m=+39.353488779 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8-calico-apiserver-certs") pod "calico-apiserver-79b6bb8fb6-gt22s" (UID: "27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8") : failed to sync secret cache: timed out waiting for the condition Dec 16 16:22:28.648265 kubelet[2885]: E1216 16:22:28.648143 2885 configmap.go:193] Couldn't get configMap calico-system/whisker-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 16 16:22:28.648265 kubelet[2885]: E1216 16:22:28.648207 2885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/00e405ab-5218-434e-9f1a-e07a6a584faa-whisker-ca-bundle podName:00e405ab-5218-434e-9f1a-e07a6a584faa nodeName:}" failed. No retries permitted until 2025-12-16 16:22:29.148191559 +0000 UTC m=+39.355138766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whisker-ca-bundle" (UniqueName: "kubernetes.io/configmap/00e405ab-5218-434e-9f1a-e07a6a584faa-whisker-ca-bundle") pod "whisker-7796b4f945-xfw6t" (UID: "00e405ab-5218-434e-9f1a-e07a6a584faa") : failed to sync configmap cache: timed out waiting for the condition Dec 16 16:22:28.648265 kubelet[2885]: E1216 16:22:28.648240 2885 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Dec 16 16:22:28.648732 kubelet[2885]: E1216 16:22:28.648315 2885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/475547d4-6717-4e55-a04e-817534e2d535-calico-apiserver-certs podName:475547d4-6717-4e55-a04e-817534e2d535 nodeName:}" failed. No retries permitted until 2025-12-16 16:22:29.14830159 +0000 UTC m=+39.355248801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/475547d4-6717-4e55-a04e-817534e2d535-calico-apiserver-certs") pod "calico-apiserver-79b6bb8fb6-vmcnq" (UID: "475547d4-6717-4e55-a04e-817534e2d535") : failed to sync secret cache: timed out waiting for the condition Dec 16 16:22:28.683913 kubelet[2885]: E1216 16:22:28.683819 2885 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 16 16:22:28.683913 kubelet[2885]: E1216 16:22:28.683888 2885 projected.go:194] Error preparing data for projected volume kube-api-access-sn5l2 for pod calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s: failed to sync configmap cache: timed out waiting for the condition Dec 16 16:22:28.684221 kubelet[2885]: E1216 16:22:28.683969 2885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8-kube-api-access-sn5l2 podName:27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8 nodeName:}" failed. No retries permitted until 2025-12-16 16:22:29.183947151 +0000 UTC m=+39.390894352 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-sn5l2" (UniqueName: "kubernetes.io/projected/27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8-kube-api-access-sn5l2") pod "calico-apiserver-79b6bb8fb6-gt22s" (UID: "27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8") : failed to sync configmap cache: timed out waiting for the condition Dec 16 16:22:28.684221 kubelet[2885]: E1216 16:22:28.683821 2885 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 16 16:22:28.684394 kubelet[2885]: E1216 16:22:28.684222 2885 projected.go:194] Error preparing data for projected volume kube-api-access-2dn65 for pod calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq: failed to sync configmap cache: timed out waiting for the condition Dec 16 16:22:28.684394 kubelet[2885]: E1216 16:22:28.684265 2885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/475547d4-6717-4e55-a04e-817534e2d535-kube-api-access-2dn65 podName:475547d4-6717-4e55-a04e-817534e2d535 nodeName:}" failed. No retries permitted until 2025-12-16 16:22:29.184249585 +0000 UTC m=+39.391196789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2dn65" (UniqueName: "kubernetes.io/projected/475547d4-6717-4e55-a04e-817534e2d535-kube-api-access-2dn65") pod "calico-apiserver-79b6bb8fb6-vmcnq" (UID: "475547d4-6717-4e55-a04e-817534e2d535") : failed to sync configmap cache: timed out waiting for the condition Dec 16 16:22:29.304160 containerd[1574]: time="2025-12-16T16:22:29.304103677Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7796b4f945-xfw6t,Uid:00e405ab-5218-434e-9f1a-e07a6a584faa,Namespace:calico-system,Attempt:0,}" Dec 16 16:22:29.317885 containerd[1574]: time="2025-12-16T16:22:29.317703828Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-79b6bb8fb6-vmcnq,Uid:475547d4-6717-4e55-a04e-817534e2d535,Namespace:calico-apiserver,Attempt:0,}" Dec 16 16:22:29.416009 containerd[1574]: time="2025-12-16T16:22:29.415546372Z" level=error msg="Failed to destroy network for sandbox \"df6afe9c2ff45d4003940e7bb7529f395f0ed110d39df3418f9050844281da63\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:29.418345 containerd[1574]: time="2025-12-16T16:22:29.418275541Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7796b4f945-xfw6t,Uid:00e405ab-5218-434e-9f1a-e07a6a584faa,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"df6afe9c2ff45d4003940e7bb7529f395f0ed110d39df3418f9050844281da63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:29.418972 kubelet[2885]: E1216 16:22:29.418850 2885 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df6afe9c2ff45d4003940e7bb7529f395f0ed110d39df3418f9050844281da63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:29.419260 kubelet[2885]: E1216 16:22:29.418941 2885 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df6afe9c2ff45d4003940e7bb7529f395f0ed110d39df3418f9050844281da63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7796b4f945-xfw6t" Dec 16 16:22:29.419260 kubelet[2885]: E1216 16:22:29.419120 2885 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df6afe9c2ff45d4003940e7bb7529f395f0ed110d39df3418f9050844281da63\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7796b4f945-xfw6t" Dec 16 16:22:29.419677 kubelet[2885]: E1216 16:22:29.419459 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7796b4f945-xfw6t_calico-system(00e405ab-5218-434e-9f1a-e07a6a584faa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7796b4f945-xfw6t_calico-system(00e405ab-5218-434e-9f1a-e07a6a584faa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"df6afe9c2ff45d4003940e7bb7529f395f0ed110d39df3418f9050844281da63\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7796b4f945-xfw6t" podUID="00e405ab-5218-434e-9f1a-e07a6a584faa" Dec 16 16:22:29.421004 systemd[1]: run-netns-cni\x2daeade9fa\x2d9c69\x2d0c4a\x2dce39\x2d61d1d3ea5553.mount: Deactivated successfully. Dec 16 16:22:29.428874 containerd[1574]: time="2025-12-16T16:22:29.428788069Z" level=error msg="Failed to destroy network for sandbox \"980b9c7b229d11881d82b1c9ad03df8a9c7d2287900e99a6c1dfcb7158aedaf1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:29.431841 containerd[1574]: time="2025-12-16T16:22:29.431768591Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-79b6bb8fb6-vmcnq,Uid:475547d4-6717-4e55-a04e-817534e2d535,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"980b9c7b229d11881d82b1c9ad03df8a9c7d2287900e99a6c1dfcb7158aedaf1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:29.433205 kubelet[2885]: E1216 16:22:29.432072 2885 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"980b9c7b229d11881d82b1c9ad03df8a9c7d2287900e99a6c1dfcb7158aedaf1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:29.433205 kubelet[2885]: E1216 16:22:29.432164 2885 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"980b9c7b229d11881d82b1c9ad03df8a9c7d2287900e99a6c1dfcb7158aedaf1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq" Dec 16 16:22:29.433205 kubelet[2885]: E1216 16:22:29.432202 2885 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"980b9c7b229d11881d82b1c9ad03df8a9c7d2287900e99a6c1dfcb7158aedaf1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq" Dec 16 16:22:29.433628 kubelet[2885]: E1216 16:22:29.432281 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-79b6bb8fb6-vmcnq_calico-apiserver(475547d4-6717-4e55-a04e-817534e2d535)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-79b6bb8fb6-vmcnq_calico-apiserver(475547d4-6717-4e55-a04e-817534e2d535)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"980b9c7b229d11881d82b1c9ad03df8a9c7d2287900e99a6c1dfcb7158aedaf1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq" podUID="475547d4-6717-4e55-a04e-817534e2d535" Dec 16 16:22:29.433895 systemd[1]: run-netns-cni\x2dab627779\x2d3542\x2d92b7\x2d90d4\x2d1d1f3c346d53.mount: Deactivated successfully. Dec 16 16:22:29.555692 containerd[1574]: time="2025-12-16T16:22:29.555438087Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-79b6bb8fb6-gt22s,Uid:27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8,Namespace:calico-apiserver,Attempt:0,}" Dec 16 16:22:29.667479 containerd[1574]: time="2025-12-16T16:22:29.667350910Z" level=error msg="Failed to destroy network for sandbox \"afd04243760c0ef1dbe2cadcc84f91001388ec385dd872ae3c88277664e913f6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:29.672097 containerd[1574]: time="2025-12-16T16:22:29.671574282Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-79b6bb8fb6-gt22s,Uid:27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"afd04243760c0ef1dbe2cadcc84f91001388ec385dd872ae3c88277664e913f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:29.671954 systemd[1]: run-netns-cni\x2d4e2b4da1\x2d3b2f\x2d79df\x2dff5f\x2dfbcb1552749a.mount: Deactivated successfully. Dec 16 16:22:29.672524 kubelet[2885]: E1216 16:22:29.672466 2885 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"afd04243760c0ef1dbe2cadcc84f91001388ec385dd872ae3c88277664e913f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:29.672914 kubelet[2885]: E1216 16:22:29.672564 2885 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"afd04243760c0ef1dbe2cadcc84f91001388ec385dd872ae3c88277664e913f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s" Dec 16 16:22:29.673370 kubelet[2885]: E1216 16:22:29.673332 2885 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"afd04243760c0ef1dbe2cadcc84f91001388ec385dd872ae3c88277664e913f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s" Dec 16 16:22:29.674404 kubelet[2885]: E1216 16:22:29.673457 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-79b6bb8fb6-gt22s_calico-apiserver(27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-79b6bb8fb6-gt22s_calico-apiserver(27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"afd04243760c0ef1dbe2cadcc84f91001388ec385dd872ae3c88277664e913f6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s" podUID="27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8" Dec 16 16:22:38.410166 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount480198187.mount: Deactivated successfully. Dec 16 16:22:38.499113 containerd[1574]: time="2025-12-16T16:22:38.487826338Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:22:38.502853 containerd[1574]: time="2025-12-16T16:22:38.502729601Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156883675" Dec 16 16:22:38.502999 containerd[1574]: time="2025-12-16T16:22:38.502961002Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:22:38.505103 containerd[1574]: time="2025-12-16T16:22:38.504409128Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 16:22:38.507761 containerd[1574]: time="2025-12-16T16:22:38.507636408Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 10.204764087s" Dec 16 16:22:38.507761 containerd[1574]: time="2025-12-16T16:22:38.507700029Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Dec 16 16:22:38.544519 containerd[1574]: time="2025-12-16T16:22:38.544451053Z" level=info msg="CreateContainer within sandbox \"22b71f85b8a08a4600a98fcaf07ceed2afe0d5fb8c2c9b4b17eb8a7d38036b4f\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 16 16:22:38.605134 containerd[1574]: time="2025-12-16T16:22:38.601328236Z" level=info msg="Container ecf8b52471f641581a58c7052b69fa6c2927e8dc43e3e894a4c514dc0e19248e: CDI devices from CRI Config.CDIDevices: []" Dec 16 16:22:38.606012 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3857779926.mount: Deactivated successfully. Dec 16 16:22:38.669498 containerd[1574]: time="2025-12-16T16:22:38.668860665Z" level=info msg="CreateContainer within sandbox \"22b71f85b8a08a4600a98fcaf07ceed2afe0d5fb8c2c9b4b17eb8a7d38036b4f\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ecf8b52471f641581a58c7052b69fa6c2927e8dc43e3e894a4c514dc0e19248e\"" Dec 16 16:22:38.670433 containerd[1574]: time="2025-12-16T16:22:38.669741639Z" level=info msg="StartContainer for \"ecf8b52471f641581a58c7052b69fa6c2927e8dc43e3e894a4c514dc0e19248e\"" Dec 16 16:22:38.678056 containerd[1574]: time="2025-12-16T16:22:38.678015342Z" level=info msg="connecting to shim ecf8b52471f641581a58c7052b69fa6c2927e8dc43e3e894a4c514dc0e19248e" address="unix:///run/containerd/s/7fc8168771acdadc103dc1dde30da3e76b2c95d04b87ad11eab00fc2993d55e1" protocol=ttrpc version=3 Dec 16 16:22:38.755322 systemd[1]: Started cri-containerd-ecf8b52471f641581a58c7052b69fa6c2927e8dc43e3e894a4c514dc0e19248e.scope - libcontainer container ecf8b52471f641581a58c7052b69fa6c2927e8dc43e3e894a4c514dc0e19248e. Dec 16 16:22:38.910524 containerd[1574]: time="2025-12-16T16:22:38.910474129Z" level=info msg="StartContainer for \"ecf8b52471f641581a58c7052b69fa6c2927e8dc43e3e894a4c514dc0e19248e\" returns successfully" Dec 16 16:22:39.022210 containerd[1574]: time="2025-12-16T16:22:39.021686612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79db8d8ff-l77mg,Uid:df9fadf4-abe7-46bf-a41a-4f54249b3de9,Namespace:calico-system,Attempt:0,}" Dec 16 16:22:39.182611 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 16 16:22:39.186825 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 16 16:22:39.191910 containerd[1574]: time="2025-12-16T16:22:39.191852284Z" level=error msg="Failed to destroy network for sandbox \"76cde1904cd04d03dd6e9dc5d0167185e31c162b3319abbb071b348d5324c4e3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:39.194775 containerd[1574]: time="2025-12-16T16:22:39.194545016Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79db8d8ff-l77mg,Uid:df9fadf4-abe7-46bf-a41a-4f54249b3de9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"76cde1904cd04d03dd6e9dc5d0167185e31c162b3319abbb071b348d5324c4e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:39.196429 kubelet[2885]: E1216 16:22:39.195856 2885 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76cde1904cd04d03dd6e9dc5d0167185e31c162b3319abbb071b348d5324c4e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 16:22:39.196429 kubelet[2885]: E1216 16:22:39.195968 2885 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76cde1904cd04d03dd6e9dc5d0167185e31c162b3319abbb071b348d5324c4e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" Dec 16 16:22:39.196429 kubelet[2885]: E1216 16:22:39.196012 2885 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76cde1904cd04d03dd6e9dc5d0167185e31c162b3319abbb071b348d5324c4e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" Dec 16 16:22:39.200374 kubelet[2885]: E1216 16:22:39.196094 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-79db8d8ff-l77mg_calico-system(df9fadf4-abe7-46bf-a41a-4f54249b3de9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-79db8d8ff-l77mg_calico-system(df9fadf4-abe7-46bf-a41a-4f54249b3de9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"76cde1904cd04d03dd6e9dc5d0167185e31c162b3319abbb071b348d5324c4e3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" podUID="df9fadf4-abe7-46bf-a41a-4f54249b3de9" Dec 16 16:22:39.426723 kubelet[2885]: I1216 16:22:39.426199 2885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-vm24s" podStartSLOduration=1.836882334 podStartE2EDuration="27.419272765s" podCreationTimestamp="2025-12-16 16:22:12 +0000 UTC" firstStartedPulling="2025-12-16 16:22:12.926348477 +0000 UTC m=+23.133295682" lastFinishedPulling="2025-12-16 16:22:38.508738907 +0000 UTC m=+48.715686113" observedRunningTime="2025-12-16 16:22:39.416766513 +0000 UTC m=+49.623713747" watchObservedRunningTime="2025-12-16 16:22:39.419272765 +0000 UTC m=+49.626219987" Dec 16 16:22:39.648390 kubelet[2885]: I1216 16:22:39.648330 2885 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/00e405ab-5218-434e-9f1a-e07a6a584faa-whisker-backend-key-pair\") pod \"00e405ab-5218-434e-9f1a-e07a6a584faa\" (UID: \"00e405ab-5218-434e-9f1a-e07a6a584faa\") " Dec 16 16:22:39.648390 kubelet[2885]: I1216 16:22:39.648421 2885 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00e405ab-5218-434e-9f1a-e07a6a584faa-whisker-ca-bundle\") pod \"00e405ab-5218-434e-9f1a-e07a6a584faa\" (UID: \"00e405ab-5218-434e-9f1a-e07a6a584faa\") " Dec 16 16:22:39.649556 kubelet[2885]: I1216 16:22:39.648457 2885 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4ptd\" (UniqueName: \"kubernetes.io/projected/00e405ab-5218-434e-9f1a-e07a6a584faa-kube-api-access-j4ptd\") pod \"00e405ab-5218-434e-9f1a-e07a6a584faa\" (UID: \"00e405ab-5218-434e-9f1a-e07a6a584faa\") " Dec 16 16:22:39.656515 kubelet[2885]: I1216 16:22:39.656243 2885 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00e405ab-5218-434e-9f1a-e07a6a584faa-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "00e405ab-5218-434e-9f1a-e07a6a584faa" (UID: "00e405ab-5218-434e-9f1a-e07a6a584faa"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 16 16:22:39.670647 systemd[1]: var-lib-kubelet-pods-00e405ab\x2d5218\x2d434e\x2d9f1a\x2de07a6a584faa-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 16 16:22:39.672429 kubelet[2885]: I1216 16:22:39.672022 2885 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00e405ab-5218-434e-9f1a-e07a6a584faa-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "00e405ab-5218-434e-9f1a-e07a6a584faa" (UID: "00e405ab-5218-434e-9f1a-e07a6a584faa"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 16 16:22:39.672429 kubelet[2885]: I1216 16:22:39.672277 2885 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00e405ab-5218-434e-9f1a-e07a6a584faa-kube-api-access-j4ptd" (OuterVolumeSpecName: "kube-api-access-j4ptd") pod "00e405ab-5218-434e-9f1a-e07a6a584faa" (UID: "00e405ab-5218-434e-9f1a-e07a6a584faa"). InnerVolumeSpecName "kube-api-access-j4ptd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 16 16:22:39.676584 systemd[1]: var-lib-kubelet-pods-00e405ab\x2d5218\x2d434e\x2d9f1a\x2de07a6a584faa-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dj4ptd.mount: Deactivated successfully. Dec 16 16:22:39.751116 kubelet[2885]: I1216 16:22:39.750901 2885 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/00e405ab-5218-434e-9f1a-e07a6a584faa-whisker-backend-key-pair\") on node \"srv-bfhb9.gb1.brightbox.com\" DevicePath \"\"" Dec 16 16:22:39.751116 kubelet[2885]: I1216 16:22:39.750970 2885 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00e405ab-5218-434e-9f1a-e07a6a584faa-whisker-ca-bundle\") on node \"srv-bfhb9.gb1.brightbox.com\" DevicePath \"\"" Dec 16 16:22:39.751116 kubelet[2885]: I1216 16:22:39.750988 2885 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j4ptd\" (UniqueName: \"kubernetes.io/projected/00e405ab-5218-434e-9f1a-e07a6a584faa-kube-api-access-j4ptd\") on node \"srv-bfhb9.gb1.brightbox.com\" DevicePath \"\"" Dec 16 16:22:40.022711 containerd[1574]: time="2025-12-16T16:22:40.022548075Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-57n7d,Uid:c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585,Namespace:calico-system,Attempt:0,}" Dec 16 16:22:40.025198 containerd[1574]: time="2025-12-16T16:22:40.022968986Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-jnqzj,Uid:720712db-dca2-4214-827a-7cbbbdd0f811,Namespace:kube-system,Attempt:0,}" Dec 16 16:22:40.025198 containerd[1574]: time="2025-12-16T16:22:40.024116697Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-79b6bb8fb6-vmcnq,Uid:475547d4-6717-4e55-a04e-817534e2d535,Namespace:calico-apiserver,Attempt:0,}" Dec 16 16:22:40.066240 systemd[1]: Removed slice kubepods-besteffort-pod00e405ab_5218_434e_9f1a_e07a6a584faa.slice - libcontainer container kubepods-besteffort-pod00e405ab_5218_434e_9f1a_e07a6a584faa.slice. Dec 16 16:22:40.681780 systemd[1]: Created slice kubepods-besteffort-pod9c3bf940_f9bd_460f_a7db_19023b314640.slice - libcontainer container kubepods-besteffort-pod9c3bf940_f9bd_460f_a7db_19023b314640.slice. Dec 16 16:22:40.765707 kubelet[2885]: I1216 16:22:40.765632 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c3bf940-f9bd-460f-a7db-19023b314640-whisker-ca-bundle\") pod \"whisker-6bd47f9749-fdxr4\" (UID: \"9c3bf940-f9bd-460f-a7db-19023b314640\") " pod="calico-system/whisker-6bd47f9749-fdxr4" Dec 16 16:22:40.765707 kubelet[2885]: I1216 16:22:40.765713 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9c3bf940-f9bd-460f-a7db-19023b314640-whisker-backend-key-pair\") pod \"whisker-6bd47f9749-fdxr4\" (UID: \"9c3bf940-f9bd-460f-a7db-19023b314640\") " pod="calico-system/whisker-6bd47f9749-fdxr4" Dec 16 16:22:40.766379 kubelet[2885]: I1216 16:22:40.765751 2885 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w6d6\" (UniqueName: \"kubernetes.io/projected/9c3bf940-f9bd-460f-a7db-19023b314640-kube-api-access-9w6d6\") pod \"whisker-6bd47f9749-fdxr4\" (UID: \"9c3bf940-f9bd-460f-a7db-19023b314640\") " pod="calico-system/whisker-6bd47f9749-fdxr4" Dec 16 16:22:40.803748 systemd-networkd[1505]: cali64361c4cdc2: Link UP Dec 16 16:22:40.808242 systemd-networkd[1505]: cali64361c4cdc2: Gained carrier Dec 16 16:22:40.868131 containerd[1574]: 2025-12-16 16:22:40.172 [INFO][4033] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 16:22:40.868131 containerd[1574]: 2025-12-16 16:22:40.219 [INFO][4033] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--vmcnq-eth0 calico-apiserver-79b6bb8fb6- calico-apiserver 475547d4-6717-4e55-a04e-817534e2d535 852 0 2025-12-16 16:22:06 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:79b6bb8fb6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-bfhb9.gb1.brightbox.com calico-apiserver-79b6bb8fb6-vmcnq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali64361c4cdc2 [] [] }} ContainerID="785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" Namespace="calico-apiserver" Pod="calico-apiserver-79b6bb8fb6-vmcnq" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--vmcnq-" Dec 16 16:22:40.868131 containerd[1574]: 2025-12-16 16:22:40.219 [INFO][4033] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" Namespace="calico-apiserver" Pod="calico-apiserver-79b6bb8fb6-vmcnq" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--vmcnq-eth0" Dec 16 16:22:40.868131 containerd[1574]: 2025-12-16 16:22:40.596 [INFO][4072] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" HandleID="k8s-pod-network.785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" Workload="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--vmcnq-eth0" Dec 16 16:22:40.868593 containerd[1574]: 2025-12-16 16:22:40.598 [INFO][4072] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" HandleID="k8s-pod-network.785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" Workload="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--vmcnq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004edd0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-bfhb9.gb1.brightbox.com", "pod":"calico-apiserver-79b6bb8fb6-vmcnq", "timestamp":"2025-12-16 16:22:40.596466174 +0000 UTC"}, Hostname:"srv-bfhb9.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 16:22:40.868593 containerd[1574]: 2025-12-16 16:22:40.598 [INFO][4072] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 16:22:40.868593 containerd[1574]: 2025-12-16 16:22:40.598 [INFO][4072] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 16:22:40.868593 containerd[1574]: 2025-12-16 16:22:40.599 [INFO][4072] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-bfhb9.gb1.brightbox.com' Dec 16 16:22:40.868593 containerd[1574]: 2025-12-16 16:22:40.633 [INFO][4072] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:40.868593 containerd[1574]: 2025-12-16 16:22:40.655 [INFO][4072] ipam/ipam.go 394: Looking up existing affinities for host host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:40.868593 containerd[1574]: 2025-12-16 16:22:40.699 [INFO][4072] ipam/ipam.go 511: Trying affinity for 192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:40.868593 containerd[1574]: 2025-12-16 16:22:40.710 [INFO][4072] ipam/ipam.go 158: Attempting to load block cidr=192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:40.868593 containerd[1574]: 2025-12-16 16:22:40.718 [INFO][4072] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:40.874538 containerd[1574]: 2025-12-16 16:22:40.719 [INFO][4072] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.87.64/26 handle="k8s-pod-network.785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:40.874538 containerd[1574]: 2025-12-16 16:22:40.724 [INFO][4072] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a Dec 16 16:22:40.874538 containerd[1574]: 2025-12-16 16:22:40.734 [INFO][4072] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.87.64/26 handle="k8s-pod-network.785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:40.874538 containerd[1574]: 2025-12-16 16:22:40.767 [INFO][4072] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.87.65/26] block=192.168.87.64/26 handle="k8s-pod-network.785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:40.874538 containerd[1574]: 2025-12-16 16:22:40.767 [INFO][4072] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.87.65/26] handle="k8s-pod-network.785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:40.874538 containerd[1574]: 2025-12-16 16:22:40.767 [INFO][4072] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 16:22:40.874538 containerd[1574]: 2025-12-16 16:22:40.767 [INFO][4072] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.87.65/26] IPv6=[] ContainerID="785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" HandleID="k8s-pod-network.785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" Workload="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--vmcnq-eth0" Dec 16 16:22:40.877408 containerd[1574]: 2025-12-16 16:22:40.775 [INFO][4033] cni-plugin/k8s.go 418: Populated endpoint ContainerID="785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" Namespace="calico-apiserver" Pod="calico-apiserver-79b6bb8fb6-vmcnq" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--vmcnq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--vmcnq-eth0", GenerateName:"calico-apiserver-79b6bb8fb6-", Namespace:"calico-apiserver", SelfLink:"", UID:"475547d4-6717-4e55-a04e-817534e2d535", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 16, 22, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"79b6bb8fb6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-bfhb9.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-79b6bb8fb6-vmcnq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.87.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali64361c4cdc2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 16:22:40.877534 containerd[1574]: 2025-12-16 16:22:40.775 [INFO][4033] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.87.65/32] ContainerID="785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" Namespace="calico-apiserver" Pod="calico-apiserver-79b6bb8fb6-vmcnq" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--vmcnq-eth0" Dec 16 16:22:40.877534 containerd[1574]: 2025-12-16 16:22:40.775 [INFO][4033] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali64361c4cdc2 ContainerID="785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" Namespace="calico-apiserver" Pod="calico-apiserver-79b6bb8fb6-vmcnq" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--vmcnq-eth0" Dec 16 16:22:40.877534 containerd[1574]: 2025-12-16 16:22:40.802 [INFO][4033] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" Namespace="calico-apiserver" Pod="calico-apiserver-79b6bb8fb6-vmcnq" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--vmcnq-eth0" Dec 16 16:22:40.877681 containerd[1574]: 2025-12-16 16:22:40.808 [INFO][4033] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" Namespace="calico-apiserver" Pod="calico-apiserver-79b6bb8fb6-vmcnq" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--vmcnq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--vmcnq-eth0", GenerateName:"calico-apiserver-79b6bb8fb6-", Namespace:"calico-apiserver", SelfLink:"", UID:"475547d4-6717-4e55-a04e-817534e2d535", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 16, 22, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"79b6bb8fb6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-bfhb9.gb1.brightbox.com", ContainerID:"785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a", Pod:"calico-apiserver-79b6bb8fb6-vmcnq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.87.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali64361c4cdc2", MAC:"2e:fa:5d:4e:f5:b5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 16:22:40.877790 containerd[1574]: 2025-12-16 16:22:40.850 [INFO][4033] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" Namespace="calico-apiserver" Pod="calico-apiserver-79b6bb8fb6-vmcnq" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--vmcnq-eth0" Dec 16 16:22:40.990840 containerd[1574]: time="2025-12-16T16:22:40.990326202Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6bd47f9749-fdxr4,Uid:9c3bf940-f9bd-460f-a7db-19023b314640,Namespace:calico-system,Attempt:0,}" Dec 16 16:22:41.048817 systemd-networkd[1505]: cali550688d933a: Link UP Dec 16 16:22:41.059211 systemd-networkd[1505]: cali550688d933a: Gained carrier Dec 16 16:22:41.092309 containerd[1574]: 2025-12-16 16:22:40.130 [INFO][4027] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 16:22:41.092309 containerd[1574]: 2025-12-16 16:22:40.221 [INFO][4027] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--jnqzj-eth0 coredns-668d6bf9bc- kube-system 720712db-dca2-4214-827a-7cbbbdd0f811 848 0 2025-12-16 16:21:54 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-bfhb9.gb1.brightbox.com coredns-668d6bf9bc-jnqzj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali550688d933a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" Namespace="kube-system" Pod="coredns-668d6bf9bc-jnqzj" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--jnqzj-" Dec 16 16:22:41.092309 containerd[1574]: 2025-12-16 16:22:40.221 [INFO][4027] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" Namespace="kube-system" Pod="coredns-668d6bf9bc-jnqzj" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--jnqzj-eth0" Dec 16 16:22:41.092309 containerd[1574]: 2025-12-16 16:22:40.598 [INFO][4073] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" HandleID="k8s-pod-network.b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" Workload="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--jnqzj-eth0" Dec 16 16:22:41.093079 containerd[1574]: 2025-12-16 16:22:40.606 [INFO][4073] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" HandleID="k8s-pod-network.b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" Workload="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--jnqzj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00030e240), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-bfhb9.gb1.brightbox.com", "pod":"coredns-668d6bf9bc-jnqzj", "timestamp":"2025-12-16 16:22:40.598961021 +0000 UTC"}, Hostname:"srv-bfhb9.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 16:22:41.093079 containerd[1574]: 2025-12-16 16:22:40.606 [INFO][4073] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 16:22:41.093079 containerd[1574]: 2025-12-16 16:22:40.768 [INFO][4073] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 16:22:41.093079 containerd[1574]: 2025-12-16 16:22:40.769 [INFO][4073] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-bfhb9.gb1.brightbox.com' Dec 16 16:22:41.093079 containerd[1574]: 2025-12-16 16:22:40.787 [INFO][4073] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.093079 containerd[1574]: 2025-12-16 16:22:40.832 [INFO][4073] ipam/ipam.go 394: Looking up existing affinities for host host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.093079 containerd[1574]: 2025-12-16 16:22:40.909 [INFO][4073] ipam/ipam.go 511: Trying affinity for 192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.093079 containerd[1574]: 2025-12-16 16:22:40.918 [INFO][4073] ipam/ipam.go 158: Attempting to load block cidr=192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.093079 containerd[1574]: 2025-12-16 16:22:40.925 [INFO][4073] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.096248 containerd[1574]: 2025-12-16 16:22:40.927 [INFO][4073] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.87.64/26 handle="k8s-pod-network.b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.096248 containerd[1574]: 2025-12-16 16:22:40.932 [INFO][4073] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935 Dec 16 16:22:41.096248 containerd[1574]: 2025-12-16 16:22:40.950 [INFO][4073] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.87.64/26 handle="k8s-pod-network.b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.096248 containerd[1574]: 2025-12-16 16:22:41.014 [INFO][4073] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.87.66/26] block=192.168.87.64/26 handle="k8s-pod-network.b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.096248 containerd[1574]: 2025-12-16 16:22:41.017 [INFO][4073] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.87.66/26] handle="k8s-pod-network.b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.096248 containerd[1574]: 2025-12-16 16:22:41.017 [INFO][4073] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 16:22:41.096248 containerd[1574]: 2025-12-16 16:22:41.019 [INFO][4073] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.87.66/26] IPv6=[] ContainerID="b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" HandleID="k8s-pod-network.b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" Workload="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--jnqzj-eth0" Dec 16 16:22:41.096537 containerd[1574]: 2025-12-16 16:22:41.041 [INFO][4027] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" Namespace="kube-system" Pod="coredns-668d6bf9bc-jnqzj" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--jnqzj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--jnqzj-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"720712db-dca2-4214-827a-7cbbbdd0f811", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 16, 21, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-bfhb9.gb1.brightbox.com", ContainerID:"", Pod:"coredns-668d6bf9bc-jnqzj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.87.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali550688d933a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 16:22:41.096537 containerd[1574]: 2025-12-16 16:22:41.042 [INFO][4027] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.87.66/32] ContainerID="b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" Namespace="kube-system" Pod="coredns-668d6bf9bc-jnqzj" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--jnqzj-eth0" Dec 16 16:22:41.096537 containerd[1574]: 2025-12-16 16:22:41.042 [INFO][4027] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali550688d933a ContainerID="b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" Namespace="kube-system" Pod="coredns-668d6bf9bc-jnqzj" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--jnqzj-eth0" Dec 16 16:22:41.096537 containerd[1574]: 2025-12-16 16:22:41.059 [INFO][4027] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" Namespace="kube-system" Pod="coredns-668d6bf9bc-jnqzj" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--jnqzj-eth0" Dec 16 16:22:41.096537 containerd[1574]: 2025-12-16 16:22:41.060 [INFO][4027] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" Namespace="kube-system" Pod="coredns-668d6bf9bc-jnqzj" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--jnqzj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--jnqzj-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"720712db-dca2-4214-827a-7cbbbdd0f811", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 16, 21, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-bfhb9.gb1.brightbox.com", ContainerID:"b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935", Pod:"coredns-668d6bf9bc-jnqzj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.87.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali550688d933a", MAC:"3e:6f:6e:1e:1f:f2", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 16:22:41.096537 containerd[1574]: 2025-12-16 16:22:41.086 [INFO][4027] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" Namespace="kube-system" Pod="coredns-668d6bf9bc-jnqzj" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--jnqzj-eth0" Dec 16 16:22:41.225324 systemd-networkd[1505]: calic3a4c520009: Link UP Dec 16 16:22:41.230349 systemd-networkd[1505]: calic3a4c520009: Gained carrier Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:40.183 [INFO][4024] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:40.226 [INFO][4024] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--bfhb9.gb1.brightbox.com-k8s-goldmane--666569f655--57n7d-eth0 goldmane-666569f655- calico-system c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585 850 0 2025-12-16 16:22:09 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s srv-bfhb9.gb1.brightbox.com goldmane-666569f655-57n7d eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calic3a4c520009 [] [] }} ContainerID="28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" Namespace="calico-system" Pod="goldmane-666569f655-57n7d" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-goldmane--666569f655--57n7d-" Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:40.228 [INFO][4024] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" Namespace="calico-system" Pod="goldmane-666569f655-57n7d" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-goldmane--666569f655--57n7d-eth0" Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:40.606 [INFO][4075] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" HandleID="k8s-pod-network.28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" Workload="srv--bfhb9.gb1.brightbox.com-k8s-goldmane--666569f655--57n7d-eth0" Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:40.608 [INFO][4075] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" HandleID="k8s-pod-network.28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" Workload="srv--bfhb9.gb1.brightbox.com-k8s-goldmane--666569f655--57n7d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000343900), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-bfhb9.gb1.brightbox.com", "pod":"goldmane-666569f655-57n7d", "timestamp":"2025-12-16 16:22:40.606381222 +0000 UTC"}, Hostname:"srv-bfhb9.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:40.608 [INFO][4075] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:41.035 [INFO][4075] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:41.035 [INFO][4075] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-bfhb9.gb1.brightbox.com' Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:41.082 [INFO][4075] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:41.104 [INFO][4075] ipam/ipam.go 394: Looking up existing affinities for host host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:41.127 [INFO][4075] ipam/ipam.go 511: Trying affinity for 192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:41.135 [INFO][4075] ipam/ipam.go 158: Attempting to load block cidr=192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:41.146 [INFO][4075] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:41.146 [INFO][4075] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.87.64/26 handle="k8s-pod-network.28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:41.157 [INFO][4075] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68 Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:41.177 [INFO][4075] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.87.64/26 handle="k8s-pod-network.28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:41.201 [INFO][4075] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.87.67/26] block=192.168.87.64/26 handle="k8s-pod-network.28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:41.201 [INFO][4075] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.87.67/26] handle="k8s-pod-network.28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:41.201 [INFO][4075] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 16:22:41.260383 containerd[1574]: 2025-12-16 16:22:41.205 [INFO][4075] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.87.67/26] IPv6=[] ContainerID="28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" HandleID="k8s-pod-network.28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" Workload="srv--bfhb9.gb1.brightbox.com-k8s-goldmane--666569f655--57n7d-eth0" Dec 16 16:22:41.264453 containerd[1574]: 2025-12-16 16:22:41.215 [INFO][4024] cni-plugin/k8s.go 418: Populated endpoint ContainerID="28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" Namespace="calico-system" Pod="goldmane-666569f655-57n7d" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-goldmane--666569f655--57n7d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--bfhb9.gb1.brightbox.com-k8s-goldmane--666569f655--57n7d-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 16, 22, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-bfhb9.gb1.brightbox.com", ContainerID:"", Pod:"goldmane-666569f655-57n7d", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.87.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic3a4c520009", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 16:22:41.264453 containerd[1574]: 2025-12-16 16:22:41.216 [INFO][4024] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.87.67/32] ContainerID="28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" Namespace="calico-system" Pod="goldmane-666569f655-57n7d" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-goldmane--666569f655--57n7d-eth0" Dec 16 16:22:41.264453 containerd[1574]: 2025-12-16 16:22:41.216 [INFO][4024] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic3a4c520009 ContainerID="28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" Namespace="calico-system" Pod="goldmane-666569f655-57n7d" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-goldmane--666569f655--57n7d-eth0" Dec 16 16:22:41.264453 containerd[1574]: 2025-12-16 16:22:41.229 [INFO][4024] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" Namespace="calico-system" Pod="goldmane-666569f655-57n7d" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-goldmane--666569f655--57n7d-eth0" Dec 16 16:22:41.264453 containerd[1574]: 2025-12-16 16:22:41.229 [INFO][4024] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" Namespace="calico-system" Pod="goldmane-666569f655-57n7d" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-goldmane--666569f655--57n7d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--bfhb9.gb1.brightbox.com-k8s-goldmane--666569f655--57n7d-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 16, 22, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-bfhb9.gb1.brightbox.com", ContainerID:"28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68", Pod:"goldmane-666569f655-57n7d", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.87.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calic3a4c520009", MAC:"32:a5:39:1b:91:db", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 16:22:41.264453 containerd[1574]: 2025-12-16 16:22:41.244 [INFO][4024] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" Namespace="calico-system" Pod="goldmane-666569f655-57n7d" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-goldmane--666569f655--57n7d-eth0" Dec 16 16:22:41.531346 containerd[1574]: time="2025-12-16T16:22:41.530167164Z" level=info msg="connecting to shim 28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68" address="unix:///run/containerd/s/6e3134e392e9dc7f1b3e8d8ee3d4c476162809c81bfb2001583842e359ca74d0" namespace=k8s.io protocol=ttrpc version=3 Dec 16 16:22:41.533106 containerd[1574]: time="2025-12-16T16:22:41.532905749Z" level=info msg="connecting to shim 785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a" address="unix:///run/containerd/s/2c8bdbee1c1f9af054b97c00bda0a4559786d2032fad7a3fbbab740b34a84e5f" namespace=k8s.io protocol=ttrpc version=3 Dec 16 16:22:41.542343 containerd[1574]: time="2025-12-16T16:22:41.542286763Z" level=info msg="connecting to shim b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935" address="unix:///run/containerd/s/01550cd990b2bd7d56175a59d23ae1d764cf6c4ac1c2df527138ef4c4196545e" namespace=k8s.io protocol=ttrpc version=3 Dec 16 16:22:41.584983 systemd-networkd[1505]: cali280f8df44c0: Link UP Dec 16 16:22:41.587753 systemd-networkd[1505]: cali280f8df44c0: Gained carrier Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.247 [INFO][4124] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.310 [INFO][4124] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--bfhb9.gb1.brightbox.com-k8s-whisker--6bd47f9749--fdxr4-eth0 whisker-6bd47f9749- calico-system 9c3bf940-f9bd-460f-a7db-19023b314640 930 0 2025-12-16 16:22:40 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6bd47f9749 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s srv-bfhb9.gb1.brightbox.com whisker-6bd47f9749-fdxr4 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali280f8df44c0 [] [] }} ContainerID="fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" Namespace="calico-system" Pod="whisker-6bd47f9749-fdxr4" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-whisker--6bd47f9749--fdxr4-" Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.310 [INFO][4124] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" Namespace="calico-system" Pod="whisker-6bd47f9749-fdxr4" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-whisker--6bd47f9749--fdxr4-eth0" Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.451 [INFO][4190] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" HandleID="k8s-pod-network.fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" Workload="srv--bfhb9.gb1.brightbox.com-k8s-whisker--6bd47f9749--fdxr4-eth0" Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.451 [INFO][4190] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" HandleID="k8s-pod-network.fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" Workload="srv--bfhb9.gb1.brightbox.com-k8s-whisker--6bd47f9749--fdxr4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000103660), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-bfhb9.gb1.brightbox.com", "pod":"whisker-6bd47f9749-fdxr4", "timestamp":"2025-12-16 16:22:41.451273022 +0000 UTC"}, Hostname:"srv-bfhb9.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.451 [INFO][4190] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.451 [INFO][4190] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.451 [INFO][4190] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-bfhb9.gb1.brightbox.com' Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.475 [INFO][4190] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.505 [INFO][4190] ipam/ipam.go 394: Looking up existing affinities for host host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.523 [INFO][4190] ipam/ipam.go 511: Trying affinity for 192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.529 [INFO][4190] ipam/ipam.go 158: Attempting to load block cidr=192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.539 [INFO][4190] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.539 [INFO][4190] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.87.64/26 handle="k8s-pod-network.fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.543 [INFO][4190] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13 Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.553 [INFO][4190] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.87.64/26 handle="k8s-pod-network.fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.566 [INFO][4190] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.87.68/26] block=192.168.87.64/26 handle="k8s-pod-network.fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.566 [INFO][4190] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.87.68/26] handle="k8s-pod-network.fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.566 [INFO][4190] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 16:22:41.669108 containerd[1574]: 2025-12-16 16:22:41.566 [INFO][4190] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.87.68/26] IPv6=[] ContainerID="fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" HandleID="k8s-pod-network.fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" Workload="srv--bfhb9.gb1.brightbox.com-k8s-whisker--6bd47f9749--fdxr4-eth0" Dec 16 16:22:41.672107 containerd[1574]: 2025-12-16 16:22:41.572 [INFO][4124] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" Namespace="calico-system" Pod="whisker-6bd47f9749-fdxr4" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-whisker--6bd47f9749--fdxr4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--bfhb9.gb1.brightbox.com-k8s-whisker--6bd47f9749--fdxr4-eth0", GenerateName:"whisker-6bd47f9749-", Namespace:"calico-system", SelfLink:"", UID:"9c3bf940-f9bd-460f-a7db-19023b314640", ResourceVersion:"930", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 16, 22, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6bd47f9749", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-bfhb9.gb1.brightbox.com", ContainerID:"", Pod:"whisker-6bd47f9749-fdxr4", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.87.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali280f8df44c0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 16:22:41.672107 containerd[1574]: 2025-12-16 16:22:41.572 [INFO][4124] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.87.68/32] ContainerID="fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" Namespace="calico-system" Pod="whisker-6bd47f9749-fdxr4" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-whisker--6bd47f9749--fdxr4-eth0" Dec 16 16:22:41.672107 containerd[1574]: 2025-12-16 16:22:41.572 [INFO][4124] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali280f8df44c0 ContainerID="fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" Namespace="calico-system" Pod="whisker-6bd47f9749-fdxr4" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-whisker--6bd47f9749--fdxr4-eth0" Dec 16 16:22:41.672107 containerd[1574]: 2025-12-16 16:22:41.590 [INFO][4124] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" Namespace="calico-system" Pod="whisker-6bd47f9749-fdxr4" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-whisker--6bd47f9749--fdxr4-eth0" Dec 16 16:22:41.672107 containerd[1574]: 2025-12-16 16:22:41.593 [INFO][4124] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" Namespace="calico-system" Pod="whisker-6bd47f9749-fdxr4" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-whisker--6bd47f9749--fdxr4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--bfhb9.gb1.brightbox.com-k8s-whisker--6bd47f9749--fdxr4-eth0", GenerateName:"whisker-6bd47f9749-", Namespace:"calico-system", SelfLink:"", UID:"9c3bf940-f9bd-460f-a7db-19023b314640", ResourceVersion:"930", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 16, 22, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6bd47f9749", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-bfhb9.gb1.brightbox.com", ContainerID:"fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13", Pod:"whisker-6bd47f9749-fdxr4", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.87.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali280f8df44c0", MAC:"82:ec:bd:5d:ad:3c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 16:22:41.672107 containerd[1574]: 2025-12-16 16:22:41.655 [INFO][4124] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" Namespace="calico-system" Pod="whisker-6bd47f9749-fdxr4" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-whisker--6bd47f9749--fdxr4-eth0" Dec 16 16:22:41.718470 systemd[1]: Started cri-containerd-785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a.scope - libcontainer container 785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a. Dec 16 16:22:41.740919 systemd[1]: Started cri-containerd-b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935.scope - libcontainer container b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935. Dec 16 16:22:41.797330 containerd[1574]: time="2025-12-16T16:22:41.796772317Z" level=info msg="connecting to shim fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13" address="unix:///run/containerd/s/84c9bfd0e16b782ef07d8182e863f0fa987fab8c318f03fdad6e242fc58a3e22" namespace=k8s.io protocol=ttrpc version=3 Dec 16 16:22:41.845378 systemd[1]: Started cri-containerd-28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68.scope - libcontainer container 28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68. Dec 16 16:22:41.950422 systemd[1]: Started cri-containerd-fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13.scope - libcontainer container fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13. Dec 16 16:22:42.002417 containerd[1574]: time="2025-12-16T16:22:42.002321940Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-jnqzj,Uid:720712db-dca2-4214-827a-7cbbbdd0f811,Namespace:kube-system,Attempt:0,} returns sandbox id \"b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935\"" Dec 16 16:22:42.010793 containerd[1574]: time="2025-12-16T16:22:42.009956028Z" level=info msg="CreateContainer within sandbox \"b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 16:22:42.032002 systemd-networkd[1505]: cali64361c4cdc2: Gained IPv6LL Dec 16 16:22:42.036178 containerd[1574]: time="2025-12-16T16:22:42.035832951Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-44s4j,Uid:79791809-4f39-420a-be3e-00a912b46628,Namespace:calico-system,Attempt:0,}" Dec 16 16:22:42.037248 containerd[1574]: time="2025-12-16T16:22:42.037160408Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-p5zp4,Uid:65e40112-221a-4b95-a8b2-4f1ac0bbad0f,Namespace:kube-system,Attempt:0,}" Dec 16 16:22:42.045396 kubelet[2885]: I1216 16:22:42.043910 2885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00e405ab-5218-434e-9f1a-e07a6a584faa" path="/var/lib/kubelet/pods/00e405ab-5218-434e-9f1a-e07a6a584faa/volumes" Dec 16 16:22:42.045933 containerd[1574]: time="2025-12-16T16:22:42.045463700Z" level=info msg="Container e6178b1423326de731aabf21ab6f662ef39a891d5c15edf27788b73d01e6aeb5: CDI devices from CRI Config.CDIDevices: []" Dec 16 16:22:42.131615 containerd[1574]: time="2025-12-16T16:22:42.131468591Z" level=info msg="CreateContainer within sandbox \"b5655c17c2826478367decb3bc1aadf8ef427aa0a64deebb3b465063a8327935\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e6178b1423326de731aabf21ab6f662ef39a891d5c15edf27788b73d01e6aeb5\"" Dec 16 16:22:42.138750 containerd[1574]: time="2025-12-16T16:22:42.138689290Z" level=info msg="StartContainer for \"e6178b1423326de731aabf21ab6f662ef39a891d5c15edf27788b73d01e6aeb5\"" Dec 16 16:22:42.150157 containerd[1574]: time="2025-12-16T16:22:42.149746710Z" level=info msg="connecting to shim e6178b1423326de731aabf21ab6f662ef39a891d5c15edf27788b73d01e6aeb5" address="unix:///run/containerd/s/01550cd990b2bd7d56175a59d23ae1d764cf6c4ac1c2df527138ef4c4196545e" protocol=ttrpc version=3 Dec 16 16:22:42.221289 systemd[1]: Started cri-containerd-e6178b1423326de731aabf21ab6f662ef39a891d5c15edf27788b73d01e6aeb5.scope - libcontainer container e6178b1423326de731aabf21ab6f662ef39a891d5c15edf27788b73d01e6aeb5. Dec 16 16:22:42.436666 containerd[1574]: time="2025-12-16T16:22:42.436531493Z" level=info msg="StartContainer for \"e6178b1423326de731aabf21ab6f662ef39a891d5c15edf27788b73d01e6aeb5\" returns successfully" Dec 16 16:22:42.578252 systemd-networkd[1505]: cali2dd89d83458: Link UP Dec 16 16:22:42.585353 containerd[1574]: time="2025-12-16T16:22:42.585292511Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-79b6bb8fb6-vmcnq,Uid:475547d4-6717-4e55-a04e-817534e2d535,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"785a2c98545a968238c4d32bea1480b23cfc723058f6f77240603619fd82727a\"" Dec 16 16:22:42.594694 systemd-networkd[1505]: cali2dd89d83458: Gained carrier Dec 16 16:22:42.600287 containerd[1574]: time="2025-12-16T16:22:42.600218474Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 16:22:42.608670 systemd-networkd[1505]: calic3a4c520009: Gained IPv6LL Dec 16 16:22:42.641908 containerd[1574]: time="2025-12-16T16:22:42.641435577Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-57n7d,Uid:c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585,Namespace:calico-system,Attempt:0,} returns sandbox id \"28cb8fffc76639a5c922b87ac7aa844726078ad8ed016a79d038684747e3ee68\"" Dec 16 16:22:42.649141 containerd[1574]: time="2025-12-16T16:22:42.648439889Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6bd47f9749-fdxr4,Uid:9c3bf940-f9bd-460f-a7db-19023b314640,Namespace:calico-system,Attempt:0,} returns sandbox id \"fe46f70534adf064bcbbd1e04d145b65427f9f7f577dcc911f12f7a5a1bead13\"" Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.167 [INFO][4406] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.220 [INFO][4406] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--p5zp4-eth0 coredns-668d6bf9bc- kube-system 65e40112-221a-4b95-a8b2-4f1ac0bbad0f 841 0 2025-12-16 16:21:54 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-bfhb9.gb1.brightbox.com coredns-668d6bf9bc-p5zp4 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali2dd89d83458 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" Namespace="kube-system" Pod="coredns-668d6bf9bc-p5zp4" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--p5zp4-" Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.220 [INFO][4406] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" Namespace="kube-system" Pod="coredns-668d6bf9bc-p5zp4" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--p5zp4-eth0" Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.382 [INFO][4454] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" HandleID="k8s-pod-network.c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" Workload="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--p5zp4-eth0" Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.386 [INFO][4454] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" HandleID="k8s-pod-network.c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" Workload="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--p5zp4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000327500), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-bfhb9.gb1.brightbox.com", "pod":"coredns-668d6bf9bc-p5zp4", "timestamp":"2025-12-16 16:22:42.382941472 +0000 UTC"}, Hostname:"srv-bfhb9.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.386 [INFO][4454] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.386 [INFO][4454] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.386 [INFO][4454] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-bfhb9.gb1.brightbox.com' Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.406 [INFO][4454] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.442 [INFO][4454] ipam/ipam.go 394: Looking up existing affinities for host host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.459 [INFO][4454] ipam/ipam.go 511: Trying affinity for 192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.471 [INFO][4454] ipam/ipam.go 158: Attempting to load block cidr=192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.485 [INFO][4454] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.485 [INFO][4454] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.87.64/26 handle="k8s-pod-network.c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.495 [INFO][4454] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234 Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.517 [INFO][4454] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.87.64/26 handle="k8s-pod-network.c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.534 [INFO][4454] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.87.69/26] block=192.168.87.64/26 handle="k8s-pod-network.c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.536 [INFO][4454] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.87.69/26] handle="k8s-pod-network.c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.536 [INFO][4454] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 16:22:42.668623 containerd[1574]: 2025-12-16 16:22:42.537 [INFO][4454] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.87.69/26] IPv6=[] ContainerID="c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" HandleID="k8s-pod-network.c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" Workload="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--p5zp4-eth0" Dec 16 16:22:42.670339 containerd[1574]: 2025-12-16 16:22:42.546 [INFO][4406] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" Namespace="kube-system" Pod="coredns-668d6bf9bc-p5zp4" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--p5zp4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--p5zp4-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"65e40112-221a-4b95-a8b2-4f1ac0bbad0f", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 16, 21, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-bfhb9.gb1.brightbox.com", ContainerID:"", Pod:"coredns-668d6bf9bc-p5zp4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.87.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2dd89d83458", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 16:22:42.670339 containerd[1574]: 2025-12-16 16:22:42.546 [INFO][4406] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.87.69/32] ContainerID="c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" Namespace="kube-system" Pod="coredns-668d6bf9bc-p5zp4" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--p5zp4-eth0" Dec 16 16:22:42.670339 containerd[1574]: 2025-12-16 16:22:42.546 [INFO][4406] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2dd89d83458 ContainerID="c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" Namespace="kube-system" Pod="coredns-668d6bf9bc-p5zp4" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--p5zp4-eth0" Dec 16 16:22:42.670339 containerd[1574]: 2025-12-16 16:22:42.618 [INFO][4406] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" Namespace="kube-system" Pod="coredns-668d6bf9bc-p5zp4" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--p5zp4-eth0" Dec 16 16:22:42.670339 containerd[1574]: 2025-12-16 16:22:42.629 [INFO][4406] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" Namespace="kube-system" Pod="coredns-668d6bf9bc-p5zp4" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--p5zp4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--p5zp4-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"65e40112-221a-4b95-a8b2-4f1ac0bbad0f", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 16, 21, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-bfhb9.gb1.brightbox.com", ContainerID:"c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234", Pod:"coredns-668d6bf9bc-p5zp4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.87.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2dd89d83458", MAC:"8a:42:7c:c8:ff:0c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 16:22:42.670339 containerd[1574]: 2025-12-16 16:22:42.662 [INFO][4406] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" Namespace="kube-system" Pod="coredns-668d6bf9bc-p5zp4" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-coredns--668d6bf9bc--p5zp4-eth0" Dec 16 16:22:42.671633 systemd-networkd[1505]: cali550688d933a: Gained IPv6LL Dec 16 16:22:42.753331 containerd[1574]: time="2025-12-16T16:22:42.752229187Z" level=info msg="connecting to shim c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234" address="unix:///run/containerd/s/fc12157ea78d312386542c31a0ec7f32f3aa7e3cedc5e945833cf67cefa1f2bb" namespace=k8s.io protocol=ttrpc version=3 Dec 16 16:22:42.766385 systemd-networkd[1505]: calieb447833be2: Link UP Dec 16 16:22:42.770583 systemd-networkd[1505]: calieb447833be2: Gained carrier Dec 16 16:22:42.844863 systemd[1]: Started cri-containerd-c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234.scope - libcontainer container c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234. Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.255 [INFO][4413] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.294 [INFO][4413] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--bfhb9.gb1.brightbox.com-k8s-csi--node--driver--44s4j-eth0 csi-node-driver- calico-system 79791809-4f39-420a-be3e-00a912b46628 721 0 2025-12-16 16:22:12 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-bfhb9.gb1.brightbox.com csi-node-driver-44s4j eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calieb447833be2 [] [] }} ContainerID="8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" Namespace="calico-system" Pod="csi-node-driver-44s4j" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-csi--node--driver--44s4j-" Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.294 [INFO][4413] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" Namespace="calico-system" Pod="csi-node-driver-44s4j" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-csi--node--driver--44s4j-eth0" Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.402 [INFO][4462] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" HandleID="k8s-pod-network.8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" Workload="srv--bfhb9.gb1.brightbox.com-k8s-csi--node--driver--44s4j-eth0" Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.402 [INFO][4462] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" HandleID="k8s-pod-network.8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" Workload="srv--bfhb9.gb1.brightbox.com-k8s-csi--node--driver--44s4j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5220), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-bfhb9.gb1.brightbox.com", "pod":"csi-node-driver-44s4j", "timestamp":"2025-12-16 16:22:42.402025532 +0000 UTC"}, Hostname:"srv-bfhb9.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.402 [INFO][4462] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.537 [INFO][4462] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.537 [INFO][4462] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-bfhb9.gb1.brightbox.com' Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.614 [INFO][4462] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.639 [INFO][4462] ipam/ipam.go 394: Looking up existing affinities for host host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.674 [INFO][4462] ipam/ipam.go 511: Trying affinity for 192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.684 [INFO][4462] ipam/ipam.go 158: Attempting to load block cidr=192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.695 [INFO][4462] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.695 [INFO][4462] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.87.64/26 handle="k8s-pod-network.8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.701 [INFO][4462] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96 Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.716 [INFO][4462] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.87.64/26 handle="k8s-pod-network.8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.744 [INFO][4462] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.87.70/26] block=192.168.87.64/26 handle="k8s-pod-network.8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.744 [INFO][4462] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.87.70/26] handle="k8s-pod-network.8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.745 [INFO][4462] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 16:22:42.853034 containerd[1574]: 2025-12-16 16:22:42.745 [INFO][4462] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.87.70/26] IPv6=[] ContainerID="8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" HandleID="k8s-pod-network.8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" Workload="srv--bfhb9.gb1.brightbox.com-k8s-csi--node--driver--44s4j-eth0" Dec 16 16:22:42.856210 containerd[1574]: 2025-12-16 16:22:42.753 [INFO][4413] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" Namespace="calico-system" Pod="csi-node-driver-44s4j" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-csi--node--driver--44s4j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--bfhb9.gb1.brightbox.com-k8s-csi--node--driver--44s4j-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"79791809-4f39-420a-be3e-00a912b46628", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 16, 22, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-bfhb9.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-44s4j", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.87.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calieb447833be2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 16:22:42.856210 containerd[1574]: 2025-12-16 16:22:42.753 [INFO][4413] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.87.70/32] ContainerID="8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" Namespace="calico-system" Pod="csi-node-driver-44s4j" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-csi--node--driver--44s4j-eth0" Dec 16 16:22:42.856210 containerd[1574]: 2025-12-16 16:22:42.754 [INFO][4413] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calieb447833be2 ContainerID="8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" Namespace="calico-system" Pod="csi-node-driver-44s4j" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-csi--node--driver--44s4j-eth0" Dec 16 16:22:42.856210 containerd[1574]: 2025-12-16 16:22:42.774 [INFO][4413] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" Namespace="calico-system" Pod="csi-node-driver-44s4j" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-csi--node--driver--44s4j-eth0" Dec 16 16:22:42.856210 containerd[1574]: 2025-12-16 16:22:42.781 [INFO][4413] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" Namespace="calico-system" Pod="csi-node-driver-44s4j" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-csi--node--driver--44s4j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--bfhb9.gb1.brightbox.com-k8s-csi--node--driver--44s4j-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"79791809-4f39-420a-be3e-00a912b46628", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 16, 22, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-bfhb9.gb1.brightbox.com", ContainerID:"8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96", Pod:"csi-node-driver-44s4j", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.87.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calieb447833be2", MAC:"d6:e0:ff:69:6e:41", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 16:22:42.856210 containerd[1574]: 2025-12-16 16:22:42.846 [INFO][4413] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" Namespace="calico-system" Pod="csi-node-driver-44s4j" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-csi--node--driver--44s4j-eth0" Dec 16 16:22:42.886981 containerd[1574]: time="2025-12-16T16:22:42.886911796Z" level=info msg="connecting to shim 8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96" address="unix:///run/containerd/s/5929519409b8d105a13c6579fc68efc97b528d3f861b3f8ac9c93adfc3ec316d" namespace=k8s.io protocol=ttrpc version=3 Dec 16 16:22:42.927387 systemd-networkd[1505]: cali280f8df44c0: Gained IPv6LL Dec 16 16:22:42.938693 systemd[1]: Started cri-containerd-8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96.scope - libcontainer container 8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96. Dec 16 16:22:43.025409 containerd[1574]: time="2025-12-16T16:22:43.024306666Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-44s4j,Uid:79791809-4f39-420a-be3e-00a912b46628,Namespace:calico-system,Attempt:0,} returns sandbox id \"8191b9912feed85eeddee6a5d3ebdce6e874ee5e5a92abe4e4dacb898b9a8a96\"" Dec 16 16:22:43.025409 containerd[1574]: time="2025-12-16T16:22:43.024800900Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:22:43.030217 containerd[1574]: time="2025-12-16T16:22:43.029178239Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 16:22:43.030217 containerd[1574]: time="2025-12-16T16:22:43.029313665Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 16:22:43.034054 kubelet[2885]: E1216 16:22:43.034005 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 16:22:43.034512 kubelet[2885]: E1216 16:22:43.034469 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 16:22:43.036049 containerd[1574]: time="2025-12-16T16:22:43.035826342Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 16:22:43.057046 containerd[1574]: time="2025-12-16T16:22:43.057002632Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-p5zp4,Uid:65e40112-221a-4b95-a8b2-4f1ac0bbad0f,Namespace:kube-system,Attempt:0,} returns sandbox id \"c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234\"" Dec 16 16:22:43.061408 kubelet[2885]: E1216 16:22:43.061292 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2dn65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-79b6bb8fb6-vmcnq_calico-apiserver(475547d4-6717-4e55-a04e-817534e2d535): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 16:22:43.066774 containerd[1574]: time="2025-12-16T16:22:43.066631544Z" level=info msg="CreateContainer within sandbox \"c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 16:22:43.079408 kubelet[2885]: E1216 16:22:43.079329 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq" podUID="475547d4-6717-4e55-a04e-817534e2d535" Dec 16 16:22:43.083530 containerd[1574]: time="2025-12-16T16:22:43.083497069Z" level=info msg="Container 6f342fa899d1b48d55953468f7b0d23f3f9623fa6a5c47ffd984e0f3e0c4a253: CDI devices from CRI Config.CDIDevices: []" Dec 16 16:22:43.096801 containerd[1574]: time="2025-12-16T16:22:43.096559190Z" level=info msg="CreateContainer within sandbox \"c71248c1bc9429c04c63a3dcf0b8b83a154a869956779bd6810f0f525107d234\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"6f342fa899d1b48d55953468f7b0d23f3f9623fa6a5c47ffd984e0f3e0c4a253\"" Dec 16 16:22:43.098107 containerd[1574]: time="2025-12-16T16:22:43.097683739Z" level=info msg="StartContainer for \"6f342fa899d1b48d55953468f7b0d23f3f9623fa6a5c47ffd984e0f3e0c4a253\"" Dec 16 16:22:43.101212 containerd[1574]: time="2025-12-16T16:22:43.101180209Z" level=info msg="connecting to shim 6f342fa899d1b48d55953468f7b0d23f3f9623fa6a5c47ffd984e0f3e0c4a253" address="unix:///run/containerd/s/fc12157ea78d312386542c31a0ec7f32f3aa7e3cedc5e945833cf67cefa1f2bb" protocol=ttrpc version=3 Dec 16 16:22:43.147676 systemd[1]: Started cri-containerd-6f342fa899d1b48d55953468f7b0d23f3f9623fa6a5c47ffd984e0f3e0c4a253.scope - libcontainer container 6f342fa899d1b48d55953468f7b0d23f3f9623fa6a5c47ffd984e0f3e0c4a253. Dec 16 16:22:43.246332 containerd[1574]: time="2025-12-16T16:22:43.246282327Z" level=info msg="StartContainer for \"6f342fa899d1b48d55953468f7b0d23f3f9623fa6a5c47ffd984e0f3e0c4a253\" returns successfully" Dec 16 16:22:43.361178 containerd[1574]: time="2025-12-16T16:22:43.360060607Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:22:43.362027 containerd[1574]: time="2025-12-16T16:22:43.361956597Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 16:22:43.362664 containerd[1574]: time="2025-12-16T16:22:43.362054154Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 16:22:43.362818 kubelet[2885]: E1216 16:22:43.362639 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 16:22:43.363744 kubelet[2885]: E1216 16:22:43.362834 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 16:22:43.364009 kubelet[2885]: E1216 16:22:43.363287 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sh92g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-57n7d_calico-system(c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 16:22:43.364820 containerd[1574]: time="2025-12-16T16:22:43.363989609Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 16:22:43.365312 kubelet[2885]: E1216 16:22:43.365197 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-57n7d" podUID="c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585" Dec 16 16:22:43.453735 kubelet[2885]: E1216 16:22:43.453188 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-57n7d" podUID="c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585" Dec 16 16:22:43.455475 kubelet[2885]: E1216 16:22:43.455427 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq" podUID="475547d4-6717-4e55-a04e-817534e2d535" Dec 16 16:22:43.549887 kubelet[2885]: I1216 16:22:43.547770 2885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-p5zp4" podStartSLOduration=49.539863204 podStartE2EDuration="49.539863204s" podCreationTimestamp="2025-12-16 16:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 16:22:43.537672362 +0000 UTC m=+53.744619583" watchObservedRunningTime="2025-12-16 16:22:43.539863204 +0000 UTC m=+53.746810426" Dec 16 16:22:43.716871 containerd[1574]: time="2025-12-16T16:22:43.716805755Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:22:43.718281 containerd[1574]: time="2025-12-16T16:22:43.718144839Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 16:22:43.718281 containerd[1574]: time="2025-12-16T16:22:43.718216185Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 16:22:43.718639 kubelet[2885]: E1216 16:22:43.718543 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 16:22:43.718745 kubelet[2885]: E1216 16:22:43.718656 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 16:22:43.719292 kubelet[2885]: E1216 16:22:43.719070 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:32fcb3ddc81d4eaca54ba9fd665b271e,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9w6d6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6bd47f9749-fdxr4_calico-system(9c3bf940-f9bd-460f-a7db-19023b314640): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 16:22:43.719696 containerd[1574]: time="2025-12-16T16:22:43.719288358Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 16:22:44.023161 containerd[1574]: time="2025-12-16T16:22:44.022435542Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-79b6bb8fb6-gt22s,Uid:27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8,Namespace:calico-apiserver,Attempt:0,}" Dec 16 16:22:44.055509 containerd[1574]: time="2025-12-16T16:22:44.054990007Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:22:44.057748 containerd[1574]: time="2025-12-16T16:22:44.057260249Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 16:22:44.057989 containerd[1574]: time="2025-12-16T16:22:44.057417691Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 16:22:44.059762 kubelet[2885]: E1216 16:22:44.059689 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 16:22:44.061482 kubelet[2885]: E1216 16:22:44.060923 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 16:22:44.064109 kubelet[2885]: E1216 16:22:44.063241 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8kfls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-44s4j_calico-system(79791809-4f39-420a-be3e-00a912b46628): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 16:22:44.068000 containerd[1574]: time="2025-12-16T16:22:44.067967563Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 16:22:44.262204 systemd-networkd[1505]: vxlan.calico: Link UP Dec 16 16:22:44.262216 systemd-networkd[1505]: vxlan.calico: Gained carrier Dec 16 16:22:44.343256 systemd-networkd[1505]: cali86fdc732577: Link UP Dec 16 16:22:44.343561 systemd-networkd[1505]: cali86fdc732577: Gained carrier Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.137 [INFO][4689] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--gt22s-eth0 calico-apiserver-79b6bb8fb6- calico-apiserver 27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8 851 0 2025-12-16 16:22:06 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:79b6bb8fb6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-bfhb9.gb1.brightbox.com calico-apiserver-79b6bb8fb6-gt22s eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali86fdc732577 [] [] }} ContainerID="8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" Namespace="calico-apiserver" Pod="calico-apiserver-79b6bb8fb6-gt22s" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--gt22s-" Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.137 [INFO][4689] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" Namespace="calico-apiserver" Pod="calico-apiserver-79b6bb8fb6-gt22s" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--gt22s-eth0" Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.256 [INFO][4711] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" HandleID="k8s-pod-network.8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" Workload="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--gt22s-eth0" Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.257 [INFO][4711] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" HandleID="k8s-pod-network.8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" Workload="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--gt22s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000304f90), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-bfhb9.gb1.brightbox.com", "pod":"calico-apiserver-79b6bb8fb6-gt22s", "timestamp":"2025-12-16 16:22:44.256388004 +0000 UTC"}, Hostname:"srv-bfhb9.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.257 [INFO][4711] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.258 [INFO][4711] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.258 [INFO][4711] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-bfhb9.gb1.brightbox.com' Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.276 [INFO][4711] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.287 [INFO][4711] ipam/ipam.go 394: Looking up existing affinities for host host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.297 [INFO][4711] ipam/ipam.go 511: Trying affinity for 192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.300 [INFO][4711] ipam/ipam.go 158: Attempting to load block cidr=192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.304 [INFO][4711] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.304 [INFO][4711] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.87.64/26 handle="k8s-pod-network.8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.308 [INFO][4711] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020 Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.314 [INFO][4711] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.87.64/26 handle="k8s-pod-network.8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.325 [INFO][4711] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.87.71/26] block=192.168.87.64/26 handle="k8s-pod-network.8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.326 [INFO][4711] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.87.71/26] handle="k8s-pod-network.8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.326 [INFO][4711] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 16:22:44.396169 containerd[1574]: 2025-12-16 16:22:44.326 [INFO][4711] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.87.71/26] IPv6=[] ContainerID="8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" HandleID="k8s-pod-network.8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" Workload="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--gt22s-eth0" Dec 16 16:22:44.415797 containerd[1574]: 2025-12-16 16:22:44.332 [INFO][4689] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" Namespace="calico-apiserver" Pod="calico-apiserver-79b6bb8fb6-gt22s" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--gt22s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--gt22s-eth0", GenerateName:"calico-apiserver-79b6bb8fb6-", Namespace:"calico-apiserver", SelfLink:"", UID:"27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 16, 22, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"79b6bb8fb6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-bfhb9.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-79b6bb8fb6-gt22s", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.87.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali86fdc732577", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 16:22:44.415797 containerd[1574]: 2025-12-16 16:22:44.332 [INFO][4689] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.87.71/32] ContainerID="8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" Namespace="calico-apiserver" Pod="calico-apiserver-79b6bb8fb6-gt22s" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--gt22s-eth0" Dec 16 16:22:44.415797 containerd[1574]: 2025-12-16 16:22:44.332 [INFO][4689] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali86fdc732577 ContainerID="8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" Namespace="calico-apiserver" Pod="calico-apiserver-79b6bb8fb6-gt22s" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--gt22s-eth0" Dec 16 16:22:44.415797 containerd[1574]: 2025-12-16 16:22:44.342 [INFO][4689] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" Namespace="calico-apiserver" Pod="calico-apiserver-79b6bb8fb6-gt22s" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--gt22s-eth0" Dec 16 16:22:44.415797 containerd[1574]: 2025-12-16 16:22:44.348 [INFO][4689] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" Namespace="calico-apiserver" Pod="calico-apiserver-79b6bb8fb6-gt22s" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--gt22s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--gt22s-eth0", GenerateName:"calico-apiserver-79b6bb8fb6-", Namespace:"calico-apiserver", SelfLink:"", UID:"27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 16, 22, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"79b6bb8fb6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-bfhb9.gb1.brightbox.com", ContainerID:"8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020", Pod:"calico-apiserver-79b6bb8fb6-gt22s", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.87.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali86fdc732577", MAC:"e2:ae:d4:7c:28:bd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 16:22:44.415797 containerd[1574]: 2025-12-16 16:22:44.379 [INFO][4689] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" Namespace="calico-apiserver" Pod="calico-apiserver-79b6bb8fb6-gt22s" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--apiserver--79b6bb8fb6--gt22s-eth0" Dec 16 16:22:44.415797 containerd[1574]: time="2025-12-16T16:22:44.397184916Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:22:44.415797 containerd[1574]: time="2025-12-16T16:22:44.410461118Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 16:22:44.415797 containerd[1574]: time="2025-12-16T16:22:44.410607835Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 16:22:44.415797 containerd[1574]: time="2025-12-16T16:22:44.413661047Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 16:22:44.422202 kubelet[2885]: E1216 16:22:44.410819 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 16:22:44.422202 kubelet[2885]: E1216 16:22:44.410886 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 16:22:44.422202 kubelet[2885]: E1216 16:22:44.412621 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9w6d6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6bd47f9749-fdxr4_calico-system(9c3bf940-f9bd-460f-a7db-19023b314640): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 16:22:44.422202 kubelet[2885]: E1216 16:22:44.415161 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6bd47f9749-fdxr4" podUID="9c3bf940-f9bd-460f-a7db-19023b314640" Dec 16 16:22:44.464040 systemd-networkd[1505]: cali2dd89d83458: Gained IPv6LL Dec 16 16:22:44.470169 kubelet[2885]: I1216 16:22:44.469053 2885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-jnqzj" podStartSLOduration=50.469026675 podStartE2EDuration="50.469026675s" podCreationTimestamp="2025-12-16 16:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 16:22:43.563479488 +0000 UTC m=+53.770426707" watchObservedRunningTime="2025-12-16 16:22:44.469026675 +0000 UTC m=+54.675973894" Dec 16 16:22:44.495396 containerd[1574]: time="2025-12-16T16:22:44.494780644Z" level=info msg="connecting to shim 8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020" address="unix:///run/containerd/s/2d1b6a0798e1668334830ad59c62e88dbfdd963b973e76c778edc063e2280820" namespace=k8s.io protocol=ttrpc version=3 Dec 16 16:22:44.506682 kubelet[2885]: E1216 16:22:44.505932 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq" podUID="475547d4-6717-4e55-a04e-817534e2d535" Dec 16 16:22:44.507471 kubelet[2885]: E1216 16:22:44.506026 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-57n7d" podUID="c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585" Dec 16 16:22:44.507956 kubelet[2885]: E1216 16:22:44.507913 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6bd47f9749-fdxr4" podUID="9c3bf940-f9bd-460f-a7db-19023b314640" Dec 16 16:22:44.606540 systemd[1]: Started cri-containerd-8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020.scope - libcontainer container 8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020. Dec 16 16:22:44.719237 systemd-networkd[1505]: calieb447833be2: Gained IPv6LL Dec 16 16:22:44.731234 containerd[1574]: time="2025-12-16T16:22:44.731185143Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:22:44.734103 containerd[1574]: time="2025-12-16T16:22:44.733212163Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 16:22:44.734103 containerd[1574]: time="2025-12-16T16:22:44.733266202Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 16:22:44.734996 kubelet[2885]: E1216 16:22:44.734497 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 16:22:44.735159 kubelet[2885]: E1216 16:22:44.735009 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 16:22:44.735758 kubelet[2885]: E1216 16:22:44.735625 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8kfls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-44s4j_calico-system(79791809-4f39-420a-be3e-00a912b46628): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 16:22:44.737131 kubelet[2885]: E1216 16:22:44.737068 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:22:44.947814 containerd[1574]: time="2025-12-16T16:22:44.947393033Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-79b6bb8fb6-gt22s,Uid:27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8e11fab6178f9936f8feb2606ac7b35d0674ee8aeb0c6147c0eb57a274083020\"" Dec 16 16:22:44.951790 containerd[1574]: time="2025-12-16T16:22:44.951635266Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 16:22:45.262189 containerd[1574]: time="2025-12-16T16:22:45.262028798Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:22:45.263947 containerd[1574]: time="2025-12-16T16:22:45.263804339Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 16:22:45.263947 containerd[1574]: time="2025-12-16T16:22:45.263815252Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 16:22:45.264585 kubelet[2885]: E1216 16:22:45.264238 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 16:22:45.264585 kubelet[2885]: E1216 16:22:45.264303 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 16:22:45.264585 kubelet[2885]: E1216 16:22:45.264482 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sn5l2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-79b6bb8fb6-gt22s_calico-apiserver(27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 16:22:45.266626 kubelet[2885]: E1216 16:22:45.265742 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s" podUID="27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8" Dec 16 16:22:45.506072 kubelet[2885]: E1216 16:22:45.506010 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s" podUID="27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8" Dec 16 16:22:45.506907 kubelet[2885]: E1216 16:22:45.506849 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:22:45.743318 systemd-networkd[1505]: cali86fdc732577: Gained IPv6LL Dec 16 16:22:45.871286 systemd-networkd[1505]: vxlan.calico: Gained IPv6LL Dec 16 16:22:46.509016 kubelet[2885]: E1216 16:22:46.508961 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s" podUID="27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8" Dec 16 16:22:51.022355 containerd[1574]: time="2025-12-16T16:22:51.022275893Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79db8d8ff-l77mg,Uid:df9fadf4-abe7-46bf-a41a-4f54249b3de9,Namespace:calico-system,Attempt:0,}" Dec 16 16:22:51.192403 systemd-networkd[1505]: calic3ba8c1b439: Link UP Dec 16 16:22:51.194119 systemd-networkd[1505]: calic3ba8c1b439: Gained carrier Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.082 [INFO][4856] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--bfhb9.gb1.brightbox.com-k8s-calico--kube--controllers--79db8d8ff--l77mg-eth0 calico-kube-controllers-79db8d8ff- calico-system df9fadf4-abe7-46bf-a41a-4f54249b3de9 844 0 2025-12-16 16:22:12 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:79db8d8ff projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-bfhb9.gb1.brightbox.com calico-kube-controllers-79db8d8ff-l77mg eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic3ba8c1b439 [] [] }} ContainerID="3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" Namespace="calico-system" Pod="calico-kube-controllers-79db8d8ff-l77mg" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--kube--controllers--79db8d8ff--l77mg-" Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.082 [INFO][4856] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" Namespace="calico-system" Pod="calico-kube-controllers-79db8d8ff-l77mg" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--kube--controllers--79db8d8ff--l77mg-eth0" Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.133 [INFO][4868] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" HandleID="k8s-pod-network.3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" Workload="srv--bfhb9.gb1.brightbox.com-k8s-calico--kube--controllers--79db8d8ff--l77mg-eth0" Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.133 [INFO][4868] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" HandleID="k8s-pod-network.3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" Workload="srv--bfhb9.gb1.brightbox.com-k8s-calico--kube--controllers--79db8d8ff--l77mg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000124750), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-bfhb9.gb1.brightbox.com", "pod":"calico-kube-controllers-79db8d8ff-l77mg", "timestamp":"2025-12-16 16:22:51.133262925 +0000 UTC"}, Hostname:"srv-bfhb9.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.133 [INFO][4868] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.133 [INFO][4868] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.133 [INFO][4868] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-bfhb9.gb1.brightbox.com' Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.143 [INFO][4868] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.150 [INFO][4868] ipam/ipam.go 394: Looking up existing affinities for host host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.158 [INFO][4868] ipam/ipam.go 511: Trying affinity for 192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.161 [INFO][4868] ipam/ipam.go 158: Attempting to load block cidr=192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.164 [INFO][4868] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.87.64/26 host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.164 [INFO][4868] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.87.64/26 handle="k8s-pod-network.3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.167 [INFO][4868] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491 Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.173 [INFO][4868] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.87.64/26 handle="k8s-pod-network.3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.184 [INFO][4868] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.87.72/26] block=192.168.87.64/26 handle="k8s-pod-network.3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.184 [INFO][4868] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.87.72/26] handle="k8s-pod-network.3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" host="srv-bfhb9.gb1.brightbox.com" Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.184 [INFO][4868] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 16:22:51.221756 containerd[1574]: 2025-12-16 16:22:51.184 [INFO][4868] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.87.72/26] IPv6=[] ContainerID="3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" HandleID="k8s-pod-network.3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" Workload="srv--bfhb9.gb1.brightbox.com-k8s-calico--kube--controllers--79db8d8ff--l77mg-eth0" Dec 16 16:22:51.228380 containerd[1574]: 2025-12-16 16:22:51.188 [INFO][4856] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" Namespace="calico-system" Pod="calico-kube-controllers-79db8d8ff-l77mg" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--kube--controllers--79db8d8ff--l77mg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--bfhb9.gb1.brightbox.com-k8s-calico--kube--controllers--79db8d8ff--l77mg-eth0", GenerateName:"calico-kube-controllers-79db8d8ff-", Namespace:"calico-system", SelfLink:"", UID:"df9fadf4-abe7-46bf-a41a-4f54249b3de9", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 16, 22, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"79db8d8ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-bfhb9.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-79db8d8ff-l77mg", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.87.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic3ba8c1b439", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 16:22:51.228380 containerd[1574]: 2025-12-16 16:22:51.188 [INFO][4856] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.87.72/32] ContainerID="3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" Namespace="calico-system" Pod="calico-kube-controllers-79db8d8ff-l77mg" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--kube--controllers--79db8d8ff--l77mg-eth0" Dec 16 16:22:51.228380 containerd[1574]: 2025-12-16 16:22:51.188 [INFO][4856] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic3ba8c1b439 ContainerID="3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" Namespace="calico-system" Pod="calico-kube-controllers-79db8d8ff-l77mg" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--kube--controllers--79db8d8ff--l77mg-eth0" Dec 16 16:22:51.228380 containerd[1574]: 2025-12-16 16:22:51.194 [INFO][4856] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" Namespace="calico-system" Pod="calico-kube-controllers-79db8d8ff-l77mg" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--kube--controllers--79db8d8ff--l77mg-eth0" Dec 16 16:22:51.228380 containerd[1574]: 2025-12-16 16:22:51.194 [INFO][4856] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" Namespace="calico-system" Pod="calico-kube-controllers-79db8d8ff-l77mg" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--kube--controllers--79db8d8ff--l77mg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--bfhb9.gb1.brightbox.com-k8s-calico--kube--controllers--79db8d8ff--l77mg-eth0", GenerateName:"calico-kube-controllers-79db8d8ff-", Namespace:"calico-system", SelfLink:"", UID:"df9fadf4-abe7-46bf-a41a-4f54249b3de9", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 16, 22, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"79db8d8ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-bfhb9.gb1.brightbox.com", ContainerID:"3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491", Pod:"calico-kube-controllers-79db8d8ff-l77mg", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.87.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic3ba8c1b439", MAC:"7a:a3:55:31:1e:5d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 16:22:51.228380 containerd[1574]: 2025-12-16 16:22:51.213 [INFO][4856] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" Namespace="calico-system" Pod="calico-kube-controllers-79db8d8ff-l77mg" WorkloadEndpoint="srv--bfhb9.gb1.brightbox.com-k8s-calico--kube--controllers--79db8d8ff--l77mg-eth0" Dec 16 16:22:51.268530 containerd[1574]: time="2025-12-16T16:22:51.268470018Z" level=info msg="connecting to shim 3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491" address="unix:///run/containerd/s/0c3a163ef91a15a4e8ea7fa50ec695b72b6351069f3214762856e88ff0bf45db" namespace=k8s.io protocol=ttrpc version=3 Dec 16 16:22:51.313317 systemd[1]: Started cri-containerd-3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491.scope - libcontainer container 3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491. Dec 16 16:22:51.393522 containerd[1574]: time="2025-12-16T16:22:51.393466503Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-79db8d8ff-l77mg,Uid:df9fadf4-abe7-46bf-a41a-4f54249b3de9,Namespace:calico-system,Attempt:0,} returns sandbox id \"3ca1fe6a1e1588609a696eaf9a5576fe49fb8043d9c55625f2f244fbed165491\"" Dec 16 16:22:51.396445 containerd[1574]: time="2025-12-16T16:22:51.396317836Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 16:22:51.723742 containerd[1574]: time="2025-12-16T16:22:51.723637610Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:22:51.726139 containerd[1574]: time="2025-12-16T16:22:51.726036105Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 16:22:51.726563 containerd[1574]: time="2025-12-16T16:22:51.726096726Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 16:22:51.726870 kubelet[2885]: E1216 16:22:51.726782 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 16:22:51.727550 kubelet[2885]: E1216 16:22:51.726891 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 16:22:51.727550 kubelet[2885]: E1216 16:22:51.727177 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wj22d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-79db8d8ff-l77mg_calico-system(df9fadf4-abe7-46bf-a41a-4f54249b3de9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 16:22:51.729324 kubelet[2885]: E1216 16:22:51.728741 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" podUID="df9fadf4-abe7-46bf-a41a-4f54249b3de9" Dec 16 16:22:52.335633 systemd-networkd[1505]: calic3ba8c1b439: Gained IPv6LL Dec 16 16:22:52.528801 kubelet[2885]: E1216 16:22:52.528120 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" podUID="df9fadf4-abe7-46bf-a41a-4f54249b3de9" Dec 16 16:22:56.024181 containerd[1574]: time="2025-12-16T16:22:56.024059046Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 16:22:56.331127 containerd[1574]: time="2025-12-16T16:22:56.330878106Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:22:56.332666 containerd[1574]: time="2025-12-16T16:22:56.332610658Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 16:22:56.332850 containerd[1574]: time="2025-12-16T16:22:56.332644157Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 16:22:56.333186 kubelet[2885]: E1216 16:22:56.333134 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 16:22:56.333789 kubelet[2885]: E1216 16:22:56.333205 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 16:22:56.333789 kubelet[2885]: E1216 16:22:56.333425 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sh92g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-57n7d_calico-system(c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 16:22:56.335144 kubelet[2885]: E1216 16:22:56.335107 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-57n7d" podUID="c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585" Dec 16 16:22:59.024171 containerd[1574]: time="2025-12-16T16:22:59.023522956Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 16:22:59.343538 containerd[1574]: time="2025-12-16T16:22:59.343223159Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:22:59.394573 containerd[1574]: time="2025-12-16T16:22:59.394473379Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 16:22:59.395044 containerd[1574]: time="2025-12-16T16:22:59.394545332Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 16:22:59.395361 kubelet[2885]: E1216 16:22:59.395257 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 16:22:59.395842 kubelet[2885]: E1216 16:22:59.395383 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 16:22:59.395842 kubelet[2885]: E1216 16:22:59.395709 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sn5l2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-79b6bb8fb6-gt22s_calico-apiserver(27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 16:22:59.397235 containerd[1574]: time="2025-12-16T16:22:59.397173643Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 16:22:59.397702 kubelet[2885]: E1216 16:22:59.397494 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s" podUID="27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8" Dec 16 16:22:59.749503 containerd[1574]: time="2025-12-16T16:22:59.749369789Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:22:59.751220 containerd[1574]: time="2025-12-16T16:22:59.751066523Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 16:22:59.751220 containerd[1574]: time="2025-12-16T16:22:59.751113317Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 16:22:59.751792 kubelet[2885]: E1216 16:22:59.751431 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 16:22:59.751792 kubelet[2885]: E1216 16:22:59.751697 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 16:22:59.752306 containerd[1574]: time="2025-12-16T16:22:59.752256968Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 16:22:59.753348 kubelet[2885]: E1216 16:22:59.753260 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:32fcb3ddc81d4eaca54ba9fd665b271e,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9w6d6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6bd47f9749-fdxr4_calico-system(9c3bf940-f9bd-460f-a7db-19023b314640): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 16:23:00.066251 containerd[1574]: time="2025-12-16T16:23:00.066005912Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:23:00.067367 containerd[1574]: time="2025-12-16T16:23:00.067311702Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 16:23:00.067445 containerd[1574]: time="2025-12-16T16:23:00.067415202Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 16:23:00.067644 kubelet[2885]: E1216 16:23:00.067596 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 16:23:00.067748 kubelet[2885]: E1216 16:23:00.067658 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 16:23:00.068870 containerd[1574]: time="2025-12-16T16:23:00.068320327Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 16:23:00.068948 kubelet[2885]: E1216 16:23:00.068792 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8kfls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-44s4j_calico-system(79791809-4f39-420a-be3e-00a912b46628): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 16:23:00.385118 containerd[1574]: time="2025-12-16T16:23:00.385014389Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:23:00.386759 containerd[1574]: time="2025-12-16T16:23:00.386631454Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 16:23:00.386759 containerd[1574]: time="2025-12-16T16:23:00.386702510Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 16:23:00.387272 kubelet[2885]: E1216 16:23:00.387162 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 16:23:00.387467 kubelet[2885]: E1216 16:23:00.387254 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 16:23:00.387737 kubelet[2885]: E1216 16:23:00.387646 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2dn65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-79b6bb8fb6-vmcnq_calico-apiserver(475547d4-6717-4e55-a04e-817534e2d535): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 16:23:00.388384 containerd[1574]: time="2025-12-16T16:23:00.388258844Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 16:23:00.388924 kubelet[2885]: E1216 16:23:00.388877 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq" podUID="475547d4-6717-4e55-a04e-817534e2d535" Dec 16 16:23:00.714235 containerd[1574]: time="2025-12-16T16:23:00.713594579Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:23:00.715286 containerd[1574]: time="2025-12-16T16:23:00.715203680Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 16:23:00.715286 containerd[1574]: time="2025-12-16T16:23:00.715248356Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 16:23:00.715512 kubelet[2885]: E1216 16:23:00.715460 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 16:23:00.715936 kubelet[2885]: E1216 16:23:00.715522 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 16:23:00.716287 kubelet[2885]: E1216 16:23:00.716200 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9w6d6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6bd47f9749-fdxr4_calico-system(9c3bf940-f9bd-460f-a7db-19023b314640): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 16:23:00.717481 containerd[1574]: time="2025-12-16T16:23:00.717444384Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 16:23:00.718071 kubelet[2885]: E1216 16:23:00.718006 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6bd47f9749-fdxr4" podUID="9c3bf940-f9bd-460f-a7db-19023b314640" Dec 16 16:23:01.066460 containerd[1574]: time="2025-12-16T16:23:01.066269828Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:23:01.070751 containerd[1574]: time="2025-12-16T16:23:01.070683739Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 16:23:01.071216 containerd[1574]: time="2025-12-16T16:23:01.070788287Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 16:23:01.071295 kubelet[2885]: E1216 16:23:01.070962 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 16:23:01.071295 kubelet[2885]: E1216 16:23:01.071024 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 16:23:01.072607 kubelet[2885]: E1216 16:23:01.071822 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8kfls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-44s4j_calico-system(79791809-4f39-420a-be3e-00a912b46628): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 16:23:01.074039 kubelet[2885]: E1216 16:23:01.073991 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:23:07.024135 containerd[1574]: time="2025-12-16T16:23:07.023961186Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 16:23:07.338606 containerd[1574]: time="2025-12-16T16:23:07.338122642Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:23:07.345118 containerd[1574]: time="2025-12-16T16:23:07.344865002Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 16:23:07.345118 containerd[1574]: time="2025-12-16T16:23:07.345055671Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 16:23:07.345467 kubelet[2885]: E1216 16:23:07.345394 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 16:23:07.345994 kubelet[2885]: E1216 16:23:07.345498 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 16:23:07.345994 kubelet[2885]: E1216 16:23:07.345718 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wj22d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-79db8d8ff-l77mg_calico-system(df9fadf4-abe7-46bf-a41a-4f54249b3de9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 16:23:07.347647 kubelet[2885]: E1216 16:23:07.347604 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" podUID="df9fadf4-abe7-46bf-a41a-4f54249b3de9" Dec 16 16:23:10.028322 kubelet[2885]: E1216 16:23:10.028265 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-57n7d" podUID="c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585" Dec 16 16:23:11.024221 kubelet[2885]: E1216 16:23:11.023992 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s" podUID="27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8" Dec 16 16:23:12.024969 kubelet[2885]: E1216 16:23:12.024893 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:23:13.026214 kubelet[2885]: E1216 16:23:13.024957 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq" podUID="475547d4-6717-4e55-a04e-817534e2d535" Dec 16 16:23:16.025101 kubelet[2885]: E1216 16:23:16.024932 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6bd47f9749-fdxr4" podUID="9c3bf940-f9bd-460f-a7db-19023b314640" Dec 16 16:23:21.023235 kubelet[2885]: E1216 16:23:21.023142 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" podUID="df9fadf4-abe7-46bf-a41a-4f54249b3de9" Dec 16 16:23:23.024412 containerd[1574]: time="2025-12-16T16:23:23.024334219Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 16:23:23.358217 containerd[1574]: time="2025-12-16T16:23:23.358013139Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:23:23.360930 containerd[1574]: time="2025-12-16T16:23:23.360827255Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 16:23:23.360930 containerd[1574]: time="2025-12-16T16:23:23.360888885Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 16:23:23.361298 kubelet[2885]: E1216 16:23:23.361217 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 16:23:23.362739 kubelet[2885]: E1216 16:23:23.361312 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 16:23:23.362739 kubelet[2885]: E1216 16:23:23.361616 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8kfls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-44s4j_calico-system(79791809-4f39-420a-be3e-00a912b46628): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 16:23:23.364155 containerd[1574]: time="2025-12-16T16:23:23.362584816Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 16:23:23.461832 systemd[1]: Started sshd@9-10.244.29.226:22-139.178.68.195:53984.service - OpenSSH per-connection server daemon (139.178.68.195:53984). Dec 16 16:23:23.694375 containerd[1574]: time="2025-12-16T16:23:23.694306874Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:23:23.695683 containerd[1574]: time="2025-12-16T16:23:23.695626545Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 16:23:23.695766 containerd[1574]: time="2025-12-16T16:23:23.695740941Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 16:23:23.696057 kubelet[2885]: E1216 16:23:23.695991 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 16:23:23.696057 kubelet[2885]: E1216 16:23:23.696053 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 16:23:23.697413 kubelet[2885]: E1216 16:23:23.696841 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sh92g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-57n7d_calico-system(c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 16:23:23.697752 containerd[1574]: time="2025-12-16T16:23:23.697349680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 16:23:23.699164 kubelet[2885]: E1216 16:23:23.699116 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-57n7d" podUID="c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585" Dec 16 16:23:24.015173 containerd[1574]: time="2025-12-16T16:23:24.014575183Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:23:24.015940 containerd[1574]: time="2025-12-16T16:23:24.015897401Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 16:23:24.016148 containerd[1574]: time="2025-12-16T16:23:24.016014823Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 16:23:24.016848 kubelet[2885]: E1216 16:23:24.016740 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 16:23:24.017100 kubelet[2885]: E1216 16:23:24.016825 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 16:23:24.017808 kubelet[2885]: E1216 16:23:24.017567 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8kfls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-44s4j_calico-system(79791809-4f39-420a-be3e-00a912b46628): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 16:23:24.019362 kubelet[2885]: E1216 16:23:24.019147 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:23:24.023350 containerd[1574]: time="2025-12-16T16:23:24.023295203Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 16:23:24.374218 containerd[1574]: time="2025-12-16T16:23:24.373673464Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:23:24.376102 containerd[1574]: time="2025-12-16T16:23:24.374880497Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 16:23:24.376102 containerd[1574]: time="2025-12-16T16:23:24.374995132Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 16:23:24.376708 kubelet[2885]: E1216 16:23:24.376418 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 16:23:24.376708 kubelet[2885]: E1216 16:23:24.376484 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 16:23:24.377188 kubelet[2885]: E1216 16:23:24.376831 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sn5l2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-79b6bb8fb6-gt22s_calico-apiserver(27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 16:23:24.377461 containerd[1574]: time="2025-12-16T16:23:24.377402335Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 16:23:24.378704 kubelet[2885]: E1216 16:23:24.378663 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s" podUID="27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8" Dec 16 16:23:24.455108 sshd[4984]: Accepted publickey for core from 139.178.68.195 port 53984 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:23:24.461752 sshd-session[4984]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:23:24.474383 systemd-logind[1549]: New session 12 of user core. Dec 16 16:23:24.482326 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 16 16:23:24.767299 containerd[1574]: time="2025-12-16T16:23:24.766647476Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:23:24.768312 containerd[1574]: time="2025-12-16T16:23:24.768208533Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 16:23:24.768464 containerd[1574]: time="2025-12-16T16:23:24.768414452Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 16:23:24.769820 kubelet[2885]: E1216 16:23:24.769131 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 16:23:24.769820 kubelet[2885]: E1216 16:23:24.769195 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 16:23:24.769820 kubelet[2885]: E1216 16:23:24.769353 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2dn65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-79b6bb8fb6-vmcnq_calico-apiserver(475547d4-6717-4e55-a04e-817534e2d535): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 16:23:24.771034 kubelet[2885]: E1216 16:23:24.770891 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq" podUID="475547d4-6717-4e55-a04e-817534e2d535" Dec 16 16:23:25.877564 sshd[4987]: Connection closed by 139.178.68.195 port 53984 Dec 16 16:23:25.878876 sshd-session[4984]: pam_unix(sshd:session): session closed for user core Dec 16 16:23:25.889874 systemd[1]: sshd@9-10.244.29.226:22-139.178.68.195:53984.service: Deactivated successfully. Dec 16 16:23:25.891817 systemd-logind[1549]: Session 12 logged out. Waiting for processes to exit. Dec 16 16:23:25.894443 systemd[1]: session-12.scope: Deactivated successfully. Dec 16 16:23:25.900675 systemd-logind[1549]: Removed session 12. Dec 16 16:23:30.026107 containerd[1574]: time="2025-12-16T16:23:30.025864896Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 16:23:30.360788 containerd[1574]: time="2025-12-16T16:23:30.360611224Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:23:30.363097 containerd[1574]: time="2025-12-16T16:23:30.362986071Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 16:23:30.363298 containerd[1574]: time="2025-12-16T16:23:30.363219355Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 16:23:30.363818 kubelet[2885]: E1216 16:23:30.363703 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 16:23:30.364664 kubelet[2885]: E1216 16:23:30.363789 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 16:23:30.364664 kubelet[2885]: E1216 16:23:30.364568 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:32fcb3ddc81d4eaca54ba9fd665b271e,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9w6d6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6bd47f9749-fdxr4_calico-system(9c3bf940-f9bd-460f-a7db-19023b314640): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 16:23:30.366811 containerd[1574]: time="2025-12-16T16:23:30.366587553Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 16:23:30.683184 containerd[1574]: time="2025-12-16T16:23:30.683069481Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:23:30.686773 containerd[1574]: time="2025-12-16T16:23:30.686703731Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 16:23:30.687096 containerd[1574]: time="2025-12-16T16:23:30.686705016Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 16:23:30.687790 kubelet[2885]: E1216 16:23:30.687434 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 16:23:30.687790 kubelet[2885]: E1216 16:23:30.687503 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 16:23:30.687790 kubelet[2885]: E1216 16:23:30.687666 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9w6d6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6bd47f9749-fdxr4_calico-system(9c3bf940-f9bd-460f-a7db-19023b314640): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 16:23:30.691302 kubelet[2885]: E1216 16:23:30.690707 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6bd47f9749-fdxr4" podUID="9c3bf940-f9bd-460f-a7db-19023b314640" Dec 16 16:23:31.039524 systemd[1]: Started sshd@10-10.244.29.226:22-139.178.68.195:53936.service - OpenSSH per-connection server daemon (139.178.68.195:53936). Dec 16 16:23:31.975277 sshd[5010]: Accepted publickey for core from 139.178.68.195 port 53936 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:23:31.978776 sshd-session[5010]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:23:31.990360 systemd-logind[1549]: New session 13 of user core. Dec 16 16:23:31.997505 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 16 16:23:32.867109 sshd[5015]: Connection closed by 139.178.68.195 port 53936 Dec 16 16:23:32.867951 sshd-session[5010]: pam_unix(sshd:session): session closed for user core Dec 16 16:23:32.876887 systemd[1]: sshd@10-10.244.29.226:22-139.178.68.195:53936.service: Deactivated successfully. Dec 16 16:23:32.880979 systemd[1]: session-13.scope: Deactivated successfully. Dec 16 16:23:32.884012 systemd-logind[1549]: Session 13 logged out. Waiting for processes to exit. Dec 16 16:23:32.887402 systemd-logind[1549]: Removed session 13. Dec 16 16:23:35.024641 containerd[1574]: time="2025-12-16T16:23:35.024591801Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 16:23:35.028568 kubelet[2885]: E1216 16:23:35.027255 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-57n7d" podUID="c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585" Dec 16 16:23:35.341113 containerd[1574]: time="2025-12-16T16:23:35.340698108Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:23:35.343664 containerd[1574]: time="2025-12-16T16:23:35.343590420Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 16:23:35.343757 containerd[1574]: time="2025-12-16T16:23:35.343690611Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 16:23:35.344065 kubelet[2885]: E1216 16:23:35.343999 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 16:23:35.344408 kubelet[2885]: E1216 16:23:35.344066 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 16:23:35.344408 kubelet[2885]: E1216 16:23:35.344267 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wj22d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-79db8d8ff-l77mg_calico-system(df9fadf4-abe7-46bf-a41a-4f54249b3de9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 16:23:35.345701 kubelet[2885]: E1216 16:23:35.345661 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" podUID="df9fadf4-abe7-46bf-a41a-4f54249b3de9" Dec 16 16:23:36.026700 kubelet[2885]: E1216 16:23:36.026651 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq" podUID="475547d4-6717-4e55-a04e-817534e2d535" Dec 16 16:23:36.027025 kubelet[2885]: E1216 16:23:36.026865 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s" podUID="27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8" Dec 16 16:23:36.027274 kubelet[2885]: E1216 16:23:36.027195 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:23:38.029881 systemd[1]: Started sshd@11-10.244.29.226:22-139.178.68.195:53944.service - OpenSSH per-connection server daemon (139.178.68.195:53944). Dec 16 16:23:38.963288 sshd[5028]: Accepted publickey for core from 139.178.68.195 port 53944 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:23:38.966000 sshd-session[5028]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:23:38.976692 systemd-logind[1549]: New session 14 of user core. Dec 16 16:23:38.983341 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 16 16:23:39.704820 sshd[5031]: Connection closed by 139.178.68.195 port 53944 Dec 16 16:23:39.705758 sshd-session[5028]: pam_unix(sshd:session): session closed for user core Dec 16 16:23:39.711521 systemd-logind[1549]: Session 14 logged out. Waiting for processes to exit. Dec 16 16:23:39.711777 systemd[1]: sshd@11-10.244.29.226:22-139.178.68.195:53944.service: Deactivated successfully. Dec 16 16:23:39.714846 systemd[1]: session-14.scope: Deactivated successfully. Dec 16 16:23:39.716587 systemd-logind[1549]: Removed session 14. Dec 16 16:23:39.864020 systemd[1]: Started sshd@12-10.244.29.226:22-139.178.68.195:53954.service - OpenSSH per-connection server daemon (139.178.68.195:53954). Dec 16 16:23:40.806417 sshd[5044]: Accepted publickey for core from 139.178.68.195 port 53954 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:23:40.808919 sshd-session[5044]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:23:40.819348 systemd-logind[1549]: New session 15 of user core. Dec 16 16:23:40.827524 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 16 16:23:41.748620 sshd[5072]: Connection closed by 139.178.68.195 port 53954 Dec 16 16:23:41.749169 sshd-session[5044]: pam_unix(sshd:session): session closed for user core Dec 16 16:23:41.757674 systemd[1]: sshd@12-10.244.29.226:22-139.178.68.195:53954.service: Deactivated successfully. Dec 16 16:23:41.761690 systemd[1]: session-15.scope: Deactivated successfully. Dec 16 16:23:41.764962 systemd-logind[1549]: Session 15 logged out. Waiting for processes to exit. Dec 16 16:23:41.768025 systemd-logind[1549]: Removed session 15. Dec 16 16:23:41.910423 systemd[1]: Started sshd@13-10.244.29.226:22-139.178.68.195:51696.service - OpenSSH per-connection server daemon (139.178.68.195:51696). Dec 16 16:23:42.849114 sshd[5082]: Accepted publickey for core from 139.178.68.195 port 51696 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:23:42.851734 sshd-session[5082]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:23:42.862047 systemd-logind[1549]: New session 16 of user core. Dec 16 16:23:42.870805 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 16 16:23:43.640537 sshd[5085]: Connection closed by 139.178.68.195 port 51696 Dec 16 16:23:43.640400 sshd-session[5082]: pam_unix(sshd:session): session closed for user core Dec 16 16:23:43.648606 systemd[1]: sshd@13-10.244.29.226:22-139.178.68.195:51696.service: Deactivated successfully. Dec 16 16:23:43.654239 systemd[1]: session-16.scope: Deactivated successfully. Dec 16 16:23:43.656947 systemd-logind[1549]: Session 16 logged out. Waiting for processes to exit. Dec 16 16:23:43.662631 systemd-logind[1549]: Removed session 16. Dec 16 16:23:45.024611 kubelet[2885]: E1216 16:23:45.024542 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6bd47f9749-fdxr4" podUID="9c3bf940-f9bd-460f-a7db-19023b314640" Dec 16 16:23:47.023197 kubelet[2885]: E1216 16:23:47.022096 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" podUID="df9fadf4-abe7-46bf-a41a-4f54249b3de9" Dec 16 16:23:47.025126 kubelet[2885]: E1216 16:23:47.025025 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:23:48.802251 systemd[1]: Started sshd@14-10.244.29.226:22-139.178.68.195:51706.service - OpenSSH per-connection server daemon (139.178.68.195:51706). Dec 16 16:23:49.748382 sshd[5103]: Accepted publickey for core from 139.178.68.195 port 51706 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:23:49.750232 sshd-session[5103]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:23:49.762068 systemd-logind[1549]: New session 17 of user core. Dec 16 16:23:49.767331 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 16 16:23:50.025980 kubelet[2885]: E1216 16:23:50.025178 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-57n7d" podUID="c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585" Dec 16 16:23:50.487308 sshd[5106]: Connection closed by 139.178.68.195 port 51706 Dec 16 16:23:50.486368 sshd-session[5103]: pam_unix(sshd:session): session closed for user core Dec 16 16:23:50.493189 systemd[1]: sshd@14-10.244.29.226:22-139.178.68.195:51706.service: Deactivated successfully. Dec 16 16:23:50.498421 systemd[1]: session-17.scope: Deactivated successfully. Dec 16 16:23:50.501851 systemd-logind[1549]: Session 17 logged out. Waiting for processes to exit. Dec 16 16:23:50.505114 systemd-logind[1549]: Removed session 17. Dec 16 16:23:51.026102 kubelet[2885]: E1216 16:23:51.025885 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s" podUID="27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8" Dec 16 16:23:51.026717 kubelet[2885]: E1216 16:23:51.026121 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq" podUID="475547d4-6717-4e55-a04e-817534e2d535" Dec 16 16:23:55.657371 systemd[1]: Started sshd@15-10.244.29.226:22-139.178.68.195:58496.service - OpenSSH per-connection server daemon (139.178.68.195:58496). Dec 16 16:23:56.597105 sshd[5123]: Accepted publickey for core from 139.178.68.195 port 58496 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:23:56.599278 sshd-session[5123]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:23:56.609421 systemd-logind[1549]: New session 18 of user core. Dec 16 16:23:56.617382 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 16 16:23:57.402289 sshd[5126]: Connection closed by 139.178.68.195 port 58496 Dec 16 16:23:57.403171 sshd-session[5123]: pam_unix(sshd:session): session closed for user core Dec 16 16:23:57.412809 systemd[1]: sshd@15-10.244.29.226:22-139.178.68.195:58496.service: Deactivated successfully. Dec 16 16:23:57.417988 systemd[1]: session-18.scope: Deactivated successfully. Dec 16 16:23:57.420926 systemd-logind[1549]: Session 18 logged out. Waiting for processes to exit. Dec 16 16:23:57.422980 systemd-logind[1549]: Removed session 18. Dec 16 16:23:58.023678 kubelet[2885]: E1216 16:23:58.023597 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" podUID="df9fadf4-abe7-46bf-a41a-4f54249b3de9" Dec 16 16:23:59.025305 kubelet[2885]: E1216 16:23:59.025166 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6bd47f9749-fdxr4" podUID="9c3bf940-f9bd-460f-a7db-19023b314640" Dec 16 16:24:00.028062 kubelet[2885]: E1216 16:24:00.027905 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:24:02.023601 kubelet[2885]: E1216 16:24:02.023493 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq" podUID="475547d4-6717-4e55-a04e-817534e2d535" Dec 16 16:24:02.570391 systemd[1]: Started sshd@16-10.244.29.226:22-139.178.68.195:53798.service - OpenSSH per-connection server daemon (139.178.68.195:53798). Dec 16 16:24:03.022667 kubelet[2885]: E1216 16:24:03.022602 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s" podUID="27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8" Dec 16 16:24:03.528102 sshd[5138]: Accepted publickey for core from 139.178.68.195 port 53798 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:24:03.529639 sshd-session[5138]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:24:03.540171 systemd-logind[1549]: New session 19 of user core. Dec 16 16:24:03.545322 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 16 16:24:04.344108 sshd[5141]: Connection closed by 139.178.68.195 port 53798 Dec 16 16:24:04.343918 sshd-session[5138]: pam_unix(sshd:session): session closed for user core Dec 16 16:24:04.353032 systemd[1]: sshd@16-10.244.29.226:22-139.178.68.195:53798.service: Deactivated successfully. Dec 16 16:24:04.356862 systemd[1]: session-19.scope: Deactivated successfully. Dec 16 16:24:04.361972 systemd-logind[1549]: Session 19 logged out. Waiting for processes to exit. Dec 16 16:24:04.364721 systemd-logind[1549]: Removed session 19. Dec 16 16:24:04.508344 systemd[1]: Started sshd@17-10.244.29.226:22-139.178.68.195:53802.service - OpenSSH per-connection server daemon (139.178.68.195:53802). Dec 16 16:24:05.023738 containerd[1574]: time="2025-12-16T16:24:05.023677845Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 16:24:05.336937 containerd[1574]: time="2025-12-16T16:24:05.336794606Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:24:05.338489 containerd[1574]: time="2025-12-16T16:24:05.338446672Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 16:24:05.338577 containerd[1574]: time="2025-12-16T16:24:05.338554787Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 16:24:05.339014 kubelet[2885]: E1216 16:24:05.338898 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 16:24:05.339014 kubelet[2885]: E1216 16:24:05.338983 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 16:24:05.340091 kubelet[2885]: E1216 16:24:05.339810 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sh92g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-57n7d_calico-system(c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 16:24:05.341726 kubelet[2885]: E1216 16:24:05.341687 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-57n7d" podUID="c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585" Dec 16 16:24:05.466038 sshd[5153]: Accepted publickey for core from 139.178.68.195 port 53802 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:24:05.467847 sshd-session[5153]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:24:05.477439 systemd-logind[1549]: New session 20 of user core. Dec 16 16:24:05.483466 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 16 16:24:06.671439 sshd[5162]: Connection closed by 139.178.68.195 port 53802 Dec 16 16:24:06.678371 sshd-session[5153]: pam_unix(sshd:session): session closed for user core Dec 16 16:24:06.696346 systemd[1]: sshd@17-10.244.29.226:22-139.178.68.195:53802.service: Deactivated successfully. Dec 16 16:24:06.702220 systemd[1]: session-20.scope: Deactivated successfully. Dec 16 16:24:06.703758 systemd-logind[1549]: Session 20 logged out. Waiting for processes to exit. Dec 16 16:24:06.706708 systemd-logind[1549]: Removed session 20. Dec 16 16:24:06.833581 systemd[1]: Started sshd@18-10.244.29.226:22-139.178.68.195:53806.service - OpenSSH per-connection server daemon (139.178.68.195:53806). Dec 16 16:24:07.828173 sshd[5172]: Accepted publickey for core from 139.178.68.195 port 53806 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:24:07.831593 sshd-session[5172]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:24:07.843331 systemd-logind[1549]: New session 21 of user core. Dec 16 16:24:07.853334 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 16 16:24:09.427112 sshd[5175]: Connection closed by 139.178.68.195 port 53806 Dec 16 16:24:09.428354 sshd-session[5172]: pam_unix(sshd:session): session closed for user core Dec 16 16:24:09.441113 systemd[1]: sshd@18-10.244.29.226:22-139.178.68.195:53806.service: Deactivated successfully. Dec 16 16:24:09.448284 systemd[1]: session-21.scope: Deactivated successfully. Dec 16 16:24:09.456335 systemd-logind[1549]: Session 21 logged out. Waiting for processes to exit. Dec 16 16:24:09.460133 systemd-logind[1549]: Removed session 21. Dec 16 16:24:09.587879 systemd[1]: Started sshd@19-10.244.29.226:22-139.178.68.195:53812.service - OpenSSH per-connection server daemon (139.178.68.195:53812). Dec 16 16:24:10.573106 sshd[5192]: Accepted publickey for core from 139.178.68.195 port 53812 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:24:10.576313 sshd-session[5192]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:24:10.590410 systemd-logind[1549]: New session 22 of user core. Dec 16 16:24:10.595687 systemd[1]: Started session-22.scope - Session 22 of User core. Dec 16 16:24:11.692560 sshd[5221]: Connection closed by 139.178.68.195 port 53812 Dec 16 16:24:11.693481 sshd-session[5192]: pam_unix(sshd:session): session closed for user core Dec 16 16:24:11.701856 systemd[1]: sshd@19-10.244.29.226:22-139.178.68.195:53812.service: Deactivated successfully. Dec 16 16:24:11.706040 systemd[1]: session-22.scope: Deactivated successfully. Dec 16 16:24:11.710678 systemd-logind[1549]: Session 22 logged out. Waiting for processes to exit. Dec 16 16:24:11.713689 systemd-logind[1549]: Removed session 22. Dec 16 16:24:11.854895 systemd[1]: Started sshd@20-10.244.29.226:22-139.178.68.195:58762.service - OpenSSH per-connection server daemon (139.178.68.195:58762). Dec 16 16:24:12.029208 kubelet[2885]: E1216 16:24:12.028830 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" podUID="df9fadf4-abe7-46bf-a41a-4f54249b3de9" Dec 16 16:24:12.786864 sshd[5234]: Accepted publickey for core from 139.178.68.195 port 58762 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:24:12.787698 sshd-session[5234]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:24:12.801150 systemd-logind[1549]: New session 23 of user core. Dec 16 16:24:12.806273 systemd[1]: Started session-23.scope - Session 23 of User core. Dec 16 16:24:13.024741 containerd[1574]: time="2025-12-16T16:24:13.024673205Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 16:24:13.374786 containerd[1574]: time="2025-12-16T16:24:13.374453348Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:24:13.378924 containerd[1574]: time="2025-12-16T16:24:13.378712105Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 16:24:13.378924 containerd[1574]: time="2025-12-16T16:24:13.378866332Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 16:24:13.379441 kubelet[2885]: E1216 16:24:13.379311 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 16:24:13.379441 kubelet[2885]: E1216 16:24:13.379407 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 16:24:13.380878 kubelet[2885]: E1216 16:24:13.380816 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:32fcb3ddc81d4eaca54ba9fd665b271e,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9w6d6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6bd47f9749-fdxr4_calico-system(9c3bf940-f9bd-460f-a7db-19023b314640): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 16:24:13.385772 containerd[1574]: time="2025-12-16T16:24:13.385723190Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 16:24:13.605627 sshd[5237]: Connection closed by 139.178.68.195 port 58762 Dec 16 16:24:13.604329 sshd-session[5234]: pam_unix(sshd:session): session closed for user core Dec 16 16:24:13.612407 systemd[1]: sshd@20-10.244.29.226:22-139.178.68.195:58762.service: Deactivated successfully. Dec 16 16:24:13.618991 systemd[1]: session-23.scope: Deactivated successfully. Dec 16 16:24:13.625291 systemd-logind[1549]: Session 23 logged out. Waiting for processes to exit. Dec 16 16:24:13.628466 systemd-logind[1549]: Removed session 23. Dec 16 16:24:13.730033 containerd[1574]: time="2025-12-16T16:24:13.729958713Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:24:13.731306 containerd[1574]: time="2025-12-16T16:24:13.731226428Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 16:24:13.731514 containerd[1574]: time="2025-12-16T16:24:13.731243245Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 16:24:13.733469 kubelet[2885]: E1216 16:24:13.733400 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 16:24:13.733559 kubelet[2885]: E1216 16:24:13.733478 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 16:24:13.733762 kubelet[2885]: E1216 16:24:13.733661 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9w6d6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-6bd47f9749-fdxr4_calico-system(9c3bf940-f9bd-460f-a7db-19023b314640): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 16:24:13.735215 kubelet[2885]: E1216 16:24:13.735156 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6bd47f9749-fdxr4" podUID="9c3bf940-f9bd-460f-a7db-19023b314640" Dec 16 16:24:14.028690 containerd[1574]: time="2025-12-16T16:24:14.026326724Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 16:24:14.371231 containerd[1574]: time="2025-12-16T16:24:14.370810258Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:24:14.372665 containerd[1574]: time="2025-12-16T16:24:14.372605912Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 16:24:14.372766 containerd[1574]: time="2025-12-16T16:24:14.372722524Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 16:24:14.373317 kubelet[2885]: E1216 16:24:14.373265 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 16:24:14.373519 kubelet[2885]: E1216 16:24:14.373463 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 16:24:14.374131 kubelet[2885]: E1216 16:24:14.373838 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8kfls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-44s4j_calico-system(79791809-4f39-420a-be3e-00a912b46628): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 16:24:14.374428 containerd[1574]: time="2025-12-16T16:24:14.374056571Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 16:24:14.687712 containerd[1574]: time="2025-12-16T16:24:14.687578405Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:24:14.694478 containerd[1574]: time="2025-12-16T16:24:14.694313645Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 16:24:14.694478 containerd[1574]: time="2025-12-16T16:24:14.694426174Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 16:24:14.695190 kubelet[2885]: E1216 16:24:14.694697 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 16:24:14.695190 kubelet[2885]: E1216 16:24:14.694765 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 16:24:14.695190 kubelet[2885]: E1216 16:24:14.695040 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sn5l2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-79b6bb8fb6-gt22s_calico-apiserver(27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 16:24:14.697135 kubelet[2885]: E1216 16:24:14.696355 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s" podUID="27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8" Dec 16 16:24:14.697349 containerd[1574]: time="2025-12-16T16:24:14.696524243Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 16:24:15.066008 containerd[1574]: time="2025-12-16T16:24:15.064292976Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:24:15.066590 containerd[1574]: time="2025-12-16T16:24:15.065993717Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 16:24:15.066590 containerd[1574]: time="2025-12-16T16:24:15.066055754Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 16:24:15.067553 kubelet[2885]: E1216 16:24:15.066874 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 16:24:15.067553 kubelet[2885]: E1216 16:24:15.066942 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 16:24:15.067553 kubelet[2885]: E1216 16:24:15.067144 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8kfls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-44s4j_calico-system(79791809-4f39-420a-be3e-00a912b46628): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 16:24:15.068791 kubelet[2885]: E1216 16:24:15.068573 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:24:17.040100 containerd[1574]: time="2025-12-16T16:24:17.040019991Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 16:24:17.391123 containerd[1574]: time="2025-12-16T16:24:17.391037886Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:24:17.392486 containerd[1574]: time="2025-12-16T16:24:17.392405722Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 16:24:17.392486 containerd[1574]: time="2025-12-16T16:24:17.392450125Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 16:24:17.392880 kubelet[2885]: E1216 16:24:17.392816 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 16:24:17.393883 kubelet[2885]: E1216 16:24:17.392891 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 16:24:17.394172 kubelet[2885]: E1216 16:24:17.393066 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2dn65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-79b6bb8fb6-vmcnq_calico-apiserver(475547d4-6717-4e55-a04e-817534e2d535): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 16:24:17.395510 kubelet[2885]: E1216 16:24:17.395456 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq" podUID="475547d4-6717-4e55-a04e-817534e2d535" Dec 16 16:24:18.024027 kubelet[2885]: E1216 16:24:18.023553 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-57n7d" podUID="c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585" Dec 16 16:24:18.762540 systemd[1]: Started sshd@21-10.244.29.226:22-139.178.68.195:58768.service - OpenSSH per-connection server daemon (139.178.68.195:58768). Dec 16 16:24:19.705892 sshd[5263]: Accepted publickey for core from 139.178.68.195 port 58768 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:24:19.711316 sshd-session[5263]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:24:19.722734 systemd-logind[1549]: New session 24 of user core. Dec 16 16:24:19.733364 systemd[1]: Started session-24.scope - Session 24 of User core. Dec 16 16:24:20.501881 sshd[5266]: Connection closed by 139.178.68.195 port 58768 Dec 16 16:24:20.505323 sshd-session[5263]: pam_unix(sshd:session): session closed for user core Dec 16 16:24:20.514001 systemd[1]: sshd@21-10.244.29.226:22-139.178.68.195:58768.service: Deactivated successfully. Dec 16 16:24:20.518760 systemd[1]: session-24.scope: Deactivated successfully. Dec 16 16:24:20.521184 systemd-logind[1549]: Session 24 logged out. Waiting for processes to exit. Dec 16 16:24:20.523091 systemd-logind[1549]: Removed session 24. Dec 16 16:24:25.023353 kubelet[2885]: E1216 16:24:25.023260 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6bd47f9749-fdxr4" podUID="9c3bf940-f9bd-460f-a7db-19023b314640" Dec 16 16:24:25.661396 systemd[1]: Started sshd@22-10.244.29.226:22-139.178.68.195:39700.service - OpenSSH per-connection server daemon (139.178.68.195:39700). Dec 16 16:24:26.027055 containerd[1574]: time="2025-12-16T16:24:26.026569147Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 16:24:26.383449 containerd[1574]: time="2025-12-16T16:24:26.383274846Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 16:24:26.385213 containerd[1574]: time="2025-12-16T16:24:26.385158988Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 16:24:26.385327 containerd[1574]: time="2025-12-16T16:24:26.385271084Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 16:24:26.387376 kubelet[2885]: E1216 16:24:26.387285 2885 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 16:24:26.387376 kubelet[2885]: E1216 16:24:26.387372 2885 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 16:24:26.387939 kubelet[2885]: E1216 16:24:26.387559 2885 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wj22d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-79db8d8ff-l77mg_calico-system(df9fadf4-abe7-46bf-a41a-4f54249b3de9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 16:24:26.389286 kubelet[2885]: E1216 16:24:26.389236 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-79db8d8ff-l77mg" podUID="df9fadf4-abe7-46bf-a41a-4f54249b3de9" Dec 16 16:24:26.588110 sshd[5287]: Accepted publickey for core from 139.178.68.195 port 39700 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:24:26.591578 sshd-session[5287]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:24:26.603163 systemd-logind[1549]: New session 25 of user core. Dec 16 16:24:26.610433 systemd[1]: Started session-25.scope - Session 25 of User core. Dec 16 16:24:27.024104 kubelet[2885]: E1216 16:24:27.023709 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-gt22s" podUID="27b8eb94-af6b-44b6-a0c4-8cd0e3a973f8" Dec 16 16:24:27.320500 sshd[5290]: Connection closed by 139.178.68.195 port 39700 Dec 16 16:24:27.321440 sshd-session[5287]: pam_unix(sshd:session): session closed for user core Dec 16 16:24:27.330780 systemd-logind[1549]: Session 25 logged out. Waiting for processes to exit. Dec 16 16:24:27.332346 systemd[1]: sshd@22-10.244.29.226:22-139.178.68.195:39700.service: Deactivated successfully. Dec 16 16:24:27.337889 systemd[1]: session-25.scope: Deactivated successfully. Dec 16 16:24:27.340957 systemd-logind[1549]: Removed session 25. Dec 16 16:24:29.024201 kubelet[2885]: E1216 16:24:29.024140 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-57n7d" podUID="c1794fb1-0f3d-47eb-a0fe-5d4e2aa97585" Dec 16 16:24:29.025894 kubelet[2885]: E1216 16:24:29.025548 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79b6bb8fb6-vmcnq" podUID="475547d4-6717-4e55-a04e-817534e2d535" Dec 16 16:24:30.028050 kubelet[2885]: E1216 16:24:30.027805 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-44s4j" podUID="79791809-4f39-420a-be3e-00a912b46628" Dec 16 16:24:32.484393 systemd[1]: Started sshd@23-10.244.29.226:22-139.178.68.195:48948.service - OpenSSH per-connection server daemon (139.178.68.195:48948). Dec 16 16:24:33.419285 sshd[5302]: Accepted publickey for core from 139.178.68.195 port 48948 ssh2: RSA SHA256:aWRHM7yqDy00ChHK+O7mKYt3bRdoTZshpl3R3naUTkM Dec 16 16:24:33.424005 sshd-session[5302]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 16:24:33.434513 systemd-logind[1549]: New session 26 of user core. Dec 16 16:24:33.440679 systemd[1]: Started session-26.scope - Session 26 of User core. Dec 16 16:24:34.209276 sshd[5305]: Connection closed by 139.178.68.195 port 48948 Dec 16 16:24:34.210950 sshd-session[5302]: pam_unix(sshd:session): session closed for user core Dec 16 16:24:34.224258 systemd[1]: sshd@23-10.244.29.226:22-139.178.68.195:48948.service: Deactivated successfully. Dec 16 16:24:34.227909 systemd[1]: session-26.scope: Deactivated successfully. Dec 16 16:24:34.232057 systemd-logind[1549]: Session 26 logged out. Waiting for processes to exit. Dec 16 16:24:34.235444 systemd-logind[1549]: Removed session 26. Dec 16 16:24:37.026371 kubelet[2885]: E1216 16:24:37.026285 2885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-6bd47f9749-fdxr4" podUID="9c3bf940-f9bd-460f-a7db-19023b314640"