Dec 12 20:04:19.979987 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Dec 12 15:21:28 -00 2025 Dec 12 20:04:19.980036 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 12 20:04:19.980050 kernel: BIOS-provided physical RAM map: Dec 12 20:04:19.980061 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Dec 12 20:04:19.980076 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Dec 12 20:04:19.980086 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Dec 12 20:04:19.980098 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Dec 12 20:04:19.980108 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Dec 12 20:04:19.980119 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Dec 12 20:04:19.980130 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Dec 12 20:04:19.980141 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Dec 12 20:04:19.980151 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Dec 12 20:04:19.980162 kernel: NX (Execute Disable) protection: active Dec 12 20:04:19.980177 kernel: APIC: Static calls initialized Dec 12 20:04:19.980189 kernel: SMBIOS 2.8 present. Dec 12 20:04:19.980201 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Dec 12 20:04:19.980213 kernel: DMI: Memory slots populated: 1/1 Dec 12 20:04:19.980224 kernel: Hypervisor detected: KVM Dec 12 20:04:19.980235 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Dec 12 20:04:19.980251 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Dec 12 20:04:19.980262 kernel: kvm-clock: using sched offset of 5799560201 cycles Dec 12 20:04:19.980275 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Dec 12 20:04:19.982065 kernel: tsc: Detected 2499.998 MHz processor Dec 12 20:04:19.982100 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Dec 12 20:04:19.982115 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Dec 12 20:04:19.982127 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Dec 12 20:04:19.982138 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Dec 12 20:04:19.982150 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Dec 12 20:04:19.982170 kernel: Using GB pages for direct mapping Dec 12 20:04:19.982182 kernel: ACPI: Early table checksum verification disabled Dec 12 20:04:19.982194 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Dec 12 20:04:19.982205 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 20:04:19.982217 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 20:04:19.982229 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 20:04:19.982240 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Dec 12 20:04:19.982252 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 20:04:19.982264 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 20:04:19.982280 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 20:04:19.982314 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 12 20:04:19.982327 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Dec 12 20:04:19.982358 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Dec 12 20:04:19.982370 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Dec 12 20:04:19.982382 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Dec 12 20:04:19.982399 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Dec 12 20:04:19.982411 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Dec 12 20:04:19.982423 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Dec 12 20:04:19.982435 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Dec 12 20:04:19.982447 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Dec 12 20:04:19.982459 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Dec 12 20:04:19.982472 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00001000-0x7ffdbfff] Dec 12 20:04:19.982484 kernel: NODE_DATA(0) allocated [mem 0x7ffd4dc0-0x7ffdbfff] Dec 12 20:04:19.982501 kernel: Zone ranges: Dec 12 20:04:19.982513 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Dec 12 20:04:19.982525 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Dec 12 20:04:19.982537 kernel: Normal empty Dec 12 20:04:19.982549 kernel: Device empty Dec 12 20:04:19.982561 kernel: Movable zone start for each node Dec 12 20:04:19.982573 kernel: Early memory node ranges Dec 12 20:04:19.982585 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Dec 12 20:04:19.982597 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Dec 12 20:04:19.982613 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Dec 12 20:04:19.982626 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Dec 12 20:04:19.982638 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Dec 12 20:04:19.982650 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Dec 12 20:04:19.982662 kernel: ACPI: PM-Timer IO Port: 0x608 Dec 12 20:04:19.982674 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Dec 12 20:04:19.982686 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Dec 12 20:04:19.982698 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Dec 12 20:04:19.982710 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Dec 12 20:04:19.982722 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Dec 12 20:04:19.982739 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Dec 12 20:04:19.982751 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Dec 12 20:04:19.982763 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Dec 12 20:04:19.982775 kernel: TSC deadline timer available Dec 12 20:04:19.982787 kernel: CPU topo: Max. logical packages: 16 Dec 12 20:04:19.982798 kernel: CPU topo: Max. logical dies: 16 Dec 12 20:04:19.982810 kernel: CPU topo: Max. dies per package: 1 Dec 12 20:04:19.982822 kernel: CPU topo: Max. threads per core: 1 Dec 12 20:04:19.982834 kernel: CPU topo: Num. cores per package: 1 Dec 12 20:04:19.982850 kernel: CPU topo: Num. threads per package: 1 Dec 12 20:04:19.982862 kernel: CPU topo: Allowing 2 present CPUs plus 14 hotplug CPUs Dec 12 20:04:19.982874 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Dec 12 20:04:19.982886 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Dec 12 20:04:19.982898 kernel: Booting paravirtualized kernel on KVM Dec 12 20:04:19.982910 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Dec 12 20:04:19.982923 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Dec 12 20:04:19.982935 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Dec 12 20:04:19.982947 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Dec 12 20:04:19.982963 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Dec 12 20:04:19.982975 kernel: kvm-guest: PV spinlocks enabled Dec 12 20:04:19.982987 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Dec 12 20:04:19.983001 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 12 20:04:19.983013 kernel: random: crng init done Dec 12 20:04:19.983025 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 12 20:04:19.983037 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 12 20:04:19.983050 kernel: Fallback order for Node 0: 0 Dec 12 20:04:19.983066 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524154 Dec 12 20:04:19.983078 kernel: Policy zone: DMA32 Dec 12 20:04:19.983090 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 12 20:04:19.983114 kernel: software IO TLB: area num 16. Dec 12 20:04:19.983130 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Dec 12 20:04:19.983143 kernel: Kernel/User page tables isolation: enabled Dec 12 20:04:19.983155 kernel: ftrace: allocating 40103 entries in 157 pages Dec 12 20:04:19.983167 kernel: ftrace: allocated 157 pages with 5 groups Dec 12 20:04:19.983179 kernel: Dynamic Preempt: voluntary Dec 12 20:04:19.983196 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 12 20:04:19.983210 kernel: rcu: RCU event tracing is enabled. Dec 12 20:04:19.983222 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Dec 12 20:04:19.983234 kernel: Trampoline variant of Tasks RCU enabled. Dec 12 20:04:19.983247 kernel: Rude variant of Tasks RCU enabled. Dec 12 20:04:19.983259 kernel: Tracing variant of Tasks RCU enabled. Dec 12 20:04:19.983270 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 12 20:04:19.983282 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Dec 12 20:04:19.983310 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Dec 12 20:04:19.983328 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Dec 12 20:04:19.983351 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Dec 12 20:04:19.983363 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Dec 12 20:04:19.983376 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 12 20:04:19.983399 kernel: Console: colour VGA+ 80x25 Dec 12 20:04:19.983416 kernel: printk: legacy console [tty0] enabled Dec 12 20:04:19.983429 kernel: printk: legacy console [ttyS0] enabled Dec 12 20:04:19.983441 kernel: ACPI: Core revision 20240827 Dec 12 20:04:19.983454 kernel: APIC: Switch to symmetric I/O mode setup Dec 12 20:04:19.983466 kernel: x2apic enabled Dec 12 20:04:19.983479 kernel: APIC: Switched APIC routing to: physical x2apic Dec 12 20:04:19.983492 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Dec 12 20:04:19.983509 kernel: Calibrating delay loop (skipped) preset value.. 4999.99 BogoMIPS (lpj=2499998) Dec 12 20:04:19.983522 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Dec 12 20:04:19.983535 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Dec 12 20:04:19.983547 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Dec 12 20:04:19.983559 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Dec 12 20:04:19.983576 kernel: Spectre V2 : Mitigation: Retpolines Dec 12 20:04:19.983588 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Dec 12 20:04:19.983601 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Dec 12 20:04:19.983614 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Dec 12 20:04:19.983626 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Dec 12 20:04:19.983639 kernel: MDS: Mitigation: Clear CPU buffers Dec 12 20:04:19.983652 kernel: MMIO Stale Data: Unknown: No mitigations Dec 12 20:04:19.983664 kernel: SRBDS: Unknown: Dependent on hypervisor status Dec 12 20:04:19.983676 kernel: active return thunk: its_return_thunk Dec 12 20:04:19.983689 kernel: ITS: Mitigation: Aligned branch/return thunks Dec 12 20:04:19.983701 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Dec 12 20:04:19.983719 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Dec 12 20:04:19.983731 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Dec 12 20:04:19.983744 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Dec 12 20:04:19.983756 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Dec 12 20:04:19.983769 kernel: Freeing SMP alternatives memory: 32K Dec 12 20:04:19.983781 kernel: pid_max: default: 32768 minimum: 301 Dec 12 20:04:19.983794 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 12 20:04:19.983806 kernel: landlock: Up and running. Dec 12 20:04:19.983819 kernel: SELinux: Initializing. Dec 12 20:04:19.983831 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Dec 12 20:04:19.983844 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Dec 12 20:04:19.983861 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Dec 12 20:04:19.983874 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Dec 12 20:04:19.983887 kernel: signal: max sigframe size: 1776 Dec 12 20:04:19.983900 kernel: rcu: Hierarchical SRCU implementation. Dec 12 20:04:19.983913 kernel: rcu: Max phase no-delay instances is 400. Dec 12 20:04:19.983926 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Dec 12 20:04:19.983938 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Dec 12 20:04:19.983951 kernel: smp: Bringing up secondary CPUs ... Dec 12 20:04:19.983963 kernel: smpboot: x86: Booting SMP configuration: Dec 12 20:04:19.983976 kernel: .... node #0, CPUs: #1 Dec 12 20:04:19.983993 kernel: smp: Brought up 1 node, 2 CPUs Dec 12 20:04:19.984006 kernel: smpboot: Total of 2 processors activated (9999.99 BogoMIPS) Dec 12 20:04:19.984019 kernel: Memory: 1887484K/2096616K available (14336K kernel code, 2444K rwdata, 26064K rodata, 46188K init, 2572K bss, 203116K reserved, 0K cma-reserved) Dec 12 20:04:19.984032 kernel: devtmpfs: initialized Dec 12 20:04:19.984045 kernel: x86/mm: Memory block size: 128MB Dec 12 20:04:19.984058 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 12 20:04:19.984070 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Dec 12 20:04:19.984083 kernel: pinctrl core: initialized pinctrl subsystem Dec 12 20:04:19.984100 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 12 20:04:19.984113 kernel: audit: initializing netlink subsys (disabled) Dec 12 20:04:19.984125 kernel: audit: type=2000 audit(1765569855.482:1): state=initialized audit_enabled=0 res=1 Dec 12 20:04:19.984138 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 12 20:04:19.984151 kernel: thermal_sys: Registered thermal governor 'user_space' Dec 12 20:04:19.984163 kernel: cpuidle: using governor menu Dec 12 20:04:19.984176 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 12 20:04:19.984189 kernel: dca service started, version 1.12.1 Dec 12 20:04:19.984201 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Dec 12 20:04:19.984218 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry Dec 12 20:04:19.984231 kernel: PCI: Using configuration type 1 for base access Dec 12 20:04:19.984244 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Dec 12 20:04:19.984256 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 12 20:04:19.984270 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Dec 12 20:04:19.984282 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 12 20:04:19.986345 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Dec 12 20:04:19.986362 kernel: ACPI: Added _OSI(Module Device) Dec 12 20:04:19.986376 kernel: ACPI: Added _OSI(Processor Device) Dec 12 20:04:19.986397 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 12 20:04:19.986410 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 12 20:04:19.986422 kernel: ACPI: Interpreter enabled Dec 12 20:04:19.986435 kernel: ACPI: PM: (supports S0 S5) Dec 12 20:04:19.986448 kernel: ACPI: Using IOAPIC for interrupt routing Dec 12 20:04:19.986461 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Dec 12 20:04:19.986473 kernel: PCI: Using E820 reservations for host bridge windows Dec 12 20:04:19.986486 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Dec 12 20:04:19.986499 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 12 20:04:19.986809 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 12 20:04:19.986984 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Dec 12 20:04:19.987150 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Dec 12 20:04:19.987170 kernel: PCI host bridge to bus 0000:00 Dec 12 20:04:19.987390 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Dec 12 20:04:19.987545 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Dec 12 20:04:19.987694 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Dec 12 20:04:19.987851 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Dec 12 20:04:19.987998 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Dec 12 20:04:19.988145 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Dec 12 20:04:19.990358 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 12 20:04:19.990577 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Dec 12 20:04:19.990784 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 conventional PCI endpoint Dec 12 20:04:19.990959 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfa000000-0xfbffffff pref] Dec 12 20:04:19.991122 kernel: pci 0000:00:01.0: BAR 1 [mem 0xfea50000-0xfea50fff] Dec 12 20:04:19.991283 kernel: pci 0000:00:01.0: ROM [mem 0xfea40000-0xfea4ffff pref] Dec 12 20:04:19.991483 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Dec 12 20:04:19.991669 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 20:04:19.991833 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea51000-0xfea51fff] Dec 12 20:04:19.991994 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 12 20:04:19.992162 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Dec 12 20:04:19.994379 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Dec 12 20:04:19.994585 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 20:04:19.994754 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea52000-0xfea52fff] Dec 12 20:04:19.994918 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 12 20:04:19.995080 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Dec 12 20:04:19.995241 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Dec 12 20:04:19.995467 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 20:04:19.995631 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea53000-0xfea53fff] Dec 12 20:04:19.995793 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 12 20:04:19.995953 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Dec 12 20:04:19.996113 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Dec 12 20:04:19.996281 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 20:04:19.997913 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea54000-0xfea54fff] Dec 12 20:04:19.998089 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 12 20:04:19.998252 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Dec 12 20:04:19.998518 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Dec 12 20:04:19.998693 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 20:04:19.998855 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea55000-0xfea55fff] Dec 12 20:04:19.999014 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 12 20:04:19.999173 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Dec 12 20:04:19.999378 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Dec 12 20:04:19.999559 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 20:04:19.999721 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea56000-0xfea56fff] Dec 12 20:04:19.999882 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 12 20:04:20.000041 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Dec 12 20:04:20.000200 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Dec 12 20:04:20.000407 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 20:04:20.000578 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea57000-0xfea57fff] Dec 12 20:04:20.000737 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 12 20:04:20.000896 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Dec 12 20:04:20.001055 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Dec 12 20:04:20.001223 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 12 20:04:20.001416 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea58000-0xfea58fff] Dec 12 20:04:20.001584 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 12 20:04:20.001743 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Dec 12 20:04:20.001902 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Dec 12 20:04:20.002078 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Dec 12 20:04:20.002241 kernel: pci 0000:00:03.0: BAR 0 [io 0xc0c0-0xc0df] Dec 12 20:04:20.002437 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfea59000-0xfea59fff] Dec 12 20:04:20.002597 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Dec 12 20:04:20.002756 kernel: pci 0000:00:03.0: ROM [mem 0xfea00000-0xfea3ffff pref] Dec 12 20:04:20.002934 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Dec 12 20:04:20.003094 kernel: pci 0000:00:04.0: BAR 0 [io 0xc000-0xc07f] Dec 12 20:04:20.003256 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfea5a000-0xfea5afff] Dec 12 20:04:20.003453 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfd004000-0xfd007fff 64bit pref] Dec 12 20:04:20.003629 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Dec 12 20:04:20.003789 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Dec 12 20:04:20.003967 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Dec 12 20:04:20.004126 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc0e0-0xc0ff] Dec 12 20:04:20.004296 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea5b000-0xfea5bfff] Dec 12 20:04:20.004482 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Dec 12 20:04:20.004644 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Dec 12 20:04:20.004823 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Dec 12 20:04:20.004989 kernel: pci 0000:01:00.0: BAR 0 [mem 0xfda00000-0xfda000ff 64bit] Dec 12 20:04:20.005161 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 12 20:04:20.005383 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Dec 12 20:04:20.005549 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 12 20:04:20.005738 kernel: pci_bus 0000:02: extended config space not accessible Dec 12 20:04:20.005921 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 conventional PCI endpoint Dec 12 20:04:20.006093 kernel: pci 0000:02:01.0: BAR 0 [mem 0xfd800000-0xfd80000f] Dec 12 20:04:20.006260 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 12 20:04:20.006511 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 12 20:04:20.006678 kernel: pci 0000:03:00.0: BAR 0 [mem 0xfe800000-0xfe803fff 64bit] Dec 12 20:04:20.006841 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 12 20:04:20.007018 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 12 20:04:20.007183 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Dec 12 20:04:20.007382 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 12 20:04:20.007553 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 12 20:04:20.007715 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 12 20:04:20.007877 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 12 20:04:20.008037 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 12 20:04:20.008197 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 12 20:04:20.008218 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Dec 12 20:04:20.008231 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Dec 12 20:04:20.008251 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Dec 12 20:04:20.008264 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Dec 12 20:04:20.008276 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Dec 12 20:04:20.010328 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Dec 12 20:04:20.010360 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Dec 12 20:04:20.010373 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Dec 12 20:04:20.010386 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Dec 12 20:04:20.010398 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Dec 12 20:04:20.010411 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Dec 12 20:04:20.010433 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Dec 12 20:04:20.010446 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Dec 12 20:04:20.010459 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Dec 12 20:04:20.010472 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Dec 12 20:04:20.010485 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Dec 12 20:04:20.010498 kernel: iommu: Default domain type: Translated Dec 12 20:04:20.010511 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Dec 12 20:04:20.010523 kernel: PCI: Using ACPI for IRQ routing Dec 12 20:04:20.010537 kernel: PCI: pci_cache_line_size set to 64 bytes Dec 12 20:04:20.010554 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Dec 12 20:04:20.010567 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Dec 12 20:04:20.010746 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Dec 12 20:04:20.010910 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Dec 12 20:04:20.011070 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Dec 12 20:04:20.011090 kernel: vgaarb: loaded Dec 12 20:04:20.011104 kernel: clocksource: Switched to clocksource kvm-clock Dec 12 20:04:20.011117 kernel: VFS: Disk quotas dquot_6.6.0 Dec 12 20:04:20.011129 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 12 20:04:20.011149 kernel: pnp: PnP ACPI init Dec 12 20:04:20.015380 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Dec 12 20:04:20.015411 kernel: pnp: PnP ACPI: found 5 devices Dec 12 20:04:20.015426 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Dec 12 20:04:20.015440 kernel: NET: Registered PF_INET protocol family Dec 12 20:04:20.015453 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 12 20:04:20.015467 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Dec 12 20:04:20.015480 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 12 20:04:20.015502 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Dec 12 20:04:20.015515 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Dec 12 20:04:20.015528 kernel: TCP: Hash tables configured (established 16384 bind 16384) Dec 12 20:04:20.015541 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Dec 12 20:04:20.015554 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Dec 12 20:04:20.015567 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 12 20:04:20.015580 kernel: NET: Registered PF_XDP protocol family Dec 12 20:04:20.015757 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Dec 12 20:04:20.015927 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 12 20:04:20.016102 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 12 20:04:20.016273 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 12 20:04:20.016471 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 12 20:04:20.016637 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 12 20:04:20.016802 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 12 20:04:20.016966 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 12 20:04:20.017128 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Dec 12 20:04:20.019328 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Dec 12 20:04:20.019511 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Dec 12 20:04:20.019674 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Dec 12 20:04:20.019836 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Dec 12 20:04:20.019997 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Dec 12 20:04:20.020156 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Dec 12 20:04:20.020345 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Dec 12 20:04:20.020519 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Dec 12 20:04:20.020715 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Dec 12 20:04:20.020877 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Dec 12 20:04:20.021038 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Dec 12 20:04:20.021198 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Dec 12 20:04:20.023433 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Dec 12 20:04:20.023602 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Dec 12 20:04:20.023762 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Dec 12 20:04:20.023923 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Dec 12 20:04:20.024084 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Dec 12 20:04:20.024245 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Dec 12 20:04:20.024448 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Dec 12 20:04:20.024609 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Dec 12 20:04:20.024778 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Dec 12 20:04:20.024940 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Dec 12 20:04:20.025100 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Dec 12 20:04:20.025269 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Dec 12 20:04:20.026614 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Dec 12 20:04:20.026794 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Dec 12 20:04:20.026960 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Dec 12 20:04:20.027124 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Dec 12 20:04:20.027318 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Dec 12 20:04:20.027499 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Dec 12 20:04:20.027666 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Dec 12 20:04:20.027828 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Dec 12 20:04:20.027988 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Dec 12 20:04:20.028151 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Dec 12 20:04:20.028353 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Dec 12 20:04:20.028519 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Dec 12 20:04:20.028680 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Dec 12 20:04:20.028843 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Dec 12 20:04:20.029010 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Dec 12 20:04:20.029174 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Dec 12 20:04:20.029367 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Dec 12 20:04:20.029525 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Dec 12 20:04:20.029674 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Dec 12 20:04:20.029819 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Dec 12 20:04:20.029964 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Dec 12 20:04:20.030109 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Dec 12 20:04:20.030262 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Dec 12 20:04:20.030469 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Dec 12 20:04:20.030624 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Dec 12 20:04:20.030775 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Dec 12 20:04:20.030937 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Dec 12 20:04:20.031100 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Dec 12 20:04:20.031251 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Dec 12 20:04:20.031441 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Dec 12 20:04:20.031604 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Dec 12 20:04:20.031756 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Dec 12 20:04:20.031907 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Dec 12 20:04:20.032066 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Dec 12 20:04:20.032217 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Dec 12 20:04:20.032412 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Dec 12 20:04:20.032591 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Dec 12 20:04:20.032742 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Dec 12 20:04:20.032892 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Dec 12 20:04:20.033059 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Dec 12 20:04:20.033210 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Dec 12 20:04:20.033394 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Dec 12 20:04:20.033563 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Dec 12 20:04:20.033714 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Dec 12 20:04:20.033864 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Dec 12 20:04:20.034023 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Dec 12 20:04:20.034173 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Dec 12 20:04:20.034366 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Dec 12 20:04:20.034389 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Dec 12 20:04:20.034413 kernel: PCI: CLS 0 bytes, default 64 Dec 12 20:04:20.034427 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Dec 12 20:04:20.034441 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Dec 12 20:04:20.034454 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Dec 12 20:04:20.034468 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Dec 12 20:04:20.034482 kernel: Initialise system trusted keyrings Dec 12 20:04:20.034495 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Dec 12 20:04:20.034509 kernel: Key type asymmetric registered Dec 12 20:04:20.034522 kernel: Asymmetric key parser 'x509' registered Dec 12 20:04:20.034540 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Dec 12 20:04:20.034553 kernel: io scheduler mq-deadline registered Dec 12 20:04:20.034566 kernel: io scheduler kyber registered Dec 12 20:04:20.034580 kernel: io scheduler bfq registered Dec 12 20:04:20.034746 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Dec 12 20:04:20.034908 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Dec 12 20:04:20.035069 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 20:04:20.035230 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Dec 12 20:04:20.035433 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Dec 12 20:04:20.035593 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 20:04:20.035753 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Dec 12 20:04:20.035912 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Dec 12 20:04:20.036070 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 20:04:20.036229 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Dec 12 20:04:20.036439 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Dec 12 20:04:20.036601 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 20:04:20.036761 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Dec 12 20:04:20.036921 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Dec 12 20:04:20.037082 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 20:04:20.037241 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Dec 12 20:04:20.037442 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Dec 12 20:04:20.037604 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 20:04:20.037765 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Dec 12 20:04:20.037924 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Dec 12 20:04:20.038084 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 20:04:20.038243 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Dec 12 20:04:20.038453 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Dec 12 20:04:20.038614 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 12 20:04:20.038636 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Dec 12 20:04:20.038650 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Dec 12 20:04:20.038665 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Dec 12 20:04:20.038678 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 12 20:04:20.038692 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Dec 12 20:04:20.038713 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Dec 12 20:04:20.038727 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Dec 12 20:04:20.038740 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Dec 12 20:04:20.038754 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Dec 12 20:04:20.038916 kernel: rtc_cmos 00:03: RTC can wake from S4 Dec 12 20:04:20.039070 kernel: rtc_cmos 00:03: registered as rtc0 Dec 12 20:04:20.039221 kernel: rtc_cmos 00:03: setting system clock to 2025-12-12T20:04:19 UTC (1765569859) Dec 12 20:04:20.039406 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Dec 12 20:04:20.039434 kernel: intel_pstate: CPU model not supported Dec 12 20:04:20.039448 kernel: NET: Registered PF_INET6 protocol family Dec 12 20:04:20.039462 kernel: Segment Routing with IPv6 Dec 12 20:04:20.039475 kernel: In-situ OAM (IOAM) with IPv6 Dec 12 20:04:20.039488 kernel: NET: Registered PF_PACKET protocol family Dec 12 20:04:20.039502 kernel: Key type dns_resolver registered Dec 12 20:04:20.039515 kernel: IPI shorthand broadcast: enabled Dec 12 20:04:20.039529 kernel: sched_clock: Marking stable (3442004499, 223902180)->(3794618703, -128712024) Dec 12 20:04:20.039542 kernel: registered taskstats version 1 Dec 12 20:04:20.039560 kernel: Loading compiled-in X.509 certificates Dec 12 20:04:20.039574 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 0d0c78e6590cb40d27f1cef749ef9f2f3425f38d' Dec 12 20:04:20.039587 kernel: Demotion targets for Node 0: null Dec 12 20:04:20.039600 kernel: Key type .fscrypt registered Dec 12 20:04:20.039614 kernel: Key type fscrypt-provisioning registered Dec 12 20:04:20.039627 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 12 20:04:20.039640 kernel: ima: Allocated hash algorithm: sha1 Dec 12 20:04:20.039653 kernel: ima: No architecture policies found Dec 12 20:04:20.039671 kernel: clk: Disabling unused clocks Dec 12 20:04:20.039688 kernel: Warning: unable to open an initial console. Dec 12 20:04:20.039702 kernel: Freeing unused kernel image (initmem) memory: 46188K Dec 12 20:04:20.039716 kernel: Write protecting the kernel read-only data: 40960k Dec 12 20:04:20.039730 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Dec 12 20:04:20.039743 kernel: Run /init as init process Dec 12 20:04:20.039756 kernel: with arguments: Dec 12 20:04:20.039769 kernel: /init Dec 12 20:04:20.039782 kernel: with environment: Dec 12 20:04:20.039795 kernel: HOME=/ Dec 12 20:04:20.039808 kernel: TERM=linux Dec 12 20:04:20.039838 systemd[1]: Successfully made /usr/ read-only. Dec 12 20:04:20.039857 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 20:04:20.039872 systemd[1]: Detected virtualization kvm. Dec 12 20:04:20.039886 systemd[1]: Detected architecture x86-64. Dec 12 20:04:20.039900 systemd[1]: Running in initrd. Dec 12 20:04:20.039914 systemd[1]: No hostname configured, using default hostname. Dec 12 20:04:20.039934 systemd[1]: Hostname set to . Dec 12 20:04:20.039948 systemd[1]: Initializing machine ID from VM UUID. Dec 12 20:04:20.039962 systemd[1]: Queued start job for default target initrd.target. Dec 12 20:04:20.039976 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 20:04:20.039990 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 20:04:20.040005 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 12 20:04:20.040020 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 20:04:20.040035 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 12 20:04:20.040055 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 12 20:04:20.040070 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 12 20:04:20.040085 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 12 20:04:20.040100 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 20:04:20.040114 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 20:04:20.040128 systemd[1]: Reached target paths.target - Path Units. Dec 12 20:04:20.040143 systemd[1]: Reached target slices.target - Slice Units. Dec 12 20:04:20.040157 systemd[1]: Reached target swap.target - Swaps. Dec 12 20:04:20.040175 systemd[1]: Reached target timers.target - Timer Units. Dec 12 20:04:20.040190 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 20:04:20.040204 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 20:04:20.040219 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 12 20:04:20.040233 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 12 20:04:20.040248 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 20:04:20.040262 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 20:04:20.040276 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 20:04:20.040323 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 20:04:20.040350 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 12 20:04:20.040365 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 20:04:20.040379 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 12 20:04:20.040394 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 12 20:04:20.040408 systemd[1]: Starting systemd-fsck-usr.service... Dec 12 20:04:20.040422 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 20:04:20.040436 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 20:04:20.040450 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 20:04:20.040471 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 12 20:04:20.040486 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 20:04:20.040500 systemd[1]: Finished systemd-fsck-usr.service. Dec 12 20:04:20.040571 systemd-journald[212]: Collecting audit messages is disabled. Dec 12 20:04:20.040614 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 12 20:04:20.040629 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 12 20:04:20.040645 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 20:04:20.040659 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 12 20:04:20.040678 kernel: Bridge firewalling registered Dec 12 20:04:20.040695 systemd-journald[212]: Journal started Dec 12 20:04:20.040735 systemd-journald[212]: Runtime Journal (/run/log/journal/08eba51d7783407ead7cdfdb321eac76) is 4.7M, max 37.8M, 33.1M free. Dec 12 20:04:19.958968 systemd-modules-load[213]: Inserted module 'overlay' Dec 12 20:04:20.021449 systemd-modules-load[213]: Inserted module 'br_netfilter' Dec 12 20:04:20.094732 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 20:04:20.095550 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 20:04:20.096549 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 20:04:20.104500 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 12 20:04:20.107618 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 20:04:20.115599 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 20:04:20.118236 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 20:04:20.134058 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 20:04:20.140846 systemd-tmpfiles[233]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 12 20:04:20.143682 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 20:04:20.145701 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 12 20:04:20.151369 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 20:04:20.155478 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 20:04:20.186255 dracut-cmdline[248]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 12 20:04:20.210648 systemd-resolved[250]: Positive Trust Anchors: Dec 12 20:04:20.211811 systemd-resolved[250]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 20:04:20.212821 systemd-resolved[250]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 20:04:20.220869 systemd-resolved[250]: Defaulting to hostname 'linux'. Dec 12 20:04:20.223529 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 20:04:20.225232 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 20:04:20.296352 kernel: SCSI subsystem initialized Dec 12 20:04:20.308371 kernel: Loading iSCSI transport class v2.0-870. Dec 12 20:04:20.321367 kernel: iscsi: registered transport (tcp) Dec 12 20:04:20.347890 kernel: iscsi: registered transport (qla4xxx) Dec 12 20:04:20.347986 kernel: QLogic iSCSI HBA Driver Dec 12 20:04:20.373125 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 20:04:20.392110 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 20:04:20.395537 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 20:04:20.458953 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 12 20:04:20.462819 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 12 20:04:20.521417 kernel: raid6: sse2x4 gen() 13420 MB/s Dec 12 20:04:20.539347 kernel: raid6: sse2x2 gen() 9495 MB/s Dec 12 20:04:20.558020 kernel: raid6: sse2x1 gen() 9798 MB/s Dec 12 20:04:20.558107 kernel: raid6: using algorithm sse2x4 gen() 13420 MB/s Dec 12 20:04:20.577024 kernel: raid6: .... xor() 7764 MB/s, rmw enabled Dec 12 20:04:20.577123 kernel: raid6: using ssse3x2 recovery algorithm Dec 12 20:04:20.603356 kernel: xor: automatically using best checksumming function avx Dec 12 20:04:20.792341 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 12 20:04:20.801209 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 12 20:04:20.804188 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 20:04:20.840426 systemd-udevd[459]: Using default interface naming scheme 'v255'. Dec 12 20:04:20.849962 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 20:04:20.856021 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 12 20:04:20.892449 dracut-pre-trigger[466]: rd.md=0: removing MD RAID activation Dec 12 20:04:20.934401 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 20:04:20.937357 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 20:04:21.057265 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 20:04:21.062264 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 12 20:04:21.182456 kernel: ACPI: bus type USB registered Dec 12 20:04:21.182553 kernel: usbcore: registered new interface driver usbfs Dec 12 20:04:21.201894 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Dec 12 20:04:21.210318 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Dec 12 20:04:21.218533 kernel: usbcore: registered new interface driver hub Dec 12 20:04:21.218558 kernel: cryptd: max_cpu_qlen set to 1000 Dec 12 20:04:21.218576 kernel: usbcore: registered new device driver usb Dec 12 20:04:21.234528 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 12 20:04:21.234593 kernel: GPT:17805311 != 125829119 Dec 12 20:04:21.234634 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 12 20:04:21.234651 kernel: GPT:17805311 != 125829119 Dec 12 20:04:21.234667 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 12 20:04:21.234695 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 12 20:04:21.262320 kernel: libata version 3.00 loaded. Dec 12 20:04:21.270326 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Dec 12 20:04:21.280815 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 20:04:21.281976 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Dec 12 20:04:21.282455 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 20:04:21.284222 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 20:04:21.292363 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Dec 12 20:04:21.291105 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 20:04:21.292248 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 12 20:04:21.309331 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 12 20:04:21.319307 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Dec 12 20:04:21.325316 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Dec 12 20:04:21.332342 kernel: AES CTR mode by8 optimization enabled Dec 12 20:04:21.381341 kernel: ahci 0000:00:1f.2: version 3.0 Dec 12 20:04:21.383335 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Dec 12 20:04:21.385319 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Dec 12 20:04:21.385533 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Dec 12 20:04:21.385729 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Dec 12 20:04:21.395271 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 12 20:04:21.491432 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Dec 12 20:04:21.491714 kernel: hub 1-0:1.0: USB hub found Dec 12 20:04:21.491940 kernel: hub 1-0:1.0: 4 ports detected Dec 12 20:04:21.492135 kernel: scsi host0: ahci Dec 12 20:04:21.492392 kernel: scsi host1: ahci Dec 12 20:04:21.492586 kernel: scsi host2: ahci Dec 12 20:04:21.492795 kernel: scsi host3: ahci Dec 12 20:04:21.492995 kernel: scsi host4: ahci Dec 12 20:04:21.493179 kernel: scsi host5: ahci Dec 12 20:04:21.493408 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 41 lpm-pol 1 Dec 12 20:04:21.493437 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 41 lpm-pol 1 Dec 12 20:04:21.493457 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 41 lpm-pol 1 Dec 12 20:04:21.493475 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 41 lpm-pol 1 Dec 12 20:04:21.493492 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 41 lpm-pol 1 Dec 12 20:04:21.493510 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 41 lpm-pol 1 Dec 12 20:04:21.493528 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 12 20:04:21.493744 kernel: hub 2-0:1.0: USB hub found Dec 12 20:04:21.493949 kernel: hub 2-0:1.0: 4 ports detected Dec 12 20:04:21.490439 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 20:04:21.504972 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Dec 12 20:04:21.517859 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Dec 12 20:04:21.528048 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Dec 12 20:04:21.528889 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Dec 12 20:04:21.532070 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 12 20:04:21.552070 disk-uuid[609]: Primary Header is updated. Dec 12 20:04:21.552070 disk-uuid[609]: Secondary Entries is updated. Dec 12 20:04:21.552070 disk-uuid[609]: Secondary Header is updated. Dec 12 20:04:21.558343 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 12 20:04:21.567330 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 12 20:04:21.645319 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 12 20:04:21.715339 kernel: ata4: SATA link down (SStatus 0 SControl 300) Dec 12 20:04:21.715414 kernel: ata3: SATA link down (SStatus 0 SControl 300) Dec 12 20:04:21.715434 kernel: ata5: SATA link down (SStatus 0 SControl 300) Dec 12 20:04:21.715451 kernel: ata6: SATA link down (SStatus 0 SControl 300) Dec 12 20:04:21.715468 kernel: ata2: SATA link down (SStatus 0 SControl 300) Dec 12 20:04:21.717888 kernel: ata1: SATA link down (SStatus 0 SControl 300) Dec 12 20:04:21.792367 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 12 20:04:21.799080 kernel: usbcore: registered new interface driver usbhid Dec 12 20:04:21.799132 kernel: usbhid: USB HID core driver Dec 12 20:04:21.806580 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Dec 12 20:04:21.806642 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Dec 12 20:04:21.821943 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 12 20:04:21.823891 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 20:04:21.824749 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 20:04:21.826556 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 20:04:21.829236 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 12 20:04:21.855296 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 12 20:04:22.573322 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 12 20:04:22.573575 disk-uuid[610]: The operation has completed successfully. Dec 12 20:04:22.634989 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 12 20:04:22.636372 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 12 20:04:22.681966 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 12 20:04:22.697670 sh[636]: Success Dec 12 20:04:22.722786 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 12 20:04:22.722876 kernel: device-mapper: uevent: version 1.0.3 Dec 12 20:04:22.725405 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 12 20:04:22.738442 kernel: device-mapper: verity: sha256 using shash "sha256-avx" Dec 12 20:04:22.792192 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 12 20:04:22.794130 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 12 20:04:22.803695 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 12 20:04:22.817339 kernel: BTRFS: device fsid a6ae7f96-a076-4d3c-81ed-46dd341492f8 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (648) Dec 12 20:04:22.821347 kernel: BTRFS info (device dm-0): first mount of filesystem a6ae7f96-a076-4d3c-81ed-46dd341492f8 Dec 12 20:04:22.821389 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Dec 12 20:04:22.832185 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 12 20:04:22.832240 kernel: BTRFS info (device dm-0): enabling free space tree Dec 12 20:04:22.834625 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 12 20:04:22.836634 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 12 20:04:22.838375 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 12 20:04:22.840652 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 12 20:04:22.844435 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 12 20:04:22.874510 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (681) Dec 12 20:04:22.877613 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 12 20:04:22.880315 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 12 20:04:22.887518 kernel: BTRFS info (device vda6): turning on async discard Dec 12 20:04:22.887581 kernel: BTRFS info (device vda6): enabling free space tree Dec 12 20:04:22.895352 kernel: BTRFS info (device vda6): last unmount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 12 20:04:22.896525 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 12 20:04:22.899409 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 12 20:04:22.982436 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 20:04:22.986460 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 20:04:23.043159 systemd-networkd[818]: lo: Link UP Dec 12 20:04:23.044267 systemd-networkd[818]: lo: Gained carrier Dec 12 20:04:23.047065 systemd-networkd[818]: Enumeration completed Dec 12 20:04:23.047256 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 20:04:23.049450 systemd[1]: Reached target network.target - Network. Dec 12 20:04:23.051770 systemd-networkd[818]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 20:04:23.051777 systemd-networkd[818]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 20:04:23.054067 systemd-networkd[818]: eth0: Link UP Dec 12 20:04:23.054367 systemd-networkd[818]: eth0: Gained carrier Dec 12 20:04:23.054381 systemd-networkd[818]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 20:04:23.101407 systemd-networkd[818]: eth0: DHCPv4 address 10.244.19.234/30, gateway 10.244.19.233 acquired from 10.244.19.233 Dec 12 20:04:23.130117 ignition[742]: Ignition 2.22.0 Dec 12 20:04:23.130152 ignition[742]: Stage: fetch-offline Dec 12 20:04:23.130229 ignition[742]: no configs at "/usr/lib/ignition/base.d" Dec 12 20:04:23.133449 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 20:04:23.130246 ignition[742]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 20:04:23.130439 ignition[742]: parsed url from cmdline: "" Dec 12 20:04:23.130450 ignition[742]: no config URL provided Dec 12 20:04:23.137106 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 12 20:04:23.130467 ignition[742]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 20:04:23.130483 ignition[742]: no config at "/usr/lib/ignition/user.ign" Dec 12 20:04:23.130499 ignition[742]: failed to fetch config: resource requires networking Dec 12 20:04:23.130727 ignition[742]: Ignition finished successfully Dec 12 20:04:23.170224 ignition[828]: Ignition 2.22.0 Dec 12 20:04:23.170247 ignition[828]: Stage: fetch Dec 12 20:04:23.170509 ignition[828]: no configs at "/usr/lib/ignition/base.d" Dec 12 20:04:23.170528 ignition[828]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 20:04:23.170651 ignition[828]: parsed url from cmdline: "" Dec 12 20:04:23.170658 ignition[828]: no config URL provided Dec 12 20:04:23.170668 ignition[828]: reading system config file "/usr/lib/ignition/user.ign" Dec 12 20:04:23.170683 ignition[828]: no config at "/usr/lib/ignition/user.ign" Dec 12 20:04:23.170873 ignition[828]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Dec 12 20:04:23.171917 ignition[828]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Dec 12 20:04:23.171974 ignition[828]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Dec 12 20:04:23.188614 ignition[828]: GET result: OK Dec 12 20:04:23.189486 ignition[828]: parsing config with SHA512: 4121ca67ce2053cacc9fa43b82b039d9ccadeb29959003cea80777fc8c985afc3829891914b4109b63a55459e0f58310f5e01636ac1d52697e181126c47a6675 Dec 12 20:04:23.194334 unknown[828]: fetched base config from "system" Dec 12 20:04:23.194351 unknown[828]: fetched base config from "system" Dec 12 20:04:23.195035 ignition[828]: fetch: fetch complete Dec 12 20:04:23.194360 unknown[828]: fetched user config from "openstack" Dec 12 20:04:23.195048 ignition[828]: fetch: fetch passed Dec 12 20:04:23.195125 ignition[828]: Ignition finished successfully Dec 12 20:04:23.199151 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 12 20:04:23.201466 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 12 20:04:23.247796 ignition[834]: Ignition 2.22.0 Dec 12 20:04:23.247830 ignition[834]: Stage: kargs Dec 12 20:04:23.248026 ignition[834]: no configs at "/usr/lib/ignition/base.d" Dec 12 20:04:23.248044 ignition[834]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 20:04:23.249084 ignition[834]: kargs: kargs passed Dec 12 20:04:23.249154 ignition[834]: Ignition finished successfully Dec 12 20:04:23.252201 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 12 20:04:23.254957 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 12 20:04:23.290742 ignition[840]: Ignition 2.22.0 Dec 12 20:04:23.291782 ignition[840]: Stage: disks Dec 12 20:04:23.292017 ignition[840]: no configs at "/usr/lib/ignition/base.d" Dec 12 20:04:23.292036 ignition[840]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 20:04:23.293175 ignition[840]: disks: disks passed Dec 12 20:04:23.295597 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 12 20:04:23.293244 ignition[840]: Ignition finished successfully Dec 12 20:04:23.297595 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 12 20:04:23.298419 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 12 20:04:23.299796 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 20:04:23.301440 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 20:04:23.302992 systemd[1]: Reached target basic.target - Basic System. Dec 12 20:04:23.306443 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 12 20:04:23.351068 systemd-fsck[849]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Dec 12 20:04:23.355610 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 12 20:04:23.359084 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 12 20:04:23.497315 kernel: EXT4-fs (vda9): mounted filesystem e48ca59c-1206-4abd-b121-5e9b35e49852 r/w with ordered data mode. Quota mode: none. Dec 12 20:04:23.498493 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 12 20:04:23.500559 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 12 20:04:23.502954 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 20:04:23.504782 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 12 20:04:23.507077 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Dec 12 20:04:23.511454 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Dec 12 20:04:23.512366 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 12 20:04:23.512414 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 20:04:23.527421 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (857) Dec 12 20:04:23.533376 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 12 20:04:23.532637 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 12 20:04:23.536308 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 12 20:04:23.536624 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 12 20:04:23.560552 kernel: BTRFS info (device vda6): turning on async discard Dec 12 20:04:23.560612 kernel: BTRFS info (device vda6): enabling free space tree Dec 12 20:04:23.564944 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 20:04:23.624318 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 20:04:23.634513 initrd-setup-root[885]: cut: /sysroot/etc/passwd: No such file or directory Dec 12 20:04:23.644154 initrd-setup-root[892]: cut: /sysroot/etc/group: No such file or directory Dec 12 20:04:23.652984 initrd-setup-root[899]: cut: /sysroot/etc/shadow: No such file or directory Dec 12 20:04:23.659066 initrd-setup-root[906]: cut: /sysroot/etc/gshadow: No such file or directory Dec 12 20:04:23.769823 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 12 20:04:23.772737 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 12 20:04:23.775493 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 12 20:04:23.795352 kernel: BTRFS info (device vda6): last unmount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 12 20:04:23.815103 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 12 20:04:23.820576 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 12 20:04:23.842463 ignition[975]: INFO : Ignition 2.22.0 Dec 12 20:04:23.845378 ignition[975]: INFO : Stage: mount Dec 12 20:04:23.845378 ignition[975]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 20:04:23.845378 ignition[975]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 20:04:23.845378 ignition[975]: INFO : mount: mount passed Dec 12 20:04:23.845378 ignition[975]: INFO : Ignition finished successfully Dec 12 20:04:23.847706 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 12 20:04:24.418754 systemd-networkd[818]: eth0: Gained IPv6LL Dec 12 20:04:24.663418 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 20:04:25.928928 systemd-networkd[818]: eth0: Ignoring DHCPv6 address 2a02:1348:17d:4fa:24:19ff:fef4:13ea/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17d:4fa:24:19ff:fef4:13ea/64 assigned by NDisc. Dec 12 20:04:25.928942 systemd-networkd[818]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Dec 12 20:04:26.670321 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 20:04:30.683333 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 20:04:30.690218 coreos-metadata[859]: Dec 12 20:04:30.690 WARN failed to locate config-drive, using the metadata service API instead Dec 12 20:04:30.719586 coreos-metadata[859]: Dec 12 20:04:30.719 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 12 20:04:30.734567 coreos-metadata[859]: Dec 12 20:04:30.734 INFO Fetch successful Dec 12 20:04:30.735685 coreos-metadata[859]: Dec 12 20:04:30.735 INFO wrote hostname srv-n0ssy.gb1.brightbox.com to /sysroot/etc/hostname Dec 12 20:04:30.739880 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Dec 12 20:04:30.740144 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Dec 12 20:04:30.744588 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 12 20:04:30.790261 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 12 20:04:30.820339 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (990) Dec 12 20:04:30.826275 kernel: BTRFS info (device vda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 12 20:04:30.826368 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Dec 12 20:04:30.832826 kernel: BTRFS info (device vda6): turning on async discard Dec 12 20:04:30.832877 kernel: BTRFS info (device vda6): enabling free space tree Dec 12 20:04:30.837176 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 12 20:04:30.886257 ignition[1007]: INFO : Ignition 2.22.0 Dec 12 20:04:30.886257 ignition[1007]: INFO : Stage: files Dec 12 20:04:30.888210 ignition[1007]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 20:04:30.888210 ignition[1007]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 20:04:30.889962 ignition[1007]: DEBUG : files: compiled without relabeling support, skipping Dec 12 20:04:30.889962 ignition[1007]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 12 20:04:30.889962 ignition[1007]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 12 20:04:30.898781 ignition[1007]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 12 20:04:30.898781 ignition[1007]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 12 20:04:30.898781 ignition[1007]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 12 20:04:30.898101 unknown[1007]: wrote ssh authorized keys file for user: core Dec 12 20:04:30.902817 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Dec 12 20:04:30.902817 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Dec 12 20:04:31.100890 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 12 20:04:31.344491 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Dec 12 20:04:31.346204 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 12 20:04:31.346204 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 12 20:04:31.346204 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 12 20:04:31.346204 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 12 20:04:31.346204 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 20:04:31.346204 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 12 20:04:31.346204 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 20:04:31.346204 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 12 20:04:31.356907 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 20:04:31.356907 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 12 20:04:31.356907 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Dec 12 20:04:31.356907 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Dec 12 20:04:31.356907 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Dec 12 20:04:31.356907 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Dec 12 20:04:31.670789 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 12 20:04:33.233250 ignition[1007]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Dec 12 20:04:33.233250 ignition[1007]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 12 20:04:33.236663 ignition[1007]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 20:04:33.239941 ignition[1007]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 12 20:04:33.239941 ignition[1007]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 12 20:04:33.239941 ignition[1007]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Dec 12 20:04:33.239941 ignition[1007]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Dec 12 20:04:33.239941 ignition[1007]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 12 20:04:33.239941 ignition[1007]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 12 20:04:33.239941 ignition[1007]: INFO : files: files passed Dec 12 20:04:33.239941 ignition[1007]: INFO : Ignition finished successfully Dec 12 20:04:33.241624 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 12 20:04:33.247477 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 12 20:04:33.250458 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 12 20:04:33.269724 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 12 20:04:33.270807 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 12 20:04:33.279580 initrd-setup-root-after-ignition[1037]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 20:04:33.279580 initrd-setup-root-after-ignition[1037]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 12 20:04:33.282582 initrd-setup-root-after-ignition[1041]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 12 20:04:33.283190 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 20:04:33.285377 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 12 20:04:33.287789 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 12 20:04:33.350618 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 12 20:04:33.350927 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 12 20:04:33.353240 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 12 20:04:33.354426 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 12 20:04:33.355959 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 12 20:04:33.357454 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 12 20:04:33.387513 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 20:04:33.390564 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 12 20:04:33.418759 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 12 20:04:33.419793 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 20:04:33.421563 systemd[1]: Stopped target timers.target - Timer Units. Dec 12 20:04:33.423185 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 12 20:04:33.423482 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 12 20:04:33.425243 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 12 20:04:33.426206 systemd[1]: Stopped target basic.target - Basic System. Dec 12 20:04:33.427737 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 12 20:04:33.429453 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 12 20:04:33.430813 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 12 20:04:33.432545 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 12 20:04:33.433991 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 12 20:04:33.451724 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 12 20:04:33.452772 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 12 20:04:33.454441 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 12 20:04:33.455934 systemd[1]: Stopped target swap.target - Swaps. Dec 12 20:04:33.458444 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 12 20:04:33.459892 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 12 20:04:33.462067 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 12 20:04:33.463013 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 20:04:33.464636 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 12 20:04:33.465218 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 20:04:33.466393 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 12 20:04:33.466624 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 12 20:04:33.467684 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 12 20:04:33.467886 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 12 20:04:33.468843 systemd[1]: ignition-files.service: Deactivated successfully. Dec 12 20:04:33.469020 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 12 20:04:33.473416 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 12 20:04:33.478865 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 12 20:04:33.479160 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 20:04:33.484533 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 12 20:04:33.485594 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 12 20:04:33.485792 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 20:04:33.490062 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 12 20:04:33.490275 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 12 20:04:33.503737 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 12 20:04:33.503890 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 12 20:04:33.529718 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 12 20:04:33.539766 ignition[1061]: INFO : Ignition 2.22.0 Dec 12 20:04:33.541041 ignition[1061]: INFO : Stage: umount Dec 12 20:04:33.541041 ignition[1061]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 12 20:04:33.541041 ignition[1061]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 12 20:04:33.543533 ignition[1061]: INFO : umount: umount passed Dec 12 20:04:33.543533 ignition[1061]: INFO : Ignition finished successfully Dec 12 20:04:33.543800 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 12 20:04:33.543991 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 12 20:04:33.545734 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 12 20:04:33.545882 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 12 20:04:33.547349 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 12 20:04:33.547443 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 12 20:04:33.548639 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 12 20:04:33.548705 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 12 20:04:33.550011 systemd[1]: Stopped target network.target - Network. Dec 12 20:04:33.551396 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 12 20:04:33.551482 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 12 20:04:33.552950 systemd[1]: Stopped target paths.target - Path Units. Dec 12 20:04:33.554375 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 12 20:04:33.558374 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 20:04:33.559167 systemd[1]: Stopped target slices.target - Slice Units. Dec 12 20:04:33.560569 systemd[1]: Stopped target sockets.target - Socket Units. Dec 12 20:04:33.562200 systemd[1]: iscsid.socket: Deactivated successfully. Dec 12 20:04:33.562272 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 12 20:04:33.564683 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 12 20:04:33.564749 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 12 20:04:33.566035 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 12 20:04:33.566141 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 12 20:04:33.569460 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 12 20:04:33.569536 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 12 20:04:33.571007 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 12 20:04:33.573482 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 12 20:04:33.575763 systemd-networkd[818]: eth0: DHCPv6 lease lost Dec 12 20:04:33.580023 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 12 20:04:33.580261 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 12 20:04:33.586547 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Dec 12 20:04:33.586973 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 12 20:04:33.587169 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 12 20:04:33.590008 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Dec 12 20:04:33.591071 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 12 20:04:33.591995 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 12 20:04:33.592071 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 12 20:04:33.594686 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 12 20:04:33.596626 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 12 20:04:33.596698 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 12 20:04:33.598737 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 12 20:04:33.598813 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 12 20:04:33.601836 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 12 20:04:33.601908 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 12 20:04:33.605069 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 12 20:04:33.605151 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 20:04:33.607448 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 20:04:33.613362 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 12 20:04:33.613485 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 12 20:04:33.624234 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 12 20:04:33.626103 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 20:04:33.627406 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 12 20:04:33.627471 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 12 20:04:33.629278 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 12 20:04:33.629350 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 20:04:33.630918 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 12 20:04:33.631001 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 12 20:04:33.633147 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 12 20:04:33.633231 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 12 20:04:33.634693 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 12 20:04:33.634776 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 12 20:04:33.638486 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 12 20:04:33.639716 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 12 20:04:33.639808 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 20:04:33.646471 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 12 20:04:33.646558 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 20:04:33.649262 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 12 20:04:33.649360 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 20:04:33.656460 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Dec 12 20:04:33.656581 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Dec 12 20:04:33.656674 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 12 20:04:33.657436 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 12 20:04:33.657590 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 12 20:04:33.665965 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 12 20:04:33.666142 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 12 20:04:33.667692 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 12 20:04:33.667828 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 12 20:04:33.670640 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 12 20:04:33.671497 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 12 20:04:33.671589 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 12 20:04:33.676453 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 12 20:04:33.699174 systemd[1]: Switching root. Dec 12 20:04:33.736867 systemd-journald[212]: Journal stopped Dec 12 20:04:35.365217 systemd-journald[212]: Received SIGTERM from PID 1 (systemd). Dec 12 20:04:35.369478 kernel: SELinux: policy capability network_peer_controls=1 Dec 12 20:04:35.369541 kernel: SELinux: policy capability open_perms=1 Dec 12 20:04:35.369589 kernel: SELinux: policy capability extended_socket_class=1 Dec 12 20:04:35.369619 kernel: SELinux: policy capability always_check_network=0 Dec 12 20:04:35.369656 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 12 20:04:35.369686 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 12 20:04:35.369712 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 12 20:04:35.369739 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 12 20:04:35.369764 kernel: SELinux: policy capability userspace_initial_context=0 Dec 12 20:04:35.369789 kernel: audit: type=1403 audit(1765569874.041:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 12 20:04:35.369830 systemd[1]: Successfully loaded SELinux policy in 83.377ms. Dec 12 20:04:35.369884 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 13.093ms. Dec 12 20:04:35.369908 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 12 20:04:35.369942 systemd[1]: Detected virtualization kvm. Dec 12 20:04:35.369970 systemd[1]: Detected architecture x86-64. Dec 12 20:04:35.369996 systemd[1]: Detected first boot. Dec 12 20:04:35.370023 systemd[1]: Hostname set to . Dec 12 20:04:35.370043 systemd[1]: Initializing machine ID from VM UUID. Dec 12 20:04:35.370069 zram_generator::config[1104]: No configuration found. Dec 12 20:04:35.370143 kernel: Guest personality initialized and is inactive Dec 12 20:04:35.370175 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Dec 12 20:04:35.370214 kernel: Initialized host personality Dec 12 20:04:35.370233 kernel: NET: Registered PF_VSOCK protocol family Dec 12 20:04:35.370259 systemd[1]: Populated /etc with preset unit settings. Dec 12 20:04:35.376528 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Dec 12 20:04:35.376589 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 12 20:04:35.376623 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 12 20:04:35.376659 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 12 20:04:35.376681 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 12 20:04:35.376725 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 12 20:04:35.376753 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 12 20:04:35.376775 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 12 20:04:35.376795 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 12 20:04:35.376825 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 12 20:04:35.376855 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 12 20:04:35.376877 systemd[1]: Created slice user.slice - User and Session Slice. Dec 12 20:04:35.376918 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 12 20:04:35.376956 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 12 20:04:35.376979 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 12 20:04:35.377000 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 12 20:04:35.377022 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 12 20:04:35.377055 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 12 20:04:35.377078 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Dec 12 20:04:35.377110 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 12 20:04:35.377133 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 12 20:04:35.377167 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 12 20:04:35.377188 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 12 20:04:35.377208 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 12 20:04:35.377229 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 12 20:04:35.377248 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 12 20:04:35.381850 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 12 20:04:35.381903 systemd[1]: Reached target slices.target - Slice Units. Dec 12 20:04:35.381925 systemd[1]: Reached target swap.target - Swaps. Dec 12 20:04:35.381957 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 12 20:04:35.381987 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 12 20:04:35.382009 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 12 20:04:35.382030 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 12 20:04:35.382052 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 12 20:04:35.382073 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 12 20:04:35.382110 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 12 20:04:35.382160 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 12 20:04:35.382190 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 12 20:04:35.382211 systemd[1]: Mounting media.mount - External Media Directory... Dec 12 20:04:35.382232 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 20:04:35.382253 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 12 20:04:35.382280 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 12 20:04:35.382334 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 12 20:04:35.382358 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 12 20:04:35.382393 systemd[1]: Reached target machines.target - Containers. Dec 12 20:04:35.382415 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 12 20:04:35.382436 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 20:04:35.382456 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 12 20:04:35.382483 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 12 20:04:35.382510 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 20:04:35.382541 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 20:04:35.382580 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 20:04:35.382615 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 12 20:04:35.382637 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 20:04:35.382665 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 12 20:04:35.382693 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 12 20:04:35.382715 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 12 20:04:35.382736 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 12 20:04:35.382756 systemd[1]: Stopped systemd-fsck-usr.service. Dec 12 20:04:35.382777 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 20:04:35.382805 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 12 20:04:35.382839 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 12 20:04:35.382862 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 12 20:04:35.382893 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 12 20:04:35.382915 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 12 20:04:35.382948 kernel: ACPI: bus type drm_connector registered Dec 12 20:04:35.382970 kernel: loop: module loaded Dec 12 20:04:35.382997 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 12 20:04:35.383035 systemd[1]: verity-setup.service: Deactivated successfully. Dec 12 20:04:35.383057 systemd[1]: Stopped verity-setup.service. Dec 12 20:04:35.383106 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 20:04:35.383131 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 12 20:04:35.383152 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 12 20:04:35.383174 systemd[1]: Mounted media.mount - External Media Directory. Dec 12 20:04:35.383195 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 12 20:04:35.387342 systemd-journald[1194]: Collecting audit messages is disabled. Dec 12 20:04:35.387437 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 12 20:04:35.387480 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 12 20:04:35.387504 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 12 20:04:35.387525 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 12 20:04:35.387546 kernel: fuse: init (API version 7.41) Dec 12 20:04:35.387567 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 12 20:04:35.387587 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 12 20:04:35.387608 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 20:04:35.387630 systemd-journald[1194]: Journal started Dec 12 20:04:35.387684 systemd-journald[1194]: Runtime Journal (/run/log/journal/08eba51d7783407ead7cdfdb321eac76) is 4.7M, max 37.8M, 33.1M free. Dec 12 20:04:34.913643 systemd[1]: Queued start job for default target multi-user.target. Dec 12 20:04:35.394954 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 20:04:35.395063 systemd[1]: Started systemd-journald.service - Journal Service. Dec 12 20:04:34.939173 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Dec 12 20:04:34.940024 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 12 20:04:35.398041 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 20:04:35.398441 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 20:04:35.401302 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 20:04:35.401632 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 20:04:35.402829 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 12 20:04:35.404200 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 12 20:04:35.405837 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 20:04:35.406172 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 20:04:35.407729 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 12 20:04:35.409172 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 12 20:04:35.410692 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 12 20:04:35.412046 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 12 20:04:35.430538 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 12 20:04:35.436424 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 12 20:04:35.439511 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 12 20:04:35.441382 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 12 20:04:35.441441 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 12 20:04:35.443753 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 12 20:04:35.450467 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 12 20:04:35.451542 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 20:04:35.456797 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 12 20:04:35.465705 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 12 20:04:35.466667 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 20:04:35.470576 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 12 20:04:35.471379 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 20:04:35.475460 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 12 20:04:35.480601 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 12 20:04:35.483764 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 12 20:04:35.501098 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 12 20:04:35.502078 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 12 20:04:35.510622 systemd-journald[1194]: Time spent on flushing to /var/log/journal/08eba51d7783407ead7cdfdb321eac76 is 183.131ms for 1163 entries. Dec 12 20:04:35.510622 systemd-journald[1194]: System Journal (/var/log/journal/08eba51d7783407ead7cdfdb321eac76) is 8M, max 584.8M, 576.8M free. Dec 12 20:04:35.733504 systemd-journald[1194]: Received client request to flush runtime journal. Dec 12 20:04:35.733598 kernel: loop0: detected capacity change from 0 to 8 Dec 12 20:04:35.733644 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 12 20:04:35.733679 kernel: loop1: detected capacity change from 0 to 128560 Dec 12 20:04:35.554172 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 12 20:04:35.559898 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 12 20:04:35.572706 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 12 20:04:35.584873 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 12 20:04:35.598542 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 12 20:04:35.643702 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 12 20:04:35.695400 systemd-tmpfiles[1253]: ACLs are not supported, ignoring. Dec 12 20:04:35.695424 systemd-tmpfiles[1253]: ACLs are not supported, ignoring. Dec 12 20:04:35.720733 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 12 20:04:35.731824 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 12 20:04:35.740962 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 12 20:04:35.757323 kernel: loop2: detected capacity change from 0 to 110984 Dec 12 20:04:35.806365 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 12 20:04:35.812883 kernel: loop3: detected capacity change from 0 to 229808 Dec 12 20:04:35.871467 kernel: loop4: detected capacity change from 0 to 8 Dec 12 20:04:35.879327 kernel: loop5: detected capacity change from 0 to 128560 Dec 12 20:04:35.900326 kernel: loop6: detected capacity change from 0 to 110984 Dec 12 20:04:35.924349 kernel: loop7: detected capacity change from 0 to 229808 Dec 12 20:04:35.940696 (sd-merge)[1267]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Dec 12 20:04:35.942195 (sd-merge)[1267]: Merged extensions into '/usr'. Dec 12 20:04:35.942705 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 12 20:04:35.952708 systemd[1]: Reload requested from client PID 1242 ('systemd-sysext') (unit systemd-sysext.service)... Dec 12 20:04:35.952743 systemd[1]: Reloading... Dec 12 20:04:36.104526 zram_generator::config[1293]: No configuration found. Dec 12 20:04:36.372624 ldconfig[1237]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 12 20:04:36.426476 systemd[1]: Reloading finished in 470 ms. Dec 12 20:04:36.442043 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 12 20:04:36.447448 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 12 20:04:36.458504 systemd[1]: Starting ensure-sysext.service... Dec 12 20:04:36.461570 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 12 20:04:36.496306 systemd-tmpfiles[1350]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 12 20:04:36.496953 systemd-tmpfiles[1350]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 12 20:04:36.497490 systemd-tmpfiles[1350]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 12 20:04:36.497902 systemd-tmpfiles[1350]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 12 20:04:36.499438 systemd[1]: Reload requested from client PID 1349 ('systemctl') (unit ensure-sysext.service)... Dec 12 20:04:36.499462 systemd[1]: Reloading... Dec 12 20:04:36.499787 systemd-tmpfiles[1350]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 12 20:04:36.500343 systemd-tmpfiles[1350]: ACLs are not supported, ignoring. Dec 12 20:04:36.500550 systemd-tmpfiles[1350]: ACLs are not supported, ignoring. Dec 12 20:04:36.506674 systemd-tmpfiles[1350]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 20:04:36.506840 systemd-tmpfiles[1350]: Skipping /boot Dec 12 20:04:36.521897 systemd-tmpfiles[1350]: Detected autofs mount point /boot during canonicalization of boot. Dec 12 20:04:36.522095 systemd-tmpfiles[1350]: Skipping /boot Dec 12 20:04:36.615397 zram_generator::config[1377]: No configuration found. Dec 12 20:04:36.920020 systemd[1]: Reloading finished in 419 ms. Dec 12 20:04:36.934145 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 12 20:04:36.954393 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 12 20:04:36.965963 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 20:04:36.970623 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 12 20:04:36.979257 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 12 20:04:36.988707 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 12 20:04:36.996906 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 12 20:04:37.004709 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 12 20:04:37.011919 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 20:04:37.012248 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 20:04:37.016573 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 12 20:04:37.028798 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 12 20:04:37.032803 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 12 20:04:37.033902 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 20:04:37.034074 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 20:04:37.034236 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 20:04:37.042811 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 12 20:04:37.050019 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 20:04:37.050421 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 20:04:37.050758 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 20:04:37.050997 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 20:04:37.051243 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 20:04:37.066849 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 12 20:04:37.076006 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 20:04:37.076481 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 12 20:04:37.080586 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 12 20:04:37.081643 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 12 20:04:37.081836 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 12 20:04:37.082068 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 12 20:04:37.083452 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 12 20:04:37.090316 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 12 20:04:37.091929 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 12 20:04:37.092213 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 12 20:04:37.101677 systemd[1]: Finished ensure-sysext.service. Dec 12 20:04:37.103208 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 12 20:04:37.106577 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Dec 12 20:04:37.114177 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 12 20:04:37.115465 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 12 20:04:37.116778 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 12 20:04:37.124840 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 12 20:04:37.125188 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 12 20:04:37.138366 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 12 20:04:37.144445 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 12 20:04:37.146491 augenrules[1472]: No rules Dec 12 20:04:37.150891 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 20:04:37.151351 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 20:04:37.172700 systemd-udevd[1445]: Using default interface naming scheme 'v255'. Dec 12 20:04:37.191142 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 12 20:04:37.205353 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 12 20:04:37.206893 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 12 20:04:37.216125 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 12 20:04:37.217212 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 12 20:04:37.222578 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 12 20:04:37.341878 systemd-resolved[1443]: Positive Trust Anchors: Dec 12 20:04:37.341901 systemd-resolved[1443]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 12 20:04:37.341946 systemd-resolved[1443]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 12 20:04:37.351072 systemd-resolved[1443]: Using system hostname 'srv-n0ssy.gb1.brightbox.com'. Dec 12 20:04:37.353729 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 12 20:04:37.356461 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 12 20:04:37.441798 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Dec 12 20:04:37.443506 systemd[1]: Reached target sysinit.target - System Initialization. Dec 12 20:04:37.452309 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 12 20:04:37.453119 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 12 20:04:37.453888 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Dec 12 20:04:37.454644 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 12 20:04:37.455435 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 12 20:04:37.455482 systemd[1]: Reached target paths.target - Path Units. Dec 12 20:04:37.456113 systemd[1]: Reached target time-set.target - System Time Set. Dec 12 20:04:37.456976 systemd-networkd[1490]: lo: Link UP Dec 12 20:04:37.457040 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 12 20:04:37.457856 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 12 20:04:37.458628 systemd[1]: Reached target timers.target - Timer Units. Dec 12 20:04:37.460327 systemd-networkd[1490]: lo: Gained carrier Dec 12 20:04:37.461042 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 12 20:04:37.463416 systemd-networkd[1490]: Enumeration completed Dec 12 20:04:37.465526 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 12 20:04:37.475964 systemd-timesyncd[1468]: No network connectivity, watching for changes. Dec 12 20:04:37.478471 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 12 20:04:37.480079 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 12 20:04:37.480916 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 12 20:04:37.490242 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 12 20:04:37.492199 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 12 20:04:37.494376 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 12 20:04:37.495616 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 12 20:04:37.497710 systemd[1]: Reached target network.target - Network. Dec 12 20:04:37.498390 systemd[1]: Reached target sockets.target - Socket Units. Dec 12 20:04:37.499111 systemd[1]: Reached target basic.target - Basic System. Dec 12 20:04:37.499859 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 12 20:04:37.499916 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 12 20:04:37.501999 systemd[1]: Starting containerd.service - containerd container runtime... Dec 12 20:04:37.508480 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 12 20:04:37.510770 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 12 20:04:37.516079 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 12 20:04:37.519506 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 12 20:04:37.521596 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 12 20:04:37.523373 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 12 20:04:37.528587 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Dec 12 20:04:37.533565 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 12 20:04:37.543527 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 12 20:04:37.551957 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 12 20:04:37.557950 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 12 20:04:37.568998 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 12 20:04:37.573587 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 12 20:04:37.578710 jq[1521]: false Dec 12 20:04:37.584562 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 12 20:04:37.588700 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 12 20:04:37.589502 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 12 20:04:37.593046 systemd[1]: Starting update-engine.service - Update Engine... Dec 12 20:04:37.598682 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 20:04:37.601853 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 12 20:04:37.609369 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Refreshing passwd entry cache Dec 12 20:04:37.606618 oslogin_cache_refresh[1523]: Refreshing passwd entry cache Dec 12 20:04:37.618364 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 12 20:04:37.619811 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 12 20:04:37.620172 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 12 20:04:37.627847 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 12 20:04:37.628451 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 12 20:04:37.631136 oslogin_cache_refresh[1523]: Failure getting users, quitting Dec 12 20:04:37.631481 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Failure getting users, quitting Dec 12 20:04:37.631481 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 12 20:04:37.631481 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Refreshing group entry cache Dec 12 20:04:37.631168 oslogin_cache_refresh[1523]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 12 20:04:37.631244 oslogin_cache_refresh[1523]: Refreshing group entry cache Dec 12 20:04:37.650062 oslogin_cache_refresh[1523]: Failure getting groups, quitting Dec 12 20:04:37.652530 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Failure getting groups, quitting Dec 12 20:04:37.652530 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 12 20:04:37.650080 oslogin_cache_refresh[1523]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 12 20:04:37.662774 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Dec 12 20:04:37.663139 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Dec 12 20:04:37.672364 jq[1533]: true Dec 12 20:04:37.682984 systemd[1]: motdgen.service: Deactivated successfully. Dec 12 20:04:37.683447 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 12 20:04:37.698578 extend-filesystems[1522]: Found /dev/vda6 Dec 12 20:04:37.708714 extend-filesystems[1522]: Found /dev/vda9 Dec 12 20:04:37.726323 tar[1542]: linux-amd64/LICENSE Dec 12 20:04:37.726323 tar[1542]: linux-amd64/helm Dec 12 20:04:37.729248 extend-filesystems[1522]: Checking size of /dev/vda9 Dec 12 20:04:37.748693 update_engine[1531]: I20251212 20:04:37.748059 1531 main.cc:92] Flatcar Update Engine starting Dec 12 20:04:37.752610 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 12 20:04:37.752776 (ntainerd)[1555]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 12 20:04:37.790172 jq[1554]: true Dec 12 20:04:37.791587 dbus-daemon[1519]: [system] SELinux support is enabled Dec 12 20:04:37.796858 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 12 20:04:37.802995 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 12 20:04:37.803057 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 12 20:04:37.803942 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 12 20:04:37.803979 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 12 20:04:37.817499 extend-filesystems[1522]: Resized partition /dev/vda9 Dec 12 20:04:37.823596 systemd[1]: Started update-engine.service - Update Engine. Dec 12 20:04:37.828880 update_engine[1531]: I20251212 20:04:37.828801 1531 update_check_scheduler.cc:74] Next update check in 4m9s Dec 12 20:04:37.847774 extend-filesystems[1570]: resize2fs 1.47.3 (8-Jul-2025) Dec 12 20:04:37.861410 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Dec 12 20:04:37.874745 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 12 20:04:38.003270 systemd-logind[1528]: New seat seat0. Dec 12 20:04:38.007001 systemd[1]: Started systemd-logind.service - User Login Management. Dec 12 20:04:38.093824 bash[1585]: Updated "/home/core/.ssh/authorized_keys" Dec 12 20:04:38.098347 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 12 20:04:38.106603 systemd[1]: Starting sshkeys.service... Dec 12 20:04:38.181218 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Dec 12 20:04:38.192681 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 12 20:04:38.198989 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 12 20:04:38.219303 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Dec 12 20:04:38.227588 containerd[1555]: time="2025-12-12T20:04:38Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 12 20:04:38.255548 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 20:04:38.263321 extend-filesystems[1570]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Dec 12 20:04:38.263321 extend-filesystems[1570]: old_desc_blocks = 1, new_desc_blocks = 8 Dec 12 20:04:38.263321 extend-filesystems[1570]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Dec 12 20:04:38.264879 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 12 20:04:38.286538 containerd[1555]: time="2025-12-12T20:04:38.276650271Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Dec 12 20:04:38.286603 extend-filesystems[1522]: Resized filesystem in /dev/vda9 Dec 12 20:04:38.265350 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 12 20:04:38.325076 kernel: mousedev: PS/2 mouse device common for all mice Dec 12 20:04:38.347455 locksmithd[1571]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 12 20:04:38.354653 containerd[1555]: time="2025-12-12T20:04:38.354602465Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="39.872µs" Dec 12 20:04:38.359054 containerd[1555]: time="2025-12-12T20:04:38.356571073Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 12 20:04:38.359054 containerd[1555]: time="2025-12-12T20:04:38.356622856Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 12 20:04:38.359054 containerd[1555]: time="2025-12-12T20:04:38.356912441Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 12 20:04:38.359054 containerd[1555]: time="2025-12-12T20:04:38.356946116Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 12 20:04:38.359054 containerd[1555]: time="2025-12-12T20:04:38.357000092Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 20:04:38.359054 containerd[1555]: time="2025-12-12T20:04:38.357118319Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 12 20:04:38.359054 containerd[1555]: time="2025-12-12T20:04:38.357139753Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 20:04:38.359054 containerd[1555]: time="2025-12-12T20:04:38.357489577Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 12 20:04:38.359054 containerd[1555]: time="2025-12-12T20:04:38.357515133Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 20:04:38.359054 containerd[1555]: time="2025-12-12T20:04:38.357532977Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 12 20:04:38.359054 containerd[1555]: time="2025-12-12T20:04:38.357547535Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 12 20:04:38.359054 containerd[1555]: time="2025-12-12T20:04:38.357705956Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 12 20:04:38.359530 containerd[1555]: time="2025-12-12T20:04:38.358089004Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 20:04:38.359530 containerd[1555]: time="2025-12-12T20:04:38.358188219Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 12 20:04:38.359530 containerd[1555]: time="2025-12-12T20:04:38.358210490Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 12 20:04:38.365700 containerd[1555]: time="2025-12-12T20:04:38.364398813Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 12 20:04:38.365700 containerd[1555]: time="2025-12-12T20:04:38.364914277Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 12 20:04:38.365700 containerd[1555]: time="2025-12-12T20:04:38.365021178Z" level=info msg="metadata content store policy set" policy=shared Dec 12 20:04:38.373457 containerd[1555]: time="2025-12-12T20:04:38.372823320Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 12 20:04:38.373457 containerd[1555]: time="2025-12-12T20:04:38.372934689Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 12 20:04:38.373457 containerd[1555]: time="2025-12-12T20:04:38.372961192Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 12 20:04:38.373457 containerd[1555]: time="2025-12-12T20:04:38.373053320Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 12 20:04:38.373457 containerd[1555]: time="2025-12-12T20:04:38.373080590Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 12 20:04:38.373457 containerd[1555]: time="2025-12-12T20:04:38.373099515Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 12 20:04:38.373457 containerd[1555]: time="2025-12-12T20:04:38.373139713Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 12 20:04:38.373457 containerd[1555]: time="2025-12-12T20:04:38.373165004Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 12 20:04:38.373457 containerd[1555]: time="2025-12-12T20:04:38.373183260Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 12 20:04:38.373457 containerd[1555]: time="2025-12-12T20:04:38.373199696Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 12 20:04:38.373457 containerd[1555]: time="2025-12-12T20:04:38.373216419Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 12 20:04:38.373457 containerd[1555]: time="2025-12-12T20:04:38.373238908Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 12 20:04:38.374408 containerd[1555]: time="2025-12-12T20:04:38.373958078Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 12 20:04:38.374408 containerd[1555]: time="2025-12-12T20:04:38.374009710Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 12 20:04:38.374408 containerd[1555]: time="2025-12-12T20:04:38.374050426Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 12 20:04:38.374408 containerd[1555]: time="2025-12-12T20:04:38.374090314Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 12 20:04:38.374408 containerd[1555]: time="2025-12-12T20:04:38.374112690Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 12 20:04:38.374408 containerd[1555]: time="2025-12-12T20:04:38.374141402Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 12 20:04:38.374408 containerd[1555]: time="2025-12-12T20:04:38.374162266Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 12 20:04:38.374408 containerd[1555]: time="2025-12-12T20:04:38.374181584Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 12 20:04:38.374408 containerd[1555]: time="2025-12-12T20:04:38.374199509Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 12 20:04:38.374408 containerd[1555]: time="2025-12-12T20:04:38.374216779Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 12 20:04:38.374408 containerd[1555]: time="2025-12-12T20:04:38.374234518Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 12 20:04:38.388316 containerd[1555]: time="2025-12-12T20:04:38.378366786Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 12 20:04:38.388316 containerd[1555]: time="2025-12-12T20:04:38.378419071Z" level=info msg="Start snapshots syncer" Dec 12 20:04:38.388316 containerd[1555]: time="2025-12-12T20:04:38.378506706Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 12 20:04:38.380029 systemd-networkd[1490]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 20:04:38.380047 systemd-networkd[1490]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 12 20:04:38.382873 systemd-networkd[1490]: eth0: Link UP Dec 12 20:04:38.384103 systemd-networkd[1490]: eth0: Gained carrier Dec 12 20:04:38.384168 systemd-networkd[1490]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 12 20:04:38.393595 containerd[1555]: time="2025-12-12T20:04:38.386257703Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 12 20:04:38.393595 containerd[1555]: time="2025-12-12T20:04:38.390221527Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 12 20:04:38.393891 containerd[1555]: time="2025-12-12T20:04:38.390364329Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 12 20:04:38.393891 containerd[1555]: time="2025-12-12T20:04:38.390583767Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 12 20:04:38.393891 containerd[1555]: time="2025-12-12T20:04:38.390620377Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 12 20:04:38.393891 containerd[1555]: time="2025-12-12T20:04:38.390641953Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 12 20:04:38.393891 containerd[1555]: time="2025-12-12T20:04:38.390659947Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 12 20:04:38.393891 containerd[1555]: time="2025-12-12T20:04:38.390679555Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 12 20:04:38.393891 containerd[1555]: time="2025-12-12T20:04:38.390697129Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 12 20:04:38.393891 containerd[1555]: time="2025-12-12T20:04:38.390726967Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 12 20:04:38.393891 containerd[1555]: time="2025-12-12T20:04:38.390761433Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 12 20:04:38.393891 containerd[1555]: time="2025-12-12T20:04:38.390780710Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 12 20:04:38.393891 containerd[1555]: time="2025-12-12T20:04:38.390799555Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 12 20:04:38.393891 containerd[1555]: time="2025-12-12T20:04:38.390859445Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 20:04:38.393891 containerd[1555]: time="2025-12-12T20:04:38.390886247Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 12 20:04:38.393891 containerd[1555]: time="2025-12-12T20:04:38.390901720Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 20:04:38.394412 containerd[1555]: time="2025-12-12T20:04:38.390918997Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 12 20:04:38.394412 containerd[1555]: time="2025-12-12T20:04:38.390936750Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 12 20:04:38.394412 containerd[1555]: time="2025-12-12T20:04:38.390953758Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 12 20:04:38.394412 containerd[1555]: time="2025-12-12T20:04:38.390979364Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 12 20:04:38.394412 containerd[1555]: time="2025-12-12T20:04:38.391005115Z" level=info msg="runtime interface created" Dec 12 20:04:38.394412 containerd[1555]: time="2025-12-12T20:04:38.391015832Z" level=info msg="created NRI interface" Dec 12 20:04:38.394412 containerd[1555]: time="2025-12-12T20:04:38.391028892Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 12 20:04:38.394412 containerd[1555]: time="2025-12-12T20:04:38.391062654Z" level=info msg="Connect containerd service" Dec 12 20:04:38.394412 containerd[1555]: time="2025-12-12T20:04:38.391092506Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 12 20:04:38.405314 containerd[1555]: time="2025-12-12T20:04:38.401147512Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 12 20:04:38.416642 systemd-networkd[1490]: eth0: DHCPv4 address 10.244.19.234/30, gateway 10.244.19.233 acquired from 10.244.19.233 Dec 12 20:04:38.417725 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Dec 12 20:04:38.421451 dbus-daemon[1519]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.1' (uid=244 pid=1490 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Dec 12 20:04:38.432599 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Dec 12 20:04:38.437019 systemd-timesyncd[1468]: Network configuration changed, trying to establish connection. Dec 12 20:04:38.521193 kernel: ACPI: button: Power Button [PWRF] Dec 12 20:04:38.688975 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 12 20:04:38.703717 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 12 20:04:38.749558 containerd[1555]: time="2025-12-12T20:04:38.747202072Z" level=info msg="Start subscribing containerd event" Dec 12 20:04:38.749558 containerd[1555]: time="2025-12-12T20:04:38.747323870Z" level=info msg="Start recovering state" Dec 12 20:04:38.749558 containerd[1555]: time="2025-12-12T20:04:38.747549089Z" level=info msg="Start event monitor" Dec 12 20:04:38.749558 containerd[1555]: time="2025-12-12T20:04:38.747583307Z" level=info msg="Start cni network conf syncer for default" Dec 12 20:04:38.749558 containerd[1555]: time="2025-12-12T20:04:38.747601698Z" level=info msg="Start streaming server" Dec 12 20:04:38.749558 containerd[1555]: time="2025-12-12T20:04:38.747623093Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 12 20:04:38.749558 containerd[1555]: time="2025-12-12T20:04:38.747635550Z" level=info msg="runtime interface starting up..." Dec 12 20:04:38.749558 containerd[1555]: time="2025-12-12T20:04:38.747645813Z" level=info msg="starting plugins..." Dec 12 20:04:38.749558 containerd[1555]: time="2025-12-12T20:04:38.747669136Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 12 20:04:38.757576 containerd[1555]: time="2025-12-12T20:04:38.755327037Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 12 20:04:38.757576 containerd[1555]: time="2025-12-12T20:04:38.757408637Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 12 20:04:38.757972 systemd[1]: Started containerd.service - containerd container runtime. Dec 12 20:04:38.759251 containerd[1555]: time="2025-12-12T20:04:38.759223852Z" level=info msg="containerd successfully booted in 0.532731s" Dec 12 20:04:38.785358 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 12 20:04:38.811817 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 12 20:04:38.842835 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Dec 12 20:04:38.844961 dbus-daemon[1519]: [system] Successfully activated service 'org.freedesktop.hostname1' Dec 12 20:04:38.847861 dbus-daemon[1519]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.8' (uid=0 pid=1611 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Dec 12 20:04:38.857260 systemd[1]: Starting polkit.service - Authorization Manager... Dec 12 20:04:38.880323 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Dec 12 20:04:38.886318 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Dec 12 20:04:39.036852 tar[1542]: linux-amd64/README.md Dec 12 20:04:39.075430 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 12 20:04:39.099181 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 12 20:04:39.413412 polkitd[1629]: Started polkitd version 126 Dec 12 20:04:39.435793 polkitd[1629]: Loading rules from directory /etc/polkit-1/rules.d Dec 12 20:04:39.436173 systemd-timesyncd[1468]: Contacted time server 91.109.118.94:123 (3.flatcar.pool.ntp.org). Dec 12 20:04:39.436267 systemd-timesyncd[1468]: Initial clock synchronization to Fri 2025-12-12 20:04:39.111428 UTC. Dec 12 20:04:39.445267 polkitd[1629]: Loading rules from directory /run/polkit-1/rules.d Dec 12 20:04:39.445373 polkitd[1629]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Dec 12 20:04:39.445754 polkitd[1629]: Loading rules from directory /usr/local/share/polkit-1/rules.d Dec 12 20:04:39.445794 polkitd[1629]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Dec 12 20:04:39.445859 polkitd[1629]: Loading rules from directory /usr/share/polkit-1/rules.d Dec 12 20:04:39.450396 polkitd[1629]: Finished loading, compiling and executing 2 rules Dec 12 20:04:39.455007 dbus-daemon[1519]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Dec 12 20:04:39.455497 systemd[1]: Started polkit.service - Authorization Manager. Dec 12 20:04:39.458330 polkitd[1629]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Dec 12 20:04:39.466946 systemd-logind[1528]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Dec 12 20:04:39.507575 systemd-logind[1528]: Watching system buttons on /dev/input/event3 (Power Button) Dec 12 20:04:39.511499 systemd-hostnamed[1611]: Hostname set to (static) Dec 12 20:04:39.713035 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 12 20:04:39.781558 systemd-networkd[1490]: eth0: Gained IPv6LL Dec 12 20:04:39.787596 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 12 20:04:39.789582 systemd[1]: Reached target network-online.target - Network is Online. Dec 12 20:04:39.798954 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 20:04:39.804752 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 12 20:04:39.882093 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 12 20:04:39.984225 sshd_keygen[1561]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 12 20:04:40.022237 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 12 20:04:40.034141 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 12 20:04:40.039178 systemd[1]: Started sshd@0-10.244.19.234:22-147.75.109.163:49462.service - OpenSSH per-connection server daemon (147.75.109.163:49462). Dec 12 20:04:40.076173 systemd[1]: issuegen.service: Deactivated successfully. Dec 12 20:04:40.076642 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 12 20:04:40.084710 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 12 20:04:40.123333 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 12 20:04:40.129199 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 12 20:04:40.133853 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Dec 12 20:04:40.134934 systemd[1]: Reached target getty.target - Login Prompts. Dec 12 20:04:40.860905 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 20:04:40.866325 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 20:04:40.938270 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 20:04:40.949832 (kubelet)[1699]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 20:04:40.960260 sshd[1682]: Accepted publickey for core from 147.75.109.163 port 49462 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:04:40.962090 sshd-session[1682]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:04:40.975919 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 12 20:04:40.978425 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 12 20:04:41.004576 systemd-logind[1528]: New session 1 of user core. Dec 12 20:04:41.019700 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 12 20:04:41.027481 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 12 20:04:41.047411 (systemd)[1702]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 12 20:04:41.054998 systemd-logind[1528]: New session c1 of user core. Dec 12 20:04:41.245153 systemd[1702]: Queued start job for default target default.target. Dec 12 20:04:41.252143 systemd[1702]: Created slice app.slice - User Application Slice. Dec 12 20:04:41.252398 systemd[1702]: Reached target paths.target - Paths. Dec 12 20:04:41.252595 systemd[1702]: Reached target timers.target - Timers. Dec 12 20:04:41.254719 systemd[1702]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 12 20:04:41.292047 systemd-networkd[1490]: eth0: Ignoring DHCPv6 address 2a02:1348:17d:4fa:24:19ff:fef4:13ea/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17d:4fa:24:19ff:fef4:13ea/64 assigned by NDisc. Dec 12 20:04:41.292061 systemd-networkd[1490]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Dec 12 20:04:41.301022 systemd[1702]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 12 20:04:41.301223 systemd[1702]: Reached target sockets.target - Sockets. Dec 12 20:04:41.301319 systemd[1702]: Reached target basic.target - Basic System. Dec 12 20:04:41.301396 systemd[1702]: Reached target default.target - Main User Target. Dec 12 20:04:41.301456 systemd[1702]: Startup finished in 234ms. Dec 12 20:04:41.301684 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 12 20:04:41.312616 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 12 20:04:41.630635 kubelet[1699]: E1212 20:04:41.630468 1699 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 20:04:41.633940 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 20:04:41.634206 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 20:04:41.635183 systemd[1]: kubelet.service: Consumed 1.134s CPU time, 267.9M memory peak. Dec 12 20:04:41.975520 systemd[1]: Started sshd@1-10.244.19.234:22-147.75.109.163:49468.service - OpenSSH per-connection server daemon (147.75.109.163:49468). Dec 12 20:04:42.878372 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 20:04:42.884313 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 20:04:42.949641 sshd[1720]: Accepted publickey for core from 147.75.109.163 port 49468 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:04:42.951672 sshd-session[1720]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:04:42.959925 systemd-logind[1528]: New session 2 of user core. Dec 12 20:04:42.971640 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 12 20:04:43.617829 sshd[1725]: Connection closed by 147.75.109.163 port 49468 Dec 12 20:04:43.618718 sshd-session[1720]: pam_unix(sshd:session): session closed for user core Dec 12 20:04:43.624728 systemd[1]: sshd@1-10.244.19.234:22-147.75.109.163:49468.service: Deactivated successfully. Dec 12 20:04:43.627140 systemd[1]: session-2.scope: Deactivated successfully. Dec 12 20:04:43.628484 systemd-logind[1528]: Session 2 logged out. Waiting for processes to exit. Dec 12 20:04:43.630450 systemd-logind[1528]: Removed session 2. Dec 12 20:04:43.762599 systemd[1]: Started sshd@2-10.244.19.234:22-147.75.109.163:36882.service - OpenSSH per-connection server daemon (147.75.109.163:36882). Dec 12 20:04:44.659949 sshd[1731]: Accepted publickey for core from 147.75.109.163 port 36882 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:04:44.661620 sshd-session[1731]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:04:44.668618 systemd-logind[1528]: New session 3 of user core. Dec 12 20:04:44.676701 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 12 20:04:45.235526 login[1690]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Dec 12 20:04:45.238543 login[1691]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Dec 12 20:04:45.245393 systemd-logind[1528]: New session 4 of user core. Dec 12 20:04:45.253627 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 12 20:04:45.259547 systemd-logind[1528]: New session 5 of user core. Dec 12 20:04:45.265624 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 12 20:04:45.275251 sshd[1734]: Connection closed by 147.75.109.163 port 36882 Dec 12 20:04:45.276633 sshd-session[1731]: pam_unix(sshd:session): session closed for user core Dec 12 20:04:45.285457 systemd[1]: sshd@2-10.244.19.234:22-147.75.109.163:36882.service: Deactivated successfully. Dec 12 20:04:45.289828 systemd[1]: session-3.scope: Deactivated successfully. Dec 12 20:04:45.294248 systemd-logind[1528]: Session 3 logged out. Waiting for processes to exit. Dec 12 20:04:45.297353 systemd-logind[1528]: Removed session 3. Dec 12 20:04:46.896329 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 20:04:46.899334 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 12 20:04:46.911153 coreos-metadata[1518]: Dec 12 20:04:46.910 WARN failed to locate config-drive, using the metadata service API instead Dec 12 20:04:46.915491 coreos-metadata[1596]: Dec 12 20:04:46.915 WARN failed to locate config-drive, using the metadata service API instead Dec 12 20:04:46.936318 coreos-metadata[1518]: Dec 12 20:04:46.935 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Dec 12 20:04:46.938336 coreos-metadata[1596]: Dec 12 20:04:46.938 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Dec 12 20:04:46.946061 coreos-metadata[1518]: Dec 12 20:04:46.946 INFO Fetch failed with 404: resource not found Dec 12 20:04:46.946350 coreos-metadata[1518]: Dec 12 20:04:46.946 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 12 20:04:46.947034 coreos-metadata[1518]: Dec 12 20:04:46.947 INFO Fetch successful Dec 12 20:04:46.947418 coreos-metadata[1518]: Dec 12 20:04:46.947 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Dec 12 20:04:46.958254 coreos-metadata[1518]: Dec 12 20:04:46.958 INFO Fetch successful Dec 12 20:04:46.958702 coreos-metadata[1518]: Dec 12 20:04:46.958 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Dec 12 20:04:46.970662 coreos-metadata[1596]: Dec 12 20:04:46.970 INFO Fetch successful Dec 12 20:04:46.970868 coreos-metadata[1596]: Dec 12 20:04:46.970 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Dec 12 20:04:46.985927 coreos-metadata[1518]: Dec 12 20:04:46.985 INFO Fetch successful Dec 12 20:04:46.986214 coreos-metadata[1518]: Dec 12 20:04:46.986 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Dec 12 20:04:47.009417 coreos-metadata[1518]: Dec 12 20:04:47.009 INFO Fetch successful Dec 12 20:04:47.009730 coreos-metadata[1518]: Dec 12 20:04:47.009 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Dec 12 20:04:47.011627 coreos-metadata[1596]: Dec 12 20:04:47.011 INFO Fetch successful Dec 12 20:04:47.013796 unknown[1596]: wrote ssh authorized keys file for user: core Dec 12 20:04:47.030653 coreos-metadata[1518]: Dec 12 20:04:47.030 INFO Fetch successful Dec 12 20:04:47.041412 update-ssh-keys[1768]: Updated "/home/core/.ssh/authorized_keys" Dec 12 20:04:47.049754 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 12 20:04:47.055871 systemd[1]: Finished sshkeys.service. Dec 12 20:04:47.070702 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 12 20:04:47.071498 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 12 20:04:47.071711 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 12 20:04:47.071936 systemd[1]: Startup finished in 3.518s (kernel) + 14.365s (initrd) + 13.113s (userspace) = 30.997s. Dec 12 20:04:51.884901 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 12 20:04:51.887451 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 20:04:52.175744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 20:04:52.192074 (kubelet)[1785]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 20:04:52.251831 kubelet[1785]: E1212 20:04:52.251760 1785 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 20:04:52.256920 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 20:04:52.257168 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 20:04:52.258086 systemd[1]: kubelet.service: Consumed 247ms CPU time, 110.5M memory peak. Dec 12 20:04:55.312936 systemd[1]: Started sshd@3-10.244.19.234:22-147.75.109.163:44640.service - OpenSSH per-connection server daemon (147.75.109.163:44640). Dec 12 20:04:56.241628 sshd[1792]: Accepted publickey for core from 147.75.109.163 port 44640 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:04:56.243667 sshd-session[1792]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:04:56.253058 systemd-logind[1528]: New session 6 of user core. Dec 12 20:04:56.256535 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 12 20:04:56.866339 sshd[1795]: Connection closed by 147.75.109.163 port 44640 Dec 12 20:04:56.867409 sshd-session[1792]: pam_unix(sshd:session): session closed for user core Dec 12 20:04:56.872659 systemd[1]: sshd@3-10.244.19.234:22-147.75.109.163:44640.service: Deactivated successfully. Dec 12 20:04:56.875234 systemd[1]: session-6.scope: Deactivated successfully. Dec 12 20:04:56.876508 systemd-logind[1528]: Session 6 logged out. Waiting for processes to exit. Dec 12 20:04:56.879544 systemd-logind[1528]: Removed session 6. Dec 12 20:04:57.031023 systemd[1]: Started sshd@4-10.244.19.234:22-147.75.109.163:44652.service - OpenSSH per-connection server daemon (147.75.109.163:44652). Dec 12 20:04:57.954268 sshd[1801]: Accepted publickey for core from 147.75.109.163 port 44652 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:04:57.956481 sshd-session[1801]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:04:57.964013 systemd-logind[1528]: New session 7 of user core. Dec 12 20:04:57.975581 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 12 20:04:58.570372 sshd[1804]: Connection closed by 147.75.109.163 port 44652 Dec 12 20:04:58.571428 sshd-session[1801]: pam_unix(sshd:session): session closed for user core Dec 12 20:04:58.578183 systemd[1]: sshd@4-10.244.19.234:22-147.75.109.163:44652.service: Deactivated successfully. Dec 12 20:04:58.580708 systemd[1]: session-7.scope: Deactivated successfully. Dec 12 20:04:58.581952 systemd-logind[1528]: Session 7 logged out. Waiting for processes to exit. Dec 12 20:04:58.584175 systemd-logind[1528]: Removed session 7. Dec 12 20:04:58.727199 systemd[1]: Started sshd@5-10.244.19.234:22-147.75.109.163:44668.service - OpenSSH per-connection server daemon (147.75.109.163:44668). Dec 12 20:04:59.642677 sshd[1810]: Accepted publickey for core from 147.75.109.163 port 44668 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:04:59.645041 sshd-session[1810]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:04:59.655387 systemd-logind[1528]: New session 8 of user core. Dec 12 20:04:59.662515 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 12 20:05:00.267854 sshd[1813]: Connection closed by 147.75.109.163 port 44668 Dec 12 20:05:00.267644 sshd-session[1810]: pam_unix(sshd:session): session closed for user core Dec 12 20:05:00.273013 systemd-logind[1528]: Session 8 logged out. Waiting for processes to exit. Dec 12 20:05:00.273534 systemd[1]: sshd@5-10.244.19.234:22-147.75.109.163:44668.service: Deactivated successfully. Dec 12 20:05:00.276425 systemd[1]: session-8.scope: Deactivated successfully. Dec 12 20:05:00.279990 systemd-logind[1528]: Removed session 8. Dec 12 20:05:00.425889 systemd[1]: Started sshd@6-10.244.19.234:22-147.75.109.163:44682.service - OpenSSH per-connection server daemon (147.75.109.163:44682). Dec 12 20:05:01.365141 sshd[1819]: Accepted publickey for core from 147.75.109.163 port 44682 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:05:01.367254 sshd-session[1819]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:05:01.376038 systemd-logind[1528]: New session 9 of user core. Dec 12 20:05:01.386688 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 12 20:05:01.866840 sudo[1823]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 12 20:05:01.867326 sudo[1823]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 20:05:01.879356 sudo[1823]: pam_unix(sudo:session): session closed for user root Dec 12 20:05:02.027314 sshd[1822]: Connection closed by 147.75.109.163 port 44682 Dec 12 20:05:02.026388 sshd-session[1819]: pam_unix(sshd:session): session closed for user core Dec 12 20:05:02.031602 systemd[1]: sshd@6-10.244.19.234:22-147.75.109.163:44682.service: Deactivated successfully. Dec 12 20:05:02.034815 systemd[1]: session-9.scope: Deactivated successfully. Dec 12 20:05:02.038038 systemd-logind[1528]: Session 9 logged out. Waiting for processes to exit. Dec 12 20:05:02.040688 systemd-logind[1528]: Removed session 9. Dec 12 20:05:02.183835 systemd[1]: Started sshd@7-10.244.19.234:22-147.75.109.163:44686.service - OpenSSH per-connection server daemon (147.75.109.163:44686). Dec 12 20:05:02.298546 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 12 20:05:02.301894 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 20:05:02.495043 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 20:05:02.507255 (kubelet)[1840]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 20:05:02.572187 kubelet[1840]: E1212 20:05:02.572106 1840 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 20:05:02.576148 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 20:05:02.576451 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 20:05:02.577412 systemd[1]: kubelet.service: Consumed 236ms CPU time, 110.2M memory peak. Dec 12 20:05:03.104360 sshd[1829]: Accepted publickey for core from 147.75.109.163 port 44686 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:05:03.104480 sshd-session[1829]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:05:03.112973 systemd-logind[1528]: New session 10 of user core. Dec 12 20:05:03.122523 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 12 20:05:03.585536 sudo[1849]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 12 20:05:03.586732 sudo[1849]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 20:05:03.594828 sudo[1849]: pam_unix(sudo:session): session closed for user root Dec 12 20:05:03.603885 sudo[1848]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 12 20:05:03.604384 sudo[1848]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 20:05:03.620865 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 12 20:05:03.670447 augenrules[1871]: No rules Dec 12 20:05:03.672182 systemd[1]: audit-rules.service: Deactivated successfully. Dec 12 20:05:03.672643 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 12 20:05:03.674613 sudo[1848]: pam_unix(sudo:session): session closed for user root Dec 12 20:05:03.820337 sshd[1847]: Connection closed by 147.75.109.163 port 44686 Dec 12 20:05:03.821427 sshd-session[1829]: pam_unix(sshd:session): session closed for user core Dec 12 20:05:03.828506 systemd[1]: sshd@7-10.244.19.234:22-147.75.109.163:44686.service: Deactivated successfully. Dec 12 20:05:03.831384 systemd[1]: session-10.scope: Deactivated successfully. Dec 12 20:05:03.833021 systemd-logind[1528]: Session 10 logged out. Waiting for processes to exit. Dec 12 20:05:03.834958 systemd-logind[1528]: Removed session 10. Dec 12 20:05:03.992030 systemd[1]: Started sshd@8-10.244.19.234:22-147.75.109.163:43090.service - OpenSSH per-connection server daemon (147.75.109.163:43090). Dec 12 20:05:04.928902 sshd[1880]: Accepted publickey for core from 147.75.109.163 port 43090 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:05:04.930974 sshd-session[1880]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:05:04.938576 systemd-logind[1528]: New session 11 of user core. Dec 12 20:05:04.947517 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 12 20:05:05.415682 sudo[1884]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 12 20:05:05.416218 sudo[1884]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 12 20:05:05.952046 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 12 20:05:05.968225 (dockerd)[1903]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 12 20:05:06.363409 dockerd[1903]: time="2025-12-12T20:05:06.363220778Z" level=info msg="Starting up" Dec 12 20:05:06.365741 dockerd[1903]: time="2025-12-12T20:05:06.365644765Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 12 20:05:06.384803 dockerd[1903]: time="2025-12-12T20:05:06.384626148Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 12 20:05:06.412709 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2524535573-merged.mount: Deactivated successfully. Dec 12 20:05:06.456183 dockerd[1903]: time="2025-12-12T20:05:06.456095211Z" level=info msg="Loading containers: start." Dec 12 20:05:06.473343 kernel: Initializing XFRM netlink socket Dec 12 20:05:06.831937 systemd-networkd[1490]: docker0: Link UP Dec 12 20:05:06.838444 dockerd[1903]: time="2025-12-12T20:05:06.837404381Z" level=info msg="Loading containers: done." Dec 12 20:05:06.859720 dockerd[1903]: time="2025-12-12T20:05:06.859614286Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 12 20:05:06.859954 dockerd[1903]: time="2025-12-12T20:05:06.859737518Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 12 20:05:06.859954 dockerd[1903]: time="2025-12-12T20:05:06.859875051Z" level=info msg="Initializing buildkit" Dec 12 20:05:06.892236 dockerd[1903]: time="2025-12-12T20:05:06.892169368Z" level=info msg="Completed buildkit initialization" Dec 12 20:05:06.900688 dockerd[1903]: time="2025-12-12T20:05:06.900642397Z" level=info msg="Daemon has completed initialization" Dec 12 20:05:06.900688 dockerd[1903]: time="2025-12-12T20:05:06.900742315Z" level=info msg="API listen on /run/docker.sock" Dec 12 20:05:06.901912 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 12 20:05:07.404869 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2769805777-merged.mount: Deactivated successfully. Dec 12 20:05:08.138322 containerd[1555]: time="2025-12-12T20:05:08.137877079Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\"" Dec 12 20:05:09.037622 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2716913127.mount: Deactivated successfully. Dec 12 20:05:11.082328 containerd[1555]: time="2025-12-12T20:05:11.081851712Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:11.084848 containerd[1555]: time="2025-12-12T20:05:11.084794781Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.7: active requests=0, bytes read=30114720" Dec 12 20:05:11.085722 containerd[1555]: time="2025-12-12T20:05:11.085668133Z" level=info msg="ImageCreate event name:\"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:11.090180 containerd[1555]: time="2025-12-12T20:05:11.090118404Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:11.092759 containerd[1555]: time="2025-12-12T20:05:11.092717185Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.7\" with image id \"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\", size \"30111311\" in 2.954720265s" Dec 12 20:05:11.092843 containerd[1555]: time="2025-12-12T20:05:11.092777887Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\" returns image reference \"sha256:021d1ceeffb11df7a9fb9adfa0ad0a30dcd13cb3d630022066f184cdcb93731b\"" Dec 12 20:05:11.093865 containerd[1555]: time="2025-12-12T20:05:11.093819390Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\"" Dec 12 20:05:11.341320 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Dec 12 20:05:12.798530 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 12 20:05:12.803612 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 20:05:13.194569 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 20:05:13.206953 (kubelet)[2192]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 20:05:13.293342 kubelet[2192]: E1212 20:05:13.292404 2192 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 20:05:13.296984 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 20:05:13.297483 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 20:05:13.298549 systemd[1]: kubelet.service: Consumed 265ms CPU time, 110.2M memory peak. Dec 12 20:05:13.494759 containerd[1555]: time="2025-12-12T20:05:13.494260522Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:13.496674 containerd[1555]: time="2025-12-12T20:05:13.496556392Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.7: active requests=0, bytes read=26016789" Dec 12 20:05:13.497901 containerd[1555]: time="2025-12-12T20:05:13.497149490Z" level=info msg="ImageCreate event name:\"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:13.500780 containerd[1555]: time="2025-12-12T20:05:13.500740417Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:13.502257 containerd[1555]: time="2025-12-12T20:05:13.502220644Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.7\" with image id \"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\", size \"27673815\" in 2.408355148s" Dec 12 20:05:13.502437 containerd[1555]: time="2025-12-12T20:05:13.502408250Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\" returns image reference \"sha256:29c7cab9d8e681d047281fd3711baf13c28f66923480fb11c8f22ddb7ca742d1\"" Dec 12 20:05:13.503208 containerd[1555]: time="2025-12-12T20:05:13.503159878Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\"" Dec 12 20:05:15.443343 containerd[1555]: time="2025-12-12T20:05:15.442622216Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:15.445011 containerd[1555]: time="2025-12-12T20:05:15.444726009Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.7: active requests=0, bytes read=20158110" Dec 12 20:05:15.445890 containerd[1555]: time="2025-12-12T20:05:15.445847681Z" level=info msg="ImageCreate event name:\"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:15.449516 containerd[1555]: time="2025-12-12T20:05:15.449474503Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:15.453615 containerd[1555]: time="2025-12-12T20:05:15.453567430Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.7\" with image id \"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\", size \"21815154\" in 1.950359857s" Dec 12 20:05:15.453763 containerd[1555]: time="2025-12-12T20:05:15.453730992Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\" returns image reference \"sha256:f457f6fcd712acb5b9beef873f6f4a4869182f9eb52ea6e24824fd4ac4eed393\"" Dec 12 20:05:15.456137 containerd[1555]: time="2025-12-12T20:05:15.455608304Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\"" Dec 12 20:05:17.974207 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3390242586.mount: Deactivated successfully. Dec 12 20:05:18.790169 containerd[1555]: time="2025-12-12T20:05:18.790079205Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:18.791768 containerd[1555]: time="2025-12-12T20:05:18.791710641Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.7: active requests=0, bytes read=31930104" Dec 12 20:05:18.791973 containerd[1555]: time="2025-12-12T20:05:18.791939766Z" level=info msg="ImageCreate event name:\"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:18.795236 containerd[1555]: time="2025-12-12T20:05:18.795195317Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:18.796125 containerd[1555]: time="2025-12-12T20:05:18.796088699Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.7\" with image id \"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\", repo tag \"registry.k8s.io/kube-proxy:v1.33.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\", size \"31929115\" in 3.340435935s" Dec 12 20:05:18.796283 containerd[1555]: time="2025-12-12T20:05:18.796254402Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\" returns image reference \"sha256:0929027b17fc30cb9de279f3bdba4e130b991a1dab7978a7db2e5feb2091853c\"" Dec 12 20:05:18.797341 containerd[1555]: time="2025-12-12T20:05:18.797313850Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Dec 12 20:05:19.490146 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3342243534.mount: Deactivated successfully. Dec 12 20:05:20.925965 containerd[1555]: time="2025-12-12T20:05:20.925874847Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:20.927355 containerd[1555]: time="2025-12-12T20:05:20.927321662Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942246" Dec 12 20:05:20.929334 containerd[1555]: time="2025-12-12T20:05:20.928395502Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:20.932563 containerd[1555]: time="2025-12-12T20:05:20.931751938Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:20.933411 containerd[1555]: time="2025-12-12T20:05:20.933360762Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 2.135918365s" Dec 12 20:05:20.933502 containerd[1555]: time="2025-12-12T20:05:20.933417878Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Dec 12 20:05:20.935389 containerd[1555]: time="2025-12-12T20:05:20.935355846Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Dec 12 20:05:21.619126 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2672678833.mount: Deactivated successfully. Dec 12 20:05:21.626218 containerd[1555]: time="2025-12-12T20:05:21.626118357Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 20:05:21.628235 containerd[1555]: time="2025-12-12T20:05:21.628175461Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Dec 12 20:05:21.629190 containerd[1555]: time="2025-12-12T20:05:21.629114906Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 20:05:21.633389 containerd[1555]: time="2025-12-12T20:05:21.633323115Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 12 20:05:21.634400 containerd[1555]: time="2025-12-12T20:05:21.634340337Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 698.942494ms" Dec 12 20:05:21.634907 containerd[1555]: time="2025-12-12T20:05:21.634866701Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Dec 12 20:05:21.635743 containerd[1555]: time="2025-12-12T20:05:21.635710433Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Dec 12 20:05:22.307586 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1481124215.mount: Deactivated successfully. Dec 12 20:05:23.075713 update_engine[1531]: I20251212 20:05:23.074630 1531 update_attempter.cc:509] Updating boot flags... Dec 12 20:05:23.298123 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Dec 12 20:05:23.303518 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 20:05:23.649001 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 20:05:23.665221 (kubelet)[2341]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 12 20:05:23.752765 kubelet[2341]: E1212 20:05:23.752689 2341 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 12 20:05:23.756372 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 12 20:05:23.756630 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 12 20:05:23.757197 systemd[1]: kubelet.service: Consumed 230ms CPU time, 105.5M memory peak. Dec 12 20:05:26.363758 containerd[1555]: time="2025-12-12T20:05:26.363637778Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:26.365522 containerd[1555]: time="2025-12-12T20:05:26.365484298Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58926235" Dec 12 20:05:26.367061 containerd[1555]: time="2025-12-12T20:05:26.367014311Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:26.374304 containerd[1555]: time="2025-12-12T20:05:26.374209490Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:26.377819 containerd[1555]: time="2025-12-12T20:05:26.377778912Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 4.7420245s" Dec 12 20:05:26.377924 containerd[1555]: time="2025-12-12T20:05:26.377824676Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Dec 12 20:05:32.801086 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 20:05:32.801578 systemd[1]: kubelet.service: Consumed 230ms CPU time, 105.5M memory peak. Dec 12 20:05:32.804745 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 20:05:32.845185 systemd[1]: Reload requested from client PID 2383 ('systemctl') (unit session-11.scope)... Dec 12 20:05:32.845244 systemd[1]: Reloading... Dec 12 20:05:33.065325 zram_generator::config[2424]: No configuration found. Dec 12 20:05:33.370098 systemd[1]: Reloading finished in 524 ms. Dec 12 20:05:33.454940 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 12 20:05:33.455088 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 12 20:05:33.455934 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 20:05:33.456035 systemd[1]: kubelet.service: Consumed 149ms CPU time, 97.7M memory peak. Dec 12 20:05:33.458657 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 20:05:33.661966 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 20:05:33.688034 (kubelet)[2495]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 20:05:33.754570 kubelet[2495]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 20:05:33.754570 kubelet[2495]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 20:05:33.754570 kubelet[2495]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 20:05:33.757178 kubelet[2495]: I1212 20:05:33.757103 2495 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 20:05:34.183371 kubelet[2495]: I1212 20:05:34.182913 2495 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Dec 12 20:05:34.183371 kubelet[2495]: I1212 20:05:34.182957 2495 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 20:05:34.183371 kubelet[2495]: I1212 20:05:34.183244 2495 server.go:956] "Client rotation is on, will bootstrap in background" Dec 12 20:05:34.224041 kubelet[2495]: I1212 20:05:34.223914 2495 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 20:05:34.227950 kubelet[2495]: E1212 20:05:34.227884 2495 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.244.19.234:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.244.19.234:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 12 20:05:34.263202 kubelet[2495]: I1212 20:05:34.263118 2495 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 20:05:34.272258 kubelet[2495]: I1212 20:05:34.272130 2495 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 12 20:05:34.277861 kubelet[2495]: I1212 20:05:34.277542 2495 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 20:05:34.281857 kubelet[2495]: I1212 20:05:34.277595 2495 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-n0ssy.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 20:05:34.289372 kubelet[2495]: I1212 20:05:34.288868 2495 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 20:05:34.289372 kubelet[2495]: I1212 20:05:34.288908 2495 container_manager_linux.go:303] "Creating device plugin manager" Dec 12 20:05:34.289372 kubelet[2495]: I1212 20:05:34.289172 2495 state_mem.go:36] "Initialized new in-memory state store" Dec 12 20:05:34.303744 kubelet[2495]: I1212 20:05:34.303542 2495 kubelet.go:480] "Attempting to sync node with API server" Dec 12 20:05:34.303744 kubelet[2495]: I1212 20:05:34.303607 2495 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 20:05:34.305447 kubelet[2495]: I1212 20:05:34.305323 2495 kubelet.go:386] "Adding apiserver pod source" Dec 12 20:05:34.307247 kubelet[2495]: I1212 20:05:34.306967 2495 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 20:05:34.310792 kubelet[2495]: E1212 20:05:34.310419 2495 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.244.19.234:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-n0ssy.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.244.19.234:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 12 20:05:34.318462 kubelet[2495]: E1212 20:05:34.318426 2495 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.244.19.234:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.244.19.234:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 12 20:05:34.319338 kubelet[2495]: I1212 20:05:34.319303 2495 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 20:05:34.320107 kubelet[2495]: I1212 20:05:34.320083 2495 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 12 20:05:34.321053 kubelet[2495]: W1212 20:05:34.321031 2495 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 12 20:05:34.328762 kubelet[2495]: I1212 20:05:34.328673 2495 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 12 20:05:34.328889 kubelet[2495]: I1212 20:05:34.328774 2495 server.go:1289] "Started kubelet" Dec 12 20:05:34.331818 kubelet[2495]: I1212 20:05:34.331497 2495 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 20:05:34.333011 kubelet[2495]: I1212 20:05:34.332400 2495 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 20:05:34.333011 kubelet[2495]: I1212 20:05:34.332859 2495 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 20:05:34.334256 kubelet[2495]: I1212 20:05:34.334231 2495 server.go:317] "Adding debug handlers to kubelet server" Dec 12 20:05:34.337271 kubelet[2495]: I1212 20:05:34.337248 2495 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 20:05:34.342003 kubelet[2495]: E1212 20:05:34.339235 2495 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.244.19.234:6443/api/v1/namespaces/default/events\": dial tcp 10.244.19.234:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-n0ssy.gb1.brightbox.com.1880908039c5f6a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-n0ssy.gb1.brightbox.com,UID:srv-n0ssy.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-n0ssy.gb1.brightbox.com,},FirstTimestamp:2025-12-12 20:05:34.328723111 +0000 UTC m=+0.635502484,LastTimestamp:2025-12-12 20:05:34.328723111 +0000 UTC m=+0.635502484,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-n0ssy.gb1.brightbox.com,}" Dec 12 20:05:34.345371 kubelet[2495]: I1212 20:05:34.345140 2495 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 20:05:34.350895 kubelet[2495]: E1212 20:05:34.349863 2495 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-n0ssy.gb1.brightbox.com\" not found" Dec 12 20:05:34.350895 kubelet[2495]: I1212 20:05:34.349931 2495 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 12 20:05:34.350895 kubelet[2495]: I1212 20:05:34.350235 2495 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 12 20:05:34.350895 kubelet[2495]: I1212 20:05:34.350394 2495 reconciler.go:26] "Reconciler: start to sync state" Dec 12 20:05:34.350895 kubelet[2495]: E1212 20:05:34.350879 2495 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.244.19.234:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.244.19.234:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 12 20:05:34.353740 kubelet[2495]: E1212 20:05:34.353674 2495 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.19.234:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-n0ssy.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.19.234:6443: connect: connection refused" interval="200ms" Dec 12 20:05:34.360622 kubelet[2495]: I1212 20:05:34.360590 2495 factory.go:223] Registration of the systemd container factory successfully Dec 12 20:05:34.361116 kubelet[2495]: I1212 20:05:34.361077 2495 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 20:05:34.362699 kubelet[2495]: E1212 20:05:34.362423 2495 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 20:05:34.364480 kubelet[2495]: I1212 20:05:34.364455 2495 factory.go:223] Registration of the containerd container factory successfully Dec 12 20:05:34.395002 kubelet[2495]: I1212 20:05:34.394661 2495 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 20:05:34.395002 kubelet[2495]: I1212 20:05:34.394699 2495 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 20:05:34.395002 kubelet[2495]: I1212 20:05:34.394728 2495 state_mem.go:36] "Initialized new in-memory state store" Dec 12 20:05:34.397785 kubelet[2495]: I1212 20:05:34.397731 2495 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Dec 12 20:05:34.399462 kubelet[2495]: I1212 20:05:34.399356 2495 policy_none.go:49] "None policy: Start" Dec 12 20:05:34.399462 kubelet[2495]: I1212 20:05:34.399398 2495 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 12 20:05:34.399462 kubelet[2495]: I1212 20:05:34.399426 2495 state_mem.go:35] "Initializing new in-memory state store" Dec 12 20:05:34.400080 kubelet[2495]: I1212 20:05:34.400041 2495 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Dec 12 20:05:34.400182 kubelet[2495]: I1212 20:05:34.400093 2495 status_manager.go:230] "Starting to sync pod status with apiserver" Dec 12 20:05:34.400182 kubelet[2495]: I1212 20:05:34.400123 2495 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 20:05:34.400182 kubelet[2495]: I1212 20:05:34.400141 2495 kubelet.go:2436] "Starting kubelet main sync loop" Dec 12 20:05:34.400374 kubelet[2495]: E1212 20:05:34.400207 2495 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 20:05:34.403726 kubelet[2495]: E1212 20:05:34.402921 2495 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.244.19.234:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.244.19.234:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 12 20:05:34.415052 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 12 20:05:34.437531 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 12 20:05:34.444205 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 12 20:05:34.450186 kubelet[2495]: E1212 20:05:34.450146 2495 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"srv-n0ssy.gb1.brightbox.com\" not found" Dec 12 20:05:34.452859 kubelet[2495]: E1212 20:05:34.452832 2495 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 12 20:05:34.453146 kubelet[2495]: I1212 20:05:34.453125 2495 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 20:05:34.453213 kubelet[2495]: I1212 20:05:34.453159 2495 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 20:05:34.454268 kubelet[2495]: I1212 20:05:34.453625 2495 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 20:05:34.457734 kubelet[2495]: E1212 20:05:34.457489 2495 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 20:05:34.457734 kubelet[2495]: E1212 20:05:34.457558 2495 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-n0ssy.gb1.brightbox.com\" not found" Dec 12 20:05:34.526515 systemd[1]: Created slice kubepods-burstable-pod9ce8762f63fe40ba8d96a6dc22b10b48.slice - libcontainer container kubepods-burstable-pod9ce8762f63fe40ba8d96a6dc22b10b48.slice. Dec 12 20:05:34.550691 kubelet[2495]: E1212 20:05:34.550648 2495 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-n0ssy.gb1.brightbox.com\" not found" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:34.551689 kubelet[2495]: I1212 20:05:34.551663 2495 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9ce8762f63fe40ba8d96a6dc22b10b48-ca-certs\") pod \"kube-apiserver-srv-n0ssy.gb1.brightbox.com\" (UID: \"9ce8762f63fe40ba8d96a6dc22b10b48\") " pod="kube-system/kube-apiserver-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:34.552189 kubelet[2495]: I1212 20:05:34.552157 2495 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9ce8762f63fe40ba8d96a6dc22b10b48-k8s-certs\") pod \"kube-apiserver-srv-n0ssy.gb1.brightbox.com\" (UID: \"9ce8762f63fe40ba8d96a6dc22b10b48\") " pod="kube-system/kube-apiserver-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:34.553056 kubelet[2495]: I1212 20:05:34.552729 2495 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9ce8762f63fe40ba8d96a6dc22b10b48-usr-share-ca-certificates\") pod \"kube-apiserver-srv-n0ssy.gb1.brightbox.com\" (UID: \"9ce8762f63fe40ba8d96a6dc22b10b48\") " pod="kube-system/kube-apiserver-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:34.553056 kubelet[2495]: I1212 20:05:34.552773 2495 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a55aa0ee93e7794122aa73d201bae4d2-flexvolume-dir\") pod \"kube-controller-manager-srv-n0ssy.gb1.brightbox.com\" (UID: \"a55aa0ee93e7794122aa73d201bae4d2\") " pod="kube-system/kube-controller-manager-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:34.553056 kubelet[2495]: I1212 20:05:34.552812 2495 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a55aa0ee93e7794122aa73d201bae4d2-kubeconfig\") pod \"kube-controller-manager-srv-n0ssy.gb1.brightbox.com\" (UID: \"a55aa0ee93e7794122aa73d201bae4d2\") " pod="kube-system/kube-controller-manager-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:34.553056 kubelet[2495]: I1212 20:05:34.552842 2495 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a55aa0ee93e7794122aa73d201bae4d2-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-n0ssy.gb1.brightbox.com\" (UID: \"a55aa0ee93e7794122aa73d201bae4d2\") " pod="kube-system/kube-controller-manager-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:34.553056 kubelet[2495]: I1212 20:05:34.552868 2495 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1cfeaa130ff4e4914b0c0ea993735145-kubeconfig\") pod \"kube-scheduler-srv-n0ssy.gb1.brightbox.com\" (UID: \"1cfeaa130ff4e4914b0c0ea993735145\") " pod="kube-system/kube-scheduler-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:34.553363 kubelet[2495]: I1212 20:05:34.552894 2495 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a55aa0ee93e7794122aa73d201bae4d2-ca-certs\") pod \"kube-controller-manager-srv-n0ssy.gb1.brightbox.com\" (UID: \"a55aa0ee93e7794122aa73d201bae4d2\") " pod="kube-system/kube-controller-manager-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:34.553363 kubelet[2495]: I1212 20:05:34.552984 2495 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a55aa0ee93e7794122aa73d201bae4d2-k8s-certs\") pod \"kube-controller-manager-srv-n0ssy.gb1.brightbox.com\" (UID: \"a55aa0ee93e7794122aa73d201bae4d2\") " pod="kube-system/kube-controller-manager-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:34.556162 kubelet[2495]: I1212 20:05:34.555911 2495 kubelet_node_status.go:75] "Attempting to register node" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:34.556672 kubelet[2495]: E1212 20:05:34.556433 2495 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.19.234:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-n0ssy.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.19.234:6443: connect: connection refused" interval="400ms" Dec 12 20:05:34.557234 systemd[1]: Created slice kubepods-burstable-poda55aa0ee93e7794122aa73d201bae4d2.slice - libcontainer container kubepods-burstable-poda55aa0ee93e7794122aa73d201bae4d2.slice. Dec 12 20:05:34.557971 kubelet[2495]: E1212 20:05:34.557696 2495 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.19.234:6443/api/v1/nodes\": dial tcp 10.244.19.234:6443: connect: connection refused" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:34.570869 kubelet[2495]: E1212 20:05:34.570415 2495 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-n0ssy.gb1.brightbox.com\" not found" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:34.575608 systemd[1]: Created slice kubepods-burstable-pod1cfeaa130ff4e4914b0c0ea993735145.slice - libcontainer container kubepods-burstable-pod1cfeaa130ff4e4914b0c0ea993735145.slice. Dec 12 20:05:34.578783 kubelet[2495]: E1212 20:05:34.578754 2495 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-n0ssy.gb1.brightbox.com\" not found" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:34.761750 kubelet[2495]: I1212 20:05:34.761615 2495 kubelet_node_status.go:75] "Attempting to register node" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:34.763773 kubelet[2495]: E1212 20:05:34.763733 2495 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.19.234:6443/api/v1/nodes\": dial tcp 10.244.19.234:6443: connect: connection refused" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:34.853276 containerd[1555]: time="2025-12-12T20:05:34.853200870Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-n0ssy.gb1.brightbox.com,Uid:9ce8762f63fe40ba8d96a6dc22b10b48,Namespace:kube-system,Attempt:0,}" Dec 12 20:05:34.893625 containerd[1555]: time="2025-12-12T20:05:34.893571588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-n0ssy.gb1.brightbox.com,Uid:1cfeaa130ff4e4914b0c0ea993735145,Namespace:kube-system,Attempt:0,}" Dec 12 20:05:34.908526 containerd[1555]: time="2025-12-12T20:05:34.908152046Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-n0ssy.gb1.brightbox.com,Uid:a55aa0ee93e7794122aa73d201bae4d2,Namespace:kube-system,Attempt:0,}" Dec 12 20:05:34.961325 kubelet[2495]: E1212 20:05:34.959990 2495 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.19.234:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-n0ssy.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.19.234:6443: connect: connection refused" interval="800ms" Dec 12 20:05:35.018246 containerd[1555]: time="2025-12-12T20:05:35.017630657Z" level=info msg="connecting to shim 3e00f2a0d083f4d112f4f8b72ffa906f181da3f839c29f7235966c272dc0db19" address="unix:///run/containerd/s/a708b7f34d6c93c042b1de0af455ece5276027a6b2b283b90bccf08d8e334c05" namespace=k8s.io protocol=ttrpc version=3 Dec 12 20:05:35.034169 containerd[1555]: time="2025-12-12T20:05:35.033526137Z" level=info msg="connecting to shim 8a0ff5aba7725d28ebb9f329923a7066e78bc384d26d288e9beb216e0b7bc455" address="unix:///run/containerd/s/ebb56975c5082633447df7a8a6a4f2ceb3feafc022b93046aa46b776dd193349" namespace=k8s.io protocol=ttrpc version=3 Dec 12 20:05:35.046982 containerd[1555]: time="2025-12-12T20:05:35.046372570Z" level=info msg="connecting to shim 1185249adb119431dd6b33db0b2396d8af4a5fe9eda32b2c954cc4c1c79dd92c" address="unix:///run/containerd/s/0246a9e3edba2721e101a9e822a4115195f7a851e57b18e43a85ff5c33591f77" namespace=k8s.io protocol=ttrpc version=3 Dec 12 20:05:35.167716 kubelet[2495]: I1212 20:05:35.167678 2495 kubelet_node_status.go:75] "Attempting to register node" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:35.168094 kubelet[2495]: E1212 20:05:35.168055 2495 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.19.234:6443/api/v1/nodes\": dial tcp 10.244.19.234:6443: connect: connection refused" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:35.171589 systemd[1]: Started cri-containerd-1185249adb119431dd6b33db0b2396d8af4a5fe9eda32b2c954cc4c1c79dd92c.scope - libcontainer container 1185249adb119431dd6b33db0b2396d8af4a5fe9eda32b2c954cc4c1c79dd92c. Dec 12 20:05:35.175198 systemd[1]: Started cri-containerd-3e00f2a0d083f4d112f4f8b72ffa906f181da3f839c29f7235966c272dc0db19.scope - libcontainer container 3e00f2a0d083f4d112f4f8b72ffa906f181da3f839c29f7235966c272dc0db19. Dec 12 20:05:35.178830 systemd[1]: Started cri-containerd-8a0ff5aba7725d28ebb9f329923a7066e78bc384d26d288e9beb216e0b7bc455.scope - libcontainer container 8a0ff5aba7725d28ebb9f329923a7066e78bc384d26d288e9beb216e0b7bc455. Dec 12 20:05:35.299782 containerd[1555]: time="2025-12-12T20:05:35.299710715Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-n0ssy.gb1.brightbox.com,Uid:1cfeaa130ff4e4914b0c0ea993735145,Namespace:kube-system,Attempt:0,} returns sandbox id \"8a0ff5aba7725d28ebb9f329923a7066e78bc384d26d288e9beb216e0b7bc455\"" Dec 12 20:05:35.317649 containerd[1555]: time="2025-12-12T20:05:35.317568336Z" level=info msg="CreateContainer within sandbox \"8a0ff5aba7725d28ebb9f329923a7066e78bc384d26d288e9beb216e0b7bc455\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 12 20:05:35.327959 containerd[1555]: time="2025-12-12T20:05:35.327816645Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-n0ssy.gb1.brightbox.com,Uid:9ce8762f63fe40ba8d96a6dc22b10b48,Namespace:kube-system,Attempt:0,} returns sandbox id \"1185249adb119431dd6b33db0b2396d8af4a5fe9eda32b2c954cc4c1c79dd92c\"" Dec 12 20:05:35.333697 containerd[1555]: time="2025-12-12T20:05:35.333645598Z" level=info msg="CreateContainer within sandbox \"1185249adb119431dd6b33db0b2396d8af4a5fe9eda32b2c954cc4c1c79dd92c\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 12 20:05:35.343321 containerd[1555]: time="2025-12-12T20:05:35.343225728Z" level=info msg="Container 857f0567dad5e1036e013f9b566909fac690c4d10d96d6849e5c73a9eca3e7d6: CDI devices from CRI Config.CDIDevices: []" Dec 12 20:05:35.343825 containerd[1555]: time="2025-12-12T20:05:35.343791695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-n0ssy.gb1.brightbox.com,Uid:a55aa0ee93e7794122aa73d201bae4d2,Namespace:kube-system,Attempt:0,} returns sandbox id \"3e00f2a0d083f4d112f4f8b72ffa906f181da3f839c29f7235966c272dc0db19\"" Dec 12 20:05:35.349734 containerd[1555]: time="2025-12-12T20:05:35.349632694Z" level=info msg="CreateContainer within sandbox \"3e00f2a0d083f4d112f4f8b72ffa906f181da3f839c29f7235966c272dc0db19\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 12 20:05:35.354326 containerd[1555]: time="2025-12-12T20:05:35.353813715Z" level=info msg="Container 32a7ca0dbfc9a0e7a6428f6f0a6d4147c10bfa65ea880dd48dc4734145e15ee5: CDI devices from CRI Config.CDIDevices: []" Dec 12 20:05:35.358847 containerd[1555]: time="2025-12-12T20:05:35.358812840Z" level=info msg="CreateContainer within sandbox \"8a0ff5aba7725d28ebb9f329923a7066e78bc384d26d288e9beb216e0b7bc455\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"857f0567dad5e1036e013f9b566909fac690c4d10d96d6849e5c73a9eca3e7d6\"" Dec 12 20:05:35.361117 containerd[1555]: time="2025-12-12T20:05:35.361080408Z" level=info msg="StartContainer for \"857f0567dad5e1036e013f9b566909fac690c4d10d96d6849e5c73a9eca3e7d6\"" Dec 12 20:05:35.362837 containerd[1555]: time="2025-12-12T20:05:35.362733807Z" level=info msg="connecting to shim 857f0567dad5e1036e013f9b566909fac690c4d10d96d6849e5c73a9eca3e7d6" address="unix:///run/containerd/s/ebb56975c5082633447df7a8a6a4f2ceb3feafc022b93046aa46b776dd193349" protocol=ttrpc version=3 Dec 12 20:05:35.366099 containerd[1555]: time="2025-12-12T20:05:35.366066037Z" level=info msg="CreateContainer within sandbox \"1185249adb119431dd6b33db0b2396d8af4a5fe9eda32b2c954cc4c1c79dd92c\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"32a7ca0dbfc9a0e7a6428f6f0a6d4147c10bfa65ea880dd48dc4734145e15ee5\"" Dec 12 20:05:35.367662 containerd[1555]: time="2025-12-12T20:05:35.367356754Z" level=info msg="StartContainer for \"32a7ca0dbfc9a0e7a6428f6f0a6d4147c10bfa65ea880dd48dc4734145e15ee5\"" Dec 12 20:05:35.368911 containerd[1555]: time="2025-12-12T20:05:35.368841274Z" level=info msg="Container 570ad400abb4ef8ab4b9acde893e1e1d4b386fea5d54aa2f5c9d1e6121bcd359: CDI devices from CRI Config.CDIDevices: []" Dec 12 20:05:35.371179 containerd[1555]: time="2025-12-12T20:05:35.370630914Z" level=info msg="connecting to shim 32a7ca0dbfc9a0e7a6428f6f0a6d4147c10bfa65ea880dd48dc4734145e15ee5" address="unix:///run/containerd/s/0246a9e3edba2721e101a9e822a4115195f7a851e57b18e43a85ff5c33591f77" protocol=ttrpc version=3 Dec 12 20:05:35.378459 containerd[1555]: time="2025-12-12T20:05:35.378421297Z" level=info msg="CreateContainer within sandbox \"3e00f2a0d083f4d112f4f8b72ffa906f181da3f839c29f7235966c272dc0db19\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"570ad400abb4ef8ab4b9acde893e1e1d4b386fea5d54aa2f5c9d1e6121bcd359\"" Dec 12 20:05:35.383333 containerd[1555]: time="2025-12-12T20:05:35.382360243Z" level=info msg="StartContainer for \"570ad400abb4ef8ab4b9acde893e1e1d4b386fea5d54aa2f5c9d1e6121bcd359\"" Dec 12 20:05:35.389645 containerd[1555]: time="2025-12-12T20:05:35.389599916Z" level=info msg="connecting to shim 570ad400abb4ef8ab4b9acde893e1e1d4b386fea5d54aa2f5c9d1e6121bcd359" address="unix:///run/containerd/s/a708b7f34d6c93c042b1de0af455ece5276027a6b2b283b90bccf08d8e334c05" protocol=ttrpc version=3 Dec 12 20:05:35.408748 systemd[1]: Started cri-containerd-857f0567dad5e1036e013f9b566909fac690c4d10d96d6849e5c73a9eca3e7d6.scope - libcontainer container 857f0567dad5e1036e013f9b566909fac690c4d10d96d6849e5c73a9eca3e7d6. Dec 12 20:05:35.432805 systemd[1]: Started cri-containerd-32a7ca0dbfc9a0e7a6428f6f0a6d4147c10bfa65ea880dd48dc4734145e15ee5.scope - libcontainer container 32a7ca0dbfc9a0e7a6428f6f0a6d4147c10bfa65ea880dd48dc4734145e15ee5. Dec 12 20:05:35.454707 systemd[1]: Started cri-containerd-570ad400abb4ef8ab4b9acde893e1e1d4b386fea5d54aa2f5c9d1e6121bcd359.scope - libcontainer container 570ad400abb4ef8ab4b9acde893e1e1d4b386fea5d54aa2f5c9d1e6121bcd359. Dec 12 20:05:35.577018 containerd[1555]: time="2025-12-12T20:05:35.576766013Z" level=info msg="StartContainer for \"32a7ca0dbfc9a0e7a6428f6f0a6d4147c10bfa65ea880dd48dc4734145e15ee5\" returns successfully" Dec 12 20:05:35.597727 containerd[1555]: time="2025-12-12T20:05:35.597653156Z" level=info msg="StartContainer for \"857f0567dad5e1036e013f9b566909fac690c4d10d96d6849e5c73a9eca3e7d6\" returns successfully" Dec 12 20:05:35.606979 containerd[1555]: time="2025-12-12T20:05:35.606930834Z" level=info msg="StartContainer for \"570ad400abb4ef8ab4b9acde893e1e1d4b386fea5d54aa2f5c9d1e6121bcd359\" returns successfully" Dec 12 20:05:35.623179 kubelet[2495]: E1212 20:05:35.623096 2495 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.244.19.234:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.244.19.234:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 12 20:05:35.626881 kubelet[2495]: E1212 20:05:35.626820 2495 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.244.19.234:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.244.19.234:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 12 20:05:35.633742 kubelet[2495]: E1212 20:05:35.633677 2495 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.244.19.234:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-n0ssy.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.244.19.234:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 12 20:05:35.764760 kubelet[2495]: E1212 20:05:35.764696 2495 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.244.19.234:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-n0ssy.gb1.brightbox.com?timeout=10s\": dial tcp 10.244.19.234:6443: connect: connection refused" interval="1.6s" Dec 12 20:05:35.943727 kubelet[2495]: E1212 20:05:35.943565 2495 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.244.19.234:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.244.19.234:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 12 20:05:35.972636 kubelet[2495]: I1212 20:05:35.972459 2495 kubelet_node_status.go:75] "Attempting to register node" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:35.973260 kubelet[2495]: E1212 20:05:35.973224 2495 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.244.19.234:6443/api/v1/nodes\": dial tcp 10.244.19.234:6443: connect: connection refused" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:36.456314 kubelet[2495]: E1212 20:05:36.455397 2495 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-n0ssy.gb1.brightbox.com\" not found" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:36.465501 kubelet[2495]: E1212 20:05:36.465455 2495 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-n0ssy.gb1.brightbox.com\" not found" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:36.470504 kubelet[2495]: E1212 20:05:36.470474 2495 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-n0ssy.gb1.brightbox.com\" not found" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:37.473334 kubelet[2495]: E1212 20:05:37.472188 2495 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-n0ssy.gb1.brightbox.com\" not found" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:37.475473 kubelet[2495]: E1212 20:05:37.475180 2495 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-n0ssy.gb1.brightbox.com\" not found" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:37.475675 kubelet[2495]: E1212 20:05:37.475652 2495 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-n0ssy.gb1.brightbox.com\" not found" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:37.577313 kubelet[2495]: I1212 20:05:37.577247 2495 kubelet_node_status.go:75] "Attempting to register node" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:38.363089 kubelet[2495]: E1212 20:05:38.363034 2495 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-n0ssy.gb1.brightbox.com\" not found" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:38.475909 kubelet[2495]: E1212 20:05:38.475835 2495 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-n0ssy.gb1.brightbox.com\" not found" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:38.477867 kubelet[2495]: E1212 20:05:38.477603 2495 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-n0ssy.gb1.brightbox.com\" not found" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:38.477867 kubelet[2495]: E1212 20:05:38.477703 2495 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-n0ssy.gb1.brightbox.com\" not found" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:38.576322 kubelet[2495]: I1212 20:05:38.575438 2495 kubelet_node_status.go:78] "Successfully registered node" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:38.652009 kubelet[2495]: I1212 20:05:38.651524 2495 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:38.663118 kubelet[2495]: E1212 20:05:38.663045 2495 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-n0ssy.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:38.663118 kubelet[2495]: I1212 20:05:38.663117 2495 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:38.665880 kubelet[2495]: E1212 20:05:38.665652 2495 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-n0ssy.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:38.665880 kubelet[2495]: I1212 20:05:38.665684 2495 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:38.668081 kubelet[2495]: E1212 20:05:38.668046 2495 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-n0ssy.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:39.312894 kubelet[2495]: I1212 20:05:39.312812 2495 apiserver.go:52] "Watching apiserver" Dec 12 20:05:39.350917 kubelet[2495]: I1212 20:05:39.350842 2495 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 12 20:05:39.476014 kubelet[2495]: I1212 20:05:39.475959 2495 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:39.487072 kubelet[2495]: I1212 20:05:39.486504 2495 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Dec 12 20:05:40.684186 systemd[1]: Reload requested from client PID 2778 ('systemctl') (unit session-11.scope)... Dec 12 20:05:40.684736 systemd[1]: Reloading... Dec 12 20:05:40.823372 zram_generator::config[2823]: No configuration found. Dec 12 20:05:41.202209 systemd[1]: Reloading finished in 516 ms. Dec 12 20:05:41.239654 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 20:05:41.261122 systemd[1]: kubelet.service: Deactivated successfully. Dec 12 20:05:41.261810 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 20:05:41.262041 systemd[1]: kubelet.service: Consumed 1.165s CPU time, 129.2M memory peak. Dec 12 20:05:41.265666 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 12 20:05:41.549635 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 12 20:05:41.561977 (kubelet)[2887]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 12 20:05:41.657172 kubelet[2887]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 20:05:41.657172 kubelet[2887]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 12 20:05:41.657172 kubelet[2887]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 12 20:05:41.657827 kubelet[2887]: I1212 20:05:41.657194 2887 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 12 20:05:41.674349 kubelet[2887]: I1212 20:05:41.674008 2887 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Dec 12 20:05:41.674349 kubelet[2887]: I1212 20:05:41.674046 2887 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 12 20:05:41.674932 kubelet[2887]: I1212 20:05:41.674881 2887 server.go:956] "Client rotation is on, will bootstrap in background" Dec 12 20:05:41.677326 kubelet[2887]: I1212 20:05:41.677301 2887 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Dec 12 20:05:41.697005 kubelet[2887]: I1212 20:05:41.696322 2887 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 12 20:05:41.713050 kubelet[2887]: I1212 20:05:41.712987 2887 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 12 20:05:41.723888 kubelet[2887]: I1212 20:05:41.723852 2887 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 12 20:05:41.724432 kubelet[2887]: I1212 20:05:41.724373 2887 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 12 20:05:41.724792 kubelet[2887]: I1212 20:05:41.724539 2887 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-n0ssy.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 12 20:05:41.725314 kubelet[2887]: I1212 20:05:41.724988 2887 topology_manager.go:138] "Creating topology manager with none policy" Dec 12 20:05:41.725314 kubelet[2887]: I1212 20:05:41.725012 2887 container_manager_linux.go:303] "Creating device plugin manager" Dec 12 20:05:41.725314 kubelet[2887]: I1212 20:05:41.725097 2887 state_mem.go:36] "Initialized new in-memory state store" Dec 12 20:05:41.725599 kubelet[2887]: I1212 20:05:41.725568 2887 kubelet.go:480] "Attempting to sync node with API server" Dec 12 20:05:41.726599 kubelet[2887]: I1212 20:05:41.726490 2887 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 12 20:05:41.730058 kubelet[2887]: I1212 20:05:41.730035 2887 kubelet.go:386] "Adding apiserver pod source" Dec 12 20:05:41.730466 kubelet[2887]: I1212 20:05:41.730445 2887 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 12 20:05:41.756523 kubelet[2887]: I1212 20:05:41.756484 2887 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 12 20:05:41.758079 kubelet[2887]: I1212 20:05:41.758052 2887 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 12 20:05:41.776405 kubelet[2887]: I1212 20:05:41.776357 2887 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 12 20:05:41.776657 kubelet[2887]: I1212 20:05:41.776637 2887 server.go:1289] "Started kubelet" Dec 12 20:05:41.784304 kubelet[2887]: I1212 20:05:41.784087 2887 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 12 20:05:41.790647 kubelet[2887]: I1212 20:05:41.781909 2887 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 12 20:05:41.793148 kubelet[2887]: I1212 20:05:41.776816 2887 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 12 20:05:41.793538 kubelet[2887]: I1212 20:05:41.793513 2887 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 12 20:05:41.794918 kubelet[2887]: I1212 20:05:41.781943 2887 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 12 20:05:41.797058 kubelet[2887]: E1212 20:05:41.797032 2887 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 12 20:05:41.798333 kubelet[2887]: I1212 20:05:41.798311 2887 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 12 20:05:41.798628 kubelet[2887]: I1212 20:05:41.798528 2887 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 12 20:05:41.800529 kubelet[2887]: I1212 20:05:41.799093 2887 reconciler.go:26] "Reconciler: start to sync state" Dec 12 20:05:41.801880 kubelet[2887]: I1212 20:05:41.800978 2887 factory.go:223] Registration of the systemd container factory successfully Dec 12 20:05:41.802119 kubelet[2887]: I1212 20:05:41.802089 2887 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 12 20:05:41.804513 kubelet[2887]: I1212 20:05:41.798172 2887 server.go:317] "Adding debug handlers to kubelet server" Dec 12 20:05:41.810319 kubelet[2887]: I1212 20:05:41.810030 2887 factory.go:223] Registration of the containerd container factory successfully Dec 12 20:05:41.860473 kubelet[2887]: I1212 20:05:41.860417 2887 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Dec 12 20:05:41.870694 kubelet[2887]: I1212 20:05:41.870522 2887 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Dec 12 20:05:41.870694 kubelet[2887]: I1212 20:05:41.870561 2887 status_manager.go:230] "Starting to sync pod status with apiserver" Dec 12 20:05:41.870694 kubelet[2887]: I1212 20:05:41.870615 2887 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 12 20:05:41.870694 kubelet[2887]: I1212 20:05:41.870629 2887 kubelet.go:2436] "Starting kubelet main sync loop" Dec 12 20:05:41.871042 kubelet[2887]: E1212 20:05:41.870740 2887 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 12 20:05:41.919327 kubelet[2887]: I1212 20:05:41.918944 2887 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 12 20:05:41.919327 kubelet[2887]: I1212 20:05:41.918972 2887 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 12 20:05:41.919327 kubelet[2887]: I1212 20:05:41.918999 2887 state_mem.go:36] "Initialized new in-memory state store" Dec 12 20:05:41.919327 kubelet[2887]: I1212 20:05:41.919190 2887 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 12 20:05:41.919327 kubelet[2887]: I1212 20:05:41.919209 2887 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 12 20:05:41.919327 kubelet[2887]: I1212 20:05:41.919233 2887 policy_none.go:49] "None policy: Start" Dec 12 20:05:41.919327 kubelet[2887]: I1212 20:05:41.919254 2887 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 12 20:05:41.919327 kubelet[2887]: I1212 20:05:41.919277 2887 state_mem.go:35] "Initializing new in-memory state store" Dec 12 20:05:41.919970 kubelet[2887]: I1212 20:05:41.919947 2887 state_mem.go:75] "Updated machine memory state" Dec 12 20:05:41.927643 kubelet[2887]: E1212 20:05:41.927615 2887 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 12 20:05:41.928614 kubelet[2887]: I1212 20:05:41.928590 2887 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 12 20:05:41.930188 kubelet[2887]: I1212 20:05:41.929753 2887 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 12 20:05:41.932751 kubelet[2887]: I1212 20:05:41.931118 2887 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 12 20:05:41.942702 kubelet[2887]: E1212 20:05:41.942649 2887 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 12 20:05:41.973152 kubelet[2887]: I1212 20:05:41.971967 2887 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:41.974010 kubelet[2887]: I1212 20:05:41.973972 2887 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:41.976755 kubelet[2887]: I1212 20:05:41.976616 2887 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:41.985896 kubelet[2887]: I1212 20:05:41.985853 2887 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Dec 12 20:05:41.989757 kubelet[2887]: I1212 20:05:41.989377 2887 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Dec 12 20:05:41.990309 kubelet[2887]: I1212 20:05:41.989903 2887 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Dec 12 20:05:41.990427 kubelet[2887]: E1212 20:05:41.990276 2887 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-n0ssy.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.004975 kubelet[2887]: I1212 20:05:42.004932 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a55aa0ee93e7794122aa73d201bae4d2-k8s-certs\") pod \"kube-controller-manager-srv-n0ssy.gb1.brightbox.com\" (UID: \"a55aa0ee93e7794122aa73d201bae4d2\") " pod="kube-system/kube-controller-manager-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.005590 kubelet[2887]: I1212 20:05:42.005158 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a55aa0ee93e7794122aa73d201bae4d2-kubeconfig\") pod \"kube-controller-manager-srv-n0ssy.gb1.brightbox.com\" (UID: \"a55aa0ee93e7794122aa73d201bae4d2\") " pod="kube-system/kube-controller-manager-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.005590 kubelet[2887]: I1212 20:05:42.005204 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a55aa0ee93e7794122aa73d201bae4d2-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-n0ssy.gb1.brightbox.com\" (UID: \"a55aa0ee93e7794122aa73d201bae4d2\") " pod="kube-system/kube-controller-manager-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.005590 kubelet[2887]: I1212 20:05:42.005237 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1cfeaa130ff4e4914b0c0ea993735145-kubeconfig\") pod \"kube-scheduler-srv-n0ssy.gb1.brightbox.com\" (UID: \"1cfeaa130ff4e4914b0c0ea993735145\") " pod="kube-system/kube-scheduler-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.005590 kubelet[2887]: I1212 20:05:42.005272 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9ce8762f63fe40ba8d96a6dc22b10b48-k8s-certs\") pod \"kube-apiserver-srv-n0ssy.gb1.brightbox.com\" (UID: \"9ce8762f63fe40ba8d96a6dc22b10b48\") " pod="kube-system/kube-apiserver-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.005590 kubelet[2887]: I1212 20:05:42.005341 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a55aa0ee93e7794122aa73d201bae4d2-ca-certs\") pod \"kube-controller-manager-srv-n0ssy.gb1.brightbox.com\" (UID: \"a55aa0ee93e7794122aa73d201bae4d2\") " pod="kube-system/kube-controller-manager-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.005858 kubelet[2887]: I1212 20:05:42.005388 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9ce8762f63fe40ba8d96a6dc22b10b48-ca-certs\") pod \"kube-apiserver-srv-n0ssy.gb1.brightbox.com\" (UID: \"9ce8762f63fe40ba8d96a6dc22b10b48\") " pod="kube-system/kube-apiserver-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.005858 kubelet[2887]: I1212 20:05:42.005448 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9ce8762f63fe40ba8d96a6dc22b10b48-usr-share-ca-certificates\") pod \"kube-apiserver-srv-n0ssy.gb1.brightbox.com\" (UID: \"9ce8762f63fe40ba8d96a6dc22b10b48\") " pod="kube-system/kube-apiserver-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.005858 kubelet[2887]: I1212 20:05:42.005486 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a55aa0ee93e7794122aa73d201bae4d2-flexvolume-dir\") pod \"kube-controller-manager-srv-n0ssy.gb1.brightbox.com\" (UID: \"a55aa0ee93e7794122aa73d201bae4d2\") " pod="kube-system/kube-controller-manager-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.056092 kubelet[2887]: I1212 20:05:42.055976 2887 kubelet_node_status.go:75] "Attempting to register node" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.071233 kubelet[2887]: I1212 20:05:42.071119 2887 kubelet_node_status.go:124] "Node was previously registered" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.073740 kubelet[2887]: I1212 20:05:42.072203 2887 kubelet_node_status.go:78] "Successfully registered node" node="srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.734091 kubelet[2887]: I1212 20:05:42.733023 2887 apiserver.go:52] "Watching apiserver" Dec 12 20:05:42.800974 kubelet[2887]: I1212 20:05:42.800902 2887 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 12 20:05:42.915603 kubelet[2887]: I1212 20:05:42.915558 2887 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.916097 kubelet[2887]: I1212 20:05:42.916067 2887 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.935136 kubelet[2887]: I1212 20:05:42.935067 2887 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Dec 12 20:05:42.935403 kubelet[2887]: E1212 20:05:42.935165 2887 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-n0ssy.gb1.brightbox.com\" already exists" pod="kube-system/kube-controller-manager-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.937300 kubelet[2887]: I1212 20:05:42.936362 2887 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Dec 12 20:05:42.937300 kubelet[2887]: E1212 20:05:42.936432 2887 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-n0ssy.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-n0ssy.gb1.brightbox.com" Dec 12 20:05:42.998190 kubelet[2887]: I1212 20:05:42.997880 2887 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-n0ssy.gb1.brightbox.com" podStartSLOduration=3.997835249 podStartE2EDuration="3.997835249s" podCreationTimestamp="2025-12-12 20:05:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 20:05:42.970074584 +0000 UTC m=+1.398974041" watchObservedRunningTime="2025-12-12 20:05:42.997835249 +0000 UTC m=+1.426734729" Dec 12 20:05:43.019874 kubelet[2887]: I1212 20:05:43.019739 2887 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-n0ssy.gb1.brightbox.com" podStartSLOduration=2.019717358 podStartE2EDuration="2.019717358s" podCreationTimestamp="2025-12-12 20:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 20:05:42.9981886 +0000 UTC m=+1.427088057" watchObservedRunningTime="2025-12-12 20:05:43.019717358 +0000 UTC m=+1.448616810" Dec 12 20:05:43.038176 kubelet[2887]: I1212 20:05:43.038092 2887 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-n0ssy.gb1.brightbox.com" podStartSLOduration=2.038062222 podStartE2EDuration="2.038062222s" podCreationTimestamp="2025-12-12 20:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 20:05:43.020144856 +0000 UTC m=+1.449044328" watchObservedRunningTime="2025-12-12 20:05:43.038062222 +0000 UTC m=+1.466961674" Dec 12 20:05:47.562268 kubelet[2887]: I1212 20:05:47.562209 2887 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 12 20:05:47.563189 containerd[1555]: time="2025-12-12T20:05:47.563119123Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 12 20:05:47.564532 kubelet[2887]: I1212 20:05:47.563357 2887 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 12 20:05:48.529279 systemd[1]: Created slice kubepods-besteffort-pod812c8498_e6f3_456a_ae28_33b9edea0f43.slice - libcontainer container kubepods-besteffort-pod812c8498_e6f3_456a_ae28_33b9edea0f43.slice. Dec 12 20:05:48.547864 kubelet[2887]: I1212 20:05:48.547811 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/812c8498-e6f3-456a-ae28-33b9edea0f43-xtables-lock\") pod \"kube-proxy-ddwwn\" (UID: \"812c8498-e6f3-456a-ae28-33b9edea0f43\") " pod="kube-system/kube-proxy-ddwwn" Dec 12 20:05:48.548057 kubelet[2887]: I1212 20:05:48.547878 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/812c8498-e6f3-456a-ae28-33b9edea0f43-kube-proxy\") pod \"kube-proxy-ddwwn\" (UID: \"812c8498-e6f3-456a-ae28-33b9edea0f43\") " pod="kube-system/kube-proxy-ddwwn" Dec 12 20:05:48.548057 kubelet[2887]: I1212 20:05:48.547917 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/812c8498-e6f3-456a-ae28-33b9edea0f43-lib-modules\") pod \"kube-proxy-ddwwn\" (UID: \"812c8498-e6f3-456a-ae28-33b9edea0f43\") " pod="kube-system/kube-proxy-ddwwn" Dec 12 20:05:48.548057 kubelet[2887]: I1212 20:05:48.547943 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w89c5\" (UniqueName: \"kubernetes.io/projected/812c8498-e6f3-456a-ae28-33b9edea0f43-kube-api-access-w89c5\") pod \"kube-proxy-ddwwn\" (UID: \"812c8498-e6f3-456a-ae28-33b9edea0f43\") " pod="kube-system/kube-proxy-ddwwn" Dec 12 20:05:48.745549 systemd[1]: Created slice kubepods-besteffort-pod54ae5826_8e53_4e0e_8ea2_f0374174fe08.slice - libcontainer container kubepods-besteffort-pod54ae5826_8e53_4e0e_8ea2_f0374174fe08.slice. Dec 12 20:05:48.748855 kubelet[2887]: I1212 20:05:48.748817 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/54ae5826-8e53-4e0e-8ea2-f0374174fe08-var-lib-calico\") pod \"tigera-operator-7dcd859c48-j2tpx\" (UID: \"54ae5826-8e53-4e0e-8ea2-f0374174fe08\") " pod="tigera-operator/tigera-operator-7dcd859c48-j2tpx" Dec 12 20:05:48.750533 kubelet[2887]: I1212 20:05:48.748885 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zrmx\" (UniqueName: \"kubernetes.io/projected/54ae5826-8e53-4e0e-8ea2-f0374174fe08-kube-api-access-9zrmx\") pod \"tigera-operator-7dcd859c48-j2tpx\" (UID: \"54ae5826-8e53-4e0e-8ea2-f0374174fe08\") " pod="tigera-operator/tigera-operator-7dcd859c48-j2tpx" Dec 12 20:05:48.843085 containerd[1555]: time="2025-12-12T20:05:48.843014365Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-ddwwn,Uid:812c8498-e6f3-456a-ae28-33b9edea0f43,Namespace:kube-system,Attempt:0,}" Dec 12 20:05:48.879245 containerd[1555]: time="2025-12-12T20:05:48.878911207Z" level=info msg="connecting to shim b764702637c9f871e9b7d9b0e37d152f22be4147bbb3d122a8ce9e7a1457dcaa" address="unix:///run/containerd/s/91e8258d9c9b8b005ba10dfc62d1927f152c8f72aafbdfe540653389fc9e26f2" namespace=k8s.io protocol=ttrpc version=3 Dec 12 20:05:48.922708 systemd[1]: Started cri-containerd-b764702637c9f871e9b7d9b0e37d152f22be4147bbb3d122a8ce9e7a1457dcaa.scope - libcontainer container b764702637c9f871e9b7d9b0e37d152f22be4147bbb3d122a8ce9e7a1457dcaa. Dec 12 20:05:48.972773 containerd[1555]: time="2025-12-12T20:05:48.972724036Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-ddwwn,Uid:812c8498-e6f3-456a-ae28-33b9edea0f43,Namespace:kube-system,Attempt:0,} returns sandbox id \"b764702637c9f871e9b7d9b0e37d152f22be4147bbb3d122a8ce9e7a1457dcaa\"" Dec 12 20:05:48.980253 containerd[1555]: time="2025-12-12T20:05:48.979594388Z" level=info msg="CreateContainer within sandbox \"b764702637c9f871e9b7d9b0e37d152f22be4147bbb3d122a8ce9e7a1457dcaa\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 12 20:05:49.010643 containerd[1555]: time="2025-12-12T20:05:49.010592834Z" level=info msg="Container fe8a40da676866c123a96408ed1f954fd5443781f03ff378fd6c083cc6903253: CDI devices from CRI Config.CDIDevices: []" Dec 12 20:05:49.021828 containerd[1555]: time="2025-12-12T20:05:49.021716338Z" level=info msg="CreateContainer within sandbox \"b764702637c9f871e9b7d9b0e37d152f22be4147bbb3d122a8ce9e7a1457dcaa\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"fe8a40da676866c123a96408ed1f954fd5443781f03ff378fd6c083cc6903253\"" Dec 12 20:05:49.022700 containerd[1555]: time="2025-12-12T20:05:49.022667076Z" level=info msg="StartContainer for \"fe8a40da676866c123a96408ed1f954fd5443781f03ff378fd6c083cc6903253\"" Dec 12 20:05:49.025128 containerd[1555]: time="2025-12-12T20:05:49.025092356Z" level=info msg="connecting to shim fe8a40da676866c123a96408ed1f954fd5443781f03ff378fd6c083cc6903253" address="unix:///run/containerd/s/91e8258d9c9b8b005ba10dfc62d1927f152c8f72aafbdfe540653389fc9e26f2" protocol=ttrpc version=3 Dec 12 20:05:49.053361 containerd[1555]: time="2025-12-12T20:05:49.053057122Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-j2tpx,Uid:54ae5826-8e53-4e0e-8ea2-f0374174fe08,Namespace:tigera-operator,Attempt:0,}" Dec 12 20:05:49.053713 systemd[1]: Started cri-containerd-fe8a40da676866c123a96408ed1f954fd5443781f03ff378fd6c083cc6903253.scope - libcontainer container fe8a40da676866c123a96408ed1f954fd5443781f03ff378fd6c083cc6903253. Dec 12 20:05:49.086341 containerd[1555]: time="2025-12-12T20:05:49.086252935Z" level=info msg="connecting to shim 30fcaebd77e717afb28d06261f21f100be561a662e773a6862be4a4ccea7bbc6" address="unix:///run/containerd/s/1ed19c8502e400c62d725003f3d440c67edf0c1a02304064deb644f343d33e43" namespace=k8s.io protocol=ttrpc version=3 Dec 12 20:05:49.139546 systemd[1]: Started cri-containerd-30fcaebd77e717afb28d06261f21f100be561a662e773a6862be4a4ccea7bbc6.scope - libcontainer container 30fcaebd77e717afb28d06261f21f100be561a662e773a6862be4a4ccea7bbc6. Dec 12 20:05:49.201951 containerd[1555]: time="2025-12-12T20:05:49.201443742Z" level=info msg="StartContainer for \"fe8a40da676866c123a96408ed1f954fd5443781f03ff378fd6c083cc6903253\" returns successfully" Dec 12 20:05:49.265092 containerd[1555]: time="2025-12-12T20:05:49.264959418Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-j2tpx,Uid:54ae5826-8e53-4e0e-8ea2-f0374174fe08,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"30fcaebd77e717afb28d06261f21f100be561a662e773a6862be4a4ccea7bbc6\"" Dec 12 20:05:49.268845 containerd[1555]: time="2025-12-12T20:05:49.268812659Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 12 20:05:49.690638 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount570743186.mount: Deactivated successfully. Dec 12 20:05:52.223494 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount740075959.mount: Deactivated successfully. Dec 12 20:05:52.773524 kubelet[2887]: I1212 20:05:52.772763 2887 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-ddwwn" podStartSLOduration=4.77273061 podStartE2EDuration="4.77273061s" podCreationTimestamp="2025-12-12 20:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 20:05:49.951498238 +0000 UTC m=+8.380397713" watchObservedRunningTime="2025-12-12 20:05:52.77273061 +0000 UTC m=+11.201630056" Dec 12 20:05:53.290735 containerd[1555]: time="2025-12-12T20:05:53.290636632Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:53.292542 containerd[1555]: time="2025-12-12T20:05:53.292401635Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=25061691" Dec 12 20:05:53.293425 containerd[1555]: time="2025-12-12T20:05:53.293385222Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:53.296555 containerd[1555]: time="2025-12-12T20:05:53.296520017Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:05:53.298022 containerd[1555]: time="2025-12-12T20:05:53.297888238Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 4.029021244s" Dec 12 20:05:53.298022 containerd[1555]: time="2025-12-12T20:05:53.297930520Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Dec 12 20:05:53.307437 containerd[1555]: time="2025-12-12T20:05:53.305258678Z" level=info msg="CreateContainer within sandbox \"30fcaebd77e717afb28d06261f21f100be561a662e773a6862be4a4ccea7bbc6\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 12 20:05:53.324359 containerd[1555]: time="2025-12-12T20:05:53.323989118Z" level=info msg="Container 72e524d83392b5399a55158727f7db0e69c1ed3ad59a9ec767ca2a19472b18de: CDI devices from CRI Config.CDIDevices: []" Dec 12 20:05:53.329536 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1436530812.mount: Deactivated successfully. Dec 12 20:05:53.337722 containerd[1555]: time="2025-12-12T20:05:53.337594391Z" level=info msg="CreateContainer within sandbox \"30fcaebd77e717afb28d06261f21f100be561a662e773a6862be4a4ccea7bbc6\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"72e524d83392b5399a55158727f7db0e69c1ed3ad59a9ec767ca2a19472b18de\"" Dec 12 20:05:53.338843 containerd[1555]: time="2025-12-12T20:05:53.338412695Z" level=info msg="StartContainer for \"72e524d83392b5399a55158727f7db0e69c1ed3ad59a9ec767ca2a19472b18de\"" Dec 12 20:05:53.340619 containerd[1555]: time="2025-12-12T20:05:53.340586719Z" level=info msg="connecting to shim 72e524d83392b5399a55158727f7db0e69c1ed3ad59a9ec767ca2a19472b18de" address="unix:///run/containerd/s/1ed19c8502e400c62d725003f3d440c67edf0c1a02304064deb644f343d33e43" protocol=ttrpc version=3 Dec 12 20:05:53.375538 systemd[1]: Started cri-containerd-72e524d83392b5399a55158727f7db0e69c1ed3ad59a9ec767ca2a19472b18de.scope - libcontainer container 72e524d83392b5399a55158727f7db0e69c1ed3ad59a9ec767ca2a19472b18de. Dec 12 20:05:53.450163 containerd[1555]: time="2025-12-12T20:05:53.449874011Z" level=info msg="StartContainer for \"72e524d83392b5399a55158727f7db0e69c1ed3ad59a9ec767ca2a19472b18de\" returns successfully" Dec 12 20:05:57.136309 systemd[1]: cri-containerd-72e524d83392b5399a55158727f7db0e69c1ed3ad59a9ec767ca2a19472b18de.scope: Deactivated successfully. Dec 12 20:05:57.228340 containerd[1555]: time="2025-12-12T20:05:57.227984479Z" level=info msg="received container exit event container_id:\"72e524d83392b5399a55158727f7db0e69c1ed3ad59a9ec767ca2a19472b18de\" id:\"72e524d83392b5399a55158727f7db0e69c1ed3ad59a9ec767ca2a19472b18de\" pid:3219 exit_status:1 exited_at:{seconds:1765569957 nanos:145737483}" Dec 12 20:05:57.312433 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-72e524d83392b5399a55158727f7db0e69c1ed3ad59a9ec767ca2a19472b18de-rootfs.mount: Deactivated successfully. Dec 12 20:05:57.999701 kubelet[2887]: I1212 20:05:57.999380 2887 scope.go:117] "RemoveContainer" containerID="72e524d83392b5399a55158727f7db0e69c1ed3ad59a9ec767ca2a19472b18de" Dec 12 20:05:58.009456 containerd[1555]: time="2025-12-12T20:05:58.008547489Z" level=info msg="CreateContainer within sandbox \"30fcaebd77e717afb28d06261f21f100be561a662e773a6862be4a4ccea7bbc6\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 12 20:05:58.026943 containerd[1555]: time="2025-12-12T20:05:58.026887079Z" level=info msg="Container 81cdd337928e68d072b084adb2050810ca318a5de52eb42e76d508cd8ab90f43: CDI devices from CRI Config.CDIDevices: []" Dec 12 20:05:58.035339 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2061522812.mount: Deactivated successfully. Dec 12 20:05:58.043072 containerd[1555]: time="2025-12-12T20:05:58.042959365Z" level=info msg="CreateContainer within sandbox \"30fcaebd77e717afb28d06261f21f100be561a662e773a6862be4a4ccea7bbc6\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"81cdd337928e68d072b084adb2050810ca318a5de52eb42e76d508cd8ab90f43\"" Dec 12 20:05:58.044688 containerd[1555]: time="2025-12-12T20:05:58.044646839Z" level=info msg="StartContainer for \"81cdd337928e68d072b084adb2050810ca318a5de52eb42e76d508cd8ab90f43\"" Dec 12 20:05:58.047000 containerd[1555]: time="2025-12-12T20:05:58.046961672Z" level=info msg="connecting to shim 81cdd337928e68d072b084adb2050810ca318a5de52eb42e76d508cd8ab90f43" address="unix:///run/containerd/s/1ed19c8502e400c62d725003f3d440c67edf0c1a02304064deb644f343d33e43" protocol=ttrpc version=3 Dec 12 20:05:58.082744 systemd[1]: Started cri-containerd-81cdd337928e68d072b084adb2050810ca318a5de52eb42e76d508cd8ab90f43.scope - libcontainer container 81cdd337928e68d072b084adb2050810ca318a5de52eb42e76d508cd8ab90f43. Dec 12 20:05:58.175816 containerd[1555]: time="2025-12-12T20:05:58.175752390Z" level=info msg="StartContainer for \"81cdd337928e68d072b084adb2050810ca318a5de52eb42e76d508cd8ab90f43\" returns successfully" Dec 12 20:05:59.009012 kubelet[2887]: I1212 20:05:59.008621 2887 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-j2tpx" podStartSLOduration=6.976175524 podStartE2EDuration="11.00859822s" podCreationTimestamp="2025-12-12 20:05:48 +0000 UTC" firstStartedPulling="2025-12-12 20:05:49.267206409 +0000 UTC m=+7.696105853" lastFinishedPulling="2025-12-12 20:05:53.299629111 +0000 UTC m=+11.728528549" observedRunningTime="2025-12-12 20:05:53.973925472 +0000 UTC m=+12.402824947" watchObservedRunningTime="2025-12-12 20:05:59.00859822 +0000 UTC m=+17.437497672" Dec 12 20:06:01.154315 sudo[1884]: pam_unix(sudo:session): session closed for user root Dec 12 20:06:01.301851 sshd[1883]: Connection closed by 147.75.109.163 port 43090 Dec 12 20:06:01.304988 sshd-session[1880]: pam_unix(sshd:session): session closed for user core Dec 12 20:06:01.314101 systemd[1]: sshd@8-10.244.19.234:22-147.75.109.163:43090.service: Deactivated successfully. Dec 12 20:06:01.320764 systemd[1]: session-11.scope: Deactivated successfully. Dec 12 20:06:01.321824 systemd[1]: session-11.scope: Consumed 9.130s CPU time, 154.6M memory peak. Dec 12 20:06:01.327437 systemd-logind[1528]: Session 11 logged out. Waiting for processes to exit. Dec 12 20:06:01.331225 systemd-logind[1528]: Removed session 11. Dec 12 20:06:11.609234 kubelet[2887]: I1212 20:06:11.607140 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6149ee57-78bc-465d-836b-29a1539259ca-tigera-ca-bundle\") pod \"calico-typha-65cf6f8f57-q79gd\" (UID: \"6149ee57-78bc-465d-836b-29a1539259ca\") " pod="calico-system/calico-typha-65cf6f8f57-q79gd" Dec 12 20:06:11.609234 kubelet[2887]: I1212 20:06:11.607400 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbwlm\" (UniqueName: \"kubernetes.io/projected/6149ee57-78bc-465d-836b-29a1539259ca-kube-api-access-nbwlm\") pod \"calico-typha-65cf6f8f57-q79gd\" (UID: \"6149ee57-78bc-465d-836b-29a1539259ca\") " pod="calico-system/calico-typha-65cf6f8f57-q79gd" Dec 12 20:06:11.609234 kubelet[2887]: I1212 20:06:11.607443 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/6149ee57-78bc-465d-836b-29a1539259ca-typha-certs\") pod \"calico-typha-65cf6f8f57-q79gd\" (UID: \"6149ee57-78bc-465d-836b-29a1539259ca\") " pod="calico-system/calico-typha-65cf6f8f57-q79gd" Dec 12 20:06:11.608958 systemd[1]: Created slice kubepods-besteffort-pod6149ee57_78bc_465d_836b_29a1539259ca.slice - libcontainer container kubepods-besteffort-pod6149ee57_78bc_465d_836b_29a1539259ca.slice. Dec 12 20:06:11.832218 systemd[1]: Created slice kubepods-besteffort-pod78c07b64_d1b7_4a7d_a27a_135ea92b7705.slice - libcontainer container kubepods-besteffort-pod78c07b64_d1b7_4a7d_a27a_135ea92b7705.slice. Dec 12 20:06:11.916659 containerd[1555]: time="2025-12-12T20:06:11.915420054Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-65cf6f8f57-q79gd,Uid:6149ee57-78bc-465d-836b-29a1539259ca,Namespace:calico-system,Attempt:0,}" Dec 12 20:06:11.966235 containerd[1555]: time="2025-12-12T20:06:11.965892162Z" level=info msg="connecting to shim 4dfbc4f0871248e58877b78e84330ff8b49cb34348efe98470535fd1ec280755" address="unix:///run/containerd/s/872052cf9a120ee7b83e8e7a781249ffbb63c41e8b57803e57331b167c1a10aa" namespace=k8s.io protocol=ttrpc version=3 Dec 12 20:06:12.007170 kubelet[2887]: E1212 20:06:12.003542 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:06:12.017486 kubelet[2887]: I1212 20:06:12.017429 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/78c07b64-d1b7-4a7d-a27a-135ea92b7705-var-run-calico\") pod \"calico-node-24v5p\" (UID: \"78c07b64-d1b7-4a7d-a27a-135ea92b7705\") " pod="calico-system/calico-node-24v5p" Dec 12 20:06:12.017686 kubelet[2887]: I1212 20:06:12.017505 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/78c07b64-d1b7-4a7d-a27a-135ea92b7705-cni-net-dir\") pod \"calico-node-24v5p\" (UID: \"78c07b64-d1b7-4a7d-a27a-135ea92b7705\") " pod="calico-system/calico-node-24v5p" Dec 12 20:06:12.017686 kubelet[2887]: I1212 20:06:12.017547 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/78c07b64-d1b7-4a7d-a27a-135ea92b7705-var-lib-calico\") pod \"calico-node-24v5p\" (UID: \"78c07b64-d1b7-4a7d-a27a-135ea92b7705\") " pod="calico-system/calico-node-24v5p" Dec 12 20:06:12.017686 kubelet[2887]: I1212 20:06:12.017591 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/78c07b64-d1b7-4a7d-a27a-135ea92b7705-node-certs\") pod \"calico-node-24v5p\" (UID: \"78c07b64-d1b7-4a7d-a27a-135ea92b7705\") " pod="calico-system/calico-node-24v5p" Dec 12 20:06:12.017686 kubelet[2887]: I1212 20:06:12.017628 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2a05762a-3b72-4df9-aa17-753debf16cab-registration-dir\") pod \"csi-node-driver-fwp5m\" (UID: \"2a05762a-3b72-4df9-aa17-753debf16cab\") " pod="calico-system/csi-node-driver-fwp5m" Dec 12 20:06:12.017686 kubelet[2887]: I1212 20:06:12.017660 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/78c07b64-d1b7-4a7d-a27a-135ea92b7705-cni-bin-dir\") pod \"calico-node-24v5p\" (UID: \"78c07b64-d1b7-4a7d-a27a-135ea92b7705\") " pod="calico-system/calico-node-24v5p" Dec 12 20:06:12.017941 kubelet[2887]: I1212 20:06:12.017693 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/78c07b64-d1b7-4a7d-a27a-135ea92b7705-cni-log-dir\") pod \"calico-node-24v5p\" (UID: \"78c07b64-d1b7-4a7d-a27a-135ea92b7705\") " pod="calico-system/calico-node-24v5p" Dec 12 20:06:12.017941 kubelet[2887]: I1212 20:06:12.017778 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/78c07b64-d1b7-4a7d-a27a-135ea92b7705-flexvol-driver-host\") pod \"calico-node-24v5p\" (UID: \"78c07b64-d1b7-4a7d-a27a-135ea92b7705\") " pod="calico-system/calico-node-24v5p" Dec 12 20:06:12.017941 kubelet[2887]: I1212 20:06:12.017816 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/78c07b64-d1b7-4a7d-a27a-135ea92b7705-lib-modules\") pod \"calico-node-24v5p\" (UID: \"78c07b64-d1b7-4a7d-a27a-135ea92b7705\") " pod="calico-system/calico-node-24v5p" Dec 12 20:06:12.017941 kubelet[2887]: I1212 20:06:12.017887 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/78c07b64-d1b7-4a7d-a27a-135ea92b7705-policysync\") pod \"calico-node-24v5p\" (UID: \"78c07b64-d1b7-4a7d-a27a-135ea92b7705\") " pod="calico-system/calico-node-24v5p" Dec 12 20:06:12.017941 kubelet[2887]: I1212 20:06:12.017921 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78c07b64-d1b7-4a7d-a27a-135ea92b7705-tigera-ca-bundle\") pod \"calico-node-24v5p\" (UID: \"78c07b64-d1b7-4a7d-a27a-135ea92b7705\") " pod="calico-system/calico-node-24v5p" Dec 12 20:06:12.018161 kubelet[2887]: I1212 20:06:12.017955 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/78c07b64-d1b7-4a7d-a27a-135ea92b7705-xtables-lock\") pod \"calico-node-24v5p\" (UID: \"78c07b64-d1b7-4a7d-a27a-135ea92b7705\") " pod="calico-system/calico-node-24v5p" Dec 12 20:06:12.018161 kubelet[2887]: I1212 20:06:12.017992 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p98d4\" (UniqueName: \"kubernetes.io/projected/78c07b64-d1b7-4a7d-a27a-135ea92b7705-kube-api-access-p98d4\") pod \"calico-node-24v5p\" (UID: \"78c07b64-d1b7-4a7d-a27a-135ea92b7705\") " pod="calico-system/calico-node-24v5p" Dec 12 20:06:12.018161 kubelet[2887]: I1212 20:06:12.018028 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a05762a-3b72-4df9-aa17-753debf16cab-kubelet-dir\") pod \"csi-node-driver-fwp5m\" (UID: \"2a05762a-3b72-4df9-aa17-753debf16cab\") " pod="calico-system/csi-node-driver-fwp5m" Dec 12 20:06:12.018161 kubelet[2887]: I1212 20:06:12.018063 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2a05762a-3b72-4df9-aa17-753debf16cab-socket-dir\") pod \"csi-node-driver-fwp5m\" (UID: \"2a05762a-3b72-4df9-aa17-753debf16cab\") " pod="calico-system/csi-node-driver-fwp5m" Dec 12 20:06:12.058664 systemd[1]: Started cri-containerd-4dfbc4f0871248e58877b78e84330ff8b49cb34348efe98470535fd1ec280755.scope - libcontainer container 4dfbc4f0871248e58877b78e84330ff8b49cb34348efe98470535fd1ec280755. Dec 12 20:06:12.119074 kubelet[2887]: I1212 20:06:12.118997 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/2a05762a-3b72-4df9-aa17-753debf16cab-varrun\") pod \"csi-node-driver-fwp5m\" (UID: \"2a05762a-3b72-4df9-aa17-753debf16cab\") " pod="calico-system/csi-node-driver-fwp5m" Dec 12 20:06:12.120807 kubelet[2887]: I1212 20:06:12.120748 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzr2l\" (UniqueName: \"kubernetes.io/projected/2a05762a-3b72-4df9-aa17-753debf16cab-kube-api-access-hzr2l\") pod \"csi-node-driver-fwp5m\" (UID: \"2a05762a-3b72-4df9-aa17-753debf16cab\") " pod="calico-system/csi-node-driver-fwp5m" Dec 12 20:06:12.132096 kubelet[2887]: E1212 20:06:12.132048 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:12.132096 kubelet[2887]: W1212 20:06:12.132098 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:12.132307 kubelet[2887]: E1212 20:06:12.132161 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:12.154701 kubelet[2887]: E1212 20:06:12.154645 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:12.154701 kubelet[2887]: W1212 20:06:12.154679 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:12.154701 kubelet[2887]: E1212 20:06:12.154706 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:12.222662 kubelet[2887]: E1212 20:06:12.221678 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:12.222662 kubelet[2887]: W1212 20:06:12.221708 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:12.222662 kubelet[2887]: E1212 20:06:12.221739 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:12.222662 kubelet[2887]: E1212 20:06:12.222019 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:12.222662 kubelet[2887]: W1212 20:06:12.222041 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:12.222662 kubelet[2887]: E1212 20:06:12.222055 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:12.222662 kubelet[2887]: E1212 20:06:12.222405 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:12.222662 kubelet[2887]: W1212 20:06:12.222420 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:12.222662 kubelet[2887]: E1212 20:06:12.222438 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:12.224440 kubelet[2887]: E1212 20:06:12.222740 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:12.224440 kubelet[2887]: W1212 20:06:12.222754 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:12.224440 kubelet[2887]: E1212 20:06:12.222769 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:12.224440 kubelet[2887]: E1212 20:06:12.223058 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:12.224440 kubelet[2887]: W1212 20:06:12.223071 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:12.224440 kubelet[2887]: E1212 20:06:12.223097 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:12.224440 kubelet[2887]: E1212 20:06:12.223367 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:12.224440 kubelet[2887]: W1212 20:06:12.223385 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:12.224440 kubelet[2887]: E1212 20:06:12.223400 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:12.225237 kubelet[2887]: E1212 20:06:12.225211 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:12.225237 kubelet[2887]: W1212 20:06:12.225232 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:12.225393 kubelet[2887]: E1212 20:06:12.225249 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:12.225626 kubelet[2887]: E1212 20:06:12.225539 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:12.225626 kubelet[2887]: W1212 20:06:12.225559 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:12.225626 kubelet[2887]: E1212 20:06:12.225575 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:12.226836 kubelet[2887]: E1212 20:06:12.226728 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:12.226836 kubelet[2887]: W1212 20:06:12.226753 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:12.226836 kubelet[2887]: E1212 20:06:12.226771 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:12.228435 kubelet[2887]: E1212 20:06:12.228407 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:12.228435 kubelet[2887]: W1212 20:06:12.228430 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:12.228574 kubelet[2887]: E1212 20:06:12.228447 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:12.239373 kubelet[2887]: E1212 20:06:12.238187 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:12.239373 kubelet[2887]: W1212 20:06:12.239070 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:12.239373 kubelet[2887]: E1212 20:06:12.239102 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:12.291896 containerd[1555]: time="2025-12-12T20:06:12.291720927Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-65cf6f8f57-q79gd,Uid:6149ee57-78bc-465d-836b-29a1539259ca,Namespace:calico-system,Attempt:0,} returns sandbox id \"4dfbc4f0871248e58877b78e84330ff8b49cb34348efe98470535fd1ec280755\"" Dec 12 20:06:12.298581 containerd[1555]: time="2025-12-12T20:06:12.298432879Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 12 20:06:12.438921 containerd[1555]: time="2025-12-12T20:06:12.438773210Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-24v5p,Uid:78c07b64-d1b7-4a7d-a27a-135ea92b7705,Namespace:calico-system,Attempt:0,}" Dec 12 20:06:12.476565 containerd[1555]: time="2025-12-12T20:06:12.475987004Z" level=info msg="connecting to shim 8ebcd613dd0402697dafa960abacd695f39b608871daddc98dabf99b1931d72b" address="unix:///run/containerd/s/8e2622351cdf82d97e50f1707c53e4e08973ad4bb250d140c4dcbf8b363ee805" namespace=k8s.io protocol=ttrpc version=3 Dec 12 20:06:12.517576 systemd[1]: Started cri-containerd-8ebcd613dd0402697dafa960abacd695f39b608871daddc98dabf99b1931d72b.scope - libcontainer container 8ebcd613dd0402697dafa960abacd695f39b608871daddc98dabf99b1931d72b. Dec 12 20:06:12.585696 containerd[1555]: time="2025-12-12T20:06:12.585495377Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-24v5p,Uid:78c07b64-d1b7-4a7d-a27a-135ea92b7705,Namespace:calico-system,Attempt:0,} returns sandbox id \"8ebcd613dd0402697dafa960abacd695f39b608871daddc98dabf99b1931d72b\"" Dec 12 20:06:13.876043 kubelet[2887]: E1212 20:06:13.875461 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:06:14.012864 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount731252209.mount: Deactivated successfully. Dec 12 20:06:15.465868 containerd[1555]: time="2025-12-12T20:06:15.465751085Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:06:15.467596 containerd[1555]: time="2025-12-12T20:06:15.467554076Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=35234628" Dec 12 20:06:15.469132 containerd[1555]: time="2025-12-12T20:06:15.469095546Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:06:15.472301 containerd[1555]: time="2025-12-12T20:06:15.472249870Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:06:15.473309 containerd[1555]: time="2025-12-12T20:06:15.473255457Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 3.174123115s" Dec 12 20:06:15.473457 containerd[1555]: time="2025-12-12T20:06:15.473430644Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Dec 12 20:06:15.485085 containerd[1555]: time="2025-12-12T20:06:15.484994134Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 12 20:06:15.542976 containerd[1555]: time="2025-12-12T20:06:15.542918135Z" level=info msg="CreateContainer within sandbox \"4dfbc4f0871248e58877b78e84330ff8b49cb34348efe98470535fd1ec280755\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 12 20:06:15.555307 containerd[1555]: time="2025-12-12T20:06:15.554498637Z" level=info msg="Container 16a74c848e0c2d28cb5ad25655bb74aca82432c157c78e5c7bcbf4d418176b10: CDI devices from CRI Config.CDIDevices: []" Dec 12 20:06:15.566095 containerd[1555]: time="2025-12-12T20:06:15.565957801Z" level=info msg="CreateContainer within sandbox \"4dfbc4f0871248e58877b78e84330ff8b49cb34348efe98470535fd1ec280755\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"16a74c848e0c2d28cb5ad25655bb74aca82432c157c78e5c7bcbf4d418176b10\"" Dec 12 20:06:15.571676 containerd[1555]: time="2025-12-12T20:06:15.571461454Z" level=info msg="StartContainer for \"16a74c848e0c2d28cb5ad25655bb74aca82432c157c78e5c7bcbf4d418176b10\"" Dec 12 20:06:15.573235 containerd[1555]: time="2025-12-12T20:06:15.573187725Z" level=info msg="connecting to shim 16a74c848e0c2d28cb5ad25655bb74aca82432c157c78e5c7bcbf4d418176b10" address="unix:///run/containerd/s/872052cf9a120ee7b83e8e7a781249ffbb63c41e8b57803e57331b167c1a10aa" protocol=ttrpc version=3 Dec 12 20:06:15.612584 systemd[1]: Started cri-containerd-16a74c848e0c2d28cb5ad25655bb74aca82432c157c78e5c7bcbf4d418176b10.scope - libcontainer container 16a74c848e0c2d28cb5ad25655bb74aca82432c157c78e5c7bcbf4d418176b10. Dec 12 20:06:15.710848 containerd[1555]: time="2025-12-12T20:06:15.710767324Z" level=info msg="StartContainer for \"16a74c848e0c2d28cb5ad25655bb74aca82432c157c78e5c7bcbf4d418176b10\" returns successfully" Dec 12 20:06:15.875009 kubelet[2887]: E1212 20:06:15.874843 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:06:16.150451 kubelet[2887]: I1212 20:06:16.148067 2887 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-65cf6f8f57-q79gd" podStartSLOduration=1.958864964 podStartE2EDuration="5.146911938s" podCreationTimestamp="2025-12-12 20:06:11 +0000 UTC" firstStartedPulling="2025-12-12 20:06:12.296732245 +0000 UTC m=+30.725631681" lastFinishedPulling="2025-12-12 20:06:15.484779212 +0000 UTC m=+33.913678655" observedRunningTime="2025-12-12 20:06:16.143027534 +0000 UTC m=+34.571926990" watchObservedRunningTime="2025-12-12 20:06:16.146911938 +0000 UTC m=+34.575811385" Dec 12 20:06:16.158555 kubelet[2887]: E1212 20:06:16.158348 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.158555 kubelet[2887]: W1212 20:06:16.158382 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.159920 kubelet[2887]: E1212 20:06:16.159774 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.160163 kubelet[2887]: E1212 20:06:16.160144 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.160341 kubelet[2887]: W1212 20:06:16.160235 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.160341 kubelet[2887]: E1212 20:06:16.160256 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.160863 kubelet[2887]: E1212 20:06:16.160841 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.161131 kubelet[2887]: W1212 20:06:16.160999 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.161131 kubelet[2887]: E1212 20:06:16.161026 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.163803 kubelet[2887]: E1212 20:06:16.163628 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.163803 kubelet[2887]: W1212 20:06:16.163648 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.163803 kubelet[2887]: E1212 20:06:16.163666 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.164089 kubelet[2887]: E1212 20:06:16.164068 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.164215 kubelet[2887]: W1212 20:06:16.164193 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.164353 kubelet[2887]: E1212 20:06:16.164331 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.164915 kubelet[2887]: E1212 20:06:16.164742 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.164915 kubelet[2887]: W1212 20:06:16.164761 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.164915 kubelet[2887]: E1212 20:06:16.164786 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.165163 kubelet[2887]: E1212 20:06:16.165143 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.165261 kubelet[2887]: W1212 20:06:16.165241 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.165562 kubelet[2887]: E1212 20:06:16.165381 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.165735 kubelet[2887]: E1212 20:06:16.165715 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.166049 kubelet[2887]: W1212 20:06:16.165816 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.166049 kubelet[2887]: E1212 20:06:16.165841 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.166270 kubelet[2887]: E1212 20:06:16.166250 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.166392 kubelet[2887]: W1212 20:06:16.166371 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.166776 kubelet[2887]: E1212 20:06:16.166528 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.166937 kubelet[2887]: E1212 20:06:16.166917 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.167281 kubelet[2887]: W1212 20:06:16.167024 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.167281 kubelet[2887]: E1212 20:06:16.167045 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.167508 kubelet[2887]: E1212 20:06:16.167488 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.167629 kubelet[2887]: W1212 20:06:16.167609 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.167728 kubelet[2887]: E1212 20:06:16.167708 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.168255 kubelet[2887]: E1212 20:06:16.168069 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.168255 kubelet[2887]: W1212 20:06:16.168089 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.168255 kubelet[2887]: E1212 20:06:16.168104 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.168625 kubelet[2887]: E1212 20:06:16.168603 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.168768 kubelet[2887]: W1212 20:06:16.168725 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.169119 kubelet[2887]: E1212 20:06:16.168866 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.169310 kubelet[2887]: E1212 20:06:16.169268 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.169452 kubelet[2887]: W1212 20:06:16.169430 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.169756 kubelet[2887]: E1212 20:06:16.169554 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.169906 kubelet[2887]: E1212 20:06:16.169887 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.170229 kubelet[2887]: W1212 20:06:16.170014 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.170229 kubelet[2887]: E1212 20:06:16.170034 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.254694 kubelet[2887]: E1212 20:06:16.254579 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.254694 kubelet[2887]: W1212 20:06:16.254684 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.254951 kubelet[2887]: E1212 20:06:16.254716 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.255080 kubelet[2887]: E1212 20:06:16.255055 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.255080 kubelet[2887]: W1212 20:06:16.255076 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.255215 kubelet[2887]: E1212 20:06:16.255106 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.255514 kubelet[2887]: E1212 20:06:16.255480 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.255514 kubelet[2887]: W1212 20:06:16.255504 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.255672 kubelet[2887]: E1212 20:06:16.255530 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.255973 kubelet[2887]: E1212 20:06:16.255798 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.255973 kubelet[2887]: W1212 20:06:16.255811 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.255973 kubelet[2887]: E1212 20:06:16.255825 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.256277 kubelet[2887]: E1212 20:06:16.256254 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.256437 kubelet[2887]: W1212 20:06:16.256398 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.256693 kubelet[2887]: E1212 20:06:16.256589 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.256925 kubelet[2887]: E1212 20:06:16.256893 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.256925 kubelet[2887]: W1212 20:06:16.256916 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.257028 kubelet[2887]: E1212 20:06:16.256931 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.257224 kubelet[2887]: E1212 20:06:16.257204 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.257224 kubelet[2887]: W1212 20:06:16.257222 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.257395 kubelet[2887]: E1212 20:06:16.257237 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.257509 kubelet[2887]: E1212 20:06:16.257489 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.257509 kubelet[2887]: W1212 20:06:16.257508 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.257630 kubelet[2887]: E1212 20:06:16.257536 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.257872 kubelet[2887]: E1212 20:06:16.257852 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.257872 kubelet[2887]: W1212 20:06:16.257870 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.257977 kubelet[2887]: E1212 20:06:16.257886 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.258183 kubelet[2887]: E1212 20:06:16.258165 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.258183 kubelet[2887]: W1212 20:06:16.258182 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.258328 kubelet[2887]: E1212 20:06:16.258197 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.258493 kubelet[2887]: E1212 20:06:16.258475 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.258493 kubelet[2887]: W1212 20:06:16.258493 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.258493 kubelet[2887]: E1212 20:06:16.258507 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.258957 kubelet[2887]: E1212 20:06:16.258938 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.258957 kubelet[2887]: W1212 20:06:16.258955 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.259104 kubelet[2887]: E1212 20:06:16.258978 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.259637 kubelet[2887]: E1212 20:06:16.259615 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.259707 kubelet[2887]: W1212 20:06:16.259689 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.259759 kubelet[2887]: E1212 20:06:16.259708 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.260014 kubelet[2887]: E1212 20:06:16.259994 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.260014 kubelet[2887]: W1212 20:06:16.260012 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.260142 kubelet[2887]: E1212 20:06:16.260040 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.260281 kubelet[2887]: E1212 20:06:16.260256 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.260281 kubelet[2887]: W1212 20:06:16.260276 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.260429 kubelet[2887]: E1212 20:06:16.260323 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.260603 kubelet[2887]: E1212 20:06:16.260579 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.260603 kubelet[2887]: W1212 20:06:16.260598 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.260816 kubelet[2887]: E1212 20:06:16.260613 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.261126 kubelet[2887]: E1212 20:06:16.261095 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.261126 kubelet[2887]: W1212 20:06:16.261115 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.261235 kubelet[2887]: E1212 20:06:16.261130 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:16.262208 kubelet[2887]: E1212 20:06:16.262185 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:16.262329 kubelet[2887]: W1212 20:06:16.262283 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:16.262458 kubelet[2887]: E1212 20:06:16.262437 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.176573 kubelet[2887]: E1212 20:06:17.176317 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.176573 kubelet[2887]: W1212 20:06:17.176360 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.176573 kubelet[2887]: E1212 20:06:17.176390 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.178653 kubelet[2887]: E1212 20:06:17.178474 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.178653 kubelet[2887]: W1212 20:06:17.178489 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.178653 kubelet[2887]: E1212 20:06:17.178505 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.179147 kubelet[2887]: E1212 20:06:17.179120 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.179147 kubelet[2887]: W1212 20:06:17.179141 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.179264 kubelet[2887]: E1212 20:06:17.179157 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.179653 kubelet[2887]: E1212 20:06:17.179623 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.179743 kubelet[2887]: W1212 20:06:17.179645 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.179818 kubelet[2887]: E1212 20:06:17.179750 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.180267 kubelet[2887]: E1212 20:06:17.180239 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.180267 kubelet[2887]: W1212 20:06:17.180260 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.180469 kubelet[2887]: E1212 20:06:17.180276 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.180991 kubelet[2887]: E1212 20:06:17.180968 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.180991 kubelet[2887]: W1212 20:06:17.180988 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.181132 kubelet[2887]: E1212 20:06:17.181005 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.181507 kubelet[2887]: E1212 20:06:17.181481 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.181507 kubelet[2887]: W1212 20:06:17.181501 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.181679 kubelet[2887]: E1212 20:06:17.181517 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.182801 kubelet[2887]: E1212 20:06:17.182773 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.182801 kubelet[2887]: W1212 20:06:17.182794 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.182939 kubelet[2887]: E1212 20:06:17.182809 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.183400 kubelet[2887]: E1212 20:06:17.183360 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.183400 kubelet[2887]: W1212 20:06:17.183381 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.183400 kubelet[2887]: E1212 20:06:17.183400 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.184149 kubelet[2887]: E1212 20:06:17.184127 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.184149 kubelet[2887]: W1212 20:06:17.184147 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.184335 kubelet[2887]: E1212 20:06:17.184162 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.185038 kubelet[2887]: E1212 20:06:17.185005 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.185038 kubelet[2887]: W1212 20:06:17.185029 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.185161 kubelet[2887]: E1212 20:06:17.185048 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.185479 kubelet[2887]: E1212 20:06:17.185446 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.185479 kubelet[2887]: W1212 20:06:17.185470 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.185653 kubelet[2887]: E1212 20:06:17.185486 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.188312 kubelet[2887]: E1212 20:06:17.186721 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.188312 kubelet[2887]: W1212 20:06:17.186743 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.188312 kubelet[2887]: E1212 20:06:17.186761 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.188312 kubelet[2887]: E1212 20:06:17.186978 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.188312 kubelet[2887]: W1212 20:06:17.186991 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.188312 kubelet[2887]: E1212 20:06:17.187005 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.188823 kubelet[2887]: E1212 20:06:17.188800 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.188823 kubelet[2887]: W1212 20:06:17.188819 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.188961 kubelet[2887]: E1212 20:06:17.188839 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.265215 kubelet[2887]: E1212 20:06:17.265169 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.265215 kubelet[2887]: W1212 20:06:17.265211 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.265588 kubelet[2887]: E1212 20:06:17.265236 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.265669 kubelet[2887]: E1212 20:06:17.265639 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.265669 kubelet[2887]: W1212 20:06:17.265654 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.265991 kubelet[2887]: E1212 20:06:17.265670 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.266306 kubelet[2887]: E1212 20:06:17.266197 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.266306 kubelet[2887]: W1212 20:06:17.266221 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.266306 kubelet[2887]: E1212 20:06:17.266239 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.266841 kubelet[2887]: E1212 20:06:17.266821 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.267048 kubelet[2887]: W1212 20:06:17.266938 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.267048 kubelet[2887]: E1212 20:06:17.266964 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.267815 kubelet[2887]: E1212 20:06:17.267706 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.267815 kubelet[2887]: W1212 20:06:17.267725 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.267815 kubelet[2887]: E1212 20:06:17.267742 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.268308 kubelet[2887]: E1212 20:06:17.268257 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.268308 kubelet[2887]: W1212 20:06:17.268276 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.268543 kubelet[2887]: E1212 20:06:17.268441 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.268843 kubelet[2887]: E1212 20:06:17.268824 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.269047 kubelet[2887]: W1212 20:06:17.268928 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.269047 kubelet[2887]: E1212 20:06:17.268951 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.269464 kubelet[2887]: E1212 20:06:17.269445 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.269755 kubelet[2887]: W1212 20:06:17.269573 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.269755 kubelet[2887]: E1212 20:06:17.269598 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.270208 kubelet[2887]: E1212 20:06:17.270060 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.270208 kubelet[2887]: W1212 20:06:17.270079 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.270208 kubelet[2887]: E1212 20:06:17.270097 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.270492 kubelet[2887]: E1212 20:06:17.270472 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.270694 kubelet[2887]: W1212 20:06:17.270570 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.270694 kubelet[2887]: E1212 20:06:17.270591 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.271309 kubelet[2887]: E1212 20:06:17.271163 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.271309 kubelet[2887]: W1212 20:06:17.271180 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.271309 kubelet[2887]: E1212 20:06:17.271195 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.271846 kubelet[2887]: E1212 20:06:17.271803 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.272037 kubelet[2887]: W1212 20:06:17.271937 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.272037 kubelet[2887]: E1212 20:06:17.271961 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.272440 kubelet[2887]: E1212 20:06:17.272422 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.272546 kubelet[2887]: W1212 20:06:17.272513 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.272671 kubelet[2887]: E1212 20:06:17.272650 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.273221 kubelet[2887]: E1212 20:06:17.273093 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.273221 kubelet[2887]: W1212 20:06:17.273111 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.273221 kubelet[2887]: E1212 20:06:17.273127 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.274214 kubelet[2887]: E1212 20:06:17.273970 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.274214 kubelet[2887]: W1212 20:06:17.273993 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.274214 kubelet[2887]: E1212 20:06:17.274010 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.274477 kubelet[2887]: E1212 20:06:17.274407 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.274477 kubelet[2887]: W1212 20:06:17.274423 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.274477 kubelet[2887]: E1212 20:06:17.274440 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.275275 kubelet[2887]: E1212 20:06:17.275153 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.275275 kubelet[2887]: W1212 20:06:17.275172 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.275275 kubelet[2887]: E1212 20:06:17.275188 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.275906 kubelet[2887]: E1212 20:06:17.275834 2887 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 12 20:06:17.275906 kubelet[2887]: W1212 20:06:17.275853 2887 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 12 20:06:17.275906 kubelet[2887]: E1212 20:06:17.275869 2887 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 12 20:06:17.435937 containerd[1555]: time="2025-12-12T20:06:17.435789317Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:06:17.439897 containerd[1555]: time="2025-12-12T20:06:17.438922138Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4446754" Dec 12 20:06:17.439897 containerd[1555]: time="2025-12-12T20:06:17.439012668Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:06:17.441518 containerd[1555]: time="2025-12-12T20:06:17.441478322Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:06:17.442657 containerd[1555]: time="2025-12-12T20:06:17.442620203Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.957371282s" Dec 12 20:06:17.442804 containerd[1555]: time="2025-12-12T20:06:17.442774320Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Dec 12 20:06:17.450088 containerd[1555]: time="2025-12-12T20:06:17.450052893Z" level=info msg="CreateContainer within sandbox \"8ebcd613dd0402697dafa960abacd695f39b608871daddc98dabf99b1931d72b\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 12 20:06:17.462792 containerd[1555]: time="2025-12-12T20:06:17.462727217Z" level=info msg="Container 0012516da02b74c7f6439b3d7405490515234b764f14d1f1b635d292d15ff7fe: CDI devices from CRI Config.CDIDevices: []" Dec 12 20:06:17.469964 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1833536936.mount: Deactivated successfully. Dec 12 20:06:17.483668 containerd[1555]: time="2025-12-12T20:06:17.483540938Z" level=info msg="CreateContainer within sandbox \"8ebcd613dd0402697dafa960abacd695f39b608871daddc98dabf99b1931d72b\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"0012516da02b74c7f6439b3d7405490515234b764f14d1f1b635d292d15ff7fe\"" Dec 12 20:06:17.484570 containerd[1555]: time="2025-12-12T20:06:17.484515610Z" level=info msg="StartContainer for \"0012516da02b74c7f6439b3d7405490515234b764f14d1f1b635d292d15ff7fe\"" Dec 12 20:06:17.488900 containerd[1555]: time="2025-12-12T20:06:17.488853166Z" level=info msg="connecting to shim 0012516da02b74c7f6439b3d7405490515234b764f14d1f1b635d292d15ff7fe" address="unix:///run/containerd/s/8e2622351cdf82d97e50f1707c53e4e08973ad4bb250d140c4dcbf8b363ee805" protocol=ttrpc version=3 Dec 12 20:06:17.528648 systemd[1]: Started cri-containerd-0012516da02b74c7f6439b3d7405490515234b764f14d1f1b635d292d15ff7fe.scope - libcontainer container 0012516da02b74c7f6439b3d7405490515234b764f14d1f1b635d292d15ff7fe. Dec 12 20:06:17.638864 containerd[1555]: time="2025-12-12T20:06:17.638748740Z" level=info msg="StartContainer for \"0012516da02b74c7f6439b3d7405490515234b764f14d1f1b635d292d15ff7fe\" returns successfully" Dec 12 20:06:17.663442 systemd[1]: cri-containerd-0012516da02b74c7f6439b3d7405490515234b764f14d1f1b635d292d15ff7fe.scope: Deactivated successfully. Dec 12 20:06:17.666875 containerd[1555]: time="2025-12-12T20:06:17.666815996Z" level=info msg="received container exit event container_id:\"0012516da02b74c7f6439b3d7405490515234b764f14d1f1b635d292d15ff7fe\" id:\"0012516da02b74c7f6439b3d7405490515234b764f14d1f1b635d292d15ff7fe\" pid:3595 exited_at:{seconds:1765569977 nanos:666194750}" Dec 12 20:06:17.708776 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0012516da02b74c7f6439b3d7405490515234b764f14d1f1b635d292d15ff7fe-rootfs.mount: Deactivated successfully. Dec 12 20:06:17.872261 kubelet[2887]: E1212 20:06:17.872163 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:06:18.117744 containerd[1555]: time="2025-12-12T20:06:18.117685990Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 12 20:06:19.874381 kubelet[2887]: E1212 20:06:19.871656 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:06:21.872835 kubelet[2887]: E1212 20:06:21.871953 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:06:23.100496 containerd[1555]: time="2025-12-12T20:06:23.100405053Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:06:23.101762 containerd[1555]: time="2025-12-12T20:06:23.101727373Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70446859" Dec 12 20:06:23.103348 containerd[1555]: time="2025-12-12T20:06:23.102501629Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:06:23.106396 containerd[1555]: time="2025-12-12T20:06:23.106361101Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:06:23.107342 containerd[1555]: time="2025-12-12T20:06:23.107302992Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 4.989537388s" Dec 12 20:06:23.107444 containerd[1555]: time="2025-12-12T20:06:23.107345155Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Dec 12 20:06:23.112579 containerd[1555]: time="2025-12-12T20:06:23.112523378Z" level=info msg="CreateContainer within sandbox \"8ebcd613dd0402697dafa960abacd695f39b608871daddc98dabf99b1931d72b\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 12 20:06:23.166556 containerd[1555]: time="2025-12-12T20:06:23.166482967Z" level=info msg="Container 5adf488f9515cdbf07a7ec899a53d8bbcec0afe204a8071101fc403d5c5785e6: CDI devices from CRI Config.CDIDevices: []" Dec 12 20:06:23.177369 containerd[1555]: time="2025-12-12T20:06:23.177317594Z" level=info msg="CreateContainer within sandbox \"8ebcd613dd0402697dafa960abacd695f39b608871daddc98dabf99b1931d72b\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"5adf488f9515cdbf07a7ec899a53d8bbcec0afe204a8071101fc403d5c5785e6\"" Dec 12 20:06:23.178104 containerd[1555]: time="2025-12-12T20:06:23.178065594Z" level=info msg="StartContainer for \"5adf488f9515cdbf07a7ec899a53d8bbcec0afe204a8071101fc403d5c5785e6\"" Dec 12 20:06:23.181064 containerd[1555]: time="2025-12-12T20:06:23.181019268Z" level=info msg="connecting to shim 5adf488f9515cdbf07a7ec899a53d8bbcec0afe204a8071101fc403d5c5785e6" address="unix:///run/containerd/s/8e2622351cdf82d97e50f1707c53e4e08973ad4bb250d140c4dcbf8b363ee805" protocol=ttrpc version=3 Dec 12 20:06:23.217519 systemd[1]: Started cri-containerd-5adf488f9515cdbf07a7ec899a53d8bbcec0afe204a8071101fc403d5c5785e6.scope - libcontainer container 5adf488f9515cdbf07a7ec899a53d8bbcec0afe204a8071101fc403d5c5785e6. Dec 12 20:06:23.344400 containerd[1555]: time="2025-12-12T20:06:23.344346458Z" level=info msg="StartContainer for \"5adf488f9515cdbf07a7ec899a53d8bbcec0afe204a8071101fc403d5c5785e6\" returns successfully" Dec 12 20:06:23.873033 kubelet[2887]: E1212 20:06:23.871243 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:06:24.349440 systemd[1]: cri-containerd-5adf488f9515cdbf07a7ec899a53d8bbcec0afe204a8071101fc403d5c5785e6.scope: Deactivated successfully. Dec 12 20:06:24.350376 systemd[1]: cri-containerd-5adf488f9515cdbf07a7ec899a53d8bbcec0afe204a8071101fc403d5c5785e6.scope: Consumed 760ms CPU time, 160.5M memory peak, 5.8M read from disk, 171.3M written to disk. Dec 12 20:06:24.398803 containerd[1555]: time="2025-12-12T20:06:24.398740912Z" level=info msg="received container exit event container_id:\"5adf488f9515cdbf07a7ec899a53d8bbcec0afe204a8071101fc403d5c5785e6\" id:\"5adf488f9515cdbf07a7ec899a53d8bbcec0afe204a8071101fc403d5c5785e6\" pid:3655 exited_at:{seconds:1765569984 nanos:398356503}" Dec 12 20:06:24.461079 kubelet[2887]: I1212 20:06:24.460837 2887 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Dec 12 20:06:24.527771 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5adf488f9515cdbf07a7ec899a53d8bbcec0afe204a8071101fc403d5c5785e6-rootfs.mount: Deactivated successfully. Dec 12 20:06:24.600497 systemd[1]: Created slice kubepods-burstable-podadefcce3_e2dc_4e91_aff3_a5f983fee5e6.slice - libcontainer container kubepods-burstable-podadefcce3_e2dc_4e91_aff3_a5f983fee5e6.slice. Dec 12 20:06:24.622316 kubelet[2887]: I1212 20:06:24.621494 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0319e4c1-786f-47b5-99ee-05577b0ae0cb-tigera-ca-bundle\") pod \"calico-kube-controllers-59fc8f5456-7jq54\" (UID: \"0319e4c1-786f-47b5-99ee-05577b0ae0cb\") " pod="calico-system/calico-kube-controllers-59fc8f5456-7jq54" Dec 12 20:06:24.622316 kubelet[2887]: I1212 20:06:24.621554 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/87f20b04-9afa-4c4b-849e-0456cd78dc55-calico-apiserver-certs\") pod \"calico-apiserver-558bb796c-w9rs4\" (UID: \"87f20b04-9afa-4c4b-849e-0456cd78dc55\") " pod="calico-apiserver/calico-apiserver-558bb796c-w9rs4" Dec 12 20:06:24.622316 kubelet[2887]: I1212 20:06:24.621601 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw8ps\" (UniqueName: \"kubernetes.io/projected/0319e4c1-786f-47b5-99ee-05577b0ae0cb-kube-api-access-vw8ps\") pod \"calico-kube-controllers-59fc8f5456-7jq54\" (UID: \"0319e4c1-786f-47b5-99ee-05577b0ae0cb\") " pod="calico-system/calico-kube-controllers-59fc8f5456-7jq54" Dec 12 20:06:24.622316 kubelet[2887]: I1212 20:06:24.621637 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcf4ce02-dd77-415d-b9b4-03b7689df33d-config-volume\") pod \"coredns-674b8bbfcf-s5kk6\" (UID: \"bcf4ce02-dd77-415d-b9b4-03b7689df33d\") " pod="kube-system/coredns-674b8bbfcf-s5kk6" Dec 12 20:06:24.622316 kubelet[2887]: I1212 20:06:24.621667 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzx7m\" (UniqueName: \"kubernetes.io/projected/adefcce3-e2dc-4e91-aff3-a5f983fee5e6-kube-api-access-xzx7m\") pod \"coredns-674b8bbfcf-gfqph\" (UID: \"adefcce3-e2dc-4e91-aff3-a5f983fee5e6\") " pod="kube-system/coredns-674b8bbfcf-gfqph" Dec 12 20:06:24.622760 kubelet[2887]: I1212 20:06:24.621697 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctvmp\" (UniqueName: \"kubernetes.io/projected/87f20b04-9afa-4c4b-849e-0456cd78dc55-kube-api-access-ctvmp\") pod \"calico-apiserver-558bb796c-w9rs4\" (UID: \"87f20b04-9afa-4c4b-849e-0456cd78dc55\") " pod="calico-apiserver/calico-apiserver-558bb796c-w9rs4" Dec 12 20:06:24.622760 kubelet[2887]: I1212 20:06:24.621728 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpx95\" (UniqueName: \"kubernetes.io/projected/dcaf5416-6748-4e19-9a64-70511b93ac27-kube-api-access-cpx95\") pod \"calico-apiserver-7f7c49f9bf-2vbl6\" (UID: \"dcaf5416-6748-4e19-9a64-70511b93ac27\") " pod="calico-apiserver/calico-apiserver-7f7c49f9bf-2vbl6" Dec 12 20:06:24.622760 kubelet[2887]: I1212 20:06:24.621756 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7qkn\" (UniqueName: \"kubernetes.io/projected/62353628-af81-4f55-9563-ec2bb7f43849-kube-api-access-t7qkn\") pod \"calico-apiserver-558bb796c-666sk\" (UID: \"62353628-af81-4f55-9563-ec2bb7f43849\") " pod="calico-apiserver/calico-apiserver-558bb796c-666sk" Dec 12 20:06:24.622760 kubelet[2887]: I1212 20:06:24.621787 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/62353628-af81-4f55-9563-ec2bb7f43849-calico-apiserver-certs\") pod \"calico-apiserver-558bb796c-666sk\" (UID: \"62353628-af81-4f55-9563-ec2bb7f43849\") " pod="calico-apiserver/calico-apiserver-558bb796c-666sk" Dec 12 20:06:24.622760 kubelet[2887]: I1212 20:06:24.621826 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/dcaf5416-6748-4e19-9a64-70511b93ac27-calico-apiserver-certs\") pod \"calico-apiserver-7f7c49f9bf-2vbl6\" (UID: \"dcaf5416-6748-4e19-9a64-70511b93ac27\") " pod="calico-apiserver/calico-apiserver-7f7c49f9bf-2vbl6" Dec 12 20:06:24.622972 kubelet[2887]: I1212 20:06:24.621856 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lnc7\" (UniqueName: \"kubernetes.io/projected/bcf4ce02-dd77-415d-b9b4-03b7689df33d-kube-api-access-2lnc7\") pod \"coredns-674b8bbfcf-s5kk6\" (UID: \"bcf4ce02-dd77-415d-b9b4-03b7689df33d\") " pod="kube-system/coredns-674b8bbfcf-s5kk6" Dec 12 20:06:24.622972 kubelet[2887]: I1212 20:06:24.621884 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/adefcce3-e2dc-4e91-aff3-a5f983fee5e6-config-volume\") pod \"coredns-674b8bbfcf-gfqph\" (UID: \"adefcce3-e2dc-4e91-aff3-a5f983fee5e6\") " pod="kube-system/coredns-674b8bbfcf-gfqph" Dec 12 20:06:24.633019 systemd[1]: Created slice kubepods-besteffort-pod87f20b04_9afa_4c4b_849e_0456cd78dc55.slice - libcontainer container kubepods-besteffort-pod87f20b04_9afa_4c4b_849e_0456cd78dc55.slice. Dec 12 20:06:24.645211 systemd[1]: Created slice kubepods-besteffort-pod0319e4c1_786f_47b5_99ee_05577b0ae0cb.slice - libcontainer container kubepods-besteffort-pod0319e4c1_786f_47b5_99ee_05577b0ae0cb.slice. Dec 12 20:06:24.654601 systemd[1]: Created slice kubepods-burstable-podbcf4ce02_dd77_415d_b9b4_03b7689df33d.slice - libcontainer container kubepods-burstable-podbcf4ce02_dd77_415d_b9b4_03b7689df33d.slice. Dec 12 20:06:24.668401 systemd[1]: Created slice kubepods-besteffort-pod08d2d7cd_4d89_4123_813f_09d5679d795c.slice - libcontainer container kubepods-besteffort-pod08d2d7cd_4d89_4123_813f_09d5679d795c.slice. Dec 12 20:06:24.684839 systemd[1]: Created slice kubepods-besteffort-poddcaf5416_6748_4e19_9a64_70511b93ac27.slice - libcontainer container kubepods-besteffort-poddcaf5416_6748_4e19_9a64_70511b93ac27.slice. Dec 12 20:06:24.696232 systemd[1]: Created slice kubepods-besteffort-podb08daecd_0b61_450c_a526_5e7b591cde3e.slice - libcontainer container kubepods-besteffort-podb08daecd_0b61_450c_a526_5e7b591cde3e.slice. Dec 12 20:06:24.705527 systemd[1]: Created slice kubepods-besteffort-pod62353628_af81_4f55_9563_ec2bb7f43849.slice - libcontainer container kubepods-besteffort-pod62353628_af81_4f55_9563_ec2bb7f43849.slice. Dec 12 20:06:24.723218 kubelet[2887]: I1212 20:06:24.722461 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gfql\" (UniqueName: \"kubernetes.io/projected/08d2d7cd-4d89-4123-813f-09d5679d795c-kube-api-access-2gfql\") pod \"whisker-7cd9d85f44-9fpdv\" (UID: \"08d2d7cd-4d89-4123-813f-09d5679d795c\") " pod="calico-system/whisker-7cd9d85f44-9fpdv" Dec 12 20:06:24.723218 kubelet[2887]: I1212 20:06:24.722568 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b08daecd-0b61-450c-a526-5e7b591cde3e-goldmane-ca-bundle\") pod \"goldmane-666569f655-22cbp\" (UID: \"b08daecd-0b61-450c-a526-5e7b591cde3e\") " pod="calico-system/goldmane-666569f655-22cbp" Dec 12 20:06:24.723218 kubelet[2887]: I1212 20:06:24.722659 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkfwk\" (UniqueName: \"kubernetes.io/projected/b08daecd-0b61-450c-a526-5e7b591cde3e-kube-api-access-wkfwk\") pod \"goldmane-666569f655-22cbp\" (UID: \"b08daecd-0b61-450c-a526-5e7b591cde3e\") " pod="calico-system/goldmane-666569f655-22cbp" Dec 12 20:06:24.723218 kubelet[2887]: I1212 20:06:24.722693 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/08d2d7cd-4d89-4123-813f-09d5679d795c-whisker-backend-key-pair\") pod \"whisker-7cd9d85f44-9fpdv\" (UID: \"08d2d7cd-4d89-4123-813f-09d5679d795c\") " pod="calico-system/whisker-7cd9d85f44-9fpdv" Dec 12 20:06:24.723218 kubelet[2887]: I1212 20:06:24.722736 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08d2d7cd-4d89-4123-813f-09d5679d795c-whisker-ca-bundle\") pod \"whisker-7cd9d85f44-9fpdv\" (UID: \"08d2d7cd-4d89-4123-813f-09d5679d795c\") " pod="calico-system/whisker-7cd9d85f44-9fpdv" Dec 12 20:06:24.725921 kubelet[2887]: I1212 20:06:24.722790 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b08daecd-0b61-450c-a526-5e7b591cde3e-config\") pod \"goldmane-666569f655-22cbp\" (UID: \"b08daecd-0b61-450c-a526-5e7b591cde3e\") " pod="calico-system/goldmane-666569f655-22cbp" Dec 12 20:06:24.725921 kubelet[2887]: I1212 20:06:24.722856 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/b08daecd-0b61-450c-a526-5e7b591cde3e-goldmane-key-pair\") pod \"goldmane-666569f655-22cbp\" (UID: \"b08daecd-0b61-450c-a526-5e7b591cde3e\") " pod="calico-system/goldmane-666569f655-22cbp" Dec 12 20:06:24.915401 containerd[1555]: time="2025-12-12T20:06:24.914146086Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gfqph,Uid:adefcce3-e2dc-4e91-aff3-a5f983fee5e6,Namespace:kube-system,Attempt:0,}" Dec 12 20:06:24.940437 containerd[1555]: time="2025-12-12T20:06:24.940380385Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-558bb796c-w9rs4,Uid:87f20b04-9afa-4c4b-849e-0456cd78dc55,Namespace:calico-apiserver,Attempt:0,}" Dec 12 20:06:24.969337 containerd[1555]: time="2025-12-12T20:06:24.967560230Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-s5kk6,Uid:bcf4ce02-dd77-415d-b9b4-03b7689df33d,Namespace:kube-system,Attempt:0,}" Dec 12 20:06:24.970487 containerd[1555]: time="2025-12-12T20:06:24.970435306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59fc8f5456-7jq54,Uid:0319e4c1-786f-47b5-99ee-05577b0ae0cb,Namespace:calico-system,Attempt:0,}" Dec 12 20:06:24.979892 containerd[1555]: time="2025-12-12T20:06:24.979844596Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7cd9d85f44-9fpdv,Uid:08d2d7cd-4d89-4123-813f-09d5679d795c,Namespace:calico-system,Attempt:0,}" Dec 12 20:06:25.010887 containerd[1555]: time="2025-12-12T20:06:25.010687671Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-22cbp,Uid:b08daecd-0b61-450c-a526-5e7b591cde3e,Namespace:calico-system,Attempt:0,}" Dec 12 20:06:25.026986 containerd[1555]: time="2025-12-12T20:06:25.026931180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-558bb796c-666sk,Uid:62353628-af81-4f55-9563-ec2bb7f43849,Namespace:calico-apiserver,Attempt:0,}" Dec 12 20:06:25.037835 containerd[1555]: time="2025-12-12T20:06:25.037481166Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f7c49f9bf-2vbl6,Uid:dcaf5416-6748-4e19-9a64-70511b93ac27,Namespace:calico-apiserver,Attempt:0,}" Dec 12 20:06:25.193569 containerd[1555]: time="2025-12-12T20:06:25.193326286Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 12 20:06:25.454041 containerd[1555]: time="2025-12-12T20:06:25.453874249Z" level=error msg="Failed to destroy network for sandbox \"c4471c77641d6cc6cdcd9581f4138ed4fea757551a8a05a634618cb671e7814d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.485109 containerd[1555]: time="2025-12-12T20:06:25.466619271Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7cd9d85f44-9fpdv,Uid:08d2d7cd-4d89-4123-813f-09d5679d795c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4471c77641d6cc6cdcd9581f4138ed4fea757551a8a05a634618cb671e7814d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.485109 containerd[1555]: time="2025-12-12T20:06:25.470566703Z" level=error msg="Failed to destroy network for sandbox \"06570ad6e9803bdcf00802bbb780cc60a8c1b17d322dce339732e2111882d950\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.491113 containerd[1555]: time="2025-12-12T20:06:25.490977209Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59fc8f5456-7jq54,Uid:0319e4c1-786f-47b5-99ee-05577b0ae0cb,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"06570ad6e9803bdcf00802bbb780cc60a8c1b17d322dce339732e2111882d950\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.510312 containerd[1555]: time="2025-12-12T20:06:25.510134467Z" level=error msg="Failed to destroy network for sandbox \"33dbd416326cab2cfde6b086b49b790af103cedf784daf0ee34f524b07fb2654\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.512960 kubelet[2887]: E1212 20:06:25.512757 2887 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"06570ad6e9803bdcf00802bbb780cc60a8c1b17d322dce339732e2111882d950\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.512960 kubelet[2887]: E1212 20:06:25.512758 2887 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4471c77641d6cc6cdcd9581f4138ed4fea757551a8a05a634618cb671e7814d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.512960 kubelet[2887]: E1212 20:06:25.512896 2887 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4471c77641d6cc6cdcd9581f4138ed4fea757551a8a05a634618cb671e7814d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7cd9d85f44-9fpdv" Dec 12 20:06:25.513741 kubelet[2887]: E1212 20:06:25.513072 2887 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4471c77641d6cc6cdcd9581f4138ed4fea757551a8a05a634618cb671e7814d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7cd9d85f44-9fpdv" Dec 12 20:06:25.513741 kubelet[2887]: E1212 20:06:25.513204 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7cd9d85f44-9fpdv_calico-system(08d2d7cd-4d89-4123-813f-09d5679d795c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7cd9d85f44-9fpdv_calico-system(08d2d7cd-4d89-4123-813f-09d5679d795c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c4471c77641d6cc6cdcd9581f4138ed4fea757551a8a05a634618cb671e7814d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7cd9d85f44-9fpdv" podUID="08d2d7cd-4d89-4123-813f-09d5679d795c" Dec 12 20:06:25.516972 kubelet[2887]: E1212 20:06:25.512896 2887 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"06570ad6e9803bdcf00802bbb780cc60a8c1b17d322dce339732e2111882d950\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-59fc8f5456-7jq54" Dec 12 20:06:25.516972 kubelet[2887]: E1212 20:06:25.514333 2887 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"06570ad6e9803bdcf00802bbb780cc60a8c1b17d322dce339732e2111882d950\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-59fc8f5456-7jq54" Dec 12 20:06:25.516972 kubelet[2887]: E1212 20:06:25.514435 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-59fc8f5456-7jq54_calico-system(0319e4c1-786f-47b5-99ee-05577b0ae0cb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-59fc8f5456-7jq54_calico-system(0319e4c1-786f-47b5-99ee-05577b0ae0cb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"06570ad6e9803bdcf00802bbb780cc60a8c1b17d322dce339732e2111882d950\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-59fc8f5456-7jq54" podUID="0319e4c1-786f-47b5-99ee-05577b0ae0cb" Dec 12 20:06:25.517280 containerd[1555]: time="2025-12-12T20:06:25.515172481Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-558bb796c-w9rs4,Uid:87f20b04-9afa-4c4b-849e-0456cd78dc55,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"33dbd416326cab2cfde6b086b49b790af103cedf784daf0ee34f524b07fb2654\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.529488 kubelet[2887]: E1212 20:06:25.515535 2887 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"33dbd416326cab2cfde6b086b49b790af103cedf784daf0ee34f524b07fb2654\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.529488 kubelet[2887]: E1212 20:06:25.515579 2887 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"33dbd416326cab2cfde6b086b49b790af103cedf784daf0ee34f524b07fb2654\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-558bb796c-w9rs4" Dec 12 20:06:25.529488 kubelet[2887]: E1212 20:06:25.515618 2887 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"33dbd416326cab2cfde6b086b49b790af103cedf784daf0ee34f524b07fb2654\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-558bb796c-w9rs4" Dec 12 20:06:25.536179 kubelet[2887]: E1212 20:06:25.515667 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-558bb796c-w9rs4_calico-apiserver(87f20b04-9afa-4c4b-849e-0456cd78dc55)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-558bb796c-w9rs4_calico-apiserver(87f20b04-9afa-4c4b-849e-0456cd78dc55)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"33dbd416326cab2cfde6b086b49b790af103cedf784daf0ee34f524b07fb2654\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-558bb796c-w9rs4" podUID="87f20b04-9afa-4c4b-849e-0456cd78dc55" Dec 12 20:06:25.545540 containerd[1555]: time="2025-12-12T20:06:25.545373585Z" level=error msg="Failed to destroy network for sandbox \"855b31c828146f09512b6b3facc29108e9dd57c20495de6b1ab625d2ef8ce87c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.574941 systemd[1]: run-netns-cni\x2d1245219a\x2d4938\x2db56b\x2d7455\x2d21bebf0cc02e.mount: Deactivated successfully. Dec 12 20:06:25.599530 containerd[1555]: time="2025-12-12T20:06:25.599473508Z" level=error msg="Failed to destroy network for sandbox \"6ce6a1268933f16c9777b3a65dc9c931a45e84609d827b2f1d3c5404d07c7e64\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.600768 containerd[1555]: time="2025-12-12T20:06:25.600630150Z" level=error msg="Failed to destroy network for sandbox \"04e56e2ba56eeb24125b435d8d037690cf417b1e3b331f1d241c22e4da36519d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.602485 containerd[1555]: time="2025-12-12T20:06:25.599775390Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-22cbp,Uid:b08daecd-0b61-450c-a526-5e7b591cde3e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"855b31c828146f09512b6b3facc29108e9dd57c20495de6b1ab625d2ef8ce87c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.603113 kubelet[2887]: E1212 20:06:25.603056 2887 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"855b31c828146f09512b6b3facc29108e9dd57c20495de6b1ab625d2ef8ce87c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.604338 kubelet[2887]: E1212 20:06:25.603408 2887 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"855b31c828146f09512b6b3facc29108e9dd57c20495de6b1ab625d2ef8ce87c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-22cbp" Dec 12 20:06:25.605235 systemd[1]: run-netns-cni\x2d4db957d3\x2def7f\x2de245\x2dfc2a\x2dbd2623baf02a.mount: Deactivated successfully. Dec 12 20:06:25.605452 systemd[1]: run-netns-cni\x2d2c6a5a44\x2d94bb\x2d38d2\x2de4af\x2dc39b45ff0c96.mount: Deactivated successfully. Dec 12 20:06:25.610251 kubelet[2887]: E1212 20:06:25.606017 2887 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"855b31c828146f09512b6b3facc29108e9dd57c20495de6b1ab625d2ef8ce87c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-22cbp" Dec 12 20:06:25.610251 kubelet[2887]: E1212 20:06:25.606135 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-22cbp_calico-system(b08daecd-0b61-450c-a526-5e7b591cde3e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-22cbp_calico-system(b08daecd-0b61-450c-a526-5e7b591cde3e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"855b31c828146f09512b6b3facc29108e9dd57c20495de6b1ab625d2ef8ce87c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-22cbp" podUID="b08daecd-0b61-450c-a526-5e7b591cde3e" Dec 12 20:06:25.613066 containerd[1555]: time="2025-12-12T20:06:25.613004590Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-s5kk6,Uid:bcf4ce02-dd77-415d-b9b4-03b7689df33d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ce6a1268933f16c9777b3a65dc9c931a45e84609d827b2f1d3c5404d07c7e64\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.613728 kubelet[2887]: E1212 20:06:25.613680 2887 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ce6a1268933f16c9777b3a65dc9c931a45e84609d827b2f1d3c5404d07c7e64\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.614063 kubelet[2887]: E1212 20:06:25.613806 2887 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ce6a1268933f16c9777b3a65dc9c931a45e84609d827b2f1d3c5404d07c7e64\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-s5kk6" Dec 12 20:06:25.614138 kubelet[2887]: E1212 20:06:25.613969 2887 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ce6a1268933f16c9777b3a65dc9c931a45e84609d827b2f1d3c5404d07c7e64\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-s5kk6" Dec 12 20:06:25.614608 kubelet[2887]: E1212 20:06:25.614376 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-s5kk6_kube-system(bcf4ce02-dd77-415d-b9b4-03b7689df33d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-s5kk6_kube-system(bcf4ce02-dd77-415d-b9b4-03b7689df33d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6ce6a1268933f16c9777b3a65dc9c931a45e84609d827b2f1d3c5404d07c7e64\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-s5kk6" podUID="bcf4ce02-dd77-415d-b9b4-03b7689df33d" Dec 12 20:06:25.616377 containerd[1555]: time="2025-12-12T20:06:25.614562325Z" level=error msg="Failed to destroy network for sandbox \"b93a8b077ff960880d174b02fe4336e54448c8dd0c93ff7593e0e961367f1141\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.618024 containerd[1555]: time="2025-12-12T20:06:25.617704606Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-558bb796c-666sk,Uid:62353628-af81-4f55-9563-ec2bb7f43849,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"04e56e2ba56eeb24125b435d8d037690cf417b1e3b331f1d241c22e4da36519d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.620326 kubelet[2887]: E1212 20:06:25.618991 2887 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"04e56e2ba56eeb24125b435d8d037690cf417b1e3b331f1d241c22e4da36519d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.620712 kubelet[2887]: E1212 20:06:25.620498 2887 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"04e56e2ba56eeb24125b435d8d037690cf417b1e3b331f1d241c22e4da36519d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-558bb796c-666sk" Dec 12 20:06:25.620712 kubelet[2887]: E1212 20:06:25.620539 2887 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"04e56e2ba56eeb24125b435d8d037690cf417b1e3b331f1d241c22e4da36519d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-558bb796c-666sk" Dec 12 20:06:25.620712 kubelet[2887]: E1212 20:06:25.620643 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-558bb796c-666sk_calico-apiserver(62353628-af81-4f55-9563-ec2bb7f43849)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-558bb796c-666sk_calico-apiserver(62353628-af81-4f55-9563-ec2bb7f43849)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"04e56e2ba56eeb24125b435d8d037690cf417b1e3b331f1d241c22e4da36519d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-558bb796c-666sk" podUID="62353628-af81-4f55-9563-ec2bb7f43849" Dec 12 20:06:25.620546 systemd[1]: run-netns-cni\x2dd6f59442\x2d3160\x2d0df8\x2ddc19\x2dc8420e06efa3.mount: Deactivated successfully. Dec 12 20:06:25.624104 containerd[1555]: time="2025-12-12T20:06:25.623993791Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gfqph,Uid:adefcce3-e2dc-4e91-aff3-a5f983fee5e6,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b93a8b077ff960880d174b02fe4336e54448c8dd0c93ff7593e0e961367f1141\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.625426 kubelet[2887]: E1212 20:06:25.625067 2887 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b93a8b077ff960880d174b02fe4336e54448c8dd0c93ff7593e0e961367f1141\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.625426 kubelet[2887]: E1212 20:06:25.625232 2887 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b93a8b077ff960880d174b02fe4336e54448c8dd0c93ff7593e0e961367f1141\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-gfqph" Dec 12 20:06:25.625426 kubelet[2887]: E1212 20:06:25.625267 2887 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b93a8b077ff960880d174b02fe4336e54448c8dd0c93ff7593e0e961367f1141\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-gfqph" Dec 12 20:06:25.626541 kubelet[2887]: E1212 20:06:25.626498 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-gfqph_kube-system(adefcce3-e2dc-4e91-aff3-a5f983fee5e6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-gfqph_kube-system(adefcce3-e2dc-4e91-aff3-a5f983fee5e6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b93a8b077ff960880d174b02fe4336e54448c8dd0c93ff7593e0e961367f1141\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-gfqph" podUID="adefcce3-e2dc-4e91-aff3-a5f983fee5e6" Dec 12 20:06:25.631177 containerd[1555]: time="2025-12-12T20:06:25.630870460Z" level=error msg="Failed to destroy network for sandbox \"f744538cd7d2addc16d8a27d8ed0f93e1173b7736350cd92ef959be01b6a43ee\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.635213 containerd[1555]: time="2025-12-12T20:06:25.635155724Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f7c49f9bf-2vbl6,Uid:dcaf5416-6748-4e19-9a64-70511b93ac27,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f744538cd7d2addc16d8a27d8ed0f93e1173b7736350cd92ef959be01b6a43ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.635801 kubelet[2887]: E1212 20:06:25.635578 2887 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f744538cd7d2addc16d8a27d8ed0f93e1173b7736350cd92ef959be01b6a43ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:25.636212 kubelet[2887]: E1212 20:06:25.636020 2887 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f744538cd7d2addc16d8a27d8ed0f93e1173b7736350cd92ef959be01b6a43ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f7c49f9bf-2vbl6" Dec 12 20:06:25.636212 kubelet[2887]: E1212 20:06:25.636061 2887 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f744538cd7d2addc16d8a27d8ed0f93e1173b7736350cd92ef959be01b6a43ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f7c49f9bf-2vbl6" Dec 12 20:06:25.636212 kubelet[2887]: E1212 20:06:25.636156 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7f7c49f9bf-2vbl6_calico-apiserver(dcaf5416-6748-4e19-9a64-70511b93ac27)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7f7c49f9bf-2vbl6_calico-apiserver(dcaf5416-6748-4e19-9a64-70511b93ac27)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f744538cd7d2addc16d8a27d8ed0f93e1173b7736350cd92ef959be01b6a43ee\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7f7c49f9bf-2vbl6" podUID="dcaf5416-6748-4e19-9a64-70511b93ac27" Dec 12 20:06:25.881981 systemd[1]: Created slice kubepods-besteffort-pod2a05762a_3b72_4df9_aa17_753debf16cab.slice - libcontainer container kubepods-besteffort-pod2a05762a_3b72_4df9_aa17_753debf16cab.slice. Dec 12 20:06:25.886036 containerd[1555]: time="2025-12-12T20:06:25.885915705Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fwp5m,Uid:2a05762a-3b72-4df9-aa17-753debf16cab,Namespace:calico-system,Attempt:0,}" Dec 12 20:06:25.998312 containerd[1555]: time="2025-12-12T20:06:25.998211169Z" level=error msg="Failed to destroy network for sandbox \"67fe8a9e222de2e5475f0713f64acd91b69da7b1384849eda1752c913df20bab\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:26.000994 containerd[1555]: time="2025-12-12T20:06:26.000752220Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fwp5m,Uid:2a05762a-3b72-4df9-aa17-753debf16cab,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"67fe8a9e222de2e5475f0713f64acd91b69da7b1384849eda1752c913df20bab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:26.004082 kubelet[2887]: E1212 20:06:26.001226 2887 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67fe8a9e222de2e5475f0713f64acd91b69da7b1384849eda1752c913df20bab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 12 20:06:26.004082 kubelet[2887]: E1212 20:06:26.001454 2887 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67fe8a9e222de2e5475f0713f64acd91b69da7b1384849eda1752c913df20bab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fwp5m" Dec 12 20:06:26.004082 kubelet[2887]: E1212 20:06:26.001902 2887 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67fe8a9e222de2e5475f0713f64acd91b69da7b1384849eda1752c913df20bab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fwp5m" Dec 12 20:06:26.004255 kubelet[2887]: E1212 20:06:26.002345 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fwp5m_calico-system(2a05762a-3b72-4df9-aa17-753debf16cab)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fwp5m_calico-system(2a05762a-3b72-4df9-aa17-753debf16cab)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"67fe8a9e222de2e5475f0713f64acd91b69da7b1384849eda1752c913df20bab\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:06:26.520520 systemd[1]: run-netns-cni\x2d92f2be48\x2d57b0\x2dece1\x2d2fad\x2d61a1029d0080.mount: Deactivated successfully. Dec 12 20:06:35.291052 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3214541370.mount: Deactivated successfully. Dec 12 20:06:35.404184 containerd[1555]: time="2025-12-12T20:06:35.390199347Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:06:35.404184 containerd[1555]: time="2025-12-12T20:06:35.404126940Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156883675" Dec 12 20:06:35.419649 containerd[1555]: time="2025-12-12T20:06:35.418474868Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:06:35.421701 containerd[1555]: time="2025-12-12T20:06:35.420781575Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 12 20:06:35.424663 containerd[1555]: time="2025-12-12T20:06:35.424553675Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 10.223545477s" Dec 12 20:06:35.424924 containerd[1555]: time="2025-12-12T20:06:35.424894781Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Dec 12 20:06:35.489661 containerd[1555]: time="2025-12-12T20:06:35.489583321Z" level=info msg="CreateContainer within sandbox \"8ebcd613dd0402697dafa960abacd695f39b608871daddc98dabf99b1931d72b\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 12 20:06:35.603357 containerd[1555]: time="2025-12-12T20:06:35.601217887Z" level=info msg="Container 4e7c1e30e94d3f9b718dcda88798ba3c21dbb30c0be4259c8ba8f56084c9f9bb: CDI devices from CRI Config.CDIDevices: []" Dec 12 20:06:35.603863 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1047342832.mount: Deactivated successfully. Dec 12 20:06:35.650837 containerd[1555]: time="2025-12-12T20:06:35.650646376Z" level=info msg="CreateContainer within sandbox \"8ebcd613dd0402697dafa960abacd695f39b608871daddc98dabf99b1931d72b\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"4e7c1e30e94d3f9b718dcda88798ba3c21dbb30c0be4259c8ba8f56084c9f9bb\"" Dec 12 20:06:35.652013 containerd[1555]: time="2025-12-12T20:06:35.651765165Z" level=info msg="StartContainer for \"4e7c1e30e94d3f9b718dcda88798ba3c21dbb30c0be4259c8ba8f56084c9f9bb\"" Dec 12 20:06:35.660530 containerd[1555]: time="2025-12-12T20:06:35.660419490Z" level=info msg="connecting to shim 4e7c1e30e94d3f9b718dcda88798ba3c21dbb30c0be4259c8ba8f56084c9f9bb" address="unix:///run/containerd/s/8e2622351cdf82d97e50f1707c53e4e08973ad4bb250d140c4dcbf8b363ee805" protocol=ttrpc version=3 Dec 12 20:06:35.854346 systemd[1]: Started cri-containerd-4e7c1e30e94d3f9b718dcda88798ba3c21dbb30c0be4259c8ba8f56084c9f9bb.scope - libcontainer container 4e7c1e30e94d3f9b718dcda88798ba3c21dbb30c0be4259c8ba8f56084c9f9bb. Dec 12 20:06:36.078482 containerd[1555]: time="2025-12-12T20:06:36.078353891Z" level=info msg="StartContainer for \"4e7c1e30e94d3f9b718dcda88798ba3c21dbb30c0be4259c8ba8f56084c9f9bb\" returns successfully" Dec 12 20:06:36.454916 kubelet[2887]: I1212 20:06:36.454201 2887 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-24v5p" podStartSLOduration=2.616006938 podStartE2EDuration="25.454140513s" podCreationTimestamp="2025-12-12 20:06:11 +0000 UTC" firstStartedPulling="2025-12-12 20:06:12.588588194 +0000 UTC m=+31.017487639" lastFinishedPulling="2025-12-12 20:06:35.426721771 +0000 UTC m=+53.855621214" observedRunningTime="2025-12-12 20:06:36.453184638 +0000 UTC m=+54.882084093" watchObservedRunningTime="2025-12-12 20:06:36.454140513 +0000 UTC m=+54.883039969" Dec 12 20:06:36.488666 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 12 20:06:36.490151 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 12 20:06:36.877331 containerd[1555]: time="2025-12-12T20:06:36.877190583Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gfqph,Uid:adefcce3-e2dc-4e91-aff3-a5f983fee5e6,Namespace:kube-system,Attempt:0,}" Dec 12 20:06:36.879925 containerd[1555]: time="2025-12-12T20:06:36.877743592Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fwp5m,Uid:2a05762a-3b72-4df9-aa17-753debf16cab,Namespace:calico-system,Attempt:0,}" Dec 12 20:06:36.885201 kubelet[2887]: I1212 20:06:36.884725 2887 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/08d2d7cd-4d89-4123-813f-09d5679d795c-whisker-backend-key-pair\") pod \"08d2d7cd-4d89-4123-813f-09d5679d795c\" (UID: \"08d2d7cd-4d89-4123-813f-09d5679d795c\") " Dec 12 20:06:36.885201 kubelet[2887]: I1212 20:06:36.884821 2887 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08d2d7cd-4d89-4123-813f-09d5679d795c-whisker-ca-bundle\") pod \"08d2d7cd-4d89-4123-813f-09d5679d795c\" (UID: \"08d2d7cd-4d89-4123-813f-09d5679d795c\") " Dec 12 20:06:36.890338 containerd[1555]: time="2025-12-12T20:06:36.887439998Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59fc8f5456-7jq54,Uid:0319e4c1-786f-47b5-99ee-05577b0ae0cb,Namespace:calico-system,Attempt:0,}" Dec 12 20:06:36.903877 kubelet[2887]: I1212 20:06:36.903135 2887 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gfql\" (UniqueName: \"kubernetes.io/projected/08d2d7cd-4d89-4123-813f-09d5679d795c-kube-api-access-2gfql\") pod \"08d2d7cd-4d89-4123-813f-09d5679d795c\" (UID: \"08d2d7cd-4d89-4123-813f-09d5679d795c\") " Dec 12 20:06:36.932252 systemd[1]: var-lib-kubelet-pods-08d2d7cd\x2d4d89\x2d4123\x2d813f\x2d09d5679d795c-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 12 20:06:36.933909 kubelet[2887]: I1212 20:06:36.908643 2887 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08d2d7cd-4d89-4123-813f-09d5679d795c-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "08d2d7cd-4d89-4123-813f-09d5679d795c" (UID: "08d2d7cd-4d89-4123-813f-09d5679d795c"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 12 20:06:36.943359 kubelet[2887]: I1212 20:06:36.941455 2887 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08d2d7cd-4d89-4123-813f-09d5679d795c-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "08d2d7cd-4d89-4123-813f-09d5679d795c" (UID: "08d2d7cd-4d89-4123-813f-09d5679d795c"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 12 20:06:36.950838 systemd[1]: var-lib-kubelet-pods-08d2d7cd\x2d4d89\x2d4123\x2d813f\x2d09d5679d795c-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d2gfql.mount: Deactivated successfully. Dec 12 20:06:36.954935 kubelet[2887]: I1212 20:06:36.951278 2887 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08d2d7cd-4d89-4123-813f-09d5679d795c-kube-api-access-2gfql" (OuterVolumeSpecName: "kube-api-access-2gfql") pod "08d2d7cd-4d89-4123-813f-09d5679d795c" (UID: "08d2d7cd-4d89-4123-813f-09d5679d795c"). InnerVolumeSpecName "kube-api-access-2gfql". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 12 20:06:37.005490 kubelet[2887]: I1212 20:06:37.005409 2887 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/08d2d7cd-4d89-4123-813f-09d5679d795c-whisker-backend-key-pair\") on node \"srv-n0ssy.gb1.brightbox.com\" DevicePath \"\"" Dec 12 20:06:37.005490 kubelet[2887]: I1212 20:06:37.005488 2887 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08d2d7cd-4d89-4123-813f-09d5679d795c-whisker-ca-bundle\") on node \"srv-n0ssy.gb1.brightbox.com\" DevicePath \"\"" Dec 12 20:06:37.005741 kubelet[2887]: I1212 20:06:37.005510 2887 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2gfql\" (UniqueName: \"kubernetes.io/projected/08d2d7cd-4d89-4123-813f-09d5679d795c-kube-api-access-2gfql\") on node \"srv-n0ssy.gb1.brightbox.com\" DevicePath \"\"" Dec 12 20:06:37.309341 systemd[1]: Removed slice kubepods-besteffort-pod08d2d7cd_4d89_4123_813f_09d5679d795c.slice - libcontainer container kubepods-besteffort-pod08d2d7cd_4d89_4123_813f_09d5679d795c.slice. Dec 12 20:06:37.550535 systemd[1]: Created slice kubepods-besteffort-pod2fcf8e8b_a7d2_49df_8e20_83c1153edafc.slice - libcontainer container kubepods-besteffort-pod2fcf8e8b_a7d2_49df_8e20_83c1153edafc.slice. Dec 12 20:06:37.623208 kubelet[2887]: I1212 20:06:37.621476 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2fcf8e8b-a7d2-49df-8e20-83c1153edafc-whisker-backend-key-pair\") pod \"whisker-779d6bdfc4-bv9cs\" (UID: \"2fcf8e8b-a7d2-49df-8e20-83c1153edafc\") " pod="calico-system/whisker-779d6bdfc4-bv9cs" Dec 12 20:06:37.623208 kubelet[2887]: I1212 20:06:37.621591 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fcf8e8b-a7d2-49df-8e20-83c1153edafc-whisker-ca-bundle\") pod \"whisker-779d6bdfc4-bv9cs\" (UID: \"2fcf8e8b-a7d2-49df-8e20-83c1153edafc\") " pod="calico-system/whisker-779d6bdfc4-bv9cs" Dec 12 20:06:37.623208 kubelet[2887]: I1212 20:06:37.621626 2887 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txd5s\" (UniqueName: \"kubernetes.io/projected/2fcf8e8b-a7d2-49df-8e20-83c1153edafc-kube-api-access-txd5s\") pod \"whisker-779d6bdfc4-bv9cs\" (UID: \"2fcf8e8b-a7d2-49df-8e20-83c1153edafc\") " pod="calico-system/whisker-779d6bdfc4-bv9cs" Dec 12 20:06:37.849453 systemd-networkd[1490]: cali28112953a5c: Link UP Dec 12 20:06:37.850688 systemd-networkd[1490]: cali28112953a5c: Gained carrier Dec 12 20:06:37.886207 containerd[1555]: 2025-12-12 20:06:37.121 [INFO][4034] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 20:06:37.886207 containerd[1555]: 2025-12-12 20:06:37.168 [INFO][4034] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--n0ssy.gb1.brightbox.com-k8s-csi--node--driver--fwp5m-eth0 csi-node-driver- calico-system 2a05762a-3b72-4df9-aa17-753debf16cab 788 0 2025-12-12 20:06:11 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-n0ssy.gb1.brightbox.com csi-node-driver-fwp5m eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali28112953a5c [] [] }} ContainerID="2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" Namespace="calico-system" Pod="csi-node-driver-fwp5m" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-csi--node--driver--fwp5m-" Dec 12 20:06:37.886207 containerd[1555]: 2025-12-12 20:06:37.168 [INFO][4034] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" Namespace="calico-system" Pod="csi-node-driver-fwp5m" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-csi--node--driver--fwp5m-eth0" Dec 12 20:06:37.886207 containerd[1555]: 2025-12-12 20:06:37.472 [INFO][4063] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" HandleID="k8s-pod-network.2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" Workload="srv--n0ssy.gb1.brightbox.com-k8s-csi--node--driver--fwp5m-eth0" Dec 12 20:06:37.888347 containerd[1555]: 2025-12-12 20:06:37.476 [INFO][4063] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" HandleID="k8s-pod-network.2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" Workload="srv--n0ssy.gb1.brightbox.com-k8s-csi--node--driver--fwp5m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00032a470), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-n0ssy.gb1.brightbox.com", "pod":"csi-node-driver-fwp5m", "timestamp":"2025-12-12 20:06:37.472761543 +0000 UTC"}, Hostname:"srv-n0ssy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 20:06:37.888347 containerd[1555]: 2025-12-12 20:06:37.476 [INFO][4063] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 20:06:37.888347 containerd[1555]: 2025-12-12 20:06:37.476 [INFO][4063] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 20:06:37.888347 containerd[1555]: 2025-12-12 20:06:37.478 [INFO][4063] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-n0ssy.gb1.brightbox.com' Dec 12 20:06:37.888347 containerd[1555]: 2025-12-12 20:06:37.518 [INFO][4063] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:37.888347 containerd[1555]: 2025-12-12 20:06:37.564 [INFO][4063] ipam/ipam.go 394: Looking up existing affinities for host host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:37.888347 containerd[1555]: 2025-12-12 20:06:37.610 [INFO][4063] ipam/ipam.go 543: Ran out of existing affine blocks for host host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:37.888347 containerd[1555]: 2025-12-12 20:06:37.625 [INFO][4063] ipam/ipam.go 560: Tried all affine blocks. Looking for an affine block with space, or a new unclaimed block host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:37.888347 containerd[1555]: 2025-12-12 20:06:37.636 [INFO][4063] ipam/ipam_block_reader_writer.go 158: Found free block: 192.168.78.0/26 Dec 12 20:06:37.888347 containerd[1555]: 2025-12-12 20:06:37.637 [INFO][4063] ipam/ipam.go 572: Found unclaimed block host="srv-n0ssy.gb1.brightbox.com" subnet=192.168.78.0/26 Dec 12 20:06:37.891306 containerd[1555]: 2025-12-12 20:06:37.637 [INFO][4063] ipam/ipam_block_reader_writer.go 175: Trying to create affinity in pending state host="srv-n0ssy.gb1.brightbox.com" subnet=192.168.78.0/26 Dec 12 20:06:37.891306 containerd[1555]: 2025-12-12 20:06:37.647 [INFO][4063] ipam/ipam_block_reader_writer.go 205: Successfully created pending affinity for block host="srv-n0ssy.gb1.brightbox.com" subnet=192.168.78.0/26 Dec 12 20:06:37.891306 containerd[1555]: 2025-12-12 20:06:37.647 [INFO][4063] ipam/ipam.go 158: Attempting to load block cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:37.891306 containerd[1555]: 2025-12-12 20:06:37.651 [INFO][4063] ipam/ipam.go 163: The referenced block doesn't exist, trying to create it cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:37.891306 containerd[1555]: 2025-12-12 20:06:37.660 [INFO][4063] ipam/ipam.go 170: Wrote affinity as pending cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:37.891306 containerd[1555]: 2025-12-12 20:06:37.744 [INFO][4063] ipam/ipam.go 179: Attempting to claim the block cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:37.891306 containerd[1555]: 2025-12-12 20:06:37.744 [INFO][4063] ipam/ipam_block_reader_writer.go 226: Attempting to create a new block affinityType="host" host="srv-n0ssy.gb1.brightbox.com" subnet=192.168.78.0/26 Dec 12 20:06:37.891306 containerd[1555]: 2025-12-12 20:06:37.760 [INFO][4063] ipam/ipam_block_reader_writer.go 267: Successfully created block Dec 12 20:06:37.891306 containerd[1555]: 2025-12-12 20:06:37.760 [INFO][4063] ipam/ipam_block_reader_writer.go 283: Confirming affinity host="srv-n0ssy.gb1.brightbox.com" subnet=192.168.78.0/26 Dec 12 20:06:37.891306 containerd[1555]: 2025-12-12 20:06:37.778 [INFO][4063] ipam/ipam_block_reader_writer.go 298: Successfully confirmed affinity host="srv-n0ssy.gb1.brightbox.com" subnet=192.168.78.0/26 Dec 12 20:06:37.891306 containerd[1555]: 2025-12-12 20:06:37.778 [INFO][4063] ipam/ipam.go 607: Block '192.168.78.0/26' has 64 free ips which is more than 1 ips required. host="srv-n0ssy.gb1.brightbox.com" subnet=192.168.78.0/26 Dec 12 20:06:37.891306 containerd[1555]: 2025-12-12 20:06:37.778 [INFO][4063] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.78.0/26 handle="k8s-pod-network.2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:37.891754 containerd[1555]: 2025-12-12 20:06:37.790 [INFO][4063] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8 Dec 12 20:06:37.891754 containerd[1555]: 2025-12-12 20:06:37.801 [INFO][4063] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.78.0/26 handle="k8s-pod-network.2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:37.891754 containerd[1555]: 2025-12-12 20:06:37.813 [INFO][4063] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.78.0/26] block=192.168.78.0/26 handle="k8s-pod-network.2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:37.891754 containerd[1555]: 2025-12-12 20:06:37.814 [INFO][4063] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.78.0/26] handle="k8s-pod-network.2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:37.891754 containerd[1555]: 2025-12-12 20:06:37.814 [INFO][4063] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 20:06:37.891754 containerd[1555]: 2025-12-12 20:06:37.814 [INFO][4063] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.78.0/26] IPv6=[] ContainerID="2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" HandleID="k8s-pod-network.2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" Workload="srv--n0ssy.gb1.brightbox.com-k8s-csi--node--driver--fwp5m-eth0" Dec 12 20:06:37.892944 containerd[1555]: 2025-12-12 20:06:37.818 [INFO][4034] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" Namespace="calico-system" Pod="csi-node-driver-fwp5m" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-csi--node--driver--fwp5m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-csi--node--driver--fwp5m-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2a05762a-3b72-4df9-aa17-753debf16cab", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 6, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-fwp5m", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.78.0/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali28112953a5c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:37.893078 containerd[1555]: 2025-12-12 20:06:37.819 [INFO][4034] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.78.0/32] ContainerID="2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" Namespace="calico-system" Pod="csi-node-driver-fwp5m" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-csi--node--driver--fwp5m-eth0" Dec 12 20:06:37.893078 containerd[1555]: 2025-12-12 20:06:37.819 [INFO][4034] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali28112953a5c ContainerID="2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" Namespace="calico-system" Pod="csi-node-driver-fwp5m" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-csi--node--driver--fwp5m-eth0" Dec 12 20:06:37.893078 containerd[1555]: 2025-12-12 20:06:37.854 [INFO][4034] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" Namespace="calico-system" Pod="csi-node-driver-fwp5m" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-csi--node--driver--fwp5m-eth0" Dec 12 20:06:37.893979 containerd[1555]: 2025-12-12 20:06:37.855 [INFO][4034] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" Namespace="calico-system" Pod="csi-node-driver-fwp5m" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-csi--node--driver--fwp5m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-csi--node--driver--fwp5m-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2a05762a-3b72-4df9-aa17-753debf16cab", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 6, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8", Pod:"csi-node-driver-fwp5m", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.78.0/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali28112953a5c", MAC:"e2:30:16:77:62:4f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:37.894135 containerd[1555]: 2025-12-12 20:06:37.869 [INFO][4034] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" Namespace="calico-system" Pod="csi-node-driver-fwp5m" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-csi--node--driver--fwp5m-eth0" Dec 12 20:06:37.896923 kubelet[2887]: I1212 20:06:37.896827 2887 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08d2d7cd-4d89-4123-813f-09d5679d795c" path="/var/lib/kubelet/pods/08d2d7cd-4d89-4123-813f-09d5679d795c/volumes" Dec 12 20:06:37.909593 containerd[1555]: time="2025-12-12T20:06:37.909467159Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-779d6bdfc4-bv9cs,Uid:2fcf8e8b-a7d2-49df-8e20-83c1153edafc,Namespace:calico-system,Attempt:0,}" Dec 12 20:06:38.178800 systemd-networkd[1490]: cali31e11e89e5e: Link UP Dec 12 20:06:38.179469 systemd-networkd[1490]: cali31e11e89e5e: Gained carrier Dec 12 20:06:38.295547 containerd[1555]: 2025-12-12 20:06:37.097 [INFO][4020] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 20:06:38.295547 containerd[1555]: 2025-12-12 20:06:37.170 [INFO][4020] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--gfqph-eth0 coredns-674b8bbfcf- kube-system adefcce3-e2dc-4e91-aff3-a5f983fee5e6 892 0 2025-12-12 20:05:48 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-n0ssy.gb1.brightbox.com coredns-674b8bbfcf-gfqph eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali31e11e89e5e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" Namespace="kube-system" Pod="coredns-674b8bbfcf-gfqph" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--gfqph-" Dec 12 20:06:38.295547 containerd[1555]: 2025-12-12 20:06:37.171 [INFO][4020] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" Namespace="kube-system" Pod="coredns-674b8bbfcf-gfqph" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--gfqph-eth0" Dec 12 20:06:38.295547 containerd[1555]: 2025-12-12 20:06:37.476 [INFO][4061] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" HandleID="k8s-pod-network.499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" Workload="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--gfqph-eth0" Dec 12 20:06:38.295984 containerd[1555]: 2025-12-12 20:06:37.478 [INFO][4061] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" HandleID="k8s-pod-network.499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" Workload="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--gfqph-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f7e0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-n0ssy.gb1.brightbox.com", "pod":"coredns-674b8bbfcf-gfqph", "timestamp":"2025-12-12 20:06:37.476112522 +0000 UTC"}, Hostname:"srv-n0ssy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 20:06:38.295984 containerd[1555]: 2025-12-12 20:06:37.478 [INFO][4061] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 20:06:38.295984 containerd[1555]: 2025-12-12 20:06:37.814 [INFO][4061] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 20:06:38.295984 containerd[1555]: 2025-12-12 20:06:37.814 [INFO][4061] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-n0ssy.gb1.brightbox.com' Dec 12 20:06:38.295984 containerd[1555]: 2025-12-12 20:06:37.838 [INFO][4061] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.295984 containerd[1555]: 2025-12-12 20:06:37.902 [INFO][4061] ipam/ipam.go 394: Looking up existing affinities for host host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.295984 containerd[1555]: 2025-12-12 20:06:37.937 [INFO][4061] ipam/ipam.go 511: Trying affinity for 192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.295984 containerd[1555]: 2025-12-12 20:06:37.950 [INFO][4061] ipam/ipam.go 158: Attempting to load block cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.295984 containerd[1555]: 2025-12-12 20:06:37.964 [INFO][4061] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.296895 containerd[1555]: 2025-12-12 20:06:37.964 [INFO][4061] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.78.0/26 handle="k8s-pod-network.499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.296895 containerd[1555]: 2025-12-12 20:06:37.976 [INFO][4061] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e Dec 12 20:06:38.296895 containerd[1555]: 2025-12-12 20:06:37.989 [INFO][4061] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.78.0/26 handle="k8s-pod-network.499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.298113 containerd[1555]: 2025-12-12 20:06:38.010 [ERROR][4061] ipam/customresource.go 184: Error updating resource Key=IPAMBlock(192-168-78-0-26) Name="192-168-78-0-26" Resource="IPAMBlocks" Value=&v3.IPAMBlock{TypeMeta:v1.TypeMeta{Kind:"IPAMBlock", APIVersion:"crd.projectcalico.org/v1"}, ObjectMeta:v1.ObjectMeta{Name:"192-168-78-0-26", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"977", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.IPAMBlockSpec{CIDR:"192.168.78.0/26", Affinity:(*string)(0xc0003af400), Allocations:[]*int{(*int)(0xc000461198), (*int)(0xc000461360), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil)}, Unallocated:[]int{2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63}, Attributes:[]v3.AllocationAttribute{v3.AllocationAttribute{AttrPrimary:(*string)(0xc0003af430), AttrSecondary:map[string]string{"namespace":"calico-system", "node":"srv-n0ssy.gb1.brightbox.com", "pod":"csi-node-driver-fwp5m", "timestamp":"2025-12-12 20:06:37.472761543 +0000 UTC"}}, v3.AllocationAttribute{AttrPrimary:(*string)(0xc00004f7e0), AttrSecondary:map[string]string{"namespace":"kube-system", "node":"srv-n0ssy.gb1.brightbox.com", "pod":"coredns-674b8bbfcf-gfqph", "timestamp":"2025-12-12 20:06:37.476112522 +0000 UTC"}}}, SequenceNumber:0x1880908efda8da24, SequenceNumberForAllocation:map[string]uint64{"0":0x1880908efda8da22, "1":0x1880908efda8da23}, Deleted:false, DeprecatedStrictAffinity:false}} error=Operation cannot be fulfilled on ipamblocks.crd.projectcalico.org "192-168-78-0-26": the object has been modified; please apply your changes to the latest version and try again Dec 12 20:06:38.298113 containerd[1555]: 2025-12-12 20:06:38.012 [INFO][4061] ipam/ipam.go 1250: Failed to update block block=192.168.78.0/26 error=update conflict: IPAMBlock(192-168-78-0-26) handle="k8s-pod-network.499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.298113 containerd[1555]: 2025-12-12 20:06:38.078 [INFO][4061] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.78.0/26 handle="k8s-pod-network.499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.298113 containerd[1555]: 2025-12-12 20:06:38.086 [INFO][4061] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e Dec 12 20:06:38.298113 containerd[1555]: 2025-12-12 20:06:38.101 [INFO][4061] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.78.0/26 handle="k8s-pod-network.499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.298113 containerd[1555]: 2025-12-12 20:06:38.133 [INFO][4061] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.78.2/26] block=192.168.78.0/26 handle="k8s-pod-network.499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.298113 containerd[1555]: 2025-12-12 20:06:38.134 [INFO][4061] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.78.2/26] handle="k8s-pod-network.499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.298113 containerd[1555]: 2025-12-12 20:06:38.134 [INFO][4061] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 20:06:38.298113 containerd[1555]: 2025-12-12 20:06:38.134 [INFO][4061] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.78.2/26] IPv6=[] ContainerID="499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" HandleID="k8s-pod-network.499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" Workload="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--gfqph-eth0" Dec 12 20:06:38.299105 containerd[1555]: 2025-12-12 20:06:38.163 [INFO][4020] cni-plugin/k8s.go 418: Populated endpoint ContainerID="499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" Namespace="kube-system" Pod="coredns-674b8bbfcf-gfqph" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--gfqph-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--gfqph-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"adefcce3-e2dc-4e91-aff3-a5f983fee5e6", ResourceVersion:"892", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 5, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"", Pod:"coredns-674b8bbfcf-gfqph", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.78.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali31e11e89e5e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:38.299105 containerd[1555]: 2025-12-12 20:06:38.163 [INFO][4020] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.78.2/32] ContainerID="499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" Namespace="kube-system" Pod="coredns-674b8bbfcf-gfqph" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--gfqph-eth0" Dec 12 20:06:38.299105 containerd[1555]: 2025-12-12 20:06:38.164 [INFO][4020] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali31e11e89e5e ContainerID="499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" Namespace="kube-system" Pod="coredns-674b8bbfcf-gfqph" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--gfqph-eth0" Dec 12 20:06:38.299105 containerd[1555]: 2025-12-12 20:06:38.175 [INFO][4020] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" Namespace="kube-system" Pod="coredns-674b8bbfcf-gfqph" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--gfqph-eth0" Dec 12 20:06:38.299105 containerd[1555]: 2025-12-12 20:06:38.191 [INFO][4020] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" Namespace="kube-system" Pod="coredns-674b8bbfcf-gfqph" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--gfqph-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--gfqph-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"adefcce3-e2dc-4e91-aff3-a5f983fee5e6", ResourceVersion:"892", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 5, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e", Pod:"coredns-674b8bbfcf-gfqph", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.78.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali31e11e89e5e", MAC:"82:db:8f:c3:b4:13", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:38.299105 containerd[1555]: 2025-12-12 20:06:38.283 [INFO][4020] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" Namespace="kube-system" Pod="coredns-674b8bbfcf-gfqph" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--gfqph-eth0" Dec 12 20:06:38.311018 containerd[1555]: time="2025-12-12T20:06:38.310270075Z" level=info msg="connecting to shim 2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8" address="unix:///run/containerd/s/ce21fdbf056abf15161da5048733ecec186eb3a474899b3b3500d084e12e450d" namespace=k8s.io protocol=ttrpc version=3 Dec 12 20:06:38.472336 systemd-networkd[1490]: cali2a525bf5724: Link UP Dec 12 20:06:38.475452 systemd-networkd[1490]: cali2a525bf5724: Gained carrier Dec 12 20:06:38.523920 containerd[1555]: time="2025-12-12T20:06:38.523628773Z" level=info msg="connecting to shim 499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e" address="unix:///run/containerd/s/10630437471ccdaa8ee6892da61a2133a4b66018963bcc2734b28750f46cf269" namespace=k8s.io protocol=ttrpc version=3 Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:37.119 [INFO][4021] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:37.167 [INFO][4021] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--n0ssy.gb1.brightbox.com-k8s-calico--kube--controllers--59fc8f5456--7jq54-eth0 calico-kube-controllers-59fc8f5456- calico-system 0319e4c1-786f-47b5-99ee-05577b0ae0cb 899 0 2025-12-12 20:06:12 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:59fc8f5456 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-n0ssy.gb1.brightbox.com calico-kube-controllers-59fc8f5456-7jq54 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali2a525bf5724 [] [] }} ContainerID="b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" Namespace="calico-system" Pod="calico-kube-controllers-59fc8f5456-7jq54" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--kube--controllers--59fc8f5456--7jq54-" Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:37.168 [INFO][4021] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" Namespace="calico-system" Pod="calico-kube-controllers-59fc8f5456-7jq54" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--kube--controllers--59fc8f5456--7jq54-eth0" Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:37.474 [INFO][4059] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" HandleID="k8s-pod-network.b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" Workload="srv--n0ssy.gb1.brightbox.com-k8s-calico--kube--controllers--59fc8f5456--7jq54-eth0" Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:37.479 [INFO][4059] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" HandleID="k8s-pod-network.b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" Workload="srv--n0ssy.gb1.brightbox.com-k8s-calico--kube--controllers--59fc8f5456--7jq54-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00047fd30), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-n0ssy.gb1.brightbox.com", "pod":"calico-kube-controllers-59fc8f5456-7jq54", "timestamp":"2025-12-12 20:06:37.474632448 +0000 UTC"}, Hostname:"srv-n0ssy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:37.479 [INFO][4059] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:38.137 [INFO][4059] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:38.138 [INFO][4059] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-n0ssy.gb1.brightbox.com' Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:38.185 [INFO][4059] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:38.247 [INFO][4059] ipam/ipam.go 394: Looking up existing affinities for host host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:38.301 [INFO][4059] ipam/ipam.go 511: Trying affinity for 192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:38.311 [INFO][4059] ipam/ipam.go 158: Attempting to load block cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:38.335 [INFO][4059] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:38.341 [INFO][4059] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.78.0/26 handle="k8s-pod-network.b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:38.348 [INFO][4059] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078 Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:38.370 [INFO][4059] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.78.0/26 handle="k8s-pod-network.b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:38.415 [INFO][4059] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.78.3/26] block=192.168.78.0/26 handle="k8s-pod-network.b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:38.415 [INFO][4059] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.78.3/26] handle="k8s-pod-network.b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:38.415 [INFO][4059] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 20:06:38.540886 containerd[1555]: 2025-12-12 20:06:38.415 [INFO][4059] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.78.3/26] IPv6=[] ContainerID="b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" HandleID="k8s-pod-network.b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" Workload="srv--n0ssy.gb1.brightbox.com-k8s-calico--kube--controllers--59fc8f5456--7jq54-eth0" Dec 12 20:06:38.544262 containerd[1555]: 2025-12-12 20:06:38.434 [INFO][4021] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" Namespace="calico-system" Pod="calico-kube-controllers-59fc8f5456-7jq54" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--kube--controllers--59fc8f5456--7jq54-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-calico--kube--controllers--59fc8f5456--7jq54-eth0", GenerateName:"calico-kube-controllers-59fc8f5456-", Namespace:"calico-system", SelfLink:"", UID:"0319e4c1-786f-47b5-99ee-05577b0ae0cb", ResourceVersion:"899", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 6, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"59fc8f5456", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-59fc8f5456-7jq54", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.78.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2a525bf5724", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:38.544262 containerd[1555]: 2025-12-12 20:06:38.457 [INFO][4021] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.78.3/32] ContainerID="b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" Namespace="calico-system" Pod="calico-kube-controllers-59fc8f5456-7jq54" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--kube--controllers--59fc8f5456--7jq54-eth0" Dec 12 20:06:38.544262 containerd[1555]: 2025-12-12 20:06:38.459 [INFO][4021] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2a525bf5724 ContainerID="b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" Namespace="calico-system" Pod="calico-kube-controllers-59fc8f5456-7jq54" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--kube--controllers--59fc8f5456--7jq54-eth0" Dec 12 20:06:38.544262 containerd[1555]: 2025-12-12 20:06:38.477 [INFO][4021] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" Namespace="calico-system" Pod="calico-kube-controllers-59fc8f5456-7jq54" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--kube--controllers--59fc8f5456--7jq54-eth0" Dec 12 20:06:38.544262 containerd[1555]: 2025-12-12 20:06:38.478 [INFO][4021] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" Namespace="calico-system" Pod="calico-kube-controllers-59fc8f5456-7jq54" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--kube--controllers--59fc8f5456--7jq54-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-calico--kube--controllers--59fc8f5456--7jq54-eth0", GenerateName:"calico-kube-controllers-59fc8f5456-", Namespace:"calico-system", SelfLink:"", UID:"0319e4c1-786f-47b5-99ee-05577b0ae0cb", ResourceVersion:"899", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 6, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"59fc8f5456", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078", Pod:"calico-kube-controllers-59fc8f5456-7jq54", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.78.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2a525bf5724", MAC:"ba:71:ab:97:c1:76", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:38.544262 containerd[1555]: 2025-12-12 20:06:38.515 [INFO][4021] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" Namespace="calico-system" Pod="calico-kube-controllers-59fc8f5456-7jq54" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--kube--controllers--59fc8f5456--7jq54-eth0" Dec 12 20:06:38.572425 systemd[1]: Started cri-containerd-2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8.scope - libcontainer container 2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8. Dec 12 20:06:38.647145 systemd-networkd[1490]: cali1a21b32380d: Link UP Dec 12 20:06:38.655107 systemd-networkd[1490]: cali1a21b32380d: Gained carrier Dec 12 20:06:38.660775 systemd[1]: Started cri-containerd-499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e.scope - libcontainer container 499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e. Dec 12 20:06:38.685211 containerd[1555]: time="2025-12-12T20:06:38.685125636Z" level=info msg="connecting to shim b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078" address="unix:///run/containerd/s/867c9e55a51c3b4b28bb470ab55844c15e1415ad4f664b7fd21506a7c9467d8c" namespace=k8s.io protocol=ttrpc version=3 Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.053 [INFO][4114] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.104 [INFO][4114] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--n0ssy.gb1.brightbox.com-k8s-whisker--779d6bdfc4--bv9cs-eth0 whisker-779d6bdfc4- calico-system 2fcf8e8b-a7d2-49df-8e20-83c1153edafc 970 0 2025-12-12 20:06:37 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:779d6bdfc4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s srv-n0ssy.gb1.brightbox.com whisker-779d6bdfc4-bv9cs eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali1a21b32380d [] [] }} ContainerID="03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" Namespace="calico-system" Pod="whisker-779d6bdfc4-bv9cs" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-whisker--779d6bdfc4--bv9cs-" Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.106 [INFO][4114] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" Namespace="calico-system" Pod="whisker-779d6bdfc4-bv9cs" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-whisker--779d6bdfc4--bv9cs-eth0" Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.275 [INFO][4130] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" HandleID="k8s-pod-network.03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" Workload="srv--n0ssy.gb1.brightbox.com-k8s-whisker--779d6bdfc4--bv9cs-eth0" Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.276 [INFO][4130] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" HandleID="k8s-pod-network.03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" Workload="srv--n0ssy.gb1.brightbox.com-k8s-whisker--779d6bdfc4--bv9cs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000103660), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-n0ssy.gb1.brightbox.com", "pod":"whisker-779d6bdfc4-bv9cs", "timestamp":"2025-12-12 20:06:38.275486108 +0000 UTC"}, Hostname:"srv-n0ssy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.276 [INFO][4130] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.423 [INFO][4130] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.423 [INFO][4130] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-n0ssy.gb1.brightbox.com' Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.456 [INFO][4130] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.482 [INFO][4130] ipam/ipam.go 394: Looking up existing affinities for host host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.516 [INFO][4130] ipam/ipam.go 511: Trying affinity for 192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.521 [INFO][4130] ipam/ipam.go 158: Attempting to load block cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.528 [INFO][4130] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.532 [INFO][4130] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.78.0/26 handle="k8s-pod-network.03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.542 [INFO][4130] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302 Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.568 [INFO][4130] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.78.0/26 handle="k8s-pod-network.03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.589 [INFO][4130] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.78.4/26] block=192.168.78.0/26 handle="k8s-pod-network.03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.589 [INFO][4130] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.78.4/26] handle="k8s-pod-network.03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.589 [INFO][4130] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 20:06:38.697467 containerd[1555]: 2025-12-12 20:06:38.589 [INFO][4130] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.78.4/26] IPv6=[] ContainerID="03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" HandleID="k8s-pod-network.03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" Workload="srv--n0ssy.gb1.brightbox.com-k8s-whisker--779d6bdfc4--bv9cs-eth0" Dec 12 20:06:38.701079 containerd[1555]: 2025-12-12 20:06:38.610 [INFO][4114] cni-plugin/k8s.go 418: Populated endpoint ContainerID="03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" Namespace="calico-system" Pod="whisker-779d6bdfc4-bv9cs" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-whisker--779d6bdfc4--bv9cs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-whisker--779d6bdfc4--bv9cs-eth0", GenerateName:"whisker-779d6bdfc4-", Namespace:"calico-system", SelfLink:"", UID:"2fcf8e8b-a7d2-49df-8e20-83c1153edafc", ResourceVersion:"970", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 6, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"779d6bdfc4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"", Pod:"whisker-779d6bdfc4-bv9cs", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.78.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali1a21b32380d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:38.701079 containerd[1555]: 2025-12-12 20:06:38.617 [INFO][4114] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.78.4/32] ContainerID="03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" Namespace="calico-system" Pod="whisker-779d6bdfc4-bv9cs" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-whisker--779d6bdfc4--bv9cs-eth0" Dec 12 20:06:38.701079 containerd[1555]: 2025-12-12 20:06:38.618 [INFO][4114] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1a21b32380d ContainerID="03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" Namespace="calico-system" Pod="whisker-779d6bdfc4-bv9cs" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-whisker--779d6bdfc4--bv9cs-eth0" Dec 12 20:06:38.701079 containerd[1555]: 2025-12-12 20:06:38.659 [INFO][4114] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" Namespace="calico-system" Pod="whisker-779d6bdfc4-bv9cs" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-whisker--779d6bdfc4--bv9cs-eth0" Dec 12 20:06:38.701079 containerd[1555]: 2025-12-12 20:06:38.662 [INFO][4114] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" Namespace="calico-system" Pod="whisker-779d6bdfc4-bv9cs" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-whisker--779d6bdfc4--bv9cs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-whisker--779d6bdfc4--bv9cs-eth0", GenerateName:"whisker-779d6bdfc4-", Namespace:"calico-system", SelfLink:"", UID:"2fcf8e8b-a7d2-49df-8e20-83c1153edafc", ResourceVersion:"970", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 6, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"779d6bdfc4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302", Pod:"whisker-779d6bdfc4-bv9cs", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.78.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali1a21b32380d", MAC:"2a:00:95:d4:99:5f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:38.701079 containerd[1555]: 2025-12-12 20:06:38.686 [INFO][4114] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" Namespace="calico-system" Pod="whisker-779d6bdfc4-bv9cs" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-whisker--779d6bdfc4--bv9cs-eth0" Dec 12 20:06:38.794740 systemd[1]: Started cri-containerd-b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078.scope - libcontainer container b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078. Dec 12 20:06:38.811275 containerd[1555]: time="2025-12-12T20:06:38.810860210Z" level=info msg="connecting to shim 03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302" address="unix:///run/containerd/s/1aaf6d4ff20837b025405756ba3ce013663a2417a81bb02998362c9d705d452d" namespace=k8s.io protocol=ttrpc version=3 Dec 12 20:06:38.875976 containerd[1555]: time="2025-12-12T20:06:38.875898458Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-22cbp,Uid:b08daecd-0b61-450c-a526-5e7b591cde3e,Namespace:calico-system,Attempt:0,}" Dec 12 20:06:38.879987 containerd[1555]: time="2025-12-12T20:06:38.879605794Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-558bb796c-w9rs4,Uid:87f20b04-9afa-4c4b-849e-0456cd78dc55,Namespace:calico-apiserver,Attempt:0,}" Dec 12 20:06:38.965864 containerd[1555]: time="2025-12-12T20:06:38.964744451Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gfqph,Uid:adefcce3-e2dc-4e91-aff3-a5f983fee5e6,Namespace:kube-system,Attempt:0,} returns sandbox id \"499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e\"" Dec 12 20:06:38.968124 systemd[1]: Started cri-containerd-03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302.scope - libcontainer container 03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302. Dec 12 20:06:38.983333 containerd[1555]: time="2025-12-12T20:06:38.982862528Z" level=info msg="CreateContainer within sandbox \"499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 20:06:39.016983 containerd[1555]: time="2025-12-12T20:06:39.015369822Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fwp5m,Uid:2a05762a-3b72-4df9-aa17-753debf16cab,Namespace:calico-system,Attempt:0,} returns sandbox id \"2e8cb23d6a4394717ed832da81dcc6f4304c7855f94ee9188434232f65fde8a8\"" Dec 12 20:06:39.033493 containerd[1555]: time="2025-12-12T20:06:39.033439172Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 20:06:39.049400 containerd[1555]: time="2025-12-12T20:06:39.049324832Z" level=info msg="Container 1aad607ed19fda24dc2fc93a4db3042cf3b471999aafbb887f728c9b62f9f601: CDI devices from CRI Config.CDIDevices: []" Dec 12 20:06:39.208622 containerd[1555]: time="2025-12-12T20:06:39.208106263Z" level=info msg="CreateContainer within sandbox \"499c4f5d564b3b1fb9cc2b51a9f4c28e82be13f8bc3c31250318e37f73012f4e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"1aad607ed19fda24dc2fc93a4db3042cf3b471999aafbb887f728c9b62f9f601\"" Dec 12 20:06:39.224056 containerd[1555]: time="2025-12-12T20:06:39.223889451Z" level=info msg="StartContainer for \"1aad607ed19fda24dc2fc93a4db3042cf3b471999aafbb887f728c9b62f9f601\"" Dec 12 20:06:39.239195 containerd[1555]: time="2025-12-12T20:06:39.237872809Z" level=info msg="connecting to shim 1aad607ed19fda24dc2fc93a4db3042cf3b471999aafbb887f728c9b62f9f601" address="unix:///run/containerd/s/10630437471ccdaa8ee6892da61a2133a4b66018963bcc2734b28750f46cf269" protocol=ttrpc version=3 Dec 12 20:06:39.267512 containerd[1555]: time="2025-12-12T20:06:39.267376631Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59fc8f5456-7jq54,Uid:0319e4c1-786f-47b5-99ee-05577b0ae0cb,Namespace:calico-system,Attempt:0,} returns sandbox id \"b91f75c52fee43450f7cc3b8a7d1547bc8e8dc422b8b3fe8a909360dc836a078\"" Dec 12 20:06:39.331584 systemd-networkd[1490]: cali28112953a5c: Gained IPv6LL Dec 12 20:06:39.422079 systemd[1]: Started cri-containerd-1aad607ed19fda24dc2fc93a4db3042cf3b471999aafbb887f728c9b62f9f601.scope - libcontainer container 1aad607ed19fda24dc2fc93a4db3042cf3b471999aafbb887f728c9b62f9f601. Dec 12 20:06:39.473807 containerd[1555]: time="2025-12-12T20:06:39.473734698Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:39.476760 containerd[1555]: time="2025-12-12T20:06:39.476701357Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 20:06:39.496143 containerd[1555]: time="2025-12-12T20:06:39.495600785Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 20:06:39.497950 containerd[1555]: time="2025-12-12T20:06:39.497916103Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-779d6bdfc4-bv9cs,Uid:2fcf8e8b-a7d2-49df-8e20-83c1153edafc,Namespace:calico-system,Attempt:0,} returns sandbox id \"03e8b71032758db7f1c7c3f1e782367ea46ed66bca1e796fa2638d729a55c302\"" Dec 12 20:06:39.500631 kubelet[2887]: E1212 20:06:39.500334 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 20:06:39.501208 kubelet[2887]: E1212 20:06:39.500765 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 20:06:39.501756 containerd[1555]: time="2025-12-12T20:06:39.501637469Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 20:06:39.524140 kubelet[2887]: E1212 20:06:39.523982 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzr2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-fwp5m_calico-system(2a05762a-3b72-4df9-aa17-753debf16cab): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:39.650574 containerd[1555]: time="2025-12-12T20:06:39.647937727Z" level=info msg="StartContainer for \"1aad607ed19fda24dc2fc93a4db3042cf3b471999aafbb887f728c9b62f9f601\" returns successfully" Dec 12 20:06:39.651729 systemd-networkd[1490]: cali2a525bf5724: Gained IPv6LL Dec 12 20:06:39.654554 systemd-networkd[1490]: cali3a5e877266f: Link UP Dec 12 20:06:39.656341 systemd-networkd[1490]: cali3a5e877266f: Gained carrier Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.073 [INFO][4349] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.122 [INFO][4349] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--n0ssy.gb1.brightbox.com-k8s-goldmane--666569f655--22cbp-eth0 goldmane-666569f655- calico-system b08daecd-0b61-450c-a526-5e7b591cde3e 903 0 2025-12-12 20:06:09 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s srv-n0ssy.gb1.brightbox.com goldmane-666569f655-22cbp eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali3a5e877266f [] [] }} ContainerID="0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" Namespace="calico-system" Pod="goldmane-666569f655-22cbp" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-goldmane--666569f655--22cbp-" Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.122 [INFO][4349] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" Namespace="calico-system" Pod="goldmane-666569f655-22cbp" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-goldmane--666569f655--22cbp-eth0" Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.472 [INFO][4383] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" HandleID="k8s-pod-network.0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" Workload="srv--n0ssy.gb1.brightbox.com-k8s-goldmane--666569f655--22cbp-eth0" Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.473 [INFO][4383] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" HandleID="k8s-pod-network.0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" Workload="srv--n0ssy.gb1.brightbox.com-k8s-goldmane--666569f655--22cbp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000310eb0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-n0ssy.gb1.brightbox.com", "pod":"goldmane-666569f655-22cbp", "timestamp":"2025-12-12 20:06:39.47249441 +0000 UTC"}, Hostname:"srv-n0ssy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.476 [INFO][4383] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.476 [INFO][4383] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.481 [INFO][4383] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-n0ssy.gb1.brightbox.com' Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.529 [INFO][4383] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.544 [INFO][4383] ipam/ipam.go 394: Looking up existing affinities for host host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.555 [INFO][4383] ipam/ipam.go 511: Trying affinity for 192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.565 [INFO][4383] ipam/ipam.go 158: Attempting to load block cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.574 [INFO][4383] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.575 [INFO][4383] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.78.0/26 handle="k8s-pod-network.0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.581 [INFO][4383] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.596 [INFO][4383] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.78.0/26 handle="k8s-pod-network.0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.614 [INFO][4383] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.78.5/26] block=192.168.78.0/26 handle="k8s-pod-network.0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.614 [INFO][4383] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.78.5/26] handle="k8s-pod-network.0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.617 [INFO][4383] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 20:06:39.709451 containerd[1555]: 2025-12-12 20:06:39.617 [INFO][4383] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.78.5/26] IPv6=[] ContainerID="0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" HandleID="k8s-pod-network.0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" Workload="srv--n0ssy.gb1.brightbox.com-k8s-goldmane--666569f655--22cbp-eth0" Dec 12 20:06:39.711897 containerd[1555]: 2025-12-12 20:06:39.623 [INFO][4349] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" Namespace="calico-system" Pod="goldmane-666569f655-22cbp" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-goldmane--666569f655--22cbp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-goldmane--666569f655--22cbp-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"b08daecd-0b61-450c-a526-5e7b591cde3e", ResourceVersion:"903", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 6, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"", Pod:"goldmane-666569f655-22cbp", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.78.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali3a5e877266f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:39.711897 containerd[1555]: 2025-12-12 20:06:39.640 [INFO][4349] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.78.5/32] ContainerID="0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" Namespace="calico-system" Pod="goldmane-666569f655-22cbp" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-goldmane--666569f655--22cbp-eth0" Dec 12 20:06:39.711897 containerd[1555]: 2025-12-12 20:06:39.640 [INFO][4349] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3a5e877266f ContainerID="0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" Namespace="calico-system" Pod="goldmane-666569f655-22cbp" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-goldmane--666569f655--22cbp-eth0" Dec 12 20:06:39.711897 containerd[1555]: 2025-12-12 20:06:39.668 [INFO][4349] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" Namespace="calico-system" Pod="goldmane-666569f655-22cbp" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-goldmane--666569f655--22cbp-eth0" Dec 12 20:06:39.711897 containerd[1555]: 2025-12-12 20:06:39.670 [INFO][4349] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" Namespace="calico-system" Pod="goldmane-666569f655-22cbp" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-goldmane--666569f655--22cbp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-goldmane--666569f655--22cbp-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"b08daecd-0b61-450c-a526-5e7b591cde3e", ResourceVersion:"903", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 6, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f", Pod:"goldmane-666569f655-22cbp", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.78.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali3a5e877266f", MAC:"b6:9f:f0:90:92:32", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:39.711897 containerd[1555]: 2025-12-12 20:06:39.697 [INFO][4349] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" Namespace="calico-system" Pod="goldmane-666569f655-22cbp" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-goldmane--666569f655--22cbp-eth0" Dec 12 20:06:39.778523 systemd-networkd[1490]: cali31e11e89e5e: Gained IPv6LL Dec 12 20:06:39.791688 systemd-networkd[1490]: calid896654d7a2: Link UP Dec 12 20:06:39.800177 systemd-networkd[1490]: calid896654d7a2: Gained carrier Dec 12 20:06:39.801060 containerd[1555]: time="2025-12-12T20:06:39.800956288Z" level=info msg="connecting to shim 0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f" address="unix:///run/containerd/s/51d7eb4ad04dd989fcf763e93e3c5e8ab456b6dbbfca4995f512c635bc46257a" namespace=k8s.io protocol=ttrpc version=3 Dec 12 20:06:39.836783 containerd[1555]: time="2025-12-12T20:06:39.836699419Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:39.842592 containerd[1555]: time="2025-12-12T20:06:39.842538111Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 20:06:39.843502 containerd[1555]: time="2025-12-12T20:06:39.842587129Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 20:06:39.844570 kubelet[2887]: E1212 20:06:39.844192 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 20:06:39.844681 kubelet[2887]: E1212 20:06:39.844617 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.171 [INFO][4353] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.277 [INFO][4353] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--w9rs4-eth0 calico-apiserver-558bb796c- calico-apiserver 87f20b04-9afa-4c4b-849e-0456cd78dc55 895 0 2025-12-12 20:06:03 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:558bb796c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-n0ssy.gb1.brightbox.com calico-apiserver-558bb796c-w9rs4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid896654d7a2 [] [] }} ContainerID="581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" Namespace="calico-apiserver" Pod="calico-apiserver-558bb796c-w9rs4" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--w9rs4-" Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.278 [INFO][4353] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" Namespace="calico-apiserver" Pod="calico-apiserver-558bb796c-w9rs4" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--w9rs4-eth0" Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.507 [INFO][4402] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" HandleID="k8s-pod-network.581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" Workload="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--w9rs4-eth0" Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.507 [INFO][4402] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" HandleID="k8s-pod-network.581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" Workload="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--w9rs4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d3840), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-n0ssy.gb1.brightbox.com", "pod":"calico-apiserver-558bb796c-w9rs4", "timestamp":"2025-12-12 20:06:39.50759565 +0000 UTC"}, Hostname:"srv-n0ssy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.508 [INFO][4402] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.615 [INFO][4402] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.615 [INFO][4402] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-n0ssy.gb1.brightbox.com' Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.637 [INFO][4402] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.665 [INFO][4402] ipam/ipam.go 394: Looking up existing affinities for host host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.685 [INFO][4402] ipam/ipam.go 511: Trying affinity for 192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.689 [INFO][4402] ipam/ipam.go 158: Attempting to load block cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.694 [INFO][4402] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.695 [INFO][4402] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.78.0/26 handle="k8s-pod-network.581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.705 [INFO][4402] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2 Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.728 [INFO][4402] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.78.0/26 handle="k8s-pod-network.581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.742 [INFO][4402] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.78.6/26] block=192.168.78.0/26 handle="k8s-pod-network.581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.744 [INFO][4402] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.78.6/26] handle="k8s-pod-network.581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.745 [INFO][4402] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 20:06:39.845712 containerd[1555]: 2025-12-12 20:06:39.745 [INFO][4402] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.78.6/26] IPv6=[] ContainerID="581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" HandleID="k8s-pod-network.581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" Workload="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--w9rs4-eth0" Dec 12 20:06:39.846915 containerd[1555]: 2025-12-12 20:06:39.765 [INFO][4353] cni-plugin/k8s.go 418: Populated endpoint ContainerID="581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" Namespace="calico-apiserver" Pod="calico-apiserver-558bb796c-w9rs4" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--w9rs4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--w9rs4-eth0", GenerateName:"calico-apiserver-558bb796c-", Namespace:"calico-apiserver", SelfLink:"", UID:"87f20b04-9afa-4c4b-849e-0456cd78dc55", ResourceVersion:"895", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 6, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"558bb796c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-558bb796c-w9rs4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.78.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid896654d7a2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:39.846915 containerd[1555]: 2025-12-12 20:06:39.766 [INFO][4353] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.78.6/32] ContainerID="581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" Namespace="calico-apiserver" Pod="calico-apiserver-558bb796c-w9rs4" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--w9rs4-eth0" Dec 12 20:06:39.846915 containerd[1555]: 2025-12-12 20:06:39.766 [INFO][4353] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid896654d7a2 ContainerID="581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" Namespace="calico-apiserver" Pod="calico-apiserver-558bb796c-w9rs4" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--w9rs4-eth0" Dec 12 20:06:39.846915 containerd[1555]: 2025-12-12 20:06:39.804 [INFO][4353] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" Namespace="calico-apiserver" Pod="calico-apiserver-558bb796c-w9rs4" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--w9rs4-eth0" Dec 12 20:06:39.846915 containerd[1555]: 2025-12-12 20:06:39.805 [INFO][4353] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" Namespace="calico-apiserver" Pod="calico-apiserver-558bb796c-w9rs4" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--w9rs4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--w9rs4-eth0", GenerateName:"calico-apiserver-558bb796c-", Namespace:"calico-apiserver", SelfLink:"", UID:"87f20b04-9afa-4c4b-849e-0456cd78dc55", ResourceVersion:"895", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 6, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"558bb796c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2", Pod:"calico-apiserver-558bb796c-w9rs4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.78.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid896654d7a2", MAC:"c6:91:ff:56:20:df", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:39.846915 containerd[1555]: 2025-12-12 20:06:39.831 [INFO][4353] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" Namespace="calico-apiserver" Pod="calico-apiserver-558bb796c-w9rs4" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--w9rs4-eth0" Dec 12 20:06:39.850074 kubelet[2887]: E1212 20:06:39.846972 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vw8ps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-59fc8f5456-7jq54_calico-system(0319e4c1-786f-47b5-99ee-05577b0ae0cb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:39.850651 kubelet[2887]: E1212 20:06:39.850277 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-59fc8f5456-7jq54" podUID="0319e4c1-786f-47b5-99ee-05577b0ae0cb" Dec 12 20:06:39.862929 containerd[1555]: time="2025-12-12T20:06:39.862881032Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 20:06:39.889521 systemd[1]: Started cri-containerd-0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f.scope - libcontainer container 0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f. Dec 12 20:06:39.893311 containerd[1555]: time="2025-12-12T20:06:39.892802239Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-s5kk6,Uid:bcf4ce02-dd77-415d-b9b4-03b7689df33d,Namespace:kube-system,Attempt:0,}" Dec 12 20:06:39.893311 containerd[1555]: time="2025-12-12T20:06:39.893242651Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-558bb796c-666sk,Uid:62353628-af81-4f55-9563-ec2bb7f43849,Namespace:calico-apiserver,Attempt:0,}" Dec 12 20:06:39.907679 systemd-networkd[1490]: cali1a21b32380d: Gained IPv6LL Dec 12 20:06:40.042029 containerd[1555]: time="2025-12-12T20:06:40.041962182Z" level=info msg="connecting to shim 581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2" address="unix:///run/containerd/s/73c8f5da53f3703cb8047d115067a058591315d0fe004cbb7c3370cfbc0ffe8e" namespace=k8s.io protocol=ttrpc version=3 Dec 12 20:06:40.145504 systemd[1]: Started cri-containerd-581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2.scope - libcontainer container 581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2. Dec 12 20:06:40.189762 containerd[1555]: time="2025-12-12T20:06:40.189503059Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:40.196371 containerd[1555]: time="2025-12-12T20:06:40.195443084Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 20:06:40.196371 containerd[1555]: time="2025-12-12T20:06:40.195603860Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 20:06:40.196518 kubelet[2887]: E1212 20:06:40.195859 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 20:06:40.196518 kubelet[2887]: E1212 20:06:40.195935 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 20:06:40.197009 kubelet[2887]: E1212 20:06:40.196873 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1f409ebb68b94cf2859e2ccda5082245,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-txd5s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-779d6bdfc4-bv9cs_calico-system(2fcf8e8b-a7d2-49df-8e20-83c1153edafc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:40.200314 containerd[1555]: time="2025-12-12T20:06:40.198416799Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 20:06:40.414601 systemd-networkd[1490]: cali59e5fc2944c: Link UP Dec 12 20:06:40.416650 systemd-networkd[1490]: cali59e5fc2944c: Gained carrier Dec 12 20:06:40.430256 kubelet[2887]: E1212 20:06:40.430063 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-59fc8f5456-7jq54" podUID="0319e4c1-786f-47b5-99ee-05577b0ae0cb" Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.019 [INFO][4529] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.067 [INFO][4529] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--s5kk6-eth0 coredns-674b8bbfcf- kube-system bcf4ce02-dd77-415d-b9b4-03b7689df33d 900 0 2025-12-12 20:05:48 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-n0ssy.gb1.brightbox.com coredns-674b8bbfcf-s5kk6 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali59e5fc2944c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" Namespace="kube-system" Pod="coredns-674b8bbfcf-s5kk6" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--s5kk6-" Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.067 [INFO][4529] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" Namespace="kube-system" Pod="coredns-674b8bbfcf-s5kk6" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--s5kk6-eth0" Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.208 [INFO][4589] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" HandleID="k8s-pod-network.ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" Workload="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--s5kk6-eth0" Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.209 [INFO][4589] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" HandleID="k8s-pod-network.ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" Workload="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--s5kk6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000391ea0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-n0ssy.gb1.brightbox.com", "pod":"coredns-674b8bbfcf-s5kk6", "timestamp":"2025-12-12 20:06:40.20840776 +0000 UTC"}, Hostname:"srv-n0ssy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.209 [INFO][4589] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.209 [INFO][4589] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.209 [INFO][4589] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-n0ssy.gb1.brightbox.com' Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.242 [INFO][4589] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.259 [INFO][4589] ipam/ipam.go 394: Looking up existing affinities for host host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.276 [INFO][4589] ipam/ipam.go 511: Trying affinity for 192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.288 [INFO][4589] ipam/ipam.go 158: Attempting to load block cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.295 [INFO][4589] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.295 [INFO][4589] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.78.0/26 handle="k8s-pod-network.ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.300 [INFO][4589] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.317 [INFO][4589] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.78.0/26 handle="k8s-pod-network.ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.357 [INFO][4589] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.78.7/26] block=192.168.78.0/26 handle="k8s-pod-network.ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.361 [INFO][4589] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.78.7/26] handle="k8s-pod-network.ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.361 [INFO][4589] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 20:06:40.475909 containerd[1555]: 2025-12-12 20:06:40.361 [INFO][4589] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.78.7/26] IPv6=[] ContainerID="ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" HandleID="k8s-pod-network.ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" Workload="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--s5kk6-eth0" Dec 12 20:06:40.483153 containerd[1555]: 2025-12-12 20:06:40.375 [INFO][4529] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" Namespace="kube-system" Pod="coredns-674b8bbfcf-s5kk6" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--s5kk6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--s5kk6-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"bcf4ce02-dd77-415d-b9b4-03b7689df33d", ResourceVersion:"900", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 5, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"", Pod:"coredns-674b8bbfcf-s5kk6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.78.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali59e5fc2944c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:40.483153 containerd[1555]: 2025-12-12 20:06:40.376 [INFO][4529] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.78.7/32] ContainerID="ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" Namespace="kube-system" Pod="coredns-674b8bbfcf-s5kk6" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--s5kk6-eth0" Dec 12 20:06:40.483153 containerd[1555]: 2025-12-12 20:06:40.376 [INFO][4529] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali59e5fc2944c ContainerID="ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" Namespace="kube-system" Pod="coredns-674b8bbfcf-s5kk6" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--s5kk6-eth0" Dec 12 20:06:40.483153 containerd[1555]: 2025-12-12 20:06:40.429 [INFO][4529] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" Namespace="kube-system" Pod="coredns-674b8bbfcf-s5kk6" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--s5kk6-eth0" Dec 12 20:06:40.483153 containerd[1555]: 2025-12-12 20:06:40.431 [INFO][4529] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" Namespace="kube-system" Pod="coredns-674b8bbfcf-s5kk6" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--s5kk6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--s5kk6-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"bcf4ce02-dd77-415d-b9b4-03b7689df33d", ResourceVersion:"900", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 5, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e", Pod:"coredns-674b8bbfcf-s5kk6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.78.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali59e5fc2944c", MAC:"36:55:da:f7:af:7b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:40.483153 containerd[1555]: 2025-12-12 20:06:40.460 [INFO][4529] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" Namespace="kube-system" Pod="coredns-674b8bbfcf-s5kk6" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-coredns--674b8bbfcf--s5kk6-eth0" Dec 12 20:06:40.497326 kubelet[2887]: I1212 20:06:40.460925 2887 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-gfqph" podStartSLOduration=52.460890673 podStartE2EDuration="52.460890673s" podCreationTimestamp="2025-12-12 20:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 20:06:40.447580237 +0000 UTC m=+58.876479702" watchObservedRunningTime="2025-12-12 20:06:40.460890673 +0000 UTC m=+58.889790228" Dec 12 20:06:40.526330 containerd[1555]: time="2025-12-12T20:06:40.525242113Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:40.527632 containerd[1555]: time="2025-12-12T20:06:40.527586285Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 20:06:40.528067 containerd[1555]: time="2025-12-12T20:06:40.527758205Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 20:06:40.528591 kubelet[2887]: E1212 20:06:40.528484 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 20:06:40.531680 kubelet[2887]: E1212 20:06:40.529229 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 20:06:40.531818 containerd[1555]: time="2025-12-12T20:06:40.531752133Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 20:06:40.532837 kubelet[2887]: E1212 20:06:40.532046 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzr2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-fwp5m_calico-system(2a05762a-3b72-4df9-aa17-753debf16cab): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:40.534322 kubelet[2887]: E1212 20:06:40.534217 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:06:40.604422 systemd-networkd[1490]: cali2e271cba362: Link UP Dec 12 20:06:40.606777 systemd-networkd[1490]: cali2e271cba362: Gained carrier Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.177 [INFO][4543] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.256 [INFO][4543] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--666sk-eth0 calico-apiserver-558bb796c- calico-apiserver 62353628-af81-4f55-9563-ec2bb7f43849 905 0 2025-12-12 20:06:03 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:558bb796c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-n0ssy.gb1.brightbox.com calico-apiserver-558bb796c-666sk eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2e271cba362 [] [] }} ContainerID="5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" Namespace="calico-apiserver" Pod="calico-apiserver-558bb796c-666sk" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--666sk-" Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.256 [INFO][4543] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" Namespace="calico-apiserver" Pod="calico-apiserver-558bb796c-666sk" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--666sk-eth0" Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.357 [INFO][4620] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" HandleID="k8s-pod-network.5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" Workload="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--666sk-eth0" Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.358 [INFO][4620] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" HandleID="k8s-pod-network.5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" Workload="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--666sk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00033d740), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-n0ssy.gb1.brightbox.com", "pod":"calico-apiserver-558bb796c-666sk", "timestamp":"2025-12-12 20:06:40.357931881 +0000 UTC"}, Hostname:"srv-n0ssy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.358 [INFO][4620] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.363 [INFO][4620] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.363 [INFO][4620] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-n0ssy.gb1.brightbox.com' Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.426 [INFO][4620] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.482 [INFO][4620] ipam/ipam.go 394: Looking up existing affinities for host host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.507 [INFO][4620] ipam/ipam.go 511: Trying affinity for 192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.514 [INFO][4620] ipam/ipam.go 158: Attempting to load block cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.545 [INFO][4620] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.547 [INFO][4620] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.78.0/26 handle="k8s-pod-network.5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.562 [INFO][4620] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57 Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.572 [INFO][4620] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.78.0/26 handle="k8s-pod-network.5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.589 [INFO][4620] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.78.8/26] block=192.168.78.0/26 handle="k8s-pod-network.5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.590 [INFO][4620] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.78.8/26] handle="k8s-pod-network.5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.590 [INFO][4620] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 20:06:40.633074 containerd[1555]: 2025-12-12 20:06:40.590 [INFO][4620] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.78.8/26] IPv6=[] ContainerID="5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" HandleID="k8s-pod-network.5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" Workload="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--666sk-eth0" Dec 12 20:06:40.636892 containerd[1555]: 2025-12-12 20:06:40.595 [INFO][4543] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" Namespace="calico-apiserver" Pod="calico-apiserver-558bb796c-666sk" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--666sk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--666sk-eth0", GenerateName:"calico-apiserver-558bb796c-", Namespace:"calico-apiserver", SelfLink:"", UID:"62353628-af81-4f55-9563-ec2bb7f43849", ResourceVersion:"905", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 6, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"558bb796c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-558bb796c-666sk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.78.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2e271cba362", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:40.636892 containerd[1555]: 2025-12-12 20:06:40.595 [INFO][4543] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.78.8/32] ContainerID="5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" Namespace="calico-apiserver" Pod="calico-apiserver-558bb796c-666sk" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--666sk-eth0" Dec 12 20:06:40.636892 containerd[1555]: 2025-12-12 20:06:40.596 [INFO][4543] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2e271cba362 ContainerID="5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" Namespace="calico-apiserver" Pod="calico-apiserver-558bb796c-666sk" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--666sk-eth0" Dec 12 20:06:40.636892 containerd[1555]: 2025-12-12 20:06:40.607 [INFO][4543] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" Namespace="calico-apiserver" Pod="calico-apiserver-558bb796c-666sk" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--666sk-eth0" Dec 12 20:06:40.636892 containerd[1555]: 2025-12-12 20:06:40.607 [INFO][4543] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" Namespace="calico-apiserver" Pod="calico-apiserver-558bb796c-666sk" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--666sk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--666sk-eth0", GenerateName:"calico-apiserver-558bb796c-", Namespace:"calico-apiserver", SelfLink:"", UID:"62353628-af81-4f55-9563-ec2bb7f43849", ResourceVersion:"905", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 6, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"558bb796c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57", Pod:"calico-apiserver-558bb796c-666sk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.78.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2e271cba362", MAC:"ca:2c:9e:3f:89:55", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:40.636892 containerd[1555]: 2025-12-12 20:06:40.628 [INFO][4543] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" Namespace="calico-apiserver" Pod="calico-apiserver-558bb796c-666sk" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--558bb796c--666sk-eth0" Dec 12 20:06:40.708923 containerd[1555]: time="2025-12-12T20:06:40.708550883Z" level=info msg="connecting to shim ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e" address="unix:///run/containerd/s/2cb3394a8044db642e9dac29e59a96dadfb74adb12e52a6aa1ddd8e22b2b5278" namespace=k8s.io protocol=ttrpc version=3 Dec 12 20:06:40.733789 containerd[1555]: time="2025-12-12T20:06:40.730086350Z" level=info msg="connecting to shim 5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57" address="unix:///run/containerd/s/dde08ec4019a48e063ec65e080fa1ad3e4568025d0a0ea41116d1d6cd921104f" namespace=k8s.io protocol=ttrpc version=3 Dec 12 20:06:40.824906 systemd[1]: Started cri-containerd-ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e.scope - libcontainer container ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e. Dec 12 20:06:40.852557 systemd[1]: Started cri-containerd-5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57.scope - libcontainer container 5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57. Dec 12 20:06:40.877097 containerd[1555]: time="2025-12-12T20:06:40.877005159Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f7c49f9bf-2vbl6,Uid:dcaf5416-6748-4e19-9a64-70511b93ac27,Namespace:calico-apiserver,Attempt:0,}" Dec 12 20:06:40.882344 containerd[1555]: time="2025-12-12T20:06:40.882263536Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:40.888324 containerd[1555]: time="2025-12-12T20:06:40.886818403Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 20:06:40.888324 containerd[1555]: time="2025-12-12T20:06:40.886971343Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 20:06:40.888882 kubelet[2887]: E1212 20:06:40.888806 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 20:06:40.890237 kubelet[2887]: E1212 20:06:40.889142 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 20:06:40.892945 kubelet[2887]: E1212 20:06:40.892873 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-txd5s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-779d6bdfc4-bv9cs_calico-system(2fcf8e8b-a7d2-49df-8e20-83c1153edafc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:40.895522 kubelet[2887]: E1212 20:06:40.895421 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-779d6bdfc4-bv9cs" podUID="2fcf8e8b-a7d2-49df-8e20-83c1153edafc" Dec 12 20:06:41.035863 containerd[1555]: time="2025-12-12T20:06:41.034663587Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-s5kk6,Uid:bcf4ce02-dd77-415d-b9b4-03b7689df33d,Namespace:kube-system,Attempt:0,} returns sandbox id \"ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e\"" Dec 12 20:06:41.046465 containerd[1555]: time="2025-12-12T20:06:41.046414818Z" level=info msg="CreateContainer within sandbox \"ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 12 20:06:41.075949 containerd[1555]: time="2025-12-12T20:06:41.075878800Z" level=info msg="Container 640a1e98197ff6375cb34098dbb658dd3d42b2049bbf8e13d24262806c6e0f51: CDI devices from CRI Config.CDIDevices: []" Dec 12 20:06:41.103518 containerd[1555]: time="2025-12-12T20:06:41.097293221Z" level=info msg="CreateContainer within sandbox \"ec4f55a5752915ff731b80f8f511ed23173f51953ac8ae723530d51b097e336e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"640a1e98197ff6375cb34098dbb658dd3d42b2049bbf8e13d24262806c6e0f51\"" Dec 12 20:06:41.108675 containerd[1555]: time="2025-12-12T20:06:41.107467233Z" level=info msg="StartContainer for \"640a1e98197ff6375cb34098dbb658dd3d42b2049bbf8e13d24262806c6e0f51\"" Dec 12 20:06:41.110682 containerd[1555]: time="2025-12-12T20:06:41.110645987Z" level=info msg="connecting to shim 640a1e98197ff6375cb34098dbb658dd3d42b2049bbf8e13d24262806c6e0f51" address="unix:///run/containerd/s/2cb3394a8044db642e9dac29e59a96dadfb74adb12e52a6aa1ddd8e22b2b5278" protocol=ttrpc version=3 Dec 12 20:06:41.186768 systemd-networkd[1490]: calid896654d7a2: Gained IPv6LL Dec 12 20:06:41.197639 systemd[1]: Started cri-containerd-640a1e98197ff6375cb34098dbb658dd3d42b2049bbf8e13d24262806c6e0f51.scope - libcontainer container 640a1e98197ff6375cb34098dbb658dd3d42b2049bbf8e13d24262806c6e0f51. Dec 12 20:06:41.303654 containerd[1555]: time="2025-12-12T20:06:41.302507719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-558bb796c-w9rs4,Uid:87f20b04-9afa-4c4b-849e-0456cd78dc55,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"581e999ebf4a3891782ff63164c92eb9bfaa699669b926fea22f0247725b72a2\"" Dec 12 20:06:41.315210 systemd-networkd[1490]: cali3a5e877266f: Gained IPv6LL Dec 12 20:06:41.323708 containerd[1555]: time="2025-12-12T20:06:41.323242855Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-22cbp,Uid:b08daecd-0b61-450c-a526-5e7b591cde3e,Namespace:calico-system,Attempt:0,} returns sandbox id \"0e4274c1d68c9bcaf326eb910ef5b9c2e15303834d0cf8884b16c311d70a884f\"" Dec 12 20:06:41.329680 containerd[1555]: time="2025-12-12T20:06:41.328968336Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 20:06:41.377132 containerd[1555]: time="2025-12-12T20:06:41.377053572Z" level=info msg="StartContainer for \"640a1e98197ff6375cb34098dbb658dd3d42b2049bbf8e13d24262806c6e0f51\" returns successfully" Dec 12 20:06:41.411410 systemd-networkd[1490]: cali11e180967b2: Link UP Dec 12 20:06:41.413449 systemd-networkd[1490]: cali11e180967b2: Gained carrier Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:40.971 [INFO][4733] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.009 [INFO][4733] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--7f7c49f9bf--2vbl6-eth0 calico-apiserver-7f7c49f9bf- calico-apiserver dcaf5416-6748-4e19-9a64-70511b93ac27 902 0 2025-12-12 20:06:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7f7c49f9bf projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-n0ssy.gb1.brightbox.com calico-apiserver-7f7c49f9bf-2vbl6 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali11e180967b2 [] [] }} ContainerID="9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" Namespace="calico-apiserver" Pod="calico-apiserver-7f7c49f9bf-2vbl6" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--7f7c49f9bf--2vbl6-" Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.010 [INFO][4733] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" Namespace="calico-apiserver" Pod="calico-apiserver-7f7c49f9bf-2vbl6" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--7f7c49f9bf--2vbl6-eth0" Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.211 [INFO][4757] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" HandleID="k8s-pod-network.9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" Workload="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--7f7c49f9bf--2vbl6-eth0" Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.212 [INFO][4757] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" HandleID="k8s-pod-network.9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" Workload="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--7f7c49f9bf--2vbl6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000103760), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-n0ssy.gb1.brightbox.com", "pod":"calico-apiserver-7f7c49f9bf-2vbl6", "timestamp":"2025-12-12 20:06:41.2116751 +0000 UTC"}, Hostname:"srv-n0ssy.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.213 [INFO][4757] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.213 [INFO][4757] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.213 [INFO][4757] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-n0ssy.gb1.brightbox.com' Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.276 [INFO][4757] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.308 [INFO][4757] ipam/ipam.go 394: Looking up existing affinities for host host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.342 [INFO][4757] ipam/ipam.go 511: Trying affinity for 192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.349 [INFO][4757] ipam/ipam.go 158: Attempting to load block cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.358 [INFO][4757] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.78.0/26 host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.359 [INFO][4757] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.78.0/26 handle="k8s-pod-network.9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.367 [INFO][4757] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.380 [INFO][4757] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.78.0/26 handle="k8s-pod-network.9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.395 [INFO][4757] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.78.9/26] block=192.168.78.0/26 handle="k8s-pod-network.9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.395 [INFO][4757] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.78.9/26] handle="k8s-pod-network.9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" host="srv-n0ssy.gb1.brightbox.com" Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.396 [INFO][4757] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 12 20:06:41.480454 containerd[1555]: 2025-12-12 20:06:41.396 [INFO][4757] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.78.9/26] IPv6=[] ContainerID="9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" HandleID="k8s-pod-network.9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" Workload="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--7f7c49f9bf--2vbl6-eth0" Dec 12 20:06:41.485069 containerd[1555]: 2025-12-12 20:06:41.402 [INFO][4733] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" Namespace="calico-apiserver" Pod="calico-apiserver-7f7c49f9bf-2vbl6" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--7f7c49f9bf--2vbl6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--7f7c49f9bf--2vbl6-eth0", GenerateName:"calico-apiserver-7f7c49f9bf-", Namespace:"calico-apiserver", SelfLink:"", UID:"dcaf5416-6748-4e19-9a64-70511b93ac27", ResourceVersion:"902", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 6, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7f7c49f9bf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-7f7c49f9bf-2vbl6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.78.9/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali11e180967b2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:41.485069 containerd[1555]: 2025-12-12 20:06:41.404 [INFO][4733] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.78.9/32] ContainerID="9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" Namespace="calico-apiserver" Pod="calico-apiserver-7f7c49f9bf-2vbl6" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--7f7c49f9bf--2vbl6-eth0" Dec 12 20:06:41.485069 containerd[1555]: 2025-12-12 20:06:41.404 [INFO][4733] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali11e180967b2 ContainerID="9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" Namespace="calico-apiserver" Pod="calico-apiserver-7f7c49f9bf-2vbl6" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--7f7c49f9bf--2vbl6-eth0" Dec 12 20:06:41.485069 containerd[1555]: 2025-12-12 20:06:41.415 [INFO][4733] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" Namespace="calico-apiserver" Pod="calico-apiserver-7f7c49f9bf-2vbl6" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--7f7c49f9bf--2vbl6-eth0" Dec 12 20:06:41.485069 containerd[1555]: 2025-12-12 20:06:41.417 [INFO][4733] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" Namespace="calico-apiserver" Pod="calico-apiserver-7f7c49f9bf-2vbl6" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--7f7c49f9bf--2vbl6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--7f7c49f9bf--2vbl6-eth0", GenerateName:"calico-apiserver-7f7c49f9bf-", Namespace:"calico-apiserver", SelfLink:"", UID:"dcaf5416-6748-4e19-9a64-70511b93ac27", ResourceVersion:"902", Generation:0, CreationTimestamp:time.Date(2025, time.December, 12, 20, 6, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7f7c49f9bf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-n0ssy.gb1.brightbox.com", ContainerID:"9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e", Pod:"calico-apiserver-7f7c49f9bf-2vbl6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.78.9/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali11e180967b2", MAC:"96:11:e1:07:97:bc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 12 20:06:41.485069 containerd[1555]: 2025-12-12 20:06:41.469 [INFO][4733] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" Namespace="calico-apiserver" Pod="calico-apiserver-7f7c49f9bf-2vbl6" WorkloadEndpoint="srv--n0ssy.gb1.brightbox.com-k8s-calico--apiserver--7f7c49f9bf--2vbl6-eth0" Dec 12 20:06:41.488579 kubelet[2887]: E1212 20:06:41.486691 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:06:41.488579 kubelet[2887]: E1212 20:06:41.486926 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-779d6bdfc4-bv9cs" podUID="2fcf8e8b-a7d2-49df-8e20-83c1153edafc" Dec 12 20:06:41.551458 containerd[1555]: time="2025-12-12T20:06:41.551381000Z" level=info msg="connecting to shim 9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e" address="unix:///run/containerd/s/2b1b6d26d72d1f2ca498c1b2c96aef72b9c62e0860eb3eb335de393c8212b678" namespace=k8s.io protocol=ttrpc version=3 Dec 12 20:06:41.626543 systemd[1]: Started cri-containerd-9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e.scope - libcontainer container 9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e. Dec 12 20:06:41.649220 kubelet[2887]: I1212 20:06:41.647715 2887 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-s5kk6" podStartSLOduration=53.647688707 podStartE2EDuration="53.647688707s" podCreationTimestamp="2025-12-12 20:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-12 20:06:41.643564192 +0000 UTC m=+60.072463662" watchObservedRunningTime="2025-12-12 20:06:41.647688707 +0000 UTC m=+60.076588157" Dec 12 20:06:41.660259 containerd[1555]: time="2025-12-12T20:06:41.660199808Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:41.665457 containerd[1555]: time="2025-12-12T20:06:41.665380741Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 20:06:41.665457 containerd[1555]: time="2025-12-12T20:06:41.665388086Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 20:06:41.666616 kubelet[2887]: E1212 20:06:41.666072 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:06:41.666746 kubelet[2887]: E1212 20:06:41.666544 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:06:41.669898 kubelet[2887]: E1212 20:06:41.667959 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ctvmp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-558bb796c-w9rs4_calico-apiserver(87f20b04-9afa-4c4b-849e-0456cd78dc55): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:41.671430 kubelet[2887]: E1212 20:06:41.670428 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-w9rs4" podUID="87f20b04-9afa-4c4b-849e-0456cd78dc55" Dec 12 20:06:41.671775 containerd[1555]: time="2025-12-12T20:06:41.671723064Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 20:06:41.694664 containerd[1555]: time="2025-12-12T20:06:41.694592054Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-558bb796c-666sk,Uid:62353628-af81-4f55-9563-ec2bb7f43849,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5eb7eb56be7419b2d005583e9e4d4232b7fabae5b08baed145757768e799be57\"" Dec 12 20:06:41.869776 containerd[1555]: time="2025-12-12T20:06:41.869396084Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f7c49f9bf-2vbl6,Uid:dcaf5416-6748-4e19-9a64-70511b93ac27,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"9fd8add51f5396ba9d463fd75c5b42bd7ed06d6cbfbb3568a1441b9d7847099e\"" Dec 12 20:06:41.998667 containerd[1555]: time="2025-12-12T20:06:41.997381229Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:42.004790 containerd[1555]: time="2025-12-12T20:06:42.004586682Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 20:06:42.004790 containerd[1555]: time="2025-12-12T20:06:42.004704957Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 20:06:42.005305 kubelet[2887]: E1212 20:06:42.005114 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 20:06:42.005305 kubelet[2887]: E1212 20:06:42.005180 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 20:06:42.006482 kubelet[2887]: E1212 20:06:42.006265 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wkfwk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-22cbp_calico-system(b08daecd-0b61-450c-a526-5e7b591cde3e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:42.008582 containerd[1555]: time="2025-12-12T20:06:42.007767460Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 20:06:42.008681 kubelet[2887]: E1212 20:06:42.008278 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-22cbp" podUID="b08daecd-0b61-450c-a526-5e7b591cde3e" Dec 12 20:06:42.146765 systemd-networkd[1490]: cali59e5fc2944c: Gained IPv6LL Dec 12 20:06:42.350266 containerd[1555]: time="2025-12-12T20:06:42.350104712Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:42.355565 containerd[1555]: time="2025-12-12T20:06:42.355409942Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 20:06:42.356443 containerd[1555]: time="2025-12-12T20:06:42.355548605Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 20:06:42.357009 kubelet[2887]: E1212 20:06:42.356863 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:06:42.357461 kubelet[2887]: E1212 20:06:42.357393 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:06:42.358511 kubelet[2887]: E1212 20:06:42.358050 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7qkn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-558bb796c-666sk_calico-apiserver(62353628-af81-4f55-9563-ec2bb7f43849): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:42.359690 kubelet[2887]: E1212 20:06:42.359498 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-666sk" podUID="62353628-af81-4f55-9563-ec2bb7f43849" Dec 12 20:06:42.360560 containerd[1555]: time="2025-12-12T20:06:42.360469286Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 20:06:42.466736 systemd-networkd[1490]: cali11e180967b2: Gained IPv6LL Dec 12 20:06:42.480997 kubelet[2887]: E1212 20:06:42.480874 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-666sk" podUID="62353628-af81-4f55-9563-ec2bb7f43849" Dec 12 20:06:42.485208 kubelet[2887]: E1212 20:06:42.484490 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-22cbp" podUID="b08daecd-0b61-450c-a526-5e7b591cde3e" Dec 12 20:06:42.485208 kubelet[2887]: E1212 20:06:42.484587 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-w9rs4" podUID="87f20b04-9afa-4c4b-849e-0456cd78dc55" Dec 12 20:06:42.531765 systemd-networkd[1490]: cali2e271cba362: Gained IPv6LL Dec 12 20:06:42.712988 containerd[1555]: time="2025-12-12T20:06:42.712085897Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:42.714825 containerd[1555]: time="2025-12-12T20:06:42.714779483Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 20:06:42.715077 containerd[1555]: time="2025-12-12T20:06:42.714937939Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 20:06:42.715693 kubelet[2887]: E1212 20:06:42.715581 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:06:42.717422 kubelet[2887]: E1212 20:06:42.716297 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:06:42.717422 kubelet[2887]: E1212 20:06:42.716768 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cpx95,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7f7c49f9bf-2vbl6_calico-apiserver(dcaf5416-6748-4e19-9a64-70511b93ac27): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:42.718646 kubelet[2887]: E1212 20:06:42.718597 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7f7c49f9bf-2vbl6" podUID="dcaf5416-6748-4e19-9a64-70511b93ac27" Dec 12 20:06:42.946678 systemd-networkd[1490]: vxlan.calico: Link UP Dec 12 20:06:42.946694 systemd-networkd[1490]: vxlan.calico: Gained carrier Dec 12 20:06:43.497167 kubelet[2887]: E1212 20:06:43.495866 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-666sk" podUID="62353628-af81-4f55-9563-ec2bb7f43849" Dec 12 20:06:43.497167 kubelet[2887]: E1212 20:06:43.497100 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7f7c49f9bf-2vbl6" podUID="dcaf5416-6748-4e19-9a64-70511b93ac27" Dec 12 20:06:44.322596 systemd-networkd[1490]: vxlan.calico: Gained IPv6LL Dec 12 20:06:53.879021 containerd[1555]: time="2025-12-12T20:06:53.878762992Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 20:06:54.188801 containerd[1555]: time="2025-12-12T20:06:54.188571465Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:54.190254 containerd[1555]: time="2025-12-12T20:06:54.190187075Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 20:06:54.190380 containerd[1555]: time="2025-12-12T20:06:54.190317091Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 20:06:54.191035 kubelet[2887]: E1212 20:06:54.190606 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 20:06:54.191035 kubelet[2887]: E1212 20:06:54.190694 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 20:06:54.191585 containerd[1555]: time="2025-12-12T20:06:54.191219317Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 20:06:54.193613 kubelet[2887]: E1212 20:06:54.192385 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wkfwk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-22cbp_calico-system(b08daecd-0b61-450c-a526-5e7b591cde3e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:54.194068 kubelet[2887]: E1212 20:06:54.193998 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-22cbp" podUID="b08daecd-0b61-450c-a526-5e7b591cde3e" Dec 12 20:06:54.497316 containerd[1555]: time="2025-12-12T20:06:54.497068991Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:54.499362 containerd[1555]: time="2025-12-12T20:06:54.499252332Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 20:06:54.499622 containerd[1555]: time="2025-12-12T20:06:54.499330744Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 20:06:54.500006 kubelet[2887]: E1212 20:06:54.499830 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:06:54.500006 kubelet[2887]: E1212 20:06:54.499939 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:06:54.500712 kubelet[2887]: E1212 20:06:54.500460 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cpx95,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7f7c49f9bf-2vbl6_calico-apiserver(dcaf5416-6748-4e19-9a64-70511b93ac27): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:54.502469 containerd[1555]: time="2025-12-12T20:06:54.502311487Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 20:06:54.502550 kubelet[2887]: E1212 20:06:54.502374 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7f7c49f9bf-2vbl6" podUID="dcaf5416-6748-4e19-9a64-70511b93ac27" Dec 12 20:06:54.823500 containerd[1555]: time="2025-12-12T20:06:54.823343243Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:54.824696 containerd[1555]: time="2025-12-12T20:06:54.824601812Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 20:06:54.824696 containerd[1555]: time="2025-12-12T20:06:54.824654193Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 20:06:54.825176 kubelet[2887]: E1212 20:06:54.825098 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 20:06:54.825267 kubelet[2887]: E1212 20:06:54.825197 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 20:06:54.825552 kubelet[2887]: E1212 20:06:54.825473 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vw8ps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-59fc8f5456-7jq54_calico-system(0319e4c1-786f-47b5-99ee-05577b0ae0cb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:54.827449 kubelet[2887]: E1212 20:06:54.827396 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-59fc8f5456-7jq54" podUID="0319e4c1-786f-47b5-99ee-05577b0ae0cb" Dec 12 20:06:54.875910 containerd[1555]: time="2025-12-12T20:06:54.875670784Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 20:06:55.183957 containerd[1555]: time="2025-12-12T20:06:55.183535211Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:55.185307 containerd[1555]: time="2025-12-12T20:06:55.185185897Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 20:06:55.185307 containerd[1555]: time="2025-12-12T20:06:55.185221394Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 20:06:55.185768 kubelet[2887]: E1212 20:06:55.185681 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:06:55.185892 kubelet[2887]: E1212 20:06:55.185799 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:06:55.186959 containerd[1555]: time="2025-12-12T20:06:55.186678673Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 20:06:55.187046 kubelet[2887]: E1212 20:06:55.186694 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7qkn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-558bb796c-666sk_calico-apiserver(62353628-af81-4f55-9563-ec2bb7f43849): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:55.188383 kubelet[2887]: E1212 20:06:55.188139 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-666sk" podUID="62353628-af81-4f55-9563-ec2bb7f43849" Dec 12 20:06:55.494131 containerd[1555]: time="2025-12-12T20:06:55.493892951Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:55.496054 containerd[1555]: time="2025-12-12T20:06:55.495972946Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 20:06:55.496424 containerd[1555]: time="2025-12-12T20:06:55.496080282Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 20:06:55.496580 kubelet[2887]: E1212 20:06:55.496407 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 20:06:55.496580 kubelet[2887]: E1212 20:06:55.496487 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 20:06:55.498368 kubelet[2887]: E1212 20:06:55.496674 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1f409ebb68b94cf2859e2ccda5082245,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-txd5s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-779d6bdfc4-bv9cs_calico-system(2fcf8e8b-a7d2-49df-8e20-83c1153edafc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:55.499765 containerd[1555]: time="2025-12-12T20:06:55.499730446Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 20:06:55.801894 containerd[1555]: time="2025-12-12T20:06:55.801816962Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:55.803259 containerd[1555]: time="2025-12-12T20:06:55.803205729Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 20:06:55.803941 containerd[1555]: time="2025-12-12T20:06:55.803236212Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 20:06:55.804027 kubelet[2887]: E1212 20:06:55.803608 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 20:06:55.804027 kubelet[2887]: E1212 20:06:55.803681 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 20:06:55.804027 kubelet[2887]: E1212 20:06:55.803849 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-txd5s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-779d6bdfc4-bv9cs_calico-system(2fcf8e8b-a7d2-49df-8e20-83c1153edafc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:55.805453 kubelet[2887]: E1212 20:06:55.805382 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-779d6bdfc4-bv9cs" podUID="2fcf8e8b-a7d2-49df-8e20-83c1153edafc" Dec 12 20:06:55.877832 containerd[1555]: time="2025-12-12T20:06:55.877761879Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 20:06:56.212496 containerd[1555]: time="2025-12-12T20:06:56.212212398Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:56.214793 containerd[1555]: time="2025-12-12T20:06:56.214739205Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 20:06:56.214959 containerd[1555]: time="2025-12-12T20:06:56.214771066Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 20:06:56.215221 kubelet[2887]: E1212 20:06:56.215151 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:06:56.215495 kubelet[2887]: E1212 20:06:56.215251 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:06:56.215694 kubelet[2887]: E1212 20:06:56.215623 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ctvmp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-558bb796c-w9rs4_calico-apiserver(87f20b04-9afa-4c4b-849e-0456cd78dc55): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:56.216961 kubelet[2887]: E1212 20:06:56.216913 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-w9rs4" podUID="87f20b04-9afa-4c4b-849e-0456cd78dc55" Dec 12 20:06:56.874651 containerd[1555]: time="2025-12-12T20:06:56.874441318Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 20:06:57.195633 containerd[1555]: time="2025-12-12T20:06:57.195385387Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:57.196922 containerd[1555]: time="2025-12-12T20:06:57.196826166Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 20:06:57.197708 containerd[1555]: time="2025-12-12T20:06:57.196864158Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 20:06:57.197787 kubelet[2887]: E1212 20:06:57.197254 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 20:06:57.197787 kubelet[2887]: E1212 20:06:57.197369 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 20:06:57.197787 kubelet[2887]: E1212 20:06:57.197579 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzr2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-fwp5m_calico-system(2a05762a-3b72-4df9-aa17-753debf16cab): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:57.201459 containerd[1555]: time="2025-12-12T20:06:57.201421274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 20:06:57.509059 containerd[1555]: time="2025-12-12T20:06:57.508851118Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:06:57.510342 containerd[1555]: time="2025-12-12T20:06:57.510239781Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 20:06:57.510474 containerd[1555]: time="2025-12-12T20:06:57.510321801Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 20:06:57.511277 kubelet[2887]: E1212 20:06:57.510831 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 20:06:57.511277 kubelet[2887]: E1212 20:06:57.510952 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 20:06:57.511277 kubelet[2887]: E1212 20:06:57.511142 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzr2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-fwp5m_calico-system(2a05762a-3b72-4df9-aa17-753debf16cab): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 20:06:57.512466 kubelet[2887]: E1212 20:06:57.512392 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:07:05.875014 kubelet[2887]: E1212 20:07:05.874914 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-59fc8f5456-7jq54" podUID="0319e4c1-786f-47b5-99ee-05577b0ae0cb" Dec 12 20:07:06.874405 kubelet[2887]: E1212 20:07:06.874164 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7f7c49f9bf-2vbl6" podUID="dcaf5416-6748-4e19-9a64-70511b93ac27" Dec 12 20:07:07.877561 kubelet[2887]: E1212 20:07:07.877473 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-w9rs4" podUID="87f20b04-9afa-4c4b-849e-0456cd78dc55" Dec 12 20:07:07.880242 kubelet[2887]: E1212 20:07:07.879245 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-779d6bdfc4-bv9cs" podUID="2fcf8e8b-a7d2-49df-8e20-83c1153edafc" Dec 12 20:07:08.874434 kubelet[2887]: E1212 20:07:08.874331 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-22cbp" podUID="b08daecd-0b61-450c-a526-5e7b591cde3e" Dec 12 20:07:09.873786 kubelet[2887]: E1212 20:07:09.873719 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-666sk" podUID="62353628-af81-4f55-9563-ec2bb7f43849" Dec 12 20:07:11.877324 kubelet[2887]: E1212 20:07:11.877089 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:07:15.680665 systemd[1]: Started sshd@9-10.244.19.234:22-147.75.109.163:56380.service - OpenSSH per-connection server daemon (147.75.109.163:56380). Dec 12 20:07:16.678666 sshd[5092]: Accepted publickey for core from 147.75.109.163 port 56380 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:07:16.681910 sshd-session[5092]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:07:16.695576 systemd-logind[1528]: New session 12 of user core. Dec 12 20:07:16.701634 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 12 20:07:18.029310 sshd[5095]: Connection closed by 147.75.109.163 port 56380 Dec 12 20:07:18.032765 sshd-session[5092]: pam_unix(sshd:session): session closed for user core Dec 12 20:07:18.047568 systemd[1]: sshd@9-10.244.19.234:22-147.75.109.163:56380.service: Deactivated successfully. Dec 12 20:07:18.047619 systemd-logind[1528]: Session 12 logged out. Waiting for processes to exit. Dec 12 20:07:18.054107 systemd[1]: session-12.scope: Deactivated successfully. Dec 12 20:07:18.059944 systemd-logind[1528]: Removed session 12. Dec 12 20:07:18.875469 containerd[1555]: time="2025-12-12T20:07:18.875359136Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 20:07:19.218941 containerd[1555]: time="2025-12-12T20:07:19.218203549Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:07:19.220342 containerd[1555]: time="2025-12-12T20:07:19.220035563Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 20:07:19.220342 containerd[1555]: time="2025-12-12T20:07:19.220181389Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 20:07:19.220834 kubelet[2887]: E1212 20:07:19.220707 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 20:07:19.221845 kubelet[2887]: E1212 20:07:19.220979 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 20:07:19.222478 containerd[1555]: time="2025-12-12T20:07:19.222335124Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 20:07:19.232029 kubelet[2887]: E1212 20:07:19.230669 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1f409ebb68b94cf2859e2ccda5082245,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-txd5s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-779d6bdfc4-bv9cs_calico-system(2fcf8e8b-a7d2-49df-8e20-83c1153edafc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 20:07:19.570726 containerd[1555]: time="2025-12-12T20:07:19.570638128Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:07:19.572346 containerd[1555]: time="2025-12-12T20:07:19.572309969Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 20:07:19.572425 containerd[1555]: time="2025-12-12T20:07:19.572382152Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 20:07:19.573939 kubelet[2887]: E1212 20:07:19.572784 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:07:19.573939 kubelet[2887]: E1212 20:07:19.572881 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:07:19.573939 kubelet[2887]: E1212 20:07:19.573344 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cpx95,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7f7c49f9bf-2vbl6_calico-apiserver(dcaf5416-6748-4e19-9a64-70511b93ac27): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 20:07:19.574324 containerd[1555]: time="2025-12-12T20:07:19.573440322Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 20:07:19.574521 kubelet[2887]: E1212 20:07:19.574479 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7f7c49f9bf-2vbl6" podUID="dcaf5416-6748-4e19-9a64-70511b93ac27" Dec 12 20:07:19.879929 containerd[1555]: time="2025-12-12T20:07:19.879327496Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:07:19.882683 containerd[1555]: time="2025-12-12T20:07:19.882582297Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 20:07:19.883363 containerd[1555]: time="2025-12-12T20:07:19.882693234Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 20:07:19.883428 kubelet[2887]: E1212 20:07:19.883024 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 20:07:19.883428 kubelet[2887]: E1212 20:07:19.883141 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 20:07:19.884312 kubelet[2887]: E1212 20:07:19.883745 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-txd5s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-779d6bdfc4-bv9cs_calico-system(2fcf8e8b-a7d2-49df-8e20-83c1153edafc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 20:07:19.885342 containerd[1555]: time="2025-12-12T20:07:19.884029605Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 20:07:19.885630 kubelet[2887]: E1212 20:07:19.885519 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-779d6bdfc4-bv9cs" podUID="2fcf8e8b-a7d2-49df-8e20-83c1153edafc" Dec 12 20:07:20.204404 containerd[1555]: time="2025-12-12T20:07:20.203935584Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:07:20.206314 containerd[1555]: time="2025-12-12T20:07:20.205613635Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 20:07:20.207308 kubelet[2887]: E1212 20:07:20.206951 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 20:07:20.207308 kubelet[2887]: E1212 20:07:20.207041 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 20:07:20.208480 kubelet[2887]: E1212 20:07:20.208411 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vw8ps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-59fc8f5456-7jq54_calico-system(0319e4c1-786f-47b5-99ee-05577b0ae0cb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 20:07:20.210344 kubelet[2887]: E1212 20:07:20.210277 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-59fc8f5456-7jq54" podUID="0319e4c1-786f-47b5-99ee-05577b0ae0cb" Dec 12 20:07:20.230318 containerd[1555]: time="2025-12-12T20:07:20.206410811Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 20:07:20.230318 containerd[1555]: time="2025-12-12T20:07:20.210586926Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 20:07:20.546774 containerd[1555]: time="2025-12-12T20:07:20.546595578Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:07:20.548902 containerd[1555]: time="2025-12-12T20:07:20.548846444Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 20:07:20.550327 containerd[1555]: time="2025-12-12T20:07:20.548946007Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 20:07:20.550408 kubelet[2887]: E1212 20:07:20.549172 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:07:20.550408 kubelet[2887]: E1212 20:07:20.549240 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:07:20.550408 kubelet[2887]: E1212 20:07:20.549714 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ctvmp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-558bb796c-w9rs4_calico-apiserver(87f20b04-9afa-4c4b-849e-0456cd78dc55): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 20:07:20.551381 kubelet[2887]: E1212 20:07:20.550845 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-w9rs4" podUID="87f20b04-9afa-4c4b-849e-0456cd78dc55" Dec 12 20:07:20.875505 containerd[1555]: time="2025-12-12T20:07:20.875438664Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 20:07:21.301827 containerd[1555]: time="2025-12-12T20:07:21.301563761Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:07:21.305385 containerd[1555]: time="2025-12-12T20:07:21.305270453Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 20:07:21.306021 containerd[1555]: time="2025-12-12T20:07:21.305340771Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 20:07:21.306215 kubelet[2887]: E1212 20:07:21.305883 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:07:21.306215 kubelet[2887]: E1212 20:07:21.306179 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:07:21.306699 kubelet[2887]: E1212 20:07:21.306624 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7qkn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-558bb796c-666sk_calico-apiserver(62353628-af81-4f55-9563-ec2bb7f43849): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 20:07:21.309566 kubelet[2887]: E1212 20:07:21.309506 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-666sk" podUID="62353628-af81-4f55-9563-ec2bb7f43849" Dec 12 20:07:23.187851 systemd[1]: Started sshd@10-10.244.19.234:22-147.75.109.163:54102.service - OpenSSH per-connection server daemon (147.75.109.163:54102). Dec 12 20:07:23.874793 containerd[1555]: time="2025-12-12T20:07:23.874740710Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 20:07:24.134670 sshd[5112]: Accepted publickey for core from 147.75.109.163 port 54102 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:07:24.137942 sshd-session[5112]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:07:24.147856 systemd-logind[1528]: New session 13 of user core. Dec 12 20:07:24.154522 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 12 20:07:24.924499 sshd[5123]: Connection closed by 147.75.109.163 port 54102 Dec 12 20:07:24.926584 sshd-session[5112]: pam_unix(sshd:session): session closed for user core Dec 12 20:07:24.935262 systemd[1]: sshd@10-10.244.19.234:22-147.75.109.163:54102.service: Deactivated successfully. Dec 12 20:07:24.944755 systemd[1]: session-13.scope: Deactivated successfully. Dec 12 20:07:24.948667 systemd-logind[1528]: Session 13 logged out. Waiting for processes to exit. Dec 12 20:07:24.952874 systemd-logind[1528]: Removed session 13. Dec 12 20:07:25.206023 containerd[1555]: time="2025-12-12T20:07:25.205806485Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:07:25.210840 containerd[1555]: time="2025-12-12T20:07:25.208889748Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 20:07:25.210840 containerd[1555]: time="2025-12-12T20:07:25.209007726Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 20:07:25.211022 kubelet[2887]: E1212 20:07:25.210486 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 20:07:25.211022 kubelet[2887]: E1212 20:07:25.210555 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 20:07:25.211022 kubelet[2887]: E1212 20:07:25.210749 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wkfwk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-22cbp_calico-system(b08daecd-0b61-450c-a526-5e7b591cde3e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 20:07:25.212339 kubelet[2887]: E1212 20:07:25.212051 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-22cbp" podUID="b08daecd-0b61-450c-a526-5e7b591cde3e" Dec 12 20:07:25.875304 containerd[1555]: time="2025-12-12T20:07:25.875232273Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 20:07:26.183348 containerd[1555]: time="2025-12-12T20:07:26.182870965Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:07:26.185660 containerd[1555]: time="2025-12-12T20:07:26.185618668Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 20:07:26.185913 containerd[1555]: time="2025-12-12T20:07:26.185884214Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 20:07:26.187763 kubelet[2887]: E1212 20:07:26.186367 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 20:07:26.187763 kubelet[2887]: E1212 20:07:26.186457 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 20:07:26.187763 kubelet[2887]: E1212 20:07:26.186634 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzr2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-fwp5m_calico-system(2a05762a-3b72-4df9-aa17-753debf16cab): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 20:07:26.189182 containerd[1555]: time="2025-12-12T20:07:26.189133646Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 20:07:26.496720 containerd[1555]: time="2025-12-12T20:07:26.496130170Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:07:26.497894 containerd[1555]: time="2025-12-12T20:07:26.497355028Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 20:07:26.497894 containerd[1555]: time="2025-12-12T20:07:26.497439485Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 20:07:26.498100 kubelet[2887]: E1212 20:07:26.498042 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 20:07:26.498813 kubelet[2887]: E1212 20:07:26.498136 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 20:07:26.498813 kubelet[2887]: E1212 20:07:26.498687 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzr2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-fwp5m_calico-system(2a05762a-3b72-4df9-aa17-753debf16cab): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 20:07:26.500010 kubelet[2887]: E1212 20:07:26.499944 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:07:30.091415 systemd[1]: Started sshd@11-10.244.19.234:22-147.75.109.163:54108.service - OpenSSH per-connection server daemon (147.75.109.163:54108). Dec 12 20:07:31.024325 sshd[5136]: Accepted publickey for core from 147.75.109.163 port 54108 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:07:31.026142 sshd-session[5136]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:07:31.037026 systemd-logind[1528]: New session 14 of user core. Dec 12 20:07:31.045498 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 12 20:07:31.797014 sshd[5139]: Connection closed by 147.75.109.163 port 54108 Dec 12 20:07:31.799548 sshd-session[5136]: pam_unix(sshd:session): session closed for user core Dec 12 20:07:31.804978 systemd-logind[1528]: Session 14 logged out. Waiting for processes to exit. Dec 12 20:07:31.805805 systemd[1]: sshd@11-10.244.19.234:22-147.75.109.163:54108.service: Deactivated successfully. Dec 12 20:07:31.810630 systemd[1]: session-14.scope: Deactivated successfully. Dec 12 20:07:31.816007 systemd-logind[1528]: Removed session 14. Dec 12 20:07:31.876505 kubelet[2887]: E1212 20:07:31.876406 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-59fc8f5456-7jq54" podUID="0319e4c1-786f-47b5-99ee-05577b0ae0cb" Dec 12 20:07:31.878401 kubelet[2887]: E1212 20:07:31.876895 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-779d6bdfc4-bv9cs" podUID="2fcf8e8b-a7d2-49df-8e20-83c1153edafc" Dec 12 20:07:31.956646 systemd[1]: Started sshd@12-10.244.19.234:22-147.75.109.163:54124.service - OpenSSH per-connection server daemon (147.75.109.163:54124). Dec 12 20:07:32.874573 kubelet[2887]: E1212 20:07:32.874503 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7f7c49f9bf-2vbl6" podUID="dcaf5416-6748-4e19-9a64-70511b93ac27" Dec 12 20:07:32.905931 sshd[5152]: Accepted publickey for core from 147.75.109.163 port 54124 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:07:32.908255 sshd-session[5152]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:07:32.921108 systemd-logind[1528]: New session 15 of user core. Dec 12 20:07:32.928587 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 12 20:07:33.746456 sshd[5155]: Connection closed by 147.75.109.163 port 54124 Dec 12 20:07:33.746310 sshd-session[5152]: pam_unix(sshd:session): session closed for user core Dec 12 20:07:33.753532 systemd[1]: sshd@12-10.244.19.234:22-147.75.109.163:54124.service: Deactivated successfully. Dec 12 20:07:33.758867 systemd[1]: session-15.scope: Deactivated successfully. Dec 12 20:07:33.763073 systemd-logind[1528]: Session 15 logged out. Waiting for processes to exit. Dec 12 20:07:33.766353 systemd-logind[1528]: Removed session 15. Dec 12 20:07:33.877572 kubelet[2887]: E1212 20:07:33.876590 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-w9rs4" podUID="87f20b04-9afa-4c4b-849e-0456cd78dc55" Dec 12 20:07:33.908860 systemd[1]: Started sshd@13-10.244.19.234:22-147.75.109.163:41324.service - OpenSSH per-connection server daemon (147.75.109.163:41324). Dec 12 20:07:34.866356 sshd[5166]: Accepted publickey for core from 147.75.109.163 port 41324 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:07:34.867440 sshd-session[5166]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:07:34.878517 systemd-logind[1528]: New session 16 of user core. Dec 12 20:07:34.885512 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 12 20:07:35.653575 sshd[5169]: Connection closed by 147.75.109.163 port 41324 Dec 12 20:07:35.654706 sshd-session[5166]: pam_unix(sshd:session): session closed for user core Dec 12 20:07:35.662761 systemd-logind[1528]: Session 16 logged out. Waiting for processes to exit. Dec 12 20:07:35.662991 systemd[1]: sshd@13-10.244.19.234:22-147.75.109.163:41324.service: Deactivated successfully. Dec 12 20:07:35.666933 systemd[1]: session-16.scope: Deactivated successfully. Dec 12 20:07:35.673750 systemd-logind[1528]: Removed session 16. Dec 12 20:07:36.874854 kubelet[2887]: E1212 20:07:36.874299 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-666sk" podUID="62353628-af81-4f55-9563-ec2bb7f43849" Dec 12 20:07:36.876699 kubelet[2887]: E1212 20:07:36.876622 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:07:37.874454 kubelet[2887]: E1212 20:07:37.874392 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-22cbp" podUID="b08daecd-0b61-450c-a526-5e7b591cde3e" Dec 12 20:07:40.889783 systemd[1]: Started sshd@14-10.244.19.234:22-147.75.109.163:41332.service - OpenSSH per-connection server daemon (147.75.109.163:41332). Dec 12 20:07:42.041479 sshd[5212]: Accepted publickey for core from 147.75.109.163 port 41332 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:07:42.043636 sshd-session[5212]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:07:42.054416 systemd-logind[1528]: New session 17 of user core. Dec 12 20:07:42.060554 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 12 20:07:42.982316 sshd[5217]: Connection closed by 147.75.109.163 port 41332 Dec 12 20:07:42.982818 sshd-session[5212]: pam_unix(sshd:session): session closed for user core Dec 12 20:07:42.991220 systemd[1]: sshd@14-10.244.19.234:22-147.75.109.163:41332.service: Deactivated successfully. Dec 12 20:07:42.992373 systemd-logind[1528]: Session 17 logged out. Waiting for processes to exit. Dec 12 20:07:42.998507 systemd[1]: session-17.scope: Deactivated successfully. Dec 12 20:07:43.003528 systemd-logind[1528]: Removed session 17. Dec 12 20:07:43.875862 kubelet[2887]: E1212 20:07:43.875612 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-59fc8f5456-7jq54" podUID="0319e4c1-786f-47b5-99ee-05577b0ae0cb" Dec 12 20:07:45.878263 kubelet[2887]: E1212 20:07:45.878196 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7f7c49f9bf-2vbl6" podUID="dcaf5416-6748-4e19-9a64-70511b93ac27" Dec 12 20:07:45.879888 kubelet[2887]: E1212 20:07:45.879009 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-779d6bdfc4-bv9cs" podUID="2fcf8e8b-a7d2-49df-8e20-83c1153edafc" Dec 12 20:07:46.873460 kubelet[2887]: E1212 20:07:46.873342 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-w9rs4" podUID="87f20b04-9afa-4c4b-849e-0456cd78dc55" Dec 12 20:07:48.105621 systemd[1]: Started sshd@15-10.244.19.234:22-147.75.109.163:57428.service - OpenSSH per-connection server daemon (147.75.109.163:57428). Dec 12 20:07:49.066183 sshd[5229]: Accepted publickey for core from 147.75.109.163 port 57428 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:07:49.068663 sshd-session[5229]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:07:49.080092 systemd-logind[1528]: New session 18 of user core. Dec 12 20:07:49.086522 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 12 20:07:49.884257 kubelet[2887]: E1212 20:07:49.883718 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-22cbp" podUID="b08daecd-0b61-450c-a526-5e7b591cde3e" Dec 12 20:07:49.951500 sshd[5232]: Connection closed by 147.75.109.163 port 57428 Dec 12 20:07:49.961210 sshd-session[5229]: pam_unix(sshd:session): session closed for user core Dec 12 20:07:49.971438 systemd[1]: sshd@15-10.244.19.234:22-147.75.109.163:57428.service: Deactivated successfully. Dec 12 20:07:49.977018 systemd[1]: session-18.scope: Deactivated successfully. Dec 12 20:07:49.979726 systemd-logind[1528]: Session 18 logged out. Waiting for processes to exit. Dec 12 20:07:49.983728 systemd-logind[1528]: Removed session 18. Dec 12 20:07:50.876975 kubelet[2887]: E1212 20:07:50.876546 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-666sk" podUID="62353628-af81-4f55-9563-ec2bb7f43849" Dec 12 20:07:50.877845 kubelet[2887]: E1212 20:07:50.877449 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:07:54.873105 kubelet[2887]: E1212 20:07:54.873029 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-59fc8f5456-7jq54" podUID="0319e4c1-786f-47b5-99ee-05577b0ae0cb" Dec 12 20:07:55.114802 systemd[1]: Started sshd@16-10.244.19.234:22-147.75.109.163:43818.service - OpenSSH per-connection server daemon (147.75.109.163:43818). Dec 12 20:07:56.084334 sshd[5246]: Accepted publickey for core from 147.75.109.163 port 43818 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:07:56.086487 sshd-session[5246]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:07:56.096502 systemd-logind[1528]: New session 19 of user core. Dec 12 20:07:56.101708 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 12 20:07:56.947321 sshd[5249]: Connection closed by 147.75.109.163 port 43818 Dec 12 20:07:56.948403 sshd-session[5246]: pam_unix(sshd:session): session closed for user core Dec 12 20:07:56.955790 systemd[1]: sshd@16-10.244.19.234:22-147.75.109.163:43818.service: Deactivated successfully. Dec 12 20:07:56.959612 systemd[1]: session-19.scope: Deactivated successfully. Dec 12 20:07:56.961826 systemd-logind[1528]: Session 19 logged out. Waiting for processes to exit. Dec 12 20:07:56.966055 systemd-logind[1528]: Removed session 19. Dec 12 20:07:57.101158 systemd[1]: Started sshd@17-10.244.19.234:22-147.75.109.163:43832.service - OpenSSH per-connection server daemon (147.75.109.163:43832). Dec 12 20:07:57.877704 kubelet[2887]: E1212 20:07:57.877457 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-779d6bdfc4-bv9cs" podUID="2fcf8e8b-a7d2-49df-8e20-83c1153edafc" Dec 12 20:07:58.048472 sshd[5261]: Accepted publickey for core from 147.75.109.163 port 43832 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:07:58.050749 sshd-session[5261]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:07:58.059714 systemd-logind[1528]: New session 20 of user core. Dec 12 20:07:58.067519 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 12 20:07:59.038006 sshd[5264]: Connection closed by 147.75.109.163 port 43832 Dec 12 20:07:59.039125 sshd-session[5261]: pam_unix(sshd:session): session closed for user core Dec 12 20:07:59.050075 systemd[1]: sshd@17-10.244.19.234:22-147.75.109.163:43832.service: Deactivated successfully. Dec 12 20:07:59.054024 systemd[1]: session-20.scope: Deactivated successfully. Dec 12 20:07:59.058546 systemd-logind[1528]: Session 20 logged out. Waiting for processes to exit. Dec 12 20:07:59.063128 systemd-logind[1528]: Removed session 20. Dec 12 20:07:59.228225 systemd[1]: Started sshd@18-10.244.19.234:22-147.75.109.163:43840.service - OpenSSH per-connection server daemon (147.75.109.163:43840). Dec 12 20:07:59.874932 kubelet[2887]: E1212 20:07:59.874692 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-w9rs4" podUID="87f20b04-9afa-4c4b-849e-0456cd78dc55" Dec 12 20:08:00.280906 sshd[5274]: Accepted publickey for core from 147.75.109.163 port 43840 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:08:00.284572 sshd-session[5274]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:08:00.293963 systemd-logind[1528]: New session 21 of user core. Dec 12 20:08:00.301580 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 12 20:08:00.875798 containerd[1555]: time="2025-12-12T20:08:00.875392986Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 20:08:01.242666 containerd[1555]: time="2025-12-12T20:08:01.242155281Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:08:01.244402 containerd[1555]: time="2025-12-12T20:08:01.244327620Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 20:08:01.244402 containerd[1555]: time="2025-12-12T20:08:01.244334207Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 20:08:01.244710 kubelet[2887]: E1212 20:08:01.244635 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:08:01.245445 kubelet[2887]: E1212 20:08:01.244731 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:08:01.245445 kubelet[2887]: E1212 20:08:01.244938 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cpx95,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-7f7c49f9bf-2vbl6_calico-apiserver(dcaf5416-6748-4e19-9a64-70511b93ac27): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 20:08:01.247145 kubelet[2887]: E1212 20:08:01.247082 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7f7c49f9bf-2vbl6" podUID="dcaf5416-6748-4e19-9a64-70511b93ac27" Dec 12 20:08:02.317330 sshd[5277]: Connection closed by 147.75.109.163 port 43840 Dec 12 20:08:02.317469 sshd-session[5274]: pam_unix(sshd:session): session closed for user core Dec 12 20:08:02.326970 systemd[1]: sshd@18-10.244.19.234:22-147.75.109.163:43840.service: Deactivated successfully. Dec 12 20:08:02.331138 systemd[1]: session-21.scope: Deactivated successfully. Dec 12 20:08:02.333572 systemd-logind[1528]: Session 21 logged out. Waiting for processes to exit. Dec 12 20:08:02.337942 systemd-logind[1528]: Removed session 21. Dec 12 20:08:02.464455 systemd[1]: Started sshd@19-10.244.19.234:22-147.75.109.163:43726.service - OpenSSH per-connection server daemon (147.75.109.163:43726). Dec 12 20:08:02.875595 kubelet[2887]: E1212 20:08:02.875515 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:08:03.425468 sshd[5295]: Accepted publickey for core from 147.75.109.163 port 43726 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:08:03.429672 sshd-session[5295]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:08:03.441018 systemd-logind[1528]: New session 22 of user core. Dec 12 20:08:03.449523 systemd[1]: Started session-22.scope - Session 22 of User core. Dec 12 20:08:03.874987 kubelet[2887]: E1212 20:08:03.874878 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-22cbp" podUID="b08daecd-0b61-450c-a526-5e7b591cde3e" Dec 12 20:08:04.534663 sshd[5300]: Connection closed by 147.75.109.163 port 43726 Dec 12 20:08:04.537196 sshd-session[5295]: pam_unix(sshd:session): session closed for user core Dec 12 20:08:04.545273 systemd[1]: sshd@19-10.244.19.234:22-147.75.109.163:43726.service: Deactivated successfully. Dec 12 20:08:04.547246 systemd-logind[1528]: Session 22 logged out. Waiting for processes to exit. Dec 12 20:08:04.553136 systemd[1]: session-22.scope: Deactivated successfully. Dec 12 20:08:04.559775 systemd-logind[1528]: Removed session 22. Dec 12 20:08:04.699557 systemd[1]: Started sshd@20-10.244.19.234:22-147.75.109.163:43742.service - OpenSSH per-connection server daemon (147.75.109.163:43742). Dec 12 20:08:04.878036 containerd[1555]: time="2025-12-12T20:08:04.877954706Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 20:08:05.186684 containerd[1555]: time="2025-12-12T20:08:05.186240090Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:08:05.188082 containerd[1555]: time="2025-12-12T20:08:05.188024255Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 20:08:05.188253 containerd[1555]: time="2025-12-12T20:08:05.188178875Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 20:08:05.190275 kubelet[2887]: E1212 20:08:05.189126 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:08:05.190275 kubelet[2887]: E1212 20:08:05.189230 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:08:05.190275 kubelet[2887]: E1212 20:08:05.189499 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7qkn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-558bb796c-666sk_calico-apiserver(62353628-af81-4f55-9563-ec2bb7f43849): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 20:08:05.192544 kubelet[2887]: E1212 20:08:05.192492 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-666sk" podUID="62353628-af81-4f55-9563-ec2bb7f43849" Dec 12 20:08:05.662879 sshd[5318]: Accepted publickey for core from 147.75.109.163 port 43742 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:08:05.666417 sshd-session[5318]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:08:05.677445 systemd-logind[1528]: New session 23 of user core. Dec 12 20:08:05.685607 systemd[1]: Started session-23.scope - Session 23 of User core. Dec 12 20:08:06.426041 sshd[5322]: Connection closed by 147.75.109.163 port 43742 Dec 12 20:08:06.428644 sshd-session[5318]: pam_unix(sshd:session): session closed for user core Dec 12 20:08:06.436797 systemd-logind[1528]: Session 23 logged out. Waiting for processes to exit. Dec 12 20:08:06.439594 systemd[1]: sshd@20-10.244.19.234:22-147.75.109.163:43742.service: Deactivated successfully. Dec 12 20:08:06.444825 systemd[1]: session-23.scope: Deactivated successfully. Dec 12 20:08:06.454945 systemd-logind[1528]: Removed session 23. Dec 12 20:08:07.875777 containerd[1555]: time="2025-12-12T20:08:07.875257035Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 12 20:08:08.184820 containerd[1555]: time="2025-12-12T20:08:08.184607492Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:08:08.188827 containerd[1555]: time="2025-12-12T20:08:08.188748889Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 12 20:08:08.188971 containerd[1555]: time="2025-12-12T20:08:08.188904634Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 12 20:08:08.190649 kubelet[2887]: E1212 20:08:08.189274 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 20:08:08.191502 kubelet[2887]: E1212 20:08:08.190687 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 12 20:08:08.191997 kubelet[2887]: E1212 20:08:08.191792 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vw8ps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-59fc8f5456-7jq54_calico-system(0319e4c1-786f-47b5-99ee-05577b0ae0cb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 12 20:08:08.193451 kubelet[2887]: E1212 20:08:08.193374 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-59fc8f5456-7jq54" podUID="0319e4c1-786f-47b5-99ee-05577b0ae0cb" Dec 12 20:08:09.876515 containerd[1555]: time="2025-12-12T20:08:09.874841412Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 12 20:08:10.204783 containerd[1555]: time="2025-12-12T20:08:10.204546352Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:08:10.208217 containerd[1555]: time="2025-12-12T20:08:10.208023603Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 12 20:08:10.208217 containerd[1555]: time="2025-12-12T20:08:10.208171383Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 12 20:08:10.208734 kubelet[2887]: E1212 20:08:10.208659 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 20:08:10.210880 kubelet[2887]: E1212 20:08:10.208887 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 12 20:08:10.210880 kubelet[2887]: E1212 20:08:10.209129 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:1f409ebb68b94cf2859e2ccda5082245,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-txd5s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-779d6bdfc4-bv9cs_calico-system(2fcf8e8b-a7d2-49df-8e20-83c1153edafc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 12 20:08:10.221613 containerd[1555]: time="2025-12-12T20:08:10.221020960Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 12 20:08:10.550363 containerd[1555]: time="2025-12-12T20:08:10.550272175Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:08:10.551853 containerd[1555]: time="2025-12-12T20:08:10.551757656Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 12 20:08:10.551853 containerd[1555]: time="2025-12-12T20:08:10.551808762Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 12 20:08:10.553639 kubelet[2887]: E1212 20:08:10.553558 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 20:08:10.553746 kubelet[2887]: E1212 20:08:10.553666 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 12 20:08:10.553934 kubelet[2887]: E1212 20:08:10.553875 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-txd5s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-779d6bdfc4-bv9cs_calico-system(2fcf8e8b-a7d2-49df-8e20-83c1153edafc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 12 20:08:10.555752 kubelet[2887]: E1212 20:08:10.555709 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-779d6bdfc4-bv9cs" podUID="2fcf8e8b-a7d2-49df-8e20-83c1153edafc" Dec 12 20:08:10.876807 containerd[1555]: time="2025-12-12T20:08:10.876545883Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 12 20:08:11.183535 containerd[1555]: time="2025-12-12T20:08:11.183229813Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:08:11.185159 containerd[1555]: time="2025-12-12T20:08:11.184997305Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 12 20:08:11.185159 containerd[1555]: time="2025-12-12T20:08:11.185053221Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 12 20:08:11.186254 kubelet[2887]: E1212 20:08:11.185563 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:08:11.186254 kubelet[2887]: E1212 20:08:11.185714 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 12 20:08:11.186254 kubelet[2887]: E1212 20:08:11.185944 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ctvmp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-558bb796c-w9rs4_calico-apiserver(87f20b04-9afa-4c4b-849e-0456cd78dc55): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 12 20:08:11.187684 kubelet[2887]: E1212 20:08:11.187486 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-w9rs4" podUID="87f20b04-9afa-4c4b-849e-0456cd78dc55" Dec 12 20:08:11.584888 systemd[1]: Started sshd@21-10.244.19.234:22-147.75.109.163:43748.service - OpenSSH per-connection server daemon (147.75.109.163:43748). Dec 12 20:08:12.550706 sshd[5364]: Accepted publickey for core from 147.75.109.163 port 43748 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:08:12.554728 sshd-session[5364]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:08:12.567363 systemd-logind[1528]: New session 24 of user core. Dec 12 20:08:12.574084 systemd[1]: Started session-24.scope - Session 24 of User core. Dec 12 20:08:13.403963 sshd[5367]: Connection closed by 147.75.109.163 port 43748 Dec 12 20:08:13.403701 sshd-session[5364]: pam_unix(sshd:session): session closed for user core Dec 12 20:08:13.414559 systemd-logind[1528]: Session 24 logged out. Waiting for processes to exit. Dec 12 20:08:13.415866 systemd[1]: sshd@21-10.244.19.234:22-147.75.109.163:43748.service: Deactivated successfully. Dec 12 20:08:13.421949 systemd[1]: session-24.scope: Deactivated successfully. Dec 12 20:08:13.427001 systemd-logind[1528]: Removed session 24. Dec 12 20:08:15.880397 kubelet[2887]: E1212 20:08:15.880314 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-7f7c49f9bf-2vbl6" podUID="dcaf5416-6748-4e19-9a64-70511b93ac27" Dec 12 20:08:15.884404 containerd[1555]: time="2025-12-12T20:08:15.884353253Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 12 20:08:16.198284 containerd[1555]: time="2025-12-12T20:08:16.197950940Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:08:16.199671 containerd[1555]: time="2025-12-12T20:08:16.199620982Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 12 20:08:16.199783 containerd[1555]: time="2025-12-12T20:08:16.199758907Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 12 20:08:16.200741 kubelet[2887]: E1212 20:08:16.200596 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 20:08:16.200993 kubelet[2887]: E1212 20:08:16.200901 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 12 20:08:16.201599 kubelet[2887]: E1212 20:08:16.201475 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wkfwk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-22cbp_calico-system(b08daecd-0b61-450c-a526-5e7b591cde3e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 12 20:08:16.202983 kubelet[2887]: E1212 20:08:16.202815 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-22cbp" podUID="b08daecd-0b61-450c-a526-5e7b591cde3e" Dec 12 20:08:16.875195 kubelet[2887]: E1212 20:08:16.875128 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-558bb796c-666sk" podUID="62353628-af81-4f55-9563-ec2bb7f43849" Dec 12 20:08:17.878740 containerd[1555]: time="2025-12-12T20:08:17.878627000Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 12 20:08:18.196640 containerd[1555]: time="2025-12-12T20:08:18.196007425Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:08:18.197455 containerd[1555]: time="2025-12-12T20:08:18.197384894Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 12 20:08:18.197538 containerd[1555]: time="2025-12-12T20:08:18.197500282Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 12 20:08:18.198438 kubelet[2887]: E1212 20:08:18.198361 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 20:08:18.199251 kubelet[2887]: E1212 20:08:18.198460 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 12 20:08:18.199251 kubelet[2887]: E1212 20:08:18.198720 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzr2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-fwp5m_calico-system(2a05762a-3b72-4df9-aa17-753debf16cab): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 12 20:08:18.201930 containerd[1555]: time="2025-12-12T20:08:18.201824147Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 12 20:08:18.518032 containerd[1555]: time="2025-12-12T20:08:18.517806216Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 12 20:08:18.519342 containerd[1555]: time="2025-12-12T20:08:18.519275903Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 12 20:08:18.519514 containerd[1555]: time="2025-12-12T20:08:18.519319198Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 12 20:08:18.521308 kubelet[2887]: E1212 20:08:18.519835 2887 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 20:08:18.521501 kubelet[2887]: E1212 20:08:18.521466 2887 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 12 20:08:18.521914 kubelet[2887]: E1212 20:08:18.521848 2887 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzr2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-fwp5m_calico-system(2a05762a-3b72-4df9-aa17-753debf16cab): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 12 20:08:18.523443 kubelet[2887]: E1212 20:08:18.523383 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-fwp5m" podUID="2a05762a-3b72-4df9-aa17-753debf16cab" Dec 12 20:08:18.566415 systemd[1]: Started sshd@22-10.244.19.234:22-147.75.109.163:37994.service - OpenSSH per-connection server daemon (147.75.109.163:37994). Dec 12 20:08:18.874756 kubelet[2887]: E1212 20:08:18.874176 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-59fc8f5456-7jq54" podUID="0319e4c1-786f-47b5-99ee-05577b0ae0cb" Dec 12 20:08:19.534815 sshd[5393]: Accepted publickey for core from 147.75.109.163 port 37994 ssh2: RSA SHA256:dtGVIBmi5GBDDRXWMHOUdZ7AMlcejJgaHwElsZPMiqo Dec 12 20:08:19.540103 sshd-session[5393]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 12 20:08:19.552951 systemd-logind[1528]: New session 25 of user core. Dec 12 20:08:19.557572 systemd[1]: Started session-25.scope - Session 25 of User core. Dec 12 20:08:20.297597 sshd[5396]: Connection closed by 147.75.109.163 port 37994 Dec 12 20:08:20.300667 sshd-session[5393]: pam_unix(sshd:session): session closed for user core Dec 12 20:08:20.309193 systemd-logind[1528]: Session 25 logged out. Waiting for processes to exit. Dec 12 20:08:20.309810 systemd[1]: sshd@22-10.244.19.234:22-147.75.109.163:37994.service: Deactivated successfully. Dec 12 20:08:20.316510 systemd[1]: session-25.scope: Deactivated successfully. Dec 12 20:08:20.322359 systemd-logind[1528]: Removed session 25. Dec 12 20:08:23.879037 kubelet[2887]: E1212 20:08:23.878940 2887 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-779d6bdfc4-bv9cs" podUID="2fcf8e8b-a7d2-49df-8e20-83c1153edafc"