Dec 16 13:01:17.840753 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Dec 12 15:21:28 -00 2025 Dec 16 13:01:17.840771 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 16 13:01:17.840780 kernel: BIOS-provided physical RAM map: Dec 16 13:01:17.840786 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Dec 16 13:01:17.840790 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Dec 16 13:01:17.840795 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Dec 16 13:01:17.840801 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Dec 16 13:01:17.840821 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Dec 16 13:01:17.840826 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Dec 16 13:01:17.840833 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Dec 16 13:01:17.840838 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Dec 16 13:01:17.840843 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Dec 16 13:01:17.840847 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Dec 16 13:01:17.840852 kernel: NX (Execute Disable) protection: active Dec 16 13:01:17.840858 kernel: APIC: Static calls initialized Dec 16 13:01:17.840865 kernel: SMBIOS 3.0.0 present. Dec 16 13:01:17.840870 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Dec 16 13:01:17.840875 kernel: DMI: Memory slots populated: 1/1 Dec 16 13:01:17.840880 kernel: Hypervisor detected: KVM Dec 16 13:01:17.840885 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Dec 16 13:01:17.840891 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Dec 16 13:01:17.840896 kernel: kvm-clock: using sched offset of 4566372955 cycles Dec 16 13:01:17.840901 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Dec 16 13:01:17.840907 kernel: tsc: Detected 2445.406 MHz processor Dec 16 13:01:17.840912 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Dec 16 13:01:17.840919 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Dec 16 13:01:17.840924 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Dec 16 13:01:17.840930 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Dec 16 13:01:17.840935 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Dec 16 13:01:17.840940 kernel: Using GB pages for direct mapping Dec 16 13:01:17.840945 kernel: ACPI: Early table checksum verification disabled Dec 16 13:01:17.840951 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Dec 16 13:01:17.840956 kernel: ACPI: RSDT 0x000000007CFE2693 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:01:17.840961 kernel: ACPI: FACP 0x000000007CFE2483 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:01:17.840968 kernel: ACPI: DSDT 0x000000007CFE0040 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:01:17.840973 kernel: ACPI: FACS 0x000000007CFE0000 000040 Dec 16 13:01:17.840978 kernel: ACPI: APIC 0x000000007CFE2577 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:01:17.840984 kernel: ACPI: HPET 0x000000007CFE25F7 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:01:17.840989 kernel: ACPI: MCFG 0x000000007CFE262F 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:01:17.840994 kernel: ACPI: WAET 0x000000007CFE266B 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 13:01:17.841002 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe2483-0x7cfe2576] Dec 16 13:01:17.841008 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe2482] Dec 16 13:01:17.841014 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Dec 16 13:01:17.841019 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2577-0x7cfe25f6] Dec 16 13:01:17.841025 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25f7-0x7cfe262e] Dec 16 13:01:17.841030 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe262f-0x7cfe266a] Dec 16 13:01:17.841036 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe266b-0x7cfe2692] Dec 16 13:01:17.841041 kernel: No NUMA configuration found Dec 16 13:01:17.841048 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Dec 16 13:01:17.841054 kernel: NODE_DATA(0) allocated [mem 0x7cfd4dc0-0x7cfdbfff] Dec 16 13:01:17.841059 kernel: Zone ranges: Dec 16 13:01:17.841065 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Dec 16 13:01:17.841070 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Dec 16 13:01:17.841076 kernel: Normal empty Dec 16 13:01:17.841081 kernel: Device empty Dec 16 13:01:17.841086 kernel: Movable zone start for each node Dec 16 13:01:17.841092 kernel: Early memory node ranges Dec 16 13:01:17.841097 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Dec 16 13:01:17.841104 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Dec 16 13:01:17.841109 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Dec 16 13:01:17.841115 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Dec 16 13:01:17.841120 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Dec 16 13:01:17.841126 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Dec 16 13:01:17.841131 kernel: ACPI: PM-Timer IO Port: 0x608 Dec 16 13:01:17.841137 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Dec 16 13:01:17.841142 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Dec 16 13:01:17.841148 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Dec 16 13:01:17.841154 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Dec 16 13:01:17.841160 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Dec 16 13:01:17.841179 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Dec 16 13:01:17.841184 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Dec 16 13:01:17.841190 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Dec 16 13:01:17.841195 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Dec 16 13:01:17.841201 kernel: CPU topo: Max. logical packages: 1 Dec 16 13:01:17.841206 kernel: CPU topo: Max. logical dies: 1 Dec 16 13:01:17.841212 kernel: CPU topo: Max. dies per package: 1 Dec 16 13:01:17.841218 kernel: CPU topo: Max. threads per core: 1 Dec 16 13:01:17.841224 kernel: CPU topo: Num. cores per package: 2 Dec 16 13:01:17.841229 kernel: CPU topo: Num. threads per package: 2 Dec 16 13:01:17.841235 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Dec 16 13:01:17.841240 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Dec 16 13:01:17.841245 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Dec 16 13:01:17.841251 kernel: Booting paravirtualized kernel on KVM Dec 16 13:01:17.841257 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Dec 16 13:01:17.841262 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Dec 16 13:01:17.841268 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Dec 16 13:01:17.841275 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Dec 16 13:01:17.841280 kernel: pcpu-alloc: [0] 0 1 Dec 16 13:01:17.841285 kernel: kvm-guest: PV spinlocks disabled, no host support Dec 16 13:01:17.841292 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 16 13:01:17.841298 kernel: random: crng init done Dec 16 13:01:17.841303 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 16 13:01:17.841309 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 16 13:01:17.841314 kernel: Fallback order for Node 0: 0 Dec 16 13:01:17.841321 kernel: Built 1 zonelists, mobility grouping on. Total pages: 511866 Dec 16 13:01:17.841326 kernel: Policy zone: DMA32 Dec 16 13:01:17.841332 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 16 13:01:17.841337 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Dec 16 13:01:17.841343 kernel: ftrace: allocating 40103 entries in 157 pages Dec 16 13:01:17.841348 kernel: ftrace: allocated 157 pages with 5 groups Dec 16 13:01:17.841354 kernel: Dynamic Preempt: voluntary Dec 16 13:01:17.841359 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 16 13:01:17.841366 kernel: rcu: RCU event tracing is enabled. Dec 16 13:01:17.841373 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Dec 16 13:01:17.841378 kernel: Trampoline variant of Tasks RCU enabled. Dec 16 13:01:17.841384 kernel: Rude variant of Tasks RCU enabled. Dec 16 13:01:17.841389 kernel: Tracing variant of Tasks RCU enabled. Dec 16 13:01:17.841395 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 16 13:01:17.841401 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Dec 16 13:01:17.841406 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 16 13:01:17.841412 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 16 13:01:17.841418 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 16 13:01:17.841424 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Dec 16 13:01:17.841430 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 16 13:01:17.841435 kernel: Console: colour VGA+ 80x25 Dec 16 13:01:17.841441 kernel: printk: legacy console [tty0] enabled Dec 16 13:01:17.841446 kernel: printk: legacy console [ttyS0] enabled Dec 16 13:01:17.841452 kernel: ACPI: Core revision 20240827 Dec 16 13:01:17.841462 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Dec 16 13:01:17.841468 kernel: APIC: Switch to symmetric I/O mode setup Dec 16 13:01:17.841474 kernel: x2apic enabled Dec 16 13:01:17.841480 kernel: APIC: Switched APIC routing to: physical x2apic Dec 16 13:01:17.841486 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Dec 16 13:01:17.841492 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x233fc4eb620, max_idle_ns: 440795316590 ns Dec 16 13:01:17.841499 kernel: Calibrating delay loop (skipped) preset value.. 4890.81 BogoMIPS (lpj=2445406) Dec 16 13:01:17.841505 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Dec 16 13:01:17.841511 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Dec 16 13:01:17.841516 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Dec 16 13:01:17.841522 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Dec 16 13:01:17.841529 kernel: Spectre V2 : Mitigation: Retpolines Dec 16 13:01:17.841535 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Dec 16 13:01:17.841541 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Dec 16 13:01:17.841546 kernel: active return thunk: retbleed_return_thunk Dec 16 13:01:17.841552 kernel: RETBleed: Mitigation: untrained return thunk Dec 16 13:01:17.841558 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Dec 16 13:01:17.841564 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Dec 16 13:01:17.841570 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Dec 16 13:01:17.841576 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Dec 16 13:01:17.841582 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Dec 16 13:01:17.841588 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Dec 16 13:01:17.841594 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Dec 16 13:01:17.841600 kernel: Freeing SMP alternatives memory: 32K Dec 16 13:01:17.841605 kernel: pid_max: default: 32768 minimum: 301 Dec 16 13:01:17.841611 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 16 13:01:17.841617 kernel: landlock: Up and running. Dec 16 13:01:17.841623 kernel: SELinux: Initializing. Dec 16 13:01:17.841628 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Dec 16 13:01:17.841635 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Dec 16 13:01:17.841641 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Dec 16 13:01:17.841647 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Dec 16 13:01:17.841653 kernel: ... version: 0 Dec 16 13:01:17.841659 kernel: ... bit width: 48 Dec 16 13:01:17.841664 kernel: ... generic registers: 6 Dec 16 13:01:17.841670 kernel: ... value mask: 0000ffffffffffff Dec 16 13:01:17.841676 kernel: ... max period: 00007fffffffffff Dec 16 13:01:17.841681 kernel: ... fixed-purpose events: 0 Dec 16 13:01:17.841688 kernel: ... event mask: 000000000000003f Dec 16 13:01:17.841694 kernel: signal: max sigframe size: 1776 Dec 16 13:01:17.841699 kernel: rcu: Hierarchical SRCU implementation. Dec 16 13:01:17.841705 kernel: rcu: Max phase no-delay instances is 400. Dec 16 13:01:17.841711 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 16 13:01:17.841717 kernel: smp: Bringing up secondary CPUs ... Dec 16 13:01:17.841723 kernel: smpboot: x86: Booting SMP configuration: Dec 16 13:01:17.841728 kernel: .... node #0, CPUs: #1 Dec 16 13:01:17.841734 kernel: smp: Brought up 1 node, 2 CPUs Dec 16 13:01:17.841741 kernel: smpboot: Total of 2 processors activated (9781.62 BogoMIPS) Dec 16 13:01:17.841747 kernel: Memory: 1909588K/2047464K available (14336K kernel code, 2444K rwdata, 26064K rodata, 46188K init, 2572K bss, 133332K reserved, 0K cma-reserved) Dec 16 13:01:17.841753 kernel: devtmpfs: initialized Dec 16 13:01:17.841759 kernel: x86/mm: Memory block size: 128MB Dec 16 13:01:17.841764 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 16 13:01:17.841770 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Dec 16 13:01:17.841776 kernel: pinctrl core: initialized pinctrl subsystem Dec 16 13:01:17.841782 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 16 13:01:17.841787 kernel: audit: initializing netlink subsys (disabled) Dec 16 13:01:17.841794 kernel: audit: type=2000 audit(1765890075.136:1): state=initialized audit_enabled=0 res=1 Dec 16 13:01:17.841802 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 16 13:01:17.841827 kernel: thermal_sys: Registered thermal governor 'user_space' Dec 16 13:01:17.841839 kernel: cpuidle: using governor menu Dec 16 13:01:17.841846 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 16 13:01:17.841852 kernel: dca service started, version 1.12.1 Dec 16 13:01:17.841858 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Dec 16 13:01:17.841864 kernel: PCI: Using configuration type 1 for base access Dec 16 13:01:17.841870 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Dec 16 13:01:17.841878 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 16 13:01:17.841884 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Dec 16 13:01:17.841890 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 16 13:01:17.841895 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Dec 16 13:01:17.841901 kernel: ACPI: Added _OSI(Module Device) Dec 16 13:01:17.841907 kernel: ACPI: Added _OSI(Processor Device) Dec 16 13:01:17.841913 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 16 13:01:17.841918 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 16 13:01:17.841924 kernel: ACPI: Interpreter enabled Dec 16 13:01:17.841931 kernel: ACPI: PM: (supports S0 S5) Dec 16 13:01:17.841937 kernel: ACPI: Using IOAPIC for interrupt routing Dec 16 13:01:17.841943 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Dec 16 13:01:17.841948 kernel: PCI: Using E820 reservations for host bridge windows Dec 16 13:01:17.841954 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Dec 16 13:01:17.841960 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 16 13:01:17.842067 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 16 13:01:17.842135 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Dec 16 13:01:17.842216 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Dec 16 13:01:17.842232 kernel: PCI host bridge to bus 0000:00 Dec 16 13:01:17.842315 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Dec 16 13:01:17.842383 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Dec 16 13:01:17.842450 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Dec 16 13:01:17.842502 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Dec 16 13:01:17.842557 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Dec 16 13:01:17.842608 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Dec 16 13:01:17.842659 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 16 13:01:17.842729 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Dec 16 13:01:17.842825 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Dec 16 13:01:17.842894 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfb800000-0xfbffffff pref] Dec 16 13:01:17.842954 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfd200000-0xfd203fff 64bit pref] Dec 16 13:01:17.843017 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff] Dec 16 13:01:17.843074 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref] Dec 16 13:01:17.843131 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Dec 16 13:01:17.843213 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:01:17.843275 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff] Dec 16 13:01:17.843334 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 16 13:01:17.843395 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Dec 16 13:01:17.843452 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Dec 16 13:01:17.843519 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:01:17.843599 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff] Dec 16 13:01:17.843660 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 16 13:01:17.843718 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Dec 16 13:01:17.843775 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Dec 16 13:01:17.843874 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:01:17.843937 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff] Dec 16 13:01:17.843996 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 16 13:01:17.844053 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Dec 16 13:01:17.844110 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Dec 16 13:01:17.844194 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:01:17.844276 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff] Dec 16 13:01:17.844342 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 16 13:01:17.844400 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Dec 16 13:01:17.844458 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Dec 16 13:01:17.844539 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:01:17.844610 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff] Dec 16 13:01:17.844679 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 16 13:01:17.844738 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Dec 16 13:01:17.844796 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Dec 16 13:01:17.844893 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:01:17.844952 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff] Dec 16 13:01:17.845008 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 16 13:01:17.845065 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Dec 16 13:01:17.845121 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Dec 16 13:01:17.845205 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:01:17.845266 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff] Dec 16 13:01:17.845328 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 16 13:01:17.845385 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Dec 16 13:01:17.845441 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Dec 16 13:01:17.845504 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:01:17.845561 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff] Dec 16 13:01:17.845617 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 16 13:01:17.845677 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Dec 16 13:01:17.845734 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Dec 16 13:01:17.845797 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 13:01:17.845875 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff] Dec 16 13:01:17.845933 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 16 13:01:17.845990 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Dec 16 13:01:17.846048 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Dec 16 13:01:17.846120 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Dec 16 13:01:17.846196 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Dec 16 13:01:17.846261 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Dec 16 13:01:17.846319 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc040-0xc05f] Dec 16 13:01:17.846375 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea1a000-0xfea1afff] Dec 16 13:01:17.846437 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Dec 16 13:01:17.846494 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Dec 16 13:01:17.846564 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 13:01:17.846625 kernel: pci 0000:01:00.0: BAR 1 [mem 0xfe880000-0xfe880fff] Dec 16 13:01:17.846684 kernel: pci 0000:01:00.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Dec 16 13:01:17.846745 kernel: pci 0000:01:00.0: ROM [mem 0xfe800000-0xfe87ffff pref] Dec 16 13:01:17.846817 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 16 13:01:17.846889 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 16 13:01:17.846954 kernel: pci 0000:02:00.0: BAR 0 [mem 0xfe600000-0xfe603fff 64bit] Dec 16 13:01:17.847013 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 16 13:01:17.847081 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Dec 16 13:01:17.847141 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe400000-0xfe400fff] Dec 16 13:01:17.847219 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfcc00000-0xfcc03fff 64bit pref] Dec 16 13:01:17.847280 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 16 13:01:17.847347 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 16 13:01:17.847414 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Dec 16 13:01:17.847512 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 16 13:01:17.847585 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 16 13:01:17.847646 kernel: pci 0000:05:00.0: BAR 1 [mem 0xfe000000-0xfe000fff] Dec 16 13:01:17.847705 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfc800000-0xfc803fff 64bit pref] Dec 16 13:01:17.847763 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 16 13:01:17.847860 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Dec 16 13:01:17.847930 kernel: pci 0000:06:00.0: BAR 1 [mem 0xfde00000-0xfde00fff] Dec 16 13:01:17.847991 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfc600000-0xfc603fff 64bit pref] Dec 16 13:01:17.848048 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 16 13:01:17.848057 kernel: acpiphp: Slot [0] registered Dec 16 13:01:17.848124 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 13:01:17.848203 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfdc80000-0xfdc80fff] Dec 16 13:01:17.848268 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfc400000-0xfc403fff 64bit pref] Dec 16 13:01:17.848328 kernel: pci 0000:07:00.0: ROM [mem 0xfdc00000-0xfdc7ffff pref] Dec 16 13:01:17.848388 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 16 13:01:17.848397 kernel: acpiphp: Slot [0-2] registered Dec 16 13:01:17.848452 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 16 13:01:17.848461 kernel: acpiphp: Slot [0-3] registered Dec 16 13:01:17.848516 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 16 13:01:17.848524 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Dec 16 13:01:17.848532 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Dec 16 13:01:17.848538 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Dec 16 13:01:17.848544 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Dec 16 13:01:17.848550 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Dec 16 13:01:17.848556 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Dec 16 13:01:17.848562 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Dec 16 13:01:17.848568 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Dec 16 13:01:17.848574 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Dec 16 13:01:17.848579 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Dec 16 13:01:17.848586 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Dec 16 13:01:17.848592 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Dec 16 13:01:17.848598 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Dec 16 13:01:17.848604 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Dec 16 13:01:17.848610 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Dec 16 13:01:17.848616 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Dec 16 13:01:17.848622 kernel: iommu: Default domain type: Translated Dec 16 13:01:17.848627 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Dec 16 13:01:17.848633 kernel: PCI: Using ACPI for IRQ routing Dec 16 13:01:17.848641 kernel: PCI: pci_cache_line_size set to 64 bytes Dec 16 13:01:17.848646 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Dec 16 13:01:17.848652 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Dec 16 13:01:17.848709 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Dec 16 13:01:17.848766 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Dec 16 13:01:17.848876 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Dec 16 13:01:17.848886 kernel: vgaarb: loaded Dec 16 13:01:17.848892 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Dec 16 13:01:17.848898 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Dec 16 13:01:17.848907 kernel: clocksource: Switched to clocksource kvm-clock Dec 16 13:01:17.848913 kernel: VFS: Disk quotas dquot_6.6.0 Dec 16 13:01:17.848919 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 16 13:01:17.848925 kernel: pnp: PnP ACPI init Dec 16 13:01:17.848989 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Dec 16 13:01:17.848999 kernel: pnp: PnP ACPI: found 5 devices Dec 16 13:01:17.849005 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Dec 16 13:01:17.849011 kernel: NET: Registered PF_INET protocol family Dec 16 13:01:17.849019 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 13:01:17.849026 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Dec 16 13:01:17.849032 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 16 13:01:17.849038 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Dec 16 13:01:17.849044 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Dec 16 13:01:17.849050 kernel: TCP: Hash tables configured (established 16384 bind 16384) Dec 16 13:01:17.849056 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Dec 16 13:01:17.849062 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Dec 16 13:01:17.849068 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 16 13:01:17.849075 kernel: NET: Registered PF_XDP protocol family Dec 16 13:01:17.849136 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 16 13:01:17.849233 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 16 13:01:17.849299 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 16 13:01:17.849359 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Dec 16 13:01:17.849417 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Dec 16 13:01:17.849486 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Dec 16 13:01:17.849548 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 16 13:01:17.849609 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Dec 16 13:01:17.849667 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Dec 16 13:01:17.849724 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 16 13:01:17.849782 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Dec 16 13:01:17.851888 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Dec 16 13:01:17.851962 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 16 13:01:17.852025 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Dec 16 13:01:17.852087 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Dec 16 13:01:17.852147 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 16 13:01:17.852227 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Dec 16 13:01:17.852294 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Dec 16 13:01:17.852354 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 16 13:01:17.852412 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Dec 16 13:01:17.852470 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Dec 16 13:01:17.852528 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 16 13:01:17.852586 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Dec 16 13:01:17.852647 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Dec 16 13:01:17.852705 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 16 13:01:17.852763 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Dec 16 13:01:17.852841 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Dec 16 13:01:17.852902 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Dec 16 13:01:17.852965 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 16 13:01:17.853023 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Dec 16 13:01:17.853080 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Dec 16 13:01:17.853139 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Dec 16 13:01:17.853216 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 16 13:01:17.853275 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Dec 16 13:01:17.853334 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Dec 16 13:01:17.853392 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Dec 16 13:01:17.853448 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Dec 16 13:01:17.853505 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Dec 16 13:01:17.853557 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Dec 16 13:01:17.853608 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Dec 16 13:01:17.853702 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Dec 16 13:01:17.853799 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Dec 16 13:01:17.853907 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Dec 16 13:01:17.853966 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Dec 16 13:01:17.854036 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Dec 16 13:01:17.854091 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Dec 16 13:01:17.854150 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Dec 16 13:01:17.854227 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Dec 16 13:01:17.854290 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Dec 16 13:01:17.854343 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Dec 16 13:01:17.854404 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Dec 16 13:01:17.854462 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Dec 16 13:01:17.854524 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Dec 16 13:01:17.854578 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Dec 16 13:01:17.854646 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Dec 16 13:01:17.854734 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Dec 16 13:01:17.854792 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Dec 16 13:01:17.856912 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Dec 16 13:01:17.856977 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Dec 16 13:01:17.857034 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Dec 16 13:01:17.857095 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Dec 16 13:01:17.857149 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Dec 16 13:01:17.857229 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Dec 16 13:01:17.857242 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Dec 16 13:01:17.857259 kernel: PCI: CLS 0 bytes, default 64 Dec 16 13:01:17.857272 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x233fc4eb620, max_idle_ns: 440795316590 ns Dec 16 13:01:17.857283 kernel: Initialise system trusted keyrings Dec 16 13:01:17.857291 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Dec 16 13:01:17.857297 kernel: Key type asymmetric registered Dec 16 13:01:17.857303 kernel: Asymmetric key parser 'x509' registered Dec 16 13:01:17.857310 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Dec 16 13:01:17.857316 kernel: io scheduler mq-deadline registered Dec 16 13:01:17.857322 kernel: io scheduler kyber registered Dec 16 13:01:17.857330 kernel: io scheduler bfq registered Dec 16 13:01:17.857411 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Dec 16 13:01:17.857486 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Dec 16 13:01:17.857560 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Dec 16 13:01:17.857639 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Dec 16 13:01:17.857731 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Dec 16 13:01:17.857798 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Dec 16 13:01:17.857895 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Dec 16 13:01:17.857962 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Dec 16 13:01:17.858024 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Dec 16 13:01:17.858085 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Dec 16 13:01:17.858145 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Dec 16 13:01:17.858241 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Dec 16 13:01:17.858304 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Dec 16 13:01:17.858366 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Dec 16 13:01:17.858438 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Dec 16 13:01:17.858504 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Dec 16 13:01:17.858515 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Dec 16 13:01:17.858575 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Dec 16 13:01:17.858646 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Dec 16 13:01:17.858663 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Dec 16 13:01:17.858675 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Dec 16 13:01:17.858690 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 16 13:01:17.858700 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Dec 16 13:01:17.858706 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Dec 16 13:01:17.858713 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Dec 16 13:01:17.858719 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Dec 16 13:01:17.858830 kernel: rtc_cmos 00:03: RTC can wake from S4 Dec 16 13:01:17.858845 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Dec 16 13:01:17.858907 kernel: rtc_cmos 00:03: registered as rtc0 Dec 16 13:01:17.858970 kernel: rtc_cmos 00:03: setting system clock to 2025-12-16T13:01:17 UTC (1765890077) Dec 16 13:01:17.859029 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Dec 16 13:01:17.859038 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Dec 16 13:01:17.859045 kernel: NET: Registered PF_INET6 protocol family Dec 16 13:01:17.859052 kernel: Segment Routing with IPv6 Dec 16 13:01:17.859058 kernel: In-situ OAM (IOAM) with IPv6 Dec 16 13:01:17.859064 kernel: NET: Registered PF_PACKET protocol family Dec 16 13:01:17.859070 kernel: Key type dns_resolver registered Dec 16 13:01:17.859077 kernel: IPI shorthand broadcast: enabled Dec 16 13:01:17.859085 kernel: sched_clock: Marking stable (3243007309, 248711957)->(3526875521, -35156255) Dec 16 13:01:17.859092 kernel: registered taskstats version 1 Dec 16 13:01:17.859098 kernel: Loading compiled-in X.509 certificates Dec 16 13:01:17.859104 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 0d0c78e6590cb40d27f1cef749ef9f2f3425f38d' Dec 16 13:01:17.859110 kernel: Demotion targets for Node 0: null Dec 16 13:01:17.859117 kernel: Key type .fscrypt registered Dec 16 13:01:17.859123 kernel: Key type fscrypt-provisioning registered Dec 16 13:01:17.859129 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 16 13:01:17.859135 kernel: ima: Allocated hash algorithm: sha1 Dec 16 13:01:17.859142 kernel: ima: No architecture policies found Dec 16 13:01:17.859149 kernel: clk: Disabling unused clocks Dec 16 13:01:17.859155 kernel: Warning: unable to open an initial console. Dec 16 13:01:17.859161 kernel: Freeing unused kernel image (initmem) memory: 46188K Dec 16 13:01:17.859182 kernel: Write protecting the kernel read-only data: 40960k Dec 16 13:01:17.859189 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Dec 16 13:01:17.859195 kernel: Run /init as init process Dec 16 13:01:17.859201 kernel: with arguments: Dec 16 13:01:17.859207 kernel: /init Dec 16 13:01:17.859215 kernel: with environment: Dec 16 13:01:17.859221 kernel: HOME=/ Dec 16 13:01:17.859227 kernel: TERM=linux Dec 16 13:01:17.859234 systemd[1]: Successfully made /usr/ read-only. Dec 16 13:01:17.859243 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 13:01:17.859250 systemd[1]: Detected virtualization kvm. Dec 16 13:01:17.859258 systemd[1]: Detected architecture x86-64. Dec 16 13:01:17.859264 systemd[1]: Running in initrd. Dec 16 13:01:17.859272 systemd[1]: No hostname configured, using default hostname. Dec 16 13:01:17.859279 systemd[1]: Hostname set to . Dec 16 13:01:17.859285 systemd[1]: Initializing machine ID from VM UUID. Dec 16 13:01:17.859292 systemd[1]: Queued start job for default target initrd.target. Dec 16 13:01:17.859298 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 13:01:17.859305 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 13:01:17.859312 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 16 13:01:17.859319 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 13:01:17.859327 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 16 13:01:17.859334 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 16 13:01:17.859341 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Dec 16 13:01:17.859348 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Dec 16 13:01:17.859355 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 13:01:17.859361 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 13:01:17.859369 systemd[1]: Reached target paths.target - Path Units. Dec 16 13:01:17.859376 systemd[1]: Reached target slices.target - Slice Units. Dec 16 13:01:17.859383 systemd[1]: Reached target swap.target - Swaps. Dec 16 13:01:17.859389 systemd[1]: Reached target timers.target - Timer Units. Dec 16 13:01:17.859396 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 13:01:17.859402 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 13:01:17.859410 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 16 13:01:17.859417 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 16 13:01:17.859423 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 13:01:17.859431 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 13:01:17.859439 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 13:01:17.859445 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 13:01:17.859452 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 16 13:01:17.859459 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 13:01:17.859465 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 16 13:01:17.859472 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 16 13:01:17.859478 systemd[1]: Starting systemd-fsck-usr.service... Dec 16 13:01:17.859486 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 13:01:17.859500 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 13:01:17.859535 systemd-journald[200]: Collecting audit messages is disabled. Dec 16 13:01:17.859555 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:01:17.859565 systemd-journald[200]: Journal started Dec 16 13:01:17.859581 systemd-journald[200]: Runtime Journal (/run/log/journal/3d72b010523d4efe8ae1f46ea6400bfd) is 4.7M, max 38.3M, 33.5M free. Dec 16 13:01:17.866480 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 13:01:17.866520 systemd-modules-load[202]: Inserted module 'overlay' Dec 16 13:01:17.870086 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 16 13:01:17.871045 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 13:01:17.873456 systemd[1]: Finished systemd-fsck-usr.service. Dec 16 13:01:17.882956 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 13:01:17.968352 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 16 13:01:17.968376 kernel: Bridge firewalling registered Dec 16 13:01:17.891451 systemd-modules-load[202]: Inserted module 'br_netfilter' Dec 16 13:01:17.971352 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 13:01:17.973710 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 13:01:17.975547 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:01:17.980154 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 13:01:17.984001 systemd-tmpfiles[213]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 16 13:01:17.984917 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 16 13:01:17.990153 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 13:01:18.001920 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 13:01:18.003157 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 13:01:18.009000 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 13:01:18.012621 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 13:01:18.014187 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 13:01:18.018013 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 13:01:18.020912 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 16 13:01:18.038800 dracut-cmdline[238]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=a214a2d85e162c493e8b13db2df50a43e1005a0e4854a1ae089a14f442a30022 Dec 16 13:01:18.052002 systemd-resolved[232]: Positive Trust Anchors: Dec 16 13:01:18.052015 systemd-resolved[232]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 13:01:18.052044 systemd-resolved[232]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 13:01:18.056445 systemd-resolved[232]: Defaulting to hostname 'linux'. Dec 16 13:01:18.060308 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 13:01:18.061612 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 13:01:18.109875 kernel: SCSI subsystem initialized Dec 16 13:01:18.117836 kernel: Loading iSCSI transport class v2.0-870. Dec 16 13:01:18.127850 kernel: iscsi: registered transport (tcp) Dec 16 13:01:18.145488 kernel: iscsi: registered transport (qla4xxx) Dec 16 13:01:18.145550 kernel: QLogic iSCSI HBA Driver Dec 16 13:01:18.160574 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 13:01:18.181237 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 13:01:18.184410 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 13:01:18.215322 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 16 13:01:18.217337 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 16 13:01:18.260834 kernel: raid6: avx2x4 gen() 30625 MB/s Dec 16 13:01:18.278837 kernel: raid6: avx2x2 gen() 29582 MB/s Dec 16 13:01:18.297960 kernel: raid6: avx2x1 gen() 20687 MB/s Dec 16 13:01:18.297987 kernel: raid6: using algorithm avx2x4 gen() 30625 MB/s Dec 16 13:01:18.317955 kernel: raid6: .... xor() 4574 MB/s, rmw enabled Dec 16 13:01:18.317994 kernel: raid6: using avx2x2 recovery algorithm Dec 16 13:01:18.335843 kernel: xor: automatically using best checksumming function avx Dec 16 13:01:18.465852 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 16 13:01:18.470000 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 16 13:01:18.472221 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 13:01:18.495785 systemd-udevd[450]: Using default interface naming scheme 'v255'. Dec 16 13:01:18.499573 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 13:01:18.503328 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 16 13:01:18.524686 dracut-pre-trigger[461]: rd.md=0: removing MD RAID activation Dec 16 13:01:18.541445 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 13:01:18.543640 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 13:01:18.600784 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 13:01:18.607313 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 16 13:01:18.667834 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Dec 16 13:01:18.682948 kernel: cryptd: max_cpu_qlen set to 1000 Dec 16 13:01:18.693125 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 13:01:18.695399 kernel: scsi host0: Virtio SCSI HBA Dec 16 13:01:18.693251 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:01:18.697629 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:01:18.699759 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:01:18.708987 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Dec 16 13:01:18.742454 kernel: ACPI: bus type USB registered Dec 16 13:01:18.742528 kernel: usbcore: registered new interface driver usbfs Dec 16 13:01:18.742539 kernel: usbcore: registered new interface driver hub Dec 16 13:01:18.748849 kernel: usbcore: registered new device driver usb Dec 16 13:01:18.751839 kernel: libata version 3.00 loaded. Dec 16 13:01:18.766840 kernel: AES CTR mode by8 optimization enabled Dec 16 13:01:18.793890 kernel: ahci 0000:00:1f.2: version 3.0 Dec 16 13:01:18.794097 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Dec 16 13:01:18.795836 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Dec 16 13:01:18.799831 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Dec 16 13:01:18.799948 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Dec 16 13:01:18.800088 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Dec 16 13:01:18.805848 kernel: scsi host1: ahci Dec 16 13:01:18.810823 kernel: scsi host2: ahci Dec 16 13:01:18.815413 kernel: scsi host3: ahci Dec 16 13:01:18.815534 kernel: sd 0:0:0:0: Power-on or device reset occurred Dec 16 13:01:18.815636 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Dec 16 13:01:18.815719 kernel: sd 0:0:0:0: [sda] Write Protect is off Dec 16 13:01:18.815803 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Dec 16 13:01:18.817898 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Dec 16 13:01:18.818011 kernel: scsi host4: ahci Dec 16 13:01:18.823026 kernel: scsi host5: ahci Dec 16 13:01:18.823181 kernel: scsi host6: ahci Dec 16 13:01:18.823277 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 48 lpm-pol 1 Dec 16 13:01:18.823287 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 48 lpm-pol 1 Dec 16 13:01:18.823299 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 48 lpm-pol 1 Dec 16 13:01:18.823306 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 48 lpm-pol 1 Dec 16 13:01:18.823314 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 48 lpm-pol 1 Dec 16 13:01:18.823321 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 48 lpm-pol 1 Dec 16 13:01:18.827398 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 16 13:01:18.827430 kernel: GPT:17805311 != 80003071 Dec 16 13:01:18.827447 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 16 13:01:18.827456 kernel: GPT:17805311 != 80003071 Dec 16 13:01:18.827463 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 16 13:01:18.827474 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 16 13:01:18.827482 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Dec 16 13:01:18.902891 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:01:19.132601 kernel: ata2: SATA link down (SStatus 0 SControl 300) Dec 16 13:01:19.132674 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Dec 16 13:01:19.133324 kernel: ata3: SATA link down (SStatus 0 SControl 300) Dec 16 13:01:19.141481 kernel: ata1.00: LPM support broken, forcing max_power Dec 16 13:01:19.141517 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Dec 16 13:01:19.141529 kernel: ata1.00: applying bridge limits Dec 16 13:01:19.143819 kernel: ata4: SATA link down (SStatus 0 SControl 300) Dec 16 13:01:19.144840 kernel: ata5: SATA link down (SStatus 0 SControl 300) Dec 16 13:01:19.151067 kernel: ata1.00: LPM support broken, forcing max_power Dec 16 13:01:19.151092 kernel: ata1.00: configured for UDMA/100 Dec 16 13:01:19.151822 kernel: ata6: SATA link down (SStatus 0 SControl 300) Dec 16 13:01:19.165839 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Dec 16 13:01:19.197484 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 13:01:19.197672 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Dec 16 13:01:19.205960 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 16 13:01:19.214009 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 13:01:19.214410 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Dec 16 13:01:19.214551 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Dec 16 13:01:19.217244 kernel: hub 1-0:1.0: USB hub found Dec 16 13:01:19.223660 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Dec 16 13:01:19.223868 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Dec 16 13:01:19.225832 kernel: hub 1-0:1.0: 4 ports detected Dec 16 13:01:19.240661 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 16 13:01:19.240984 kernel: hub 2-0:1.0: USB hub found Dec 16 13:01:19.243824 kernel: hub 2-0:1.0: 4 ports detected Dec 16 13:01:19.245843 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Dec 16 13:01:19.273986 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Dec 16 13:01:19.283053 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Dec 16 13:01:19.290523 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 16 13:01:19.296666 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Dec 16 13:01:19.297387 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Dec 16 13:01:19.300092 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 16 13:01:19.320951 disk-uuid[619]: Primary Header is updated. Dec 16 13:01:19.320951 disk-uuid[619]: Secondary Entries is updated. Dec 16 13:01:19.320951 disk-uuid[619]: Secondary Header is updated. Dec 16 13:01:19.331833 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 16 13:01:19.343857 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 16 13:01:19.469823 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 16 13:01:19.470789 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 16 13:01:19.499173 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 13:01:19.499893 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 13:01:19.501587 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 13:01:19.503758 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 16 13:01:19.521688 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 16 13:01:19.612849 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 16 13:01:19.622603 kernel: usbcore: registered new interface driver usbhid Dec 16 13:01:19.622648 kernel: usbhid: USB HID core driver Dec 16 13:01:19.638274 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input4 Dec 16 13:01:19.638305 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Dec 16 13:01:20.347645 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 16 13:01:20.347701 disk-uuid[621]: The operation has completed successfully. Dec 16 13:01:20.390989 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 16 13:01:20.391088 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 16 13:01:20.424554 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Dec 16 13:01:20.439579 sh[651]: Success Dec 16 13:01:20.459098 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 16 13:01:20.459200 kernel: device-mapper: uevent: version 1.0.3 Dec 16 13:01:20.460226 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 16 13:01:20.472844 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Dec 16 13:01:20.507782 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Dec 16 13:01:20.510890 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Dec 16 13:01:20.536872 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Dec 16 13:01:20.548832 kernel: BTRFS: device fsid a6ae7f96-a076-4d3c-81ed-46dd341492f8 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (663) Dec 16 13:01:20.548875 kernel: BTRFS info (device dm-0): first mount of filesystem a6ae7f96-a076-4d3c-81ed-46dd341492f8 Dec 16 13:01:20.555838 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Dec 16 13:01:20.566511 kernel: BTRFS info (device dm-0): enabling ssd optimizations Dec 16 13:01:20.566556 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 16 13:01:20.566568 kernel: BTRFS info (device dm-0): enabling free space tree Dec 16 13:01:20.570618 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Dec 16 13:01:20.572597 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 16 13:01:20.574468 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 16 13:01:20.575900 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 16 13:01:20.578901 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 16 13:01:20.616163 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (698) Dec 16 13:01:20.616218 kernel: BTRFS info (device sda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:01:20.622003 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 13:01:20.627312 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 13:01:20.627343 kernel: BTRFS info (device sda6): turning on async discard Dec 16 13:01:20.631361 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 13:01:20.638880 kernel: BTRFS info (device sda6): last unmount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:01:20.640023 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 16 13:01:20.643929 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 16 13:01:20.697455 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 13:01:20.702949 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 13:01:20.742477 ignition[765]: Ignition 2.22.0 Dec 16 13:01:20.743255 ignition[765]: Stage: fetch-offline Dec 16 13:01:20.743283 ignition[765]: no configs at "/usr/lib/ignition/base.d" Dec 16 13:01:20.743924 systemd-networkd[832]: lo: Link UP Dec 16 13:01:20.743290 ignition[765]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 13:01:20.743927 systemd-networkd[832]: lo: Gained carrier Dec 16 13:01:20.743352 ignition[765]: parsed url from cmdline: "" Dec 16 13:01:20.745392 systemd-networkd[832]: Enumeration completed Dec 16 13:01:20.743354 ignition[765]: no config URL provided Dec 16 13:01:20.745460 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 13:01:20.743358 ignition[765]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 13:01:20.745878 systemd-networkd[832]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:01:20.743363 ignition[765]: no config at "/usr/lib/ignition/user.ign" Dec 16 13:01:20.745882 systemd-networkd[832]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 13:01:20.743366 ignition[765]: failed to fetch config: resource requires networking Dec 16 13:01:20.746675 systemd-networkd[832]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:01:20.743475 ignition[765]: Ignition finished successfully Dec 16 13:01:20.746678 systemd-networkd[832]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 13:01:20.747158 systemd-networkd[832]: eth0: Link UP Dec 16 13:01:20.747255 systemd-networkd[832]: eth0: Gained carrier Dec 16 13:01:20.747262 systemd-networkd[832]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:01:20.749977 systemd-networkd[832]: eth1: Link UP Dec 16 13:01:20.750243 systemd-networkd[832]: eth1: Gained carrier Dec 16 13:01:20.750251 systemd-networkd[832]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:01:20.750486 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 13:01:20.752340 systemd[1]: Reached target network.target - Network. Dec 16 13:01:20.755127 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 16 13:01:20.776856 systemd-networkd[832]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Dec 16 13:01:20.778612 ignition[841]: Ignition 2.22.0 Dec 16 13:01:20.778620 ignition[841]: Stage: fetch Dec 16 13:01:20.778733 ignition[841]: no configs at "/usr/lib/ignition/base.d" Dec 16 13:01:20.778741 ignition[841]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 13:01:20.778801 ignition[841]: parsed url from cmdline: "" Dec 16 13:01:20.778821 ignition[841]: no config URL provided Dec 16 13:01:20.778826 ignition[841]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 13:01:20.778833 ignition[841]: no config at "/usr/lib/ignition/user.ign" Dec 16 13:01:20.778865 ignition[841]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Dec 16 13:01:20.779032 ignition[841]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Dec 16 13:01:20.812879 systemd-networkd[832]: eth0: DHCPv4 address 77.42.23.34/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 16 13:01:20.979309 ignition[841]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Dec 16 13:01:20.985117 ignition[841]: GET result: OK Dec 16 13:01:20.985472 ignition[841]: parsing config with SHA512: c9c7dcdf5e324cc9bbe2559fb2b4430aea89fe08c71d808a43ae454fbbf3ec864eaf4ce637461396056c94786a4a8cb1f28b15eeef122f31af84c72d69847adc Dec 16 13:01:20.988779 unknown[841]: fetched base config from "system" Dec 16 13:01:20.988793 unknown[841]: fetched base config from "system" Dec 16 13:01:20.989246 ignition[841]: fetch: fetch complete Dec 16 13:01:20.988798 unknown[841]: fetched user config from "hetzner" Dec 16 13:01:20.989251 ignition[841]: fetch: fetch passed Dec 16 13:01:20.992978 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 16 13:01:20.989291 ignition[841]: Ignition finished successfully Dec 16 13:01:20.995932 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 16 13:01:21.021364 ignition[848]: Ignition 2.22.0 Dec 16 13:01:21.021380 ignition[848]: Stage: kargs Dec 16 13:01:21.021503 ignition[848]: no configs at "/usr/lib/ignition/base.d" Dec 16 13:01:21.021512 ignition[848]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 13:01:21.026066 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 16 13:01:21.022316 ignition[848]: kargs: kargs passed Dec 16 13:01:21.022352 ignition[848]: Ignition finished successfully Dec 16 13:01:21.029918 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 16 13:01:21.053119 ignition[855]: Ignition 2.22.0 Dec 16 13:01:21.053147 ignition[855]: Stage: disks Dec 16 13:01:21.053274 ignition[855]: no configs at "/usr/lib/ignition/base.d" Dec 16 13:01:21.053282 ignition[855]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 13:01:21.055926 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 16 13:01:21.054059 ignition[855]: disks: disks passed Dec 16 13:01:21.058089 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 16 13:01:21.054098 ignition[855]: Ignition finished successfully Dec 16 13:01:21.059031 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 16 13:01:21.060664 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 13:01:21.062164 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 13:01:21.063915 systemd[1]: Reached target basic.target - Basic System. Dec 16 13:01:21.066229 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 16 13:01:21.098893 systemd-fsck[863]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Dec 16 13:01:21.102715 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 16 13:01:21.109925 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 16 13:01:21.225836 kernel: EXT4-fs (sda9): mounted filesystem e48ca59c-1206-4abd-b121-5e9b35e49852 r/w with ordered data mode. Quota mode: none. Dec 16 13:01:21.226798 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 16 13:01:21.228004 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 16 13:01:21.230444 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 13:01:21.233876 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 16 13:01:21.239939 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Dec 16 13:01:21.241958 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 16 13:01:21.242003 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 13:01:21.247978 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 16 13:01:21.256540 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (871) Dec 16 13:01:21.257069 kernel: BTRFS info (device sda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:01:21.255229 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 16 13:01:21.269236 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 13:01:21.271839 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 13:01:21.271865 kernel: BTRFS info (device sda6): turning on async discard Dec 16 13:01:21.271876 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 13:01:21.279909 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 13:01:21.299489 coreos-metadata[873]: Dec 16 13:01:21.299 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Dec 16 13:01:21.301714 coreos-metadata[873]: Dec 16 13:01:21.300 INFO Fetch successful Dec 16 13:01:21.301714 coreos-metadata[873]: Dec 16 13:01:21.301 INFO wrote hostname ci-4459-2-2-4-07f930e259 to /sysroot/etc/hostname Dec 16 13:01:21.304486 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 16 13:01:21.314671 initrd-setup-root[901]: cut: /sysroot/etc/passwd: No such file or directory Dec 16 13:01:21.318479 initrd-setup-root[908]: cut: /sysroot/etc/group: No such file or directory Dec 16 13:01:21.322092 initrd-setup-root[915]: cut: /sysroot/etc/shadow: No such file or directory Dec 16 13:01:21.325582 initrd-setup-root[922]: cut: /sysroot/etc/gshadow: No such file or directory Dec 16 13:01:21.390697 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 16 13:01:21.392449 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 16 13:01:21.394334 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 16 13:01:21.406845 kernel: BTRFS info (device sda6): last unmount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:01:21.419986 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 16 13:01:21.428420 ignition[990]: INFO : Ignition 2.22.0 Dec 16 13:01:21.428420 ignition[990]: INFO : Stage: mount Dec 16 13:01:21.429963 ignition[990]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 13:01:21.429963 ignition[990]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 13:01:21.429963 ignition[990]: INFO : mount: mount passed Dec 16 13:01:21.429963 ignition[990]: INFO : Ignition finished successfully Dec 16 13:01:21.430797 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 16 13:01:21.435863 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 16 13:01:21.545660 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 16 13:01:21.547102 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 13:01:21.569850 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1002) Dec 16 13:01:21.574555 kernel: BTRFS info (device sda6): first mount of filesystem 7e9ead35-f0ec-40e8-bc31-5061934f865a Dec 16 13:01:21.574596 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Dec 16 13:01:21.584387 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 13:01:21.584423 kernel: BTRFS info (device sda6): turning on async discard Dec 16 13:01:21.584435 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 13:01:21.588608 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 13:01:21.616662 ignition[1018]: INFO : Ignition 2.22.0 Dec 16 13:01:21.616662 ignition[1018]: INFO : Stage: files Dec 16 13:01:21.618753 ignition[1018]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 13:01:21.618753 ignition[1018]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 13:01:21.618753 ignition[1018]: DEBUG : files: compiled without relabeling support, skipping Dec 16 13:01:21.618753 ignition[1018]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 16 13:01:21.618753 ignition[1018]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 16 13:01:21.624430 ignition[1018]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 16 13:01:21.624430 ignition[1018]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 16 13:01:21.626941 ignition[1018]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 16 13:01:21.625339 unknown[1018]: wrote ssh authorized keys file for user: core Dec 16 13:01:21.628989 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Dec 16 13:01:21.628989 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Dec 16 13:01:21.765069 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 16 13:01:22.092888 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Dec 16 13:01:22.094587 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 16 13:01:22.094587 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 16 13:01:22.094587 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 16 13:01:22.094587 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 16 13:01:22.094587 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 13:01:22.094587 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 13:01:22.094587 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 13:01:22.094587 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 13:01:22.104046 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 13:01:22.104046 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 13:01:22.104046 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Dec 16 13:01:22.104046 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Dec 16 13:01:22.104046 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Dec 16 13:01:22.104046 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-x86-64.raw: attempt #1 Dec 16 13:01:22.401032 systemd-networkd[832]: eth0: Gained IPv6LL Dec 16 13:01:22.528956 systemd-networkd[832]: eth1: Gained IPv6LL Dec 16 13:01:22.540894 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 16 13:01:22.808876 ignition[1018]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-x86-64.raw" Dec 16 13:01:22.808876 ignition[1018]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 16 13:01:22.811869 ignition[1018]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 13:01:22.813788 ignition[1018]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 13:01:22.813788 ignition[1018]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 16 13:01:22.813788 ignition[1018]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Dec 16 13:01:22.818361 ignition[1018]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 16 13:01:22.818361 ignition[1018]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 16 13:01:22.818361 ignition[1018]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Dec 16 13:01:22.818361 ignition[1018]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Dec 16 13:01:22.818361 ignition[1018]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Dec 16 13:01:22.818361 ignition[1018]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 16 13:01:22.818361 ignition[1018]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 16 13:01:22.818361 ignition[1018]: INFO : files: files passed Dec 16 13:01:22.818361 ignition[1018]: INFO : Ignition finished successfully Dec 16 13:01:22.816248 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 16 13:01:22.818954 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 16 13:01:22.829947 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 16 13:01:22.831464 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 16 13:01:22.831546 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 16 13:01:22.846205 initrd-setup-root-after-ignition[1053]: grep: Dec 16 13:01:22.846205 initrd-setup-root-after-ignition[1049]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 13:01:22.846205 initrd-setup-root-after-ignition[1049]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 16 13:01:22.850753 initrd-setup-root-after-ignition[1053]: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 13:01:22.847673 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 13:01:22.849997 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 16 13:01:22.852243 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 16 13:01:22.886160 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 16 13:01:22.886253 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 16 13:01:22.888043 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 16 13:01:22.889400 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 16 13:01:22.891030 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 16 13:01:22.891694 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 16 13:01:22.917259 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 13:01:22.919399 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 16 13:01:22.945204 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 16 13:01:22.946228 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 13:01:22.948156 systemd[1]: Stopped target timers.target - Timer Units. Dec 16 13:01:22.949899 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 16 13:01:22.950045 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 13:01:22.952005 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 16 13:01:22.953151 systemd[1]: Stopped target basic.target - Basic System. Dec 16 13:01:22.954941 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 16 13:01:22.956553 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 13:01:22.958221 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 16 13:01:22.960041 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 16 13:01:22.961848 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 16 13:01:22.963697 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 13:01:22.965588 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 16 13:01:22.967389 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 16 13:01:22.969213 systemd[1]: Stopped target swap.target - Swaps. Dec 16 13:01:22.970940 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 16 13:01:22.971099 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 16 13:01:22.973053 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 16 13:01:22.974302 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 13:01:22.975914 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 16 13:01:22.976867 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 13:01:22.978885 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 16 13:01:22.979033 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 16 13:01:22.981146 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 16 13:01:22.981254 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 13:01:22.989619 systemd[1]: ignition-files.service: Deactivated successfully. Dec 16 13:01:22.989757 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 16 13:01:22.991434 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Dec 16 13:01:22.991570 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 16 13:01:22.994897 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 16 13:01:22.997976 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 16 13:01:23.000299 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 16 13:01:23.001910 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 13:01:23.004392 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 16 13:01:23.004560 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 13:01:23.011278 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 16 13:01:23.011359 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 16 13:01:23.027516 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 16 13:01:23.030254 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 16 13:01:23.031716 ignition[1073]: INFO : Ignition 2.22.0 Dec 16 13:01:23.031716 ignition[1073]: INFO : Stage: umount Dec 16 13:01:23.031716 ignition[1073]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 13:01:23.031716 ignition[1073]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 13:01:23.031716 ignition[1073]: INFO : umount: umount passed Dec 16 13:01:23.031716 ignition[1073]: INFO : Ignition finished successfully Dec 16 13:01:23.031080 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 16 13:01:23.032616 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 16 13:01:23.032687 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 16 13:01:23.034423 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 16 13:01:23.034476 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 16 13:01:23.035348 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 16 13:01:23.035384 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 16 13:01:23.036779 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 16 13:01:23.036830 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 16 13:01:23.038019 systemd[1]: Stopped target network.target - Network. Dec 16 13:01:23.039215 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 16 13:01:23.039252 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 13:01:23.040661 systemd[1]: Stopped target paths.target - Path Units. Dec 16 13:01:23.042007 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 16 13:01:23.047926 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 13:01:23.048892 systemd[1]: Stopped target slices.target - Slice Units. Dec 16 13:01:23.050426 systemd[1]: Stopped target sockets.target - Socket Units. Dec 16 13:01:23.051887 systemd[1]: iscsid.socket: Deactivated successfully. Dec 16 13:01:23.051933 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 13:01:23.053426 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 16 13:01:23.053455 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 13:01:23.055158 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 16 13:01:23.055212 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 16 13:01:23.056641 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 16 13:01:23.056678 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 16 13:01:23.058038 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 16 13:01:23.058078 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 16 13:01:23.059680 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 16 13:01:23.060927 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 16 13:01:23.067220 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 16 13:01:23.067312 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 16 13:01:23.070325 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Dec 16 13:01:23.070560 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 16 13:01:23.070604 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 13:01:23.072792 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 16 13:01:23.077930 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 16 13:01:23.078017 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 16 13:01:23.080518 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Dec 16 13:01:23.081056 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 16 13:01:23.082414 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 16 13:01:23.082441 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 16 13:01:23.084602 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 16 13:01:23.086205 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 16 13:01:23.086245 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 13:01:23.088248 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 16 13:01:23.088282 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 16 13:01:23.090559 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 16 13:01:23.090592 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 16 13:01:23.091638 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 13:01:23.097287 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 16 13:01:23.100230 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 16 13:01:23.100344 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 13:01:23.102676 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 16 13:01:23.102720 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 16 13:01:23.104078 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 16 13:01:23.104102 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 13:01:23.106725 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 16 13:01:23.106780 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 16 13:01:23.108976 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 16 13:01:23.109010 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 16 13:01:23.110592 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 16 13:01:23.110629 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 13:01:23.112872 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 16 13:01:23.115302 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 16 13:01:23.115350 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 13:01:23.117020 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 16 13:01:23.117056 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 13:01:23.118244 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Dec 16 13:01:23.118288 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 13:01:23.119800 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 16 13:01:23.119855 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 13:01:23.120918 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 13:01:23.120952 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:01:23.125876 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 16 13:01:23.125940 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 16 13:01:23.134935 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 16 13:01:23.134993 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 16 13:01:23.136834 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 16 13:01:23.138910 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 16 13:01:23.152449 systemd[1]: Switching root. Dec 16 13:01:23.195981 systemd-journald[200]: Journal stopped Dec 16 13:01:24.111838 systemd-journald[200]: Received SIGTERM from PID 1 (systemd). Dec 16 13:01:24.111880 kernel: SELinux: policy capability network_peer_controls=1 Dec 16 13:01:24.111894 kernel: SELinux: policy capability open_perms=1 Dec 16 13:01:24.111903 kernel: SELinux: policy capability extended_socket_class=1 Dec 16 13:01:24.111911 kernel: SELinux: policy capability always_check_network=0 Dec 16 13:01:24.111919 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 16 13:01:24.111934 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 16 13:01:24.111944 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 16 13:01:24.111952 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 16 13:01:24.111959 kernel: SELinux: policy capability userspace_initial_context=0 Dec 16 13:01:24.111967 kernel: audit: type=1403 audit(1765890083.339:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 16 13:01:24.111976 systemd[1]: Successfully loaded SELinux policy in 64.217ms. Dec 16 13:01:24.111991 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.498ms. Dec 16 13:01:24.112000 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 13:01:24.112011 systemd[1]: Detected virtualization kvm. Dec 16 13:01:24.112019 systemd[1]: Detected architecture x86-64. Dec 16 13:01:24.112027 systemd[1]: Detected first boot. Dec 16 13:01:24.112035 systemd[1]: Hostname set to . Dec 16 13:01:24.112043 systemd[1]: Initializing machine ID from VM UUID. Dec 16 13:01:24.112051 zram_generator::config[1116]: No configuration found. Dec 16 13:01:24.112059 kernel: Guest personality initialized and is inactive Dec 16 13:01:24.112067 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Dec 16 13:01:24.112074 kernel: Initialized host personality Dec 16 13:01:24.112083 kernel: NET: Registered PF_VSOCK protocol family Dec 16 13:01:24.112091 systemd[1]: Populated /etc with preset unit settings. Dec 16 13:01:24.112099 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Dec 16 13:01:24.112118 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 16 13:01:24.112127 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 16 13:01:24.112135 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 16 13:01:24.112143 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 16 13:01:24.112156 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 16 13:01:24.112165 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 16 13:01:24.112174 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 16 13:01:24.112183 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 16 13:01:24.112191 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 16 13:01:24.112200 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 16 13:01:24.112207 systemd[1]: Created slice user.slice - User and Session Slice. Dec 16 13:01:24.112217 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 13:01:24.112225 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 13:01:24.112233 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 16 13:01:24.112241 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 16 13:01:24.112250 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 16 13:01:24.112258 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 13:01:24.112268 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Dec 16 13:01:24.112277 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 13:01:24.112285 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 13:01:24.112293 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 16 13:01:24.112301 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 16 13:01:24.112310 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 16 13:01:24.112324 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 16 13:01:24.112344 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 13:01:24.112361 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 13:01:24.112380 systemd[1]: Reached target slices.target - Slice Units. Dec 16 13:01:24.112392 systemd[1]: Reached target swap.target - Swaps. Dec 16 13:01:24.112407 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 16 13:01:24.112421 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 16 13:01:24.112436 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 16 13:01:24.112444 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 13:01:24.112452 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 13:01:24.112461 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 13:01:24.112469 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 16 13:01:24.112479 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 16 13:01:24.112488 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 16 13:01:24.112496 systemd[1]: Mounting media.mount - External Media Directory... Dec 16 13:01:24.112504 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:01:24.112512 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 16 13:01:24.112520 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 16 13:01:24.112528 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 16 13:01:24.112537 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 16 13:01:24.112545 systemd[1]: Reached target machines.target - Containers. Dec 16 13:01:24.112554 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 16 13:01:24.112562 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 13:01:24.112570 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 13:01:24.112578 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 16 13:01:24.112587 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 13:01:24.112597 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 13:01:24.112605 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 13:01:24.112613 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 16 13:01:24.112622 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 13:01:24.112630 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 16 13:01:24.112639 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 16 13:01:24.112647 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 16 13:01:24.112655 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 16 13:01:24.112663 systemd[1]: Stopped systemd-fsck-usr.service. Dec 16 13:01:24.112672 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 13:01:24.112679 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 13:01:24.112688 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 13:01:24.112697 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 13:01:24.112705 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 16 13:01:24.112713 kernel: loop: module loaded Dec 16 13:01:24.112721 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 16 13:01:24.112730 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 13:01:24.112740 systemd[1]: verity-setup.service: Deactivated successfully. Dec 16 13:01:24.112748 kernel: fuse: init (API version 7.41) Dec 16 13:01:24.112756 systemd[1]: Stopped verity-setup.service. Dec 16 13:01:24.112764 kernel: ACPI: bus type drm_connector registered Dec 16 13:01:24.112772 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:01:24.112781 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 16 13:01:24.112790 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 16 13:01:24.112798 systemd[1]: Mounted media.mount - External Media Directory. Dec 16 13:01:24.112930 systemd-journald[1207]: Collecting audit messages is disabled. Dec 16 13:01:24.112954 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 16 13:01:24.112963 systemd-journald[1207]: Journal started Dec 16 13:01:24.112983 systemd-journald[1207]: Runtime Journal (/run/log/journal/3d72b010523d4efe8ae1f46ea6400bfd) is 4.7M, max 38.3M, 33.5M free. Dec 16 13:01:23.800985 systemd[1]: Queued start job for default target multi-user.target. Dec 16 13:01:23.813856 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Dec 16 13:01:23.814374 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 16 13:01:24.117837 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 13:01:24.118915 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 16 13:01:24.119679 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 16 13:01:24.120619 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 16 13:01:24.121572 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 13:01:24.122505 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 16 13:01:24.122711 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 16 13:01:24.123620 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 13:01:24.123854 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 13:01:24.124733 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 13:01:24.125052 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 13:01:24.125924 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 13:01:24.126099 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 13:01:24.127072 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 16 13:01:24.127252 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 16 13:01:24.128094 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 13:01:24.128309 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 13:01:24.129295 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 13:01:24.130173 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 13:01:24.131173 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 16 13:01:24.132059 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 16 13:01:24.139195 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 13:01:24.140945 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 16 13:01:24.144461 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 16 13:01:24.145226 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 16 13:01:24.145296 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 13:01:24.147459 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 16 13:01:24.153075 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 16 13:01:24.154933 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 13:01:24.156250 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 16 13:01:24.158639 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 16 13:01:24.159400 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 13:01:24.160755 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 16 13:01:24.162883 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 13:01:24.164548 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 13:01:24.171566 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 16 13:01:24.175465 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 13:01:24.180930 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 13:01:24.182001 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 16 13:01:24.182918 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 16 13:01:24.189874 kernel: loop0: detected capacity change from 0 to 8 Dec 16 13:01:24.191863 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 16 13:01:24.192729 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 16 13:01:24.196173 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 16 13:01:24.200357 systemd-journald[1207]: Time spent on flushing to /var/log/journal/3d72b010523d4efe8ae1f46ea6400bfd is 18.467ms for 1175 entries. Dec 16 13:01:24.200357 systemd-journald[1207]: System Journal (/var/log/journal/3d72b010523d4efe8ae1f46ea6400bfd) is 8M, max 584.8M, 576.8M free. Dec 16 13:01:24.247399 systemd-journald[1207]: Received client request to flush runtime journal. Dec 16 13:01:24.247506 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 16 13:01:24.248845 kernel: loop1: detected capacity change from 0 to 219144 Dec 16 13:01:24.215259 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 13:01:24.250744 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 16 13:01:24.254964 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 16 13:01:24.256727 systemd-tmpfiles[1242]: ACLs are not supported, ignoring. Dec 16 13:01:24.256741 systemd-tmpfiles[1242]: ACLs are not supported, ignoring. Dec 16 13:01:24.261480 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 13:01:24.264397 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 16 13:01:24.280080 kernel: loop2: detected capacity change from 0 to 128560 Dec 16 13:01:24.304555 kernel: loop3: detected capacity change from 0 to 110984 Dec 16 13:01:24.305016 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 16 13:01:24.307320 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 13:01:24.332837 kernel: loop4: detected capacity change from 0 to 8 Dec 16 13:01:24.334439 systemd-tmpfiles[1263]: ACLs are not supported, ignoring. Dec 16 13:01:24.334652 systemd-tmpfiles[1263]: ACLs are not supported, ignoring. Dec 16 13:01:24.336861 kernel: loop5: detected capacity change from 0 to 219144 Dec 16 13:01:24.340097 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 13:01:24.371873 kernel: loop6: detected capacity change from 0 to 128560 Dec 16 13:01:24.387845 kernel: loop7: detected capacity change from 0 to 110984 Dec 16 13:01:24.407160 (sd-merge)[1266]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Dec 16 13:01:24.407857 (sd-merge)[1266]: Merged extensions into '/usr'. Dec 16 13:01:24.412134 systemd[1]: Reload requested from client PID 1241 ('systemd-sysext') (unit systemd-sysext.service)... Dec 16 13:01:24.412220 systemd[1]: Reloading... Dec 16 13:01:24.487837 zram_generator::config[1292]: No configuration found. Dec 16 13:01:24.640742 ldconfig[1236]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 16 13:01:24.653880 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 16 13:01:24.654020 systemd[1]: Reloading finished in 240 ms. Dec 16 13:01:24.665632 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 16 13:01:24.666802 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 16 13:01:24.674703 systemd[1]: Starting ensure-sysext.service... Dec 16 13:01:24.677897 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 13:01:24.710655 systemd[1]: Reload requested from client PID 1336 ('systemctl') (unit ensure-sysext.service)... Dec 16 13:01:24.710668 systemd[1]: Reloading... Dec 16 13:01:24.715054 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 16 13:01:24.715336 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 16 13:01:24.715615 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 16 13:01:24.715901 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Dec 16 13:01:24.716696 systemd-tmpfiles[1337]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Dec 16 13:01:24.720070 systemd-tmpfiles[1337]: ACLs are not supported, ignoring. Dec 16 13:01:24.720154 systemd-tmpfiles[1337]: ACLs are not supported, ignoring. Dec 16 13:01:24.725259 systemd-tmpfiles[1337]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 13:01:24.725272 systemd-tmpfiles[1337]: Skipping /boot Dec 16 13:01:24.731331 systemd-tmpfiles[1337]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 13:01:24.731482 systemd-tmpfiles[1337]: Skipping /boot Dec 16 13:01:24.767833 zram_generator::config[1364]: No configuration found. Dec 16 13:01:24.915690 systemd[1]: Reloading finished in 204 ms. Dec 16 13:01:24.938410 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 16 13:01:24.955480 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 13:01:24.962727 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 13:01:24.964988 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 16 13:01:24.973044 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 16 13:01:24.976044 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 13:01:24.980070 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 13:01:24.985006 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 16 13:01:24.989316 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:01:24.989506 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 13:01:24.998957 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 13:01:25.006477 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 13:01:25.013038 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 13:01:25.014085 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 13:01:25.014198 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 13:01:25.014276 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:01:25.021951 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 16 13:01:25.023622 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 16 13:01:25.027760 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 13:01:25.027925 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 13:01:25.030664 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:01:25.031180 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 13:01:25.031358 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 13:01:25.031470 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 13:01:25.032662 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 16 13:01:25.037043 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 16 13:01:25.038144 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:01:25.038434 systemd-udevd[1415]: Using default interface naming scheme 'v255'. Dec 16 13:01:25.038733 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 13:01:25.039927 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 13:01:25.042164 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 13:01:25.042278 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 13:01:25.047417 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:01:25.047591 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 13:01:25.049317 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 13:01:25.052231 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 13:01:25.054184 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 13:01:25.056998 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 13:01:25.057713 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 13:01:25.057820 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 13:01:25.057992 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Dec 16 13:01:25.061010 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 16 13:01:25.063175 systemd[1]: Finished ensure-sysext.service. Dec 16 13:01:25.067657 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Dec 16 13:01:25.072215 augenrules[1450]: No rules Dec 16 13:01:25.072794 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 13:01:25.073474 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 13:01:25.084802 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 13:01:25.085930 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 13:01:25.087904 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 16 13:01:25.088772 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 13:01:25.088920 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 13:01:25.089780 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 13:01:25.090111 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 13:01:25.090955 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 13:01:25.091064 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 13:01:25.092666 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 13:01:25.092751 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 13:01:25.092778 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 16 13:01:25.097294 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 13:01:25.101583 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 13:01:25.103920 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 16 13:01:25.214154 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Dec 16 13:01:25.253330 systemd-networkd[1467]: lo: Link UP Dec 16 13:01:25.253831 systemd-networkd[1467]: lo: Gained carrier Dec 16 13:01:25.254563 systemd-networkd[1467]: Enumeration completed Dec 16 13:01:25.254642 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 13:01:25.271896 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 16 13:01:25.272437 systemd-networkd[1467]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:01:25.272443 systemd-networkd[1467]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 13:01:25.273957 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 16 13:01:25.274120 systemd-networkd[1467]: eth1: Link UP Dec 16 13:01:25.274239 systemd-networkd[1467]: eth1: Gained carrier Dec 16 13:01:25.274253 systemd-networkd[1467]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:01:25.300921 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Dec 16 13:01:25.302306 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 16 13:01:25.304586 systemd[1]: Reached target time-set.target - System Time Set. Dec 16 13:01:25.308115 systemd-resolved[1413]: Positive Trust Anchors: Dec 16 13:01:25.308386 systemd-resolved[1413]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 13:01:25.308481 systemd-resolved[1413]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 13:01:25.314922 systemd-networkd[1467]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Dec 16 13:01:25.315460 systemd-resolved[1413]: Using system hostname 'ci-4459-2-2-4-07f930e259'. Dec 16 13:01:25.315756 systemd-timesyncd[1451]: Network configuration changed, trying to establish connection. Dec 16 13:01:25.317677 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 13:01:25.319069 systemd[1]: Reached target network.target - Network. Dec 16 13:01:25.320914 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 13:01:25.321627 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 13:01:25.322568 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 16 13:01:25.324225 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 16 13:01:25.325770 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Dec 16 13:01:25.326965 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 16 13:01:25.328478 systemd-networkd[1467]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:01:25.328489 systemd-networkd[1467]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 13:01:25.328739 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 16 13:01:25.329449 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 16 13:01:25.330924 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 16 13:01:25.330956 systemd[1]: Reached target paths.target - Path Units. Dec 16 13:01:25.332207 systemd[1]: Reached target timers.target - Timer Units. Dec 16 13:01:25.333188 systemd-networkd[1467]: eth0: Link UP Dec 16 13:01:25.333338 systemd-timesyncd[1451]: Network configuration changed, trying to establish connection. Dec 16 13:01:25.336047 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 16 13:01:25.337658 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 16 13:01:25.338132 systemd-timesyncd[1451]: Network configuration changed, trying to establish connection. Dec 16 13:01:25.339901 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 16 13:01:25.340661 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 16 13:01:25.341431 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 16 13:01:25.342131 systemd-networkd[1467]: eth0: Gained carrier Dec 16 13:01:25.342162 systemd-networkd[1467]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Dec 16 13:01:25.343318 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 16 13:01:25.344488 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 16 13:01:25.345836 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 16 13:01:25.347064 systemd-timesyncd[1451]: Network configuration changed, trying to establish connection. Dec 16 13:01:25.347496 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 13:01:25.348103 systemd[1]: Reached target basic.target - Basic System. Dec 16 13:01:25.348983 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 16 13:01:25.349010 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 16 13:01:25.349641 systemd[1]: Starting containerd.service - containerd container runtime... Dec 16 13:01:25.354903 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 16 13:01:25.357913 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 16 13:01:25.359852 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 16 13:01:25.364910 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 16 13:01:25.370278 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 16 13:01:25.372039 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 16 13:01:25.375034 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Dec 16 13:01:25.376375 kernel: mousedev: PS/2 mouse device common for all mice Dec 16 13:01:25.379109 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 16 13:01:25.384557 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 16 13:01:25.387704 coreos-metadata[1515]: Dec 16 13:01:25.386 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Dec 16 13:01:25.387704 coreos-metadata[1515]: Dec 16 13:01:25.386 INFO Failed to fetch: error sending request for url (http://169.254.169.254/hetzner/v1/metadata) Dec 16 13:01:25.395740 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 16 13:01:25.407616 oslogin_cache_refresh[1522]: Refreshing passwd entry cache Dec 16 13:01:25.408401 google_oslogin_nss_cache[1522]: oslogin_cache_refresh[1522]: Refreshing passwd entry cache Dec 16 13:01:25.401043 systemd-networkd[1467]: eth0: DHCPv4 address 77.42.23.34/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 16 13:01:25.402082 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 16 13:01:25.408937 systemd-timesyncd[1451]: Network configuration changed, trying to establish connection. Dec 16 13:01:25.414892 extend-filesystems[1521]: Found /dev/sda6 Dec 16 13:01:25.409734 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 16 13:01:25.411947 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 16 13:01:25.412374 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 16 13:01:25.417210 jq[1518]: false Dec 16 13:01:25.417346 extend-filesystems[1521]: Found /dev/sda9 Dec 16 13:01:25.418080 systemd[1]: Starting update-engine.service - Update Engine... Dec 16 13:01:25.422060 extend-filesystems[1521]: Checking size of /dev/sda9 Dec 16 13:01:25.429511 extend-filesystems[1521]: Resized partition /dev/sda9 Dec 16 13:01:25.425921 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 16 13:01:25.433137 extend-filesystems[1548]: resize2fs 1.47.3 (8-Jul-2025) Dec 16 13:01:25.428855 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 16 13:01:25.439066 oslogin_cache_refresh[1522]: Failure getting users, quitting Dec 16 13:01:25.440146 google_oslogin_nss_cache[1522]: oslogin_cache_refresh[1522]: Failure getting users, quitting Dec 16 13:01:25.440146 google_oslogin_nss_cache[1522]: oslogin_cache_refresh[1522]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 16 13:01:25.440146 google_oslogin_nss_cache[1522]: oslogin_cache_refresh[1522]: Refreshing group entry cache Dec 16 13:01:25.430073 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 16 13:01:25.439084 oslogin_cache_refresh[1522]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Dec 16 13:01:25.430223 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 16 13:01:25.439140 oslogin_cache_refresh[1522]: Refreshing group entry cache Dec 16 13:01:25.430412 systemd[1]: motdgen.service: Deactivated successfully. Dec 16 13:01:25.430536 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 16 13:01:25.445832 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Dec 16 13:01:25.440434 oslogin_cache_refresh[1522]: Failure getting groups, quitting Dec 16 13:01:25.445927 google_oslogin_nss_cache[1522]: oslogin_cache_refresh[1522]: Failure getting groups, quitting Dec 16 13:01:25.445927 google_oslogin_nss_cache[1522]: oslogin_cache_refresh[1522]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 16 13:01:25.440441 oslogin_cache_refresh[1522]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Dec 16 13:01:25.449882 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Dec 16 13:01:25.454403 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Dec 16 13:01:25.455698 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 16 13:01:25.455867 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 16 13:01:25.465598 (ntainerd)[1550]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Dec 16 13:01:25.486860 jq[1544]: true Dec 16 13:01:25.515287 update_engine[1539]: I20251216 13:01:25.495590 1539 main.cc:92] Flatcar Update Engine starting Dec 16 13:01:25.515475 tar[1549]: linux-amd64/LICENSE Dec 16 13:01:25.515593 jq[1564]: true Dec 16 13:01:25.516778 tar[1549]: linux-amd64/helm Dec 16 13:01:25.524246 dbus-daemon[1516]: [system] SELinux support is enabled Dec 16 13:01:25.524386 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 16 13:01:25.527111 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 16 13:01:25.527132 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 16 13:01:25.528492 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 16 13:01:25.528508 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 16 13:01:25.538475 systemd[1]: Started update-engine.service - Update Engine. Dec 16 13:01:25.540297 update_engine[1539]: I20251216 13:01:25.539216 1539 update_check_scheduler.cc:74] Next update check in 7m31s Dec 16 13:01:25.542525 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 16 13:01:25.556825 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input5 Dec 16 13:01:25.581560 systemd-logind[1534]: New seat seat0. Dec 16 13:01:25.582299 systemd[1]: Started systemd-logind.service - User Login Management. Dec 16 13:01:25.603515 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Dec 16 13:01:25.617778 kernel: ACPI: button: Power Button [PWRF] Dec 16 13:01:25.619916 extend-filesystems[1548]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Dec 16 13:01:25.619916 extend-filesystems[1548]: old_desc_blocks = 1, new_desc_blocks = 5 Dec 16 13:01:25.619916 extend-filesystems[1548]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Dec 16 13:01:25.624358 extend-filesystems[1521]: Resized filesystem in /dev/sda9 Dec 16 13:01:25.622352 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 16 13:01:25.628388 bash[1586]: Updated "/home/core/.ssh/authorized_keys" Dec 16 13:01:25.622515 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 16 13:01:25.628786 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 16 13:01:25.633750 systemd[1]: Starting sshkeys.service... Dec 16 13:01:25.644300 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Dec 16 13:01:25.651407 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Dec 16 13:01:25.657962 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 16 13:01:25.670341 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 16 13:01:25.683674 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 16 13:01:25.689992 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 16 13:01:25.711800 containerd[1550]: time="2025-12-16T13:01:25Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 16 13:01:25.712504 containerd[1550]: time="2025-12-16T13:01:25.712483833Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Dec 16 13:01:25.733321 containerd[1550]: time="2025-12-16T13:01:25.733272885Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.199µs" Dec 16 13:01:25.733321 containerd[1550]: time="2025-12-16T13:01:25.733308822Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 16 13:01:25.733321 containerd[1550]: time="2025-12-16T13:01:25.733327678Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 16 13:01:25.733472 containerd[1550]: time="2025-12-16T13:01:25.733453023Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 16 13:01:25.733506 containerd[1550]: time="2025-12-16T13:01:25.733472399Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 16 13:01:25.733506 containerd[1550]: time="2025-12-16T13:01:25.733494220Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 13:01:25.733568 containerd[1550]: time="2025-12-16T13:01:25.733547730Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 13:01:25.733588 containerd[1550]: time="2025-12-16T13:01:25.733564261Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 13:01:25.736441 containerd[1550]: time="2025-12-16T13:01:25.736413254Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 13:01:25.736441 containerd[1550]: time="2025-12-16T13:01:25.736435906Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 13:01:25.736502 containerd[1550]: time="2025-12-16T13:01:25.736446125Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 13:01:25.736502 containerd[1550]: time="2025-12-16T13:01:25.736452617Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 16 13:01:25.736531 containerd[1550]: time="2025-12-16T13:01:25.736521186Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 16 13:01:25.736700 containerd[1550]: time="2025-12-16T13:01:25.736672409Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 13:01:25.736733 containerd[1550]: time="2025-12-16T13:01:25.736706934Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 13:01:25.736733 containerd[1550]: time="2025-12-16T13:01:25.736715270Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 16 13:01:25.743406 containerd[1550]: time="2025-12-16T13:01:25.743347666Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 16 13:01:25.754260 containerd[1550]: time="2025-12-16T13:01:25.754221155Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 16 13:01:25.754334 containerd[1550]: time="2025-12-16T13:01:25.754321743Z" level=info msg="metadata content store policy set" policy=shared Dec 16 13:01:25.761266 containerd[1550]: time="2025-12-16T13:01:25.761227623Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 16 13:01:25.761329 containerd[1550]: time="2025-12-16T13:01:25.761303745Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 16 13:01:25.761329 containerd[1550]: time="2025-12-16T13:01:25.761317842Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 16 13:01:25.761359 containerd[1550]: time="2025-12-16T13:01:25.761328101Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 16 13:01:25.761359 containerd[1550]: time="2025-12-16T13:01:25.761339032Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 16 13:01:25.761398 containerd[1550]: time="2025-12-16T13:01:25.761391179Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 16 13:01:25.761416 containerd[1550]: time="2025-12-16T13:01:25.761404795Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 16 13:01:25.761430 containerd[1550]: time="2025-12-16T13:01:25.761416026Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 16 13:01:25.761430 containerd[1550]: time="2025-12-16T13:01:25.761426165Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 16 13:01:25.761455 containerd[1550]: time="2025-12-16T13:01:25.761435041Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 16 13:01:25.761455 containerd[1550]: time="2025-12-16T13:01:25.761442145Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 16 13:01:25.761455 containerd[1550]: time="2025-12-16T13:01:25.761452113Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 16 13:01:25.761596 containerd[1550]: time="2025-12-16T13:01:25.761576016Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 16 13:01:25.761624 containerd[1550]: time="2025-12-16T13:01:25.761599349Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 16 13:01:25.761624 containerd[1550]: time="2025-12-16T13:01:25.761611843Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 16 13:01:25.761624 containerd[1550]: time="2025-12-16T13:01:25.761619879Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 16 13:01:25.764058 containerd[1550]: time="2025-12-16T13:01:25.764031521Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 16 13:01:25.764144 containerd[1550]: time="2025-12-16T13:01:25.764057209Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 16 13:01:25.764144 containerd[1550]: time="2025-12-16T13:01:25.764070734Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 16 13:01:25.764144 containerd[1550]: time="2025-12-16T13:01:25.764079360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 16 13:01:25.764144 containerd[1550]: time="2025-12-16T13:01:25.764128131Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 16 13:01:25.764209 containerd[1550]: time="2025-12-16T13:01:25.764144682Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 16 13:01:25.764209 containerd[1550]: time="2025-12-16T13:01:25.764155974Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 16 13:01:25.764236 containerd[1550]: time="2025-12-16T13:01:25.764208903Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 16 13:01:25.764236 containerd[1550]: time="2025-12-16T13:01:25.764220395Z" level=info msg="Start snapshots syncer" Dec 16 13:01:25.764263 containerd[1550]: time="2025-12-16T13:01:25.764240742Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 16 13:01:25.764927 containerd[1550]: time="2025-12-16T13:01:25.764892795Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 16 13:01:25.765862 containerd[1550]: time="2025-12-16T13:01:25.765840542Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 16 13:01:25.765910 containerd[1550]: time="2025-12-16T13:01:25.765893582Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 16 13:01:25.767149 containerd[1550]: time="2025-12-16T13:01:25.767123588Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 16 13:01:25.767178 containerd[1550]: time="2025-12-16T13:01:25.767153634Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 16 13:01:25.767178 containerd[1550]: time="2025-12-16T13:01:25.767163573Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 16 13:01:25.767178 containerd[1550]: time="2025-12-16T13:01:25.767171147Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 16 13:01:25.767229 containerd[1550]: time="2025-12-16T13:01:25.767179693Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 16 13:01:25.767229 containerd[1550]: time="2025-12-16T13:01:25.767213507Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 16 13:01:25.767229 containerd[1550]: time="2025-12-16T13:01:25.767226121Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 16 13:01:25.767269 containerd[1550]: time="2025-12-16T13:01:25.767244786Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 16 13:01:25.767269 containerd[1550]: time="2025-12-16T13:01:25.767254534Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 16 13:01:25.767296 containerd[1550]: time="2025-12-16T13:01:25.767280453Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 16 13:01:25.768872 containerd[1550]: time="2025-12-16T13:01:25.768845718Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 13:01:25.769323 containerd[1550]: time="2025-12-16T13:01:25.768872177Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 13:01:25.769323 containerd[1550]: time="2025-12-16T13:01:25.769319696Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 13:01:25.769367 containerd[1550]: time="2025-12-16T13:01:25.769333402Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 13:01:25.769367 containerd[1550]: time="2025-12-16T13:01:25.769339944Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 16 13:01:25.769367 containerd[1550]: time="2025-12-16T13:01:25.769349452Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 16 13:01:25.769367 containerd[1550]: time="2025-12-16T13:01:25.769362226Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 16 13:01:25.769430 containerd[1550]: time="2025-12-16T13:01:25.769374098Z" level=info msg="runtime interface created" Dec 16 13:01:25.769430 containerd[1550]: time="2025-12-16T13:01:25.769395869Z" level=info msg="created NRI interface" Dec 16 13:01:25.769430 containerd[1550]: time="2025-12-16T13:01:25.769403453Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 16 13:01:25.769430 containerd[1550]: time="2025-12-16T13:01:25.769413321Z" level=info msg="Connect containerd service" Dec 16 13:01:25.769493 containerd[1550]: time="2025-12-16T13:01:25.769429913Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 16 13:01:25.775078 containerd[1550]: time="2025-12-16T13:01:25.775045142Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 13:01:25.798979 coreos-metadata[1593]: Dec 16 13:01:25.798 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Dec 16 13:01:25.800175 coreos-metadata[1593]: Dec 16 13:01:25.800 INFO Fetch successful Dec 16 13:01:25.802554 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 16 13:01:25.804602 unknown[1593]: wrote ssh authorized keys file for user: core Dec 16 13:01:25.824564 locksmithd[1567]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 16 13:01:25.832914 update-ssh-keys[1622]: Updated "/home/core/.ssh/authorized_keys" Dec 16 13:01:25.834567 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 16 13:01:25.838597 systemd[1]: Finished sshkeys.service. Dec 16 13:01:25.908212 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:01:25.914858 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Dec 16 13:01:25.920061 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Dec 16 13:01:25.930506 systemd-logind[1534]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Dec 16 13:01:25.945823 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Dec 16 13:01:25.980855 kernel: EDAC MC: Ver: 3.0.0 Dec 16 13:01:25.981752 sshd_keygen[1560]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 16 13:01:25.986219 containerd[1550]: time="2025-12-16T13:01:25.985913111Z" level=info msg="Start subscribing containerd event" Dec 16 13:01:25.986219 containerd[1550]: time="2025-12-16T13:01:25.985956493Z" level=info msg="Start recovering state" Dec 16 13:01:25.986219 containerd[1550]: time="2025-12-16T13:01:25.986035069Z" level=info msg="Start event monitor" Dec 16 13:01:25.986219 containerd[1550]: time="2025-12-16T13:01:25.986045860Z" level=info msg="Start cni network conf syncer for default" Dec 16 13:01:25.986219 containerd[1550]: time="2025-12-16T13:01:25.986052282Z" level=info msg="Start streaming server" Dec 16 13:01:25.986219 containerd[1550]: time="2025-12-16T13:01:25.986060387Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 16 13:01:25.986219 containerd[1550]: time="2025-12-16T13:01:25.986066178Z" level=info msg="runtime interface starting up..." Dec 16 13:01:25.986219 containerd[1550]: time="2025-12-16T13:01:25.986071678Z" level=info msg="starting plugins..." Dec 16 13:01:25.986219 containerd[1550]: time="2025-12-16T13:01:25.986081957Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 16 13:01:25.989345 containerd[1550]: time="2025-12-16T13:01:25.989292588Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 16 13:01:25.989406 containerd[1550]: time="2025-12-16T13:01:25.989366837Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 16 13:01:25.991188 systemd[1]: Started containerd.service - containerd container runtime. Dec 16 13:01:25.992410 containerd[1550]: time="2025-12-16T13:01:25.992387201Z" level=info msg="containerd successfully booted in 0.280849s" Dec 16 13:01:25.997827 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Dec 16 13:01:26.009182 kernel: Console: switching to colour dummy device 80x25 Dec 16 13:01:26.008369 systemd-logind[1534]: Watching system buttons on /dev/input/event3 (Power Button) Dec 16 13:01:26.011555 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 16 13:01:26.011588 kernel: [drm] features: -context_init Dec 16 13:01:26.018526 kernel: [drm] number of scanouts: 1 Dec 16 13:01:26.018577 kernel: [drm] number of cap sets: 0 Dec 16 13:01:26.018587 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Dec 16 13:01:26.031738 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Dec 16 13:01:26.031796 kernel: Console: switching to colour frame buffer device 160x50 Dec 16 13:01:26.060869 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 16 13:01:26.068208 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 16 13:01:26.074623 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 13:01:26.075365 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:01:26.078661 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Dec 16 13:01:26.092539 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 16 13:01:26.094967 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 13:01:26.111901 systemd[1]: issuegen.service: Deactivated successfully. Dec 16 13:01:26.112067 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 16 13:01:26.113483 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 16 13:01:26.136228 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 16 13:01:26.138957 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 16 13:01:26.143566 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Dec 16 13:01:26.144618 systemd[1]: Reached target getty.target - Login Prompts. Dec 16 13:01:26.172820 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 13:01:26.211436 tar[1549]: linux-amd64/README.md Dec 16 13:01:26.224359 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 16 13:01:26.387152 coreos-metadata[1515]: Dec 16 13:01:26.386 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #2 Dec 16 13:01:26.388114 coreos-metadata[1515]: Dec 16 13:01:26.388 INFO Fetch successful Dec 16 13:01:26.388914 coreos-metadata[1515]: Dec 16 13:01:26.388 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Dec 16 13:01:26.389456 coreos-metadata[1515]: Dec 16 13:01:26.389 INFO Fetch successful Dec 16 13:01:26.395921 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 16 13:01:26.399156 systemd[1]: Started sshd@0-77.42.23.34:22-139.178.89.65:52162.service - OpenSSH per-connection server daemon (139.178.89.65:52162). Dec 16 13:01:26.460778 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 16 13:01:26.464275 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 16 13:01:26.561072 systemd-networkd[1467]: eth1: Gained IPv6LL Dec 16 13:01:26.562262 systemd-timesyncd[1451]: Network configuration changed, trying to establish connection. Dec 16 13:01:26.564010 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 16 13:01:26.564699 systemd[1]: Reached target network-online.target - Network is Online. Dec 16 13:01:26.577590 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:01:26.580535 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 16 13:01:26.625837 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 16 13:01:26.753074 systemd-networkd[1467]: eth0: Gained IPv6LL Dec 16 13:01:26.753852 systemd-timesyncd[1451]: Network configuration changed, trying to establish connection. Dec 16 13:01:27.567446 sshd[1675]: Accepted publickey for core from 139.178.89.65 port 52162 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:01:27.569662 sshd-session[1675]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:01:27.585347 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 16 13:01:27.588153 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 16 13:01:27.618872 systemd-logind[1534]: New session 1 of user core. Dec 16 13:01:27.629369 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 16 13:01:27.635679 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 16 13:01:27.653517 (systemd)[1699]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 16 13:01:27.657172 systemd-logind[1534]: New session c1 of user core. Dec 16 13:01:27.794447 systemd[1699]: Queued start job for default target default.target. Dec 16 13:01:27.799711 systemd[1699]: Created slice app.slice - User Application Slice. Dec 16 13:01:27.799732 systemd[1699]: Reached target paths.target - Paths. Dec 16 13:01:27.799763 systemd[1699]: Reached target timers.target - Timers. Dec 16 13:01:27.802980 systemd[1699]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 16 13:01:27.806662 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:01:27.807525 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 16 13:01:27.813064 systemd[1699]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 16 13:01:27.813186 systemd[1699]: Reached target sockets.target - Sockets. Dec 16 13:01:27.813293 systemd[1699]: Reached target basic.target - Basic System. Dec 16 13:01:27.813371 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 16 13:01:27.814132 (kubelet)[1711]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 13:01:27.814581 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 16 13:01:27.814877 systemd[1699]: Reached target default.target - Main User Target. Dec 16 13:01:27.814902 systemd[1699]: Startup finished in 149ms. Dec 16 13:01:27.818629 systemd[1]: Startup finished in 3.310s (kernel) + 5.689s (initrd) + 4.541s (userspace) = 13.541s. Dec 16 13:01:28.471022 kubelet[1711]: E1216 13:01:28.470940 1711 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 13:01:28.474011 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 13:01:28.474267 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 13:01:28.474917 systemd[1]: kubelet.service: Consumed 1.226s CPU time, 258.2M memory peak. Dec 16 13:01:28.617181 systemd[1]: Started sshd@1-77.42.23.34:22-139.178.89.65:52164.service - OpenSSH per-connection server daemon (139.178.89.65:52164). Dec 16 13:01:29.766349 sshd[1726]: Accepted publickey for core from 139.178.89.65 port 52164 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:01:29.767648 sshd-session[1726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:01:29.773079 systemd-logind[1534]: New session 2 of user core. Dec 16 13:01:29.781950 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 16 13:01:30.553250 sshd[1729]: Connection closed by 139.178.89.65 port 52164 Dec 16 13:01:30.553922 sshd-session[1726]: pam_unix(sshd:session): session closed for user core Dec 16 13:01:30.557412 systemd-logind[1534]: Session 2 logged out. Waiting for processes to exit. Dec 16 13:01:30.557930 systemd[1]: sshd@1-77.42.23.34:22-139.178.89.65:52164.service: Deactivated successfully. Dec 16 13:01:30.559687 systemd[1]: session-2.scope: Deactivated successfully. Dec 16 13:01:30.561293 systemd-logind[1534]: Removed session 2. Dec 16 13:01:30.709714 systemd[1]: Started sshd@2-77.42.23.34:22-139.178.89.65:33766.service - OpenSSH per-connection server daemon (139.178.89.65:33766). Dec 16 13:01:31.745575 sshd[1735]: Accepted publickey for core from 139.178.89.65 port 33766 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:01:31.747233 sshd-session[1735]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:01:31.752521 systemd-logind[1534]: New session 3 of user core. Dec 16 13:01:31.759988 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 16 13:01:32.447589 sshd[1738]: Connection closed by 139.178.89.65 port 33766 Dec 16 13:01:32.448142 sshd-session[1735]: pam_unix(sshd:session): session closed for user core Dec 16 13:01:32.451725 systemd-logind[1534]: Session 3 logged out. Waiting for processes to exit. Dec 16 13:01:32.451963 systemd[1]: sshd@2-77.42.23.34:22-139.178.89.65:33766.service: Deactivated successfully. Dec 16 13:01:32.453566 systemd[1]: session-3.scope: Deactivated successfully. Dec 16 13:01:32.454785 systemd-logind[1534]: Removed session 3. Dec 16 13:01:32.676925 systemd[1]: Started sshd@3-77.42.23.34:22-139.178.89.65:33776.service - OpenSSH per-connection server daemon (139.178.89.65:33776). Dec 16 13:01:33.837004 sshd[1744]: Accepted publickey for core from 139.178.89.65 port 33776 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:01:33.838610 sshd-session[1744]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:01:33.844640 systemd-logind[1534]: New session 4 of user core. Dec 16 13:01:33.852004 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 16 13:01:34.636845 sshd[1747]: Connection closed by 139.178.89.65 port 33776 Dec 16 13:01:34.637638 sshd-session[1744]: pam_unix(sshd:session): session closed for user core Dec 16 13:01:34.642352 systemd[1]: sshd@3-77.42.23.34:22-139.178.89.65:33776.service: Deactivated successfully. Dec 16 13:01:34.644780 systemd[1]: session-4.scope: Deactivated successfully. Dec 16 13:01:34.646636 systemd-logind[1534]: Session 4 logged out. Waiting for processes to exit. Dec 16 13:01:34.649200 systemd-logind[1534]: Removed session 4. Dec 16 13:01:34.804174 systemd[1]: Started sshd@4-77.42.23.34:22-139.178.89.65:33790.service - OpenSSH per-connection server daemon (139.178.89.65:33790). Dec 16 13:01:35.884959 sshd[1753]: Accepted publickey for core from 139.178.89.65 port 33790 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:01:35.886140 sshd-session[1753]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:01:35.890705 systemd-logind[1534]: New session 5 of user core. Dec 16 13:01:35.900941 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 16 13:01:36.452765 sudo[1757]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 16 13:01:36.453306 sudo[1757]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 13:01:36.467563 sudo[1757]: pam_unix(sudo:session): session closed for user root Dec 16 13:01:36.638191 sshd[1756]: Connection closed by 139.178.89.65 port 33790 Dec 16 13:01:36.639100 sshd-session[1753]: pam_unix(sshd:session): session closed for user core Dec 16 13:01:36.644328 systemd[1]: sshd@4-77.42.23.34:22-139.178.89.65:33790.service: Deactivated successfully. Dec 16 13:01:36.644342 systemd-logind[1534]: Session 5 logged out. Waiting for processes to exit. Dec 16 13:01:36.646653 systemd[1]: session-5.scope: Deactivated successfully. Dec 16 13:01:36.648591 systemd-logind[1534]: Removed session 5. Dec 16 13:01:36.825192 systemd[1]: Started sshd@5-77.42.23.34:22-139.178.89.65:33798.service - OpenSSH per-connection server daemon (139.178.89.65:33798). Dec 16 13:01:37.863303 sshd[1763]: Accepted publickey for core from 139.178.89.65 port 33798 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:01:37.864632 sshd-session[1763]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:01:37.869203 systemd-logind[1534]: New session 6 of user core. Dec 16 13:01:37.877074 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 16 13:01:38.409922 sudo[1768]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 16 13:01:38.410270 sudo[1768]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 13:01:38.415209 sudo[1768]: pam_unix(sudo:session): session closed for user root Dec 16 13:01:38.420326 sudo[1767]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 16 13:01:38.420561 sudo[1767]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 13:01:38.430319 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 13:01:38.472274 augenrules[1790]: No rules Dec 16 13:01:38.473469 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 13:01:38.473659 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 13:01:38.475081 sudo[1767]: pam_unix(sudo:session): session closed for user root Dec 16 13:01:38.477383 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 16 13:01:38.479270 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:01:38.593132 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:01:38.599078 (kubelet)[1803]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 13:01:38.633740 kubelet[1803]: E1216 13:01:38.633684 1803 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 13:01:38.637237 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 13:01:38.637356 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 13:01:38.637760 systemd[1]: kubelet.service: Consumed 126ms CPU time, 110.3M memory peak. Dec 16 13:01:38.643520 sshd[1766]: Connection closed by 139.178.89.65 port 33798 Dec 16 13:01:38.643941 sshd-session[1763]: pam_unix(sshd:session): session closed for user core Dec 16 13:01:38.647390 systemd-logind[1534]: Session 6 logged out. Waiting for processes to exit. Dec 16 13:01:38.647958 systemd[1]: sshd@5-77.42.23.34:22-139.178.89.65:33798.service: Deactivated successfully. Dec 16 13:01:38.649541 systemd[1]: session-6.scope: Deactivated successfully. Dec 16 13:01:38.650761 systemd-logind[1534]: Removed session 6. Dec 16 13:01:38.825137 systemd[1]: Started sshd@6-77.42.23.34:22-139.178.89.65:33814.service - OpenSSH per-connection server daemon (139.178.89.65:33814). Dec 16 13:01:39.875357 sshd[1814]: Accepted publickey for core from 139.178.89.65 port 33814 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:01:39.876591 sshd-session[1814]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:01:39.882200 systemd-logind[1534]: New session 7 of user core. Dec 16 13:01:39.889014 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 16 13:01:40.428499 sudo[1818]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 16 13:01:40.428732 sudo[1818]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 13:01:40.863182 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 16 13:01:40.890432 (dockerd)[1836]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 16 13:01:41.138992 dockerd[1836]: time="2025-12-16T13:01:41.138871180Z" level=info msg="Starting up" Dec 16 13:01:41.139736 dockerd[1836]: time="2025-12-16T13:01:41.139712408Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 16 13:01:41.147967 dockerd[1836]: time="2025-12-16T13:01:41.147913736Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 16 13:01:41.188967 dockerd[1836]: time="2025-12-16T13:01:41.188922111Z" level=info msg="Loading containers: start." Dec 16 13:01:41.200858 kernel: Initializing XFRM netlink socket Dec 16 13:01:41.354754 systemd-timesyncd[1451]: Network configuration changed, trying to establish connection. Dec 16 13:01:42.298988 systemd-resolved[1413]: Clock change detected. Flushing caches. Dec 16 13:01:42.299242 systemd-timesyncd[1451]: Contacted time server 195.201.20.16:123 (2.flatcar.pool.ntp.org). Dec 16 13:01:42.299321 systemd-timesyncd[1451]: Initial clock synchronization to Tue 2025-12-16 13:01:42.298886 UTC. Dec 16 13:01:42.311491 systemd-networkd[1467]: docker0: Link UP Dec 16 13:01:42.314465 dockerd[1836]: time="2025-12-16T13:01:42.314420702Z" level=info msg="Loading containers: done." Dec 16 13:01:42.333076 dockerd[1836]: time="2025-12-16T13:01:42.333031930Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 16 13:01:42.333176 dockerd[1836]: time="2025-12-16T13:01:42.333114495Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 16 13:01:42.333197 dockerd[1836]: time="2025-12-16T13:01:42.333181049Z" level=info msg="Initializing buildkit" Dec 16 13:01:42.351895 dockerd[1836]: time="2025-12-16T13:01:42.351869042Z" level=info msg="Completed buildkit initialization" Dec 16 13:01:42.359768 dockerd[1836]: time="2025-12-16T13:01:42.359568449Z" level=info msg="Daemon has completed initialization" Dec 16 13:01:42.359768 dockerd[1836]: time="2025-12-16T13:01:42.359666222Z" level=info msg="API listen on /run/docker.sock" Dec 16 13:01:42.359747 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 16 13:01:43.364430 containerd[1550]: time="2025-12-16T13:01:43.364331101Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Dec 16 13:01:43.999514 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2274505284.mount: Deactivated successfully. Dec 16 13:01:44.974805 containerd[1550]: time="2025-12-16T13:01:44.974064065Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:44.974805 containerd[1550]: time="2025-12-16T13:01:44.974781680Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=27068173" Dec 16 13:01:44.975525 containerd[1550]: time="2025-12-16T13:01:44.975506219Z" level=info msg="ImageCreate event name:\"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:44.977468 containerd[1550]: time="2025-12-16T13:01:44.977430847Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:44.978214 containerd[1550]: time="2025-12-16T13:01:44.978190552Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"27064672\" in 1.613787986s" Dec 16 13:01:44.978253 containerd[1550]: time="2025-12-16T13:01:44.978218394Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:aa27095f5619377172f3d59289ccb2ba567ebea93a736d1705be068b2c030b0c\"" Dec 16 13:01:44.978734 containerd[1550]: time="2025-12-16T13:01:44.978713292Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Dec 16 13:01:45.991097 containerd[1550]: time="2025-12-16T13:01:45.991030961Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:45.992242 containerd[1550]: time="2025-12-16T13:01:45.992049100Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=21162462" Dec 16 13:01:45.993232 containerd[1550]: time="2025-12-16T13:01:45.993207362Z" level=info msg="ImageCreate event name:\"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:45.996662 containerd[1550]: time="2025-12-16T13:01:45.995846280Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:45.996662 containerd[1550]: time="2025-12-16T13:01:45.996547385Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"22819474\" in 1.017809727s" Dec 16 13:01:45.996662 containerd[1550]: time="2025-12-16T13:01:45.996569006Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:5826b25d990d7d314d236c8d128f43e443583891f5cdffa7bf8bca50ae9e0942\"" Dec 16 13:01:45.997079 containerd[1550]: time="2025-12-16T13:01:45.997046230Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Dec 16 13:01:46.999547 containerd[1550]: time="2025-12-16T13:01:46.999484153Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:47.000343 containerd[1550]: time="2025-12-16T13:01:47.000327004Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=15725949" Dec 16 13:01:47.001065 containerd[1550]: time="2025-12-16T13:01:47.001034250Z" level=info msg="ImageCreate event name:\"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:47.003383 containerd[1550]: time="2025-12-16T13:01:47.003336838Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:47.004289 containerd[1550]: time="2025-12-16T13:01:47.004194456Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"17382979\" in 1.00698546s" Dec 16 13:01:47.004289 containerd[1550]: time="2025-12-16T13:01:47.004218150Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:aec12dadf56dd45659a682b94571f115a1be02ee4a262b3b5176394f5c030c78\"" Dec 16 13:01:47.006614 containerd[1550]: time="2025-12-16T13:01:47.006598854Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Dec 16 13:01:48.018119 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1211547134.mount: Deactivated successfully. Dec 16 13:01:48.272385 containerd[1550]: time="2025-12-16T13:01:48.272184636Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:48.273262 containerd[1550]: time="2025-12-16T13:01:48.273114761Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=25965321" Dec 16 13:01:48.273950 containerd[1550]: time="2025-12-16T13:01:48.273922865Z" level=info msg="ImageCreate event name:\"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:48.275418 containerd[1550]: time="2025-12-16T13:01:48.275387672Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:48.275852 containerd[1550]: time="2025-12-16T13:01:48.275808151Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"25964312\" in 1.26912041s" Dec 16 13:01:48.275953 containerd[1550]: time="2025-12-16T13:01:48.275934267Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:36eef8e07bdd6abdc2bbf44041e49480fe499a3cedb0ae054b50daa1a35cf691\"" Dec 16 13:01:48.276423 containerd[1550]: time="2025-12-16T13:01:48.276395162Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Dec 16 13:01:48.740003 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2265320435.mount: Deactivated successfully. Dec 16 13:01:49.585292 containerd[1550]: time="2025-12-16T13:01:49.585220292Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:49.586602 containerd[1550]: time="2025-12-16T13:01:49.586335864Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=22388101" Dec 16 13:01:49.587505 containerd[1550]: time="2025-12-16T13:01:49.587475110Z" level=info msg="ImageCreate event name:\"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:49.589909 containerd[1550]: time="2025-12-16T13:01:49.589868147Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:49.590617 containerd[1550]: time="2025-12-16T13:01:49.590586213Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"22384805\" in 1.314157429s" Dec 16 13:01:49.590658 containerd[1550]: time="2025-12-16T13:01:49.590619456Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\"" Dec 16 13:01:49.591393 containerd[1550]: time="2025-12-16T13:01:49.591355637Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Dec 16 13:01:49.610755 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 16 13:01:49.613119 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:01:49.709210 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:01:49.717079 (kubelet)[2181]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 13:01:49.759275 kubelet[2181]: E1216 13:01:49.759191 2181 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 13:01:49.761943 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 13:01:49.762081 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 13:01:49.762366 systemd[1]: kubelet.service: Consumed 123ms CPU time, 107.9M memory peak. Dec 16 13:01:50.046470 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3703583423.mount: Deactivated successfully. Dec 16 13:01:50.053503 containerd[1550]: time="2025-12-16T13:01:50.053442325Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:50.054465 containerd[1550]: time="2025-12-16T13:01:50.054255821Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321240" Dec 16 13:01:50.055459 containerd[1550]: time="2025-12-16T13:01:50.055404796Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:50.057803 containerd[1550]: time="2025-12-16T13:01:50.057769019Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:50.058488 containerd[1550]: time="2025-12-16T13:01:50.058456928Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 467.074051ms" Dec 16 13:01:50.058603 containerd[1550]: time="2025-12-16T13:01:50.058586702Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Dec 16 13:01:50.059123 containerd[1550]: time="2025-12-16T13:01:50.059086299Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Dec 16 13:01:50.571545 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1281225927.mount: Deactivated successfully. Dec 16 13:01:52.482679 containerd[1550]: time="2025-12-16T13:01:52.482635189Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:52.483629 containerd[1550]: time="2025-12-16T13:01:52.483419891Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=74166872" Dec 16 13:01:52.484378 containerd[1550]: time="2025-12-16T13:01:52.484351098Z" level=info msg="ImageCreate event name:\"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:52.486576 containerd[1550]: time="2025-12-16T13:01:52.486553677Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:01:52.487326 containerd[1550]: time="2025-12-16T13:01:52.487304745Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"74311308\" in 2.428189263s" Dec 16 13:01:52.487403 containerd[1550]: time="2025-12-16T13:01:52.487390987Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:5f1f5298c888daa46c4409ff4cefe5ca9d16e479419f94cdb5f5d5563dac0115\"" Dec 16 13:01:55.145007 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:01:55.145179 systemd[1]: kubelet.service: Consumed 123ms CPU time, 107.9M memory peak. Dec 16 13:01:55.148198 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:01:55.177262 systemd[1]: Reload requested from client PID 2274 ('systemctl') (unit session-7.scope)... Dec 16 13:01:55.177288 systemd[1]: Reloading... Dec 16 13:01:55.264899 zram_generator::config[2321]: No configuration found. Dec 16 13:01:55.442955 systemd[1]: Reloading finished in 265 ms. Dec 16 13:01:55.499022 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:01:55.502231 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:01:55.503219 systemd[1]: kubelet.service: Deactivated successfully. Dec 16 13:01:55.503597 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:01:55.503734 systemd[1]: kubelet.service: Consumed 74ms CPU time, 98.2M memory peak. Dec 16 13:01:55.505401 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:01:55.616476 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:01:55.619419 (kubelet)[2374]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 13:01:55.652008 kubelet[2374]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 13:01:55.652008 kubelet[2374]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 13:01:55.657125 kubelet[2374]: I1216 13:01:55.657080 2374 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 13:01:56.118426 kubelet[2374]: I1216 13:01:56.118379 2374 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 16 13:01:56.118426 kubelet[2374]: I1216 13:01:56.118410 2374 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 13:01:56.119582 kubelet[2374]: I1216 13:01:56.119555 2374 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 16 13:01:56.119582 kubelet[2374]: I1216 13:01:56.119575 2374 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 13:01:56.119808 kubelet[2374]: I1216 13:01:56.119786 2374 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 13:01:56.128548 kubelet[2374]: I1216 13:01:56.128508 2374 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 13:01:56.139687 kubelet[2374]: E1216 13:01:56.139658 2374 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://77.42.23.34:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 16 13:01:56.147117 kubelet[2374]: I1216 13:01:56.147085 2374 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 13:01:56.151978 kubelet[2374]: I1216 13:01:56.151924 2374 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 16 13:01:56.154028 kubelet[2374]: I1216 13:01:56.153987 2374 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 13:01:56.155390 kubelet[2374]: I1216 13:01:56.154020 2374 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-4-07f930e259","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 13:01:56.155390 kubelet[2374]: I1216 13:01:56.155389 2374 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 13:01:56.155521 kubelet[2374]: I1216 13:01:56.155399 2374 container_manager_linux.go:306] "Creating device plugin manager" Dec 16 13:01:56.155521 kubelet[2374]: I1216 13:01:56.155488 2374 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 16 13:01:56.158491 kubelet[2374]: I1216 13:01:56.158454 2374 state_mem.go:36] "Initialized new in-memory state store" Dec 16 13:01:56.158649 kubelet[2374]: I1216 13:01:56.158616 2374 kubelet.go:475] "Attempting to sync node with API server" Dec 16 13:01:56.158649 kubelet[2374]: I1216 13:01:56.158634 2374 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 13:01:56.158649 kubelet[2374]: I1216 13:01:56.158652 2374 kubelet.go:387] "Adding apiserver pod source" Dec 16 13:01:56.158724 kubelet[2374]: I1216 13:01:56.158669 2374 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 13:01:56.164429 kubelet[2374]: E1216 13:01:56.164398 2374 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://77.42.23.34:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-4-07f930e259&limit=500&resourceVersion=0\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 16 13:01:56.164503 kubelet[2374]: E1216 13:01:56.164456 2374 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://77.42.23.34:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 13:01:56.164778 kubelet[2374]: I1216 13:01:56.164759 2374 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 16 13:01:56.167686 kubelet[2374]: I1216 13:01:56.167655 2374 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 13:01:56.167686 kubelet[2374]: I1216 13:01:56.167688 2374 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 16 13:01:56.167757 kubelet[2374]: W1216 13:01:56.167728 2374 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 16 13:01:56.171231 kubelet[2374]: I1216 13:01:56.171045 2374 server.go:1262] "Started kubelet" Dec 16 13:01:56.171719 kubelet[2374]: I1216 13:01:56.171688 2374 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 13:01:56.177873 kubelet[2374]: E1216 13:01:56.175359 2374 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://77.42.23.34:6443/api/v1/namespaces/default/events\": dial tcp 77.42.23.34:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-2-4-07f930e259.1881b3b45ebf219c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-2-4-07f930e259,UID:ci-4459-2-2-4-07f930e259,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-2-4-07f930e259,},FirstTimestamp:2025-12-16 13:01:56.171014556 +0000 UTC m=+0.549219884,LastTimestamp:2025-12-16 13:01:56.171014556 +0000 UTC m=+0.549219884,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-2-4-07f930e259,}" Dec 16 13:01:56.177873 kubelet[2374]: I1216 13:01:56.176763 2374 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 13:01:56.183086 kubelet[2374]: I1216 13:01:56.182700 2374 server.go:310] "Adding debug handlers to kubelet server" Dec 16 13:01:56.184427 kubelet[2374]: I1216 13:01:56.184401 2374 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 16 13:01:56.184654 kubelet[2374]: E1216 13:01:56.184629 2374 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-4-07f930e259\" not found" Dec 16 13:01:56.186875 kubelet[2374]: I1216 13:01:56.186846 2374 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 13:01:56.186925 kubelet[2374]: I1216 13:01:56.186910 2374 reconciler.go:29] "Reconciler: start to sync state" Dec 16 13:01:56.187802 kubelet[2374]: I1216 13:01:56.187773 2374 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 13:01:56.187925 kubelet[2374]: I1216 13:01:56.187911 2374 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 16 13:01:56.188137 kubelet[2374]: I1216 13:01:56.188096 2374 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 13:01:56.189233 kubelet[2374]: I1216 13:01:56.189199 2374 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 13:01:56.196331 kubelet[2374]: E1216 13:01:56.196155 2374 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://77.42.23.34:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 13:01:56.196427 kubelet[2374]: I1216 13:01:56.196412 2374 factory.go:223] Registration of the systemd container factory successfully Dec 16 13:01:56.196709 kubelet[2374]: I1216 13:01:56.196492 2374 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 13:01:56.199133 kubelet[2374]: I1216 13:01:56.199083 2374 factory.go:223] Registration of the containerd container factory successfully Dec 16 13:01:56.200445 kubelet[2374]: E1216 13:01:56.200411 2374 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.23.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-4-07f930e259?timeout=10s\": dial tcp 77.42.23.34:6443: connect: connection refused" interval="200ms" Dec 16 13:01:56.202986 kubelet[2374]: I1216 13:01:56.202947 2374 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 16 13:01:56.203889 kubelet[2374]: I1216 13:01:56.203857 2374 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 16 13:01:56.203889 kubelet[2374]: I1216 13:01:56.203873 2374 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 16 13:01:56.203889 kubelet[2374]: I1216 13:01:56.203889 2374 kubelet.go:2427] "Starting kubelet main sync loop" Dec 16 13:01:56.203981 kubelet[2374]: E1216 13:01:56.203915 2374 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 13:01:56.225906 kubelet[2374]: E1216 13:01:56.225875 2374 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 13:01:56.226094 kubelet[2374]: E1216 13:01:56.226067 2374 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://77.42.23.34:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 16 13:01:56.230051 kubelet[2374]: I1216 13:01:56.230029 2374 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 13:01:56.230051 kubelet[2374]: I1216 13:01:56.230043 2374 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 13:01:56.230051 kubelet[2374]: I1216 13:01:56.230054 2374 state_mem.go:36] "Initialized new in-memory state store" Dec 16 13:01:56.231702 kubelet[2374]: I1216 13:01:56.231683 2374 policy_none.go:49] "None policy: Start" Dec 16 13:01:56.231702 kubelet[2374]: I1216 13:01:56.231698 2374 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 16 13:01:56.231781 kubelet[2374]: I1216 13:01:56.231706 2374 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 16 13:01:56.232582 kubelet[2374]: I1216 13:01:56.232561 2374 policy_none.go:47] "Start" Dec 16 13:01:56.236215 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 16 13:01:56.250615 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 16 13:01:56.253175 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 16 13:01:56.260721 kubelet[2374]: E1216 13:01:56.260699 2374 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 13:01:56.261231 kubelet[2374]: I1216 13:01:56.261175 2374 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 13:01:56.261231 kubelet[2374]: I1216 13:01:56.261188 2374 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 13:01:56.261808 kubelet[2374]: I1216 13:01:56.261794 2374 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 13:01:56.263804 kubelet[2374]: E1216 13:01:56.263758 2374 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 13:01:56.263804 kubelet[2374]: E1216 13:01:56.263800 2374 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-2-4-07f930e259\" not found" Dec 16 13:01:56.317867 systemd[1]: Created slice kubepods-burstable-pod6bcfe440740b57e762c96822e537f98c.slice - libcontainer container kubepods-burstable-pod6bcfe440740b57e762c96822e537f98c.slice. Dec 16 13:01:56.332434 kubelet[2374]: E1216 13:01:56.332390 2374 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-4-07f930e259\" not found" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.335865 systemd[1]: Created slice kubepods-burstable-pod5260260b4201445e7c10c6e8db190289.slice - libcontainer container kubepods-burstable-pod5260260b4201445e7c10c6e8db190289.slice. Dec 16 13:01:56.342284 kubelet[2374]: E1216 13:01:56.342256 2374 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-4-07f930e259\" not found" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.344442 systemd[1]: Created slice kubepods-burstable-pod607c34f2651ba9bd2e9ed872b8b7cbf3.slice - libcontainer container kubepods-burstable-pod607c34f2651ba9bd2e9ed872b8b7cbf3.slice. Dec 16 13:01:56.346437 kubelet[2374]: E1216 13:01:56.346398 2374 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-4-07f930e259\" not found" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.364018 kubelet[2374]: I1216 13:01:56.363992 2374 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.364327 kubelet[2374]: E1216 13:01:56.364295 2374 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://77.42.23.34:6443/api/v1/nodes\": dial tcp 77.42.23.34:6443: connect: connection refused" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.401568 kubelet[2374]: E1216 13:01:56.401457 2374 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.23.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-4-07f930e259?timeout=10s\": dial tcp 77.42.23.34:6443: connect: connection refused" interval="400ms" Dec 16 13:01:56.488049 kubelet[2374]: I1216 13:01:56.487995 2374 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6bcfe440740b57e762c96822e537f98c-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-4-07f930e259\" (UID: \"6bcfe440740b57e762c96822e537f98c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.488049 kubelet[2374]: I1216 13:01:56.488043 2374 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6bcfe440740b57e762c96822e537f98c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-4-07f930e259\" (UID: \"6bcfe440740b57e762c96822e537f98c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.488049 kubelet[2374]: I1216 13:01:56.488070 2374 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/607c34f2651ba9bd2e9ed872b8b7cbf3-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-4-07f930e259\" (UID: \"607c34f2651ba9bd2e9ed872b8b7cbf3\") " pod="kube-system/kube-apiserver-ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.488431 kubelet[2374]: I1216 13:01:56.488089 2374 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6bcfe440740b57e762c96822e537f98c-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-4-07f930e259\" (UID: \"6bcfe440740b57e762c96822e537f98c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.488431 kubelet[2374]: I1216 13:01:56.488113 2374 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5260260b4201445e7c10c6e8db190289-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-4-07f930e259\" (UID: \"5260260b4201445e7c10c6e8db190289\") " pod="kube-system/kube-scheduler-ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.488431 kubelet[2374]: I1216 13:01:56.488134 2374 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/607c34f2651ba9bd2e9ed872b8b7cbf3-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-4-07f930e259\" (UID: \"607c34f2651ba9bd2e9ed872b8b7cbf3\") " pod="kube-system/kube-apiserver-ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.488431 kubelet[2374]: I1216 13:01:56.488192 2374 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/607c34f2651ba9bd2e9ed872b8b7cbf3-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-4-07f930e259\" (UID: \"607c34f2651ba9bd2e9ed872b8b7cbf3\") " pod="kube-system/kube-apiserver-ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.488431 kubelet[2374]: I1216 13:01:56.488234 2374 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6bcfe440740b57e762c96822e537f98c-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-4-07f930e259\" (UID: \"6bcfe440740b57e762c96822e537f98c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.488557 kubelet[2374]: I1216 13:01:56.488251 2374 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6bcfe440740b57e762c96822e537f98c-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-4-07f930e259\" (UID: \"6bcfe440740b57e762c96822e537f98c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.567320 kubelet[2374]: I1216 13:01:56.567272 2374 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.567727 kubelet[2374]: E1216 13:01:56.567662 2374 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://77.42.23.34:6443/api/v1/nodes\": dial tcp 77.42.23.34:6443: connect: connection refused" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.636270 containerd[1550]: time="2025-12-16T13:01:56.636223140Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-4-07f930e259,Uid:6bcfe440740b57e762c96822e537f98c,Namespace:kube-system,Attempt:0,}" Dec 16 13:01:56.647299 containerd[1550]: time="2025-12-16T13:01:56.647235900Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-4-07f930e259,Uid:5260260b4201445e7c10c6e8db190289,Namespace:kube-system,Attempt:0,}" Dec 16 13:01:56.648841 containerd[1550]: time="2025-12-16T13:01:56.648788892Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-4-07f930e259,Uid:607c34f2651ba9bd2e9ed872b8b7cbf3,Namespace:kube-system,Attempt:0,}" Dec 16 13:01:56.802277 kubelet[2374]: E1216 13:01:56.802178 2374 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.23.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-4-07f930e259?timeout=10s\": dial tcp 77.42.23.34:6443: connect: connection refused" interval="800ms" Dec 16 13:01:56.969831 kubelet[2374]: I1216 13:01:56.969774 2374 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:56.970170 kubelet[2374]: E1216 13:01:56.970135 2374 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://77.42.23.34:6443/api/v1/nodes\": dial tcp 77.42.23.34:6443: connect: connection refused" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:57.067095 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2287382168.mount: Deactivated successfully. Dec 16 13:01:57.072553 containerd[1550]: time="2025-12-16T13:01:57.072519751Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 13:01:57.074800 containerd[1550]: time="2025-12-16T13:01:57.074761043Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Dec 16 13:01:57.075509 containerd[1550]: time="2025-12-16T13:01:57.075471776Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 13:01:57.077097 containerd[1550]: time="2025-12-16T13:01:57.077063190Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 13:01:57.077871 containerd[1550]: time="2025-12-16T13:01:57.077746983Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 13:01:57.078694 containerd[1550]: time="2025-12-16T13:01:57.078671917Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 13:01:57.079643 containerd[1550]: time="2025-12-16T13:01:57.079610947Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 13:01:57.080525 containerd[1550]: time="2025-12-16T13:01:57.080184613Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 13:01:57.080525 containerd[1550]: time="2025-12-16T13:01:57.080200463Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 439.109465ms" Dec 16 13:01:57.082081 containerd[1550]: time="2025-12-16T13:01:57.082049460Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 433.55565ms" Dec 16 13:01:57.088460 containerd[1550]: time="2025-12-16T13:01:57.088421388Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 437.692528ms" Dec 16 13:01:57.156687 containerd[1550]: time="2025-12-16T13:01:57.156640806Z" level=info msg="connecting to shim d64d5f398304db34c8b57a051323ae10fa12f8290c5d47a60f4b9efc1716af68" address="unix:///run/containerd/s/7a1c025d8923a345aafadf21651f8cd25461f7d8d814c2df344f5c34b4e2ecaf" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:01:57.156927 containerd[1550]: time="2025-12-16T13:01:57.156904641Z" level=info msg="connecting to shim fb8706562bc699603916f6bfcea3648e99d0c1e1723dc63f135c59e802fc4bb4" address="unix:///run/containerd/s/b8440a8be23ba5ae23cf59e34bbac4a0949da91a47f126dde35af91aeb2db308" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:01:57.162474 containerd[1550]: time="2025-12-16T13:01:57.162447895Z" level=info msg="connecting to shim b4754eefda32a639ff00136e7eb44968e5ca1f94300a37f3ce21331391e82e4a" address="unix:///run/containerd/s/eb73c49e27892ec5c8c27eb77721fba765612b540aa96176e4bfa41f61ed8604" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:01:57.228951 systemd[1]: Started cri-containerd-d64d5f398304db34c8b57a051323ae10fa12f8290c5d47a60f4b9efc1716af68.scope - libcontainer container d64d5f398304db34c8b57a051323ae10fa12f8290c5d47a60f4b9efc1716af68. Dec 16 13:01:57.232480 systemd[1]: Started cri-containerd-b4754eefda32a639ff00136e7eb44968e5ca1f94300a37f3ce21331391e82e4a.scope - libcontainer container b4754eefda32a639ff00136e7eb44968e5ca1f94300a37f3ce21331391e82e4a. Dec 16 13:01:57.233502 systemd[1]: Started cri-containerd-fb8706562bc699603916f6bfcea3648e99d0c1e1723dc63f135c59e802fc4bb4.scope - libcontainer container fb8706562bc699603916f6bfcea3648e99d0c1e1723dc63f135c59e802fc4bb4. Dec 16 13:01:57.303361 containerd[1550]: time="2025-12-16T13:01:57.303322629Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-2-4-07f930e259,Uid:607c34f2651ba9bd2e9ed872b8b7cbf3,Namespace:kube-system,Attempt:0,} returns sandbox id \"fb8706562bc699603916f6bfcea3648e99d0c1e1723dc63f135c59e802fc4bb4\"" Dec 16 13:01:57.303503 containerd[1550]: time="2025-12-16T13:01:57.303482109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-2-4-07f930e259,Uid:6bcfe440740b57e762c96822e537f98c,Namespace:kube-system,Attempt:0,} returns sandbox id \"d64d5f398304db34c8b57a051323ae10fa12f8290c5d47a60f4b9efc1716af68\"" Dec 16 13:01:57.310024 containerd[1550]: time="2025-12-16T13:01:57.310001824Z" level=info msg="CreateContainer within sandbox \"d64d5f398304db34c8b57a051323ae10fa12f8290c5d47a60f4b9efc1716af68\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 16 13:01:57.311748 containerd[1550]: time="2025-12-16T13:01:57.311439169Z" level=info msg="CreateContainer within sandbox \"fb8706562bc699603916f6bfcea3648e99d0c1e1723dc63f135c59e802fc4bb4\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 16 13:01:57.314529 kubelet[2374]: E1216 13:01:57.314478 2374 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://77.42.23.34:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 13:01:57.323332 containerd[1550]: time="2025-12-16T13:01:57.323146662Z" level=info msg="Container f284aaea70f2a1195ebff833c0a2c77e97febf5f381f4eb6f3316c8693c21b25: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:01:57.323400 containerd[1550]: time="2025-12-16T13:01:57.323336908Z" level=info msg="Container cd36d1397e6d3493c06e36a0cae4911a168a38c88fb535bb9d4843d0eb730e55: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:01:57.325569 containerd[1550]: time="2025-12-16T13:01:57.325513550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-2-4-07f930e259,Uid:5260260b4201445e7c10c6e8db190289,Namespace:kube-system,Attempt:0,} returns sandbox id \"b4754eefda32a639ff00136e7eb44968e5ca1f94300a37f3ce21331391e82e4a\"" Dec 16 13:01:57.329585 containerd[1550]: time="2025-12-16T13:01:57.329389067Z" level=info msg="CreateContainer within sandbox \"b4754eefda32a639ff00136e7eb44968e5ca1f94300a37f3ce21331391e82e4a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 16 13:01:57.338298 containerd[1550]: time="2025-12-16T13:01:57.338274588Z" level=info msg="Container 9cf70de4ca81a87b886b07eef471fb2edd9fb4fd00c1e6d497e275c6e1659419: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:01:57.340221 containerd[1550]: time="2025-12-16T13:01:57.340143292Z" level=info msg="CreateContainer within sandbox \"fb8706562bc699603916f6bfcea3648e99d0c1e1723dc63f135c59e802fc4bb4\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"cd36d1397e6d3493c06e36a0cae4911a168a38c88fb535bb9d4843d0eb730e55\"" Dec 16 13:01:57.340778 containerd[1550]: time="2025-12-16T13:01:57.340737616Z" level=info msg="StartContainer for \"cd36d1397e6d3493c06e36a0cae4911a168a38c88fb535bb9d4843d0eb730e55\"" Dec 16 13:01:57.343234 containerd[1550]: time="2025-12-16T13:01:57.343189074Z" level=info msg="CreateContainer within sandbox \"d64d5f398304db34c8b57a051323ae10fa12f8290c5d47a60f4b9efc1716af68\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"f284aaea70f2a1195ebff833c0a2c77e97febf5f381f4eb6f3316c8693c21b25\"" Dec 16 13:01:57.343850 containerd[1550]: time="2025-12-16T13:01:57.343455994Z" level=info msg="StartContainer for \"f284aaea70f2a1195ebff833c0a2c77e97febf5f381f4eb6f3316c8693c21b25\"" Dec 16 13:01:57.343850 containerd[1550]: time="2025-12-16T13:01:57.343758010Z" level=info msg="connecting to shim cd36d1397e6d3493c06e36a0cae4911a168a38c88fb535bb9d4843d0eb730e55" address="unix:///run/containerd/s/b8440a8be23ba5ae23cf59e34bbac4a0949da91a47f126dde35af91aeb2db308" protocol=ttrpc version=3 Dec 16 13:01:57.345123 containerd[1550]: time="2025-12-16T13:01:57.345097612Z" level=info msg="CreateContainer within sandbox \"b4754eefda32a639ff00136e7eb44968e5ca1f94300a37f3ce21331391e82e4a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"9cf70de4ca81a87b886b07eef471fb2edd9fb4fd00c1e6d497e275c6e1659419\"" Dec 16 13:01:57.345394 containerd[1550]: time="2025-12-16T13:01:57.345376946Z" level=info msg="StartContainer for \"9cf70de4ca81a87b886b07eef471fb2edd9fb4fd00c1e6d497e275c6e1659419\"" Dec 16 13:01:57.345844 containerd[1550]: time="2025-12-16T13:01:57.345800640Z" level=info msg="connecting to shim f284aaea70f2a1195ebff833c0a2c77e97febf5f381f4eb6f3316c8693c21b25" address="unix:///run/containerd/s/7a1c025d8923a345aafadf21651f8cd25461f7d8d814c2df344f5c34b4e2ecaf" protocol=ttrpc version=3 Dec 16 13:01:57.346067 containerd[1550]: time="2025-12-16T13:01:57.346043405Z" level=info msg="connecting to shim 9cf70de4ca81a87b886b07eef471fb2edd9fb4fd00c1e6d497e275c6e1659419" address="unix:///run/containerd/s/eb73c49e27892ec5c8c27eb77721fba765612b540aa96176e4bfa41f61ed8604" protocol=ttrpc version=3 Dec 16 13:01:57.365941 systemd[1]: Started cri-containerd-9cf70de4ca81a87b886b07eef471fb2edd9fb4fd00c1e6d497e275c6e1659419.scope - libcontainer container 9cf70de4ca81a87b886b07eef471fb2edd9fb4fd00c1e6d497e275c6e1659419. Dec 16 13:01:57.372918 systemd[1]: Started cri-containerd-cd36d1397e6d3493c06e36a0cae4911a168a38c88fb535bb9d4843d0eb730e55.scope - libcontainer container cd36d1397e6d3493c06e36a0cae4911a168a38c88fb535bb9d4843d0eb730e55. Dec 16 13:01:57.374555 systemd[1]: Started cri-containerd-f284aaea70f2a1195ebff833c0a2c77e97febf5f381f4eb6f3316c8693c21b25.scope - libcontainer container f284aaea70f2a1195ebff833c0a2c77e97febf5f381f4eb6f3316c8693c21b25. Dec 16 13:01:57.431896 containerd[1550]: time="2025-12-16T13:01:57.431796892Z" level=info msg="StartContainer for \"cd36d1397e6d3493c06e36a0cae4911a168a38c88fb535bb9d4843d0eb730e55\" returns successfully" Dec 16 13:01:57.437997 containerd[1550]: time="2025-12-16T13:01:57.437956552Z" level=info msg="StartContainer for \"f284aaea70f2a1195ebff833c0a2c77e97febf5f381f4eb6f3316c8693c21b25\" returns successfully" Dec 16 13:01:57.471745 containerd[1550]: time="2025-12-16T13:01:57.471703942Z" level=info msg="StartContainer for \"9cf70de4ca81a87b886b07eef471fb2edd9fb4fd00c1e6d497e275c6e1659419\" returns successfully" Dec 16 13:01:57.491406 kubelet[2374]: E1216 13:01:57.491365 2374 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://77.42.23.34:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-2-4-07f930e259&limit=500&resourceVersion=0\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 16 13:01:57.590235 kubelet[2374]: E1216 13:01:57.590136 2374 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://77.42.23.34:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 77.42.23.34:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 13:01:57.602612 kubelet[2374]: E1216 13:01:57.602586 2374 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://77.42.23.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-4-07f930e259?timeout=10s\": dial tcp 77.42.23.34:6443: connect: connection refused" interval="1.6s" Dec 16 13:01:57.774250 kubelet[2374]: I1216 13:01:57.774001 2374 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:58.228417 kubelet[2374]: E1216 13:01:58.228236 2374 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-4-07f930e259\" not found" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:58.229838 kubelet[2374]: E1216 13:01:58.229764 2374 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-4-07f930e259\" not found" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:58.234314 kubelet[2374]: E1216 13:01:58.234303 2374 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-4-07f930e259\" not found" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:59.205712 kubelet[2374]: E1216 13:01:59.205659 2374 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-2-4-07f930e259\" not found" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:59.235551 kubelet[2374]: E1216 13:01:59.235430 2374 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-4-07f930e259\" not found" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:59.235551 kubelet[2374]: E1216 13:01:59.235544 2374 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-4-07f930e259\" not found" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:59.327032 kubelet[2374]: I1216 13:01:59.326877 2374 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:59.327032 kubelet[2374]: E1216 13:01:59.326919 2374 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4459-2-2-4-07f930e259\": node \"ci-4459-2-2-4-07f930e259\" not found" Dec 16 13:01:59.341347 kubelet[2374]: E1216 13:01:59.341320 2374 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-4-07f930e259\" not found" Dec 16 13:01:59.442016 kubelet[2374]: E1216 13:01:59.441978 2374 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-4-07f930e259\" not found" Dec 16 13:01:59.542987 kubelet[2374]: E1216 13:01:59.542869 2374 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-4-07f930e259\" not found" Dec 16 13:01:59.643992 kubelet[2374]: E1216 13:01:59.643952 2374 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-4-07f930e259\" not found" Dec 16 13:01:59.661750 kubelet[2374]: E1216 13:01:59.661724 2374 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-2-4-07f930e259\" not found" node="ci-4459-2-2-4-07f930e259" Dec 16 13:01:59.744931 kubelet[2374]: E1216 13:01:59.744882 2374 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-4-07f930e259\" not found" Dec 16 13:01:59.845973 kubelet[2374]: E1216 13:01:59.845865 2374 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-4-07f930e259\" not found" Dec 16 13:01:59.946677 kubelet[2374]: E1216 13:01:59.946628 2374 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-4-07f930e259\" not found" Dec 16 13:02:00.047573 kubelet[2374]: E1216 13:02:00.047524 2374 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-4-07f930e259\" not found" Dec 16 13:02:00.148611 kubelet[2374]: E1216 13:02:00.148568 2374 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-2-4-07f930e259\" not found" Dec 16 13:02:00.236061 kubelet[2374]: I1216 13:02:00.235997 2374 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-4-07f930e259" Dec 16 13:02:00.285211 kubelet[2374]: I1216 13:02:00.285153 2374 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-4-07f930e259" Dec 16 13:02:00.290701 kubelet[2374]: I1216 13:02:00.290412 2374 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-4-07f930e259" Dec 16 13:02:00.297186 kubelet[2374]: E1216 13:02:00.297076 2374 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-4-07f930e259\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-2-4-07f930e259" Dec 16 13:02:00.297380 kubelet[2374]: I1216 13:02:00.297301 2374 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.101644 systemd[1]: Reload requested from client PID 2663 ('systemctl') (unit session-7.scope)... Dec 16 13:02:01.101661 systemd[1]: Reloading... Dec 16 13:02:01.168100 kubelet[2374]: I1216 13:02:01.168065 2374 apiserver.go:52] "Watching apiserver" Dec 16 13:02:01.188031 kubelet[2374]: I1216 13:02:01.188000 2374 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 13:02:01.207850 zram_generator::config[2716]: No configuration found. Dec 16 13:02:01.383596 systemd[1]: Reloading finished in 281 ms. Dec 16 13:02:01.410077 kubelet[2374]: I1216 13:02:01.409988 2374 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 13:02:01.410014 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:02:01.433347 systemd[1]: kubelet.service: Deactivated successfully. Dec 16 13:02:01.433654 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:02:01.433695 systemd[1]: kubelet.service: Consumed 826ms CPU time, 122.1M memory peak. Dec 16 13:02:01.436287 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 13:02:01.557296 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 13:02:01.565152 (kubelet)[2758]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 13:02:01.621894 kubelet[2758]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 13:02:01.621894 kubelet[2758]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 13:02:01.622220 kubelet[2758]: I1216 13:02:01.621935 2758 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 13:02:01.627655 kubelet[2758]: I1216 13:02:01.627624 2758 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 16 13:02:01.627655 kubelet[2758]: I1216 13:02:01.627641 2758 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 13:02:01.627732 kubelet[2758]: I1216 13:02:01.627662 2758 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 16 13:02:01.627732 kubelet[2758]: I1216 13:02:01.627671 2758 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 13:02:01.627964 kubelet[2758]: I1216 13:02:01.627934 2758 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 13:02:01.629843 kubelet[2758]: I1216 13:02:01.629374 2758 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Dec 16 13:02:01.635874 kubelet[2758]: I1216 13:02:01.635482 2758 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 13:02:01.640337 kubelet[2758]: I1216 13:02:01.640325 2758 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 13:02:01.642888 kubelet[2758]: I1216 13:02:01.642874 2758 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 16 13:02:01.643105 kubelet[2758]: I1216 13:02:01.643083 2758 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 13:02:01.643311 kubelet[2758]: I1216 13:02:01.643177 2758 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-2-4-07f930e259","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 13:02:01.643416 kubelet[2758]: I1216 13:02:01.643407 2758 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 13:02:01.643465 kubelet[2758]: I1216 13:02:01.643459 2758 container_manager_linux.go:306] "Creating device plugin manager" Dec 16 13:02:01.643542 kubelet[2758]: I1216 13:02:01.643535 2758 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 16 13:02:01.644307 kubelet[2758]: I1216 13:02:01.644291 2758 state_mem.go:36] "Initialized new in-memory state store" Dec 16 13:02:01.648184 kubelet[2758]: I1216 13:02:01.648162 2758 kubelet.go:475] "Attempting to sync node with API server" Dec 16 13:02:01.648253 kubelet[2758]: I1216 13:02:01.648243 2758 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 13:02:01.648320 kubelet[2758]: I1216 13:02:01.648313 2758 kubelet.go:387] "Adding apiserver pod source" Dec 16 13:02:01.648719 kubelet[2758]: I1216 13:02:01.648710 2758 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 13:02:01.656389 kubelet[2758]: I1216 13:02:01.656363 2758 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Dec 16 13:02:01.656908 kubelet[2758]: I1216 13:02:01.656858 2758 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 13:02:01.656908 kubelet[2758]: I1216 13:02:01.656886 2758 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 16 13:02:01.660871 kubelet[2758]: I1216 13:02:01.659423 2758 server.go:1262] "Started kubelet" Dec 16 13:02:01.663923 kubelet[2758]: I1216 13:02:01.662554 2758 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 13:02:01.670871 kubelet[2758]: I1216 13:02:01.664158 2758 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 13:02:01.672551 kubelet[2758]: I1216 13:02:01.672538 2758 server.go:310] "Adding debug handlers to kubelet server" Dec 16 13:02:01.673901 kubelet[2758]: I1216 13:02:01.667359 2758 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 13:02:01.674171 kubelet[2758]: I1216 13:02:01.674069 2758 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 16 13:02:01.674620 kubelet[2758]: I1216 13:02:01.664364 2758 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 13:02:01.675120 kubelet[2758]: I1216 13:02:01.675101 2758 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 16 13:02:01.676086 kubelet[2758]: I1216 13:02:01.676000 2758 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 13:02:01.676975 kubelet[2758]: I1216 13:02:01.676965 2758 reconciler.go:29] "Reconciler: start to sync state" Dec 16 13:02:01.677262 kubelet[2758]: I1216 13:02:01.677223 2758 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 13:02:01.678477 kubelet[2758]: I1216 13:02:01.677951 2758 factory.go:223] Registration of the systemd container factory successfully Dec 16 13:02:01.678477 kubelet[2758]: I1216 13:02:01.678010 2758 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 13:02:01.678477 kubelet[2758]: E1216 13:02:01.678205 2758 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 13:02:01.685596 kubelet[2758]: I1216 13:02:01.685576 2758 factory.go:223] Registration of the containerd container factory successfully Dec 16 13:02:01.691781 kubelet[2758]: I1216 13:02:01.691742 2758 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 16 13:02:01.693295 kubelet[2758]: I1216 13:02:01.693283 2758 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 16 13:02:01.693359 kubelet[2758]: I1216 13:02:01.693352 2758 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 16 13:02:01.693485 kubelet[2758]: I1216 13:02:01.693467 2758 kubelet.go:2427] "Starting kubelet main sync loop" Dec 16 13:02:01.693585 kubelet[2758]: E1216 13:02:01.693568 2758 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 13:02:01.742839 kubelet[2758]: I1216 13:02:01.742206 2758 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 13:02:01.742839 kubelet[2758]: I1216 13:02:01.742219 2758 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 13:02:01.742839 kubelet[2758]: I1216 13:02:01.742235 2758 state_mem.go:36] "Initialized new in-memory state store" Dec 16 13:02:01.742839 kubelet[2758]: I1216 13:02:01.742331 2758 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 16 13:02:01.742839 kubelet[2758]: I1216 13:02:01.742339 2758 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 16 13:02:01.742839 kubelet[2758]: I1216 13:02:01.742352 2758 policy_none.go:49] "None policy: Start" Dec 16 13:02:01.742839 kubelet[2758]: I1216 13:02:01.742360 2758 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 16 13:02:01.742839 kubelet[2758]: I1216 13:02:01.742366 2758 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 16 13:02:01.742839 kubelet[2758]: I1216 13:02:01.742439 2758 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Dec 16 13:02:01.742839 kubelet[2758]: I1216 13:02:01.742446 2758 policy_none.go:47] "Start" Dec 16 13:02:01.750750 kubelet[2758]: E1216 13:02:01.750723 2758 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 13:02:01.750984 kubelet[2758]: I1216 13:02:01.750970 2758 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 13:02:01.751014 kubelet[2758]: I1216 13:02:01.750986 2758 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 13:02:01.751229 kubelet[2758]: I1216 13:02:01.751205 2758 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 13:02:01.753927 kubelet[2758]: E1216 13:02:01.753915 2758 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 13:02:01.794656 kubelet[2758]: I1216 13:02:01.794615 2758 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.795149 kubelet[2758]: I1216 13:02:01.794942 2758 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.800653 kubelet[2758]: I1216 13:02:01.795032 2758 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.806518 kubelet[2758]: E1216 13:02:01.806461 2758 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-4-07f930e259\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.807485 kubelet[2758]: E1216 13:02:01.807445 2758 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-4-07f930e259\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.808349 kubelet[2758]: E1216 13:02:01.808302 2758 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-2-4-07f930e259\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.860957 kubelet[2758]: I1216 13:02:01.860883 2758 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.866166 kubelet[2758]: I1216 13:02:01.866094 2758 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.866166 kubelet[2758]: I1216 13:02:01.866141 2758 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.878400 kubelet[2758]: I1216 13:02:01.878351 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5260260b4201445e7c10c6e8db190289-kubeconfig\") pod \"kube-scheduler-ci-4459-2-2-4-07f930e259\" (UID: \"5260260b4201445e7c10c6e8db190289\") " pod="kube-system/kube-scheduler-ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.878400 kubelet[2758]: I1216 13:02:01.878381 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/607c34f2651ba9bd2e9ed872b8b7cbf3-ca-certs\") pod \"kube-apiserver-ci-4459-2-2-4-07f930e259\" (UID: \"607c34f2651ba9bd2e9ed872b8b7cbf3\") " pod="kube-system/kube-apiserver-ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.878400 kubelet[2758]: I1216 13:02:01.878393 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/607c34f2651ba9bd2e9ed872b8b7cbf3-k8s-certs\") pod \"kube-apiserver-ci-4459-2-2-4-07f930e259\" (UID: \"607c34f2651ba9bd2e9ed872b8b7cbf3\") " pod="kube-system/kube-apiserver-ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.878400 kubelet[2758]: I1216 13:02:01.878404 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6bcfe440740b57e762c96822e537f98c-ca-certs\") pod \"kube-controller-manager-ci-4459-2-2-4-07f930e259\" (UID: \"6bcfe440740b57e762c96822e537f98c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.878573 kubelet[2758]: I1216 13:02:01.878416 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6bcfe440740b57e762c96822e537f98c-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-2-4-07f930e259\" (UID: \"6bcfe440740b57e762c96822e537f98c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.878573 kubelet[2758]: I1216 13:02:01.878426 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6bcfe440740b57e762c96822e537f98c-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-2-4-07f930e259\" (UID: \"6bcfe440740b57e762c96822e537f98c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.878573 kubelet[2758]: I1216 13:02:01.878438 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6bcfe440740b57e762c96822e537f98c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-2-4-07f930e259\" (UID: \"6bcfe440740b57e762c96822e537f98c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.878573 kubelet[2758]: I1216 13:02:01.878454 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/607c34f2651ba9bd2e9ed872b8b7cbf3-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-2-4-07f930e259\" (UID: \"607c34f2651ba9bd2e9ed872b8b7cbf3\") " pod="kube-system/kube-apiserver-ci-4459-2-2-4-07f930e259" Dec 16 13:02:01.878573 kubelet[2758]: I1216 13:02:01.878476 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6bcfe440740b57e762c96822e537f98c-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-2-4-07f930e259\" (UID: \"6bcfe440740b57e762c96822e537f98c\") " pod="kube-system/kube-controller-manager-ci-4459-2-2-4-07f930e259" Dec 16 13:02:02.652420 kubelet[2758]: I1216 13:02:02.652366 2758 apiserver.go:52] "Watching apiserver" Dec 16 13:02:02.677839 kubelet[2758]: I1216 13:02:02.677538 2758 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 13:02:02.728160 kubelet[2758]: I1216 13:02:02.728071 2758 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-2-4-07f930e259" Dec 16 13:02:02.729465 kubelet[2758]: I1216 13:02:02.729375 2758 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-2-4-07f930e259" Dec 16 13:02:02.735275 kubelet[2758]: E1216 13:02:02.735192 2758 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-2-4-07f930e259\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-2-4-07f930e259" Dec 16 13:02:02.737057 kubelet[2758]: E1216 13:02:02.737040 2758 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-2-4-07f930e259\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-2-4-07f930e259" Dec 16 13:02:02.760417 kubelet[2758]: I1216 13:02:02.760351 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-2-4-07f930e259" podStartSLOduration=2.760335655 podStartE2EDuration="2.760335655s" podCreationTimestamp="2025-12-16 13:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:02:02.759687289 +0000 UTC m=+1.189447005" watchObservedRunningTime="2025-12-16 13:02:02.760335655 +0000 UTC m=+1.190095360" Dec 16 13:02:02.769299 kubelet[2758]: I1216 13:02:02.769201 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-2-4-07f930e259" podStartSLOduration=2.769184898 podStartE2EDuration="2.769184898s" podCreationTimestamp="2025-12-16 13:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:02:02.768086398 +0000 UTC m=+1.197846134" watchObservedRunningTime="2025-12-16 13:02:02.769184898 +0000 UTC m=+1.198944604" Dec 16 13:02:02.788650 kubelet[2758]: I1216 13:02:02.787604 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-2-4-07f930e259" podStartSLOduration=2.787589368 podStartE2EDuration="2.787589368s" podCreationTimestamp="2025-12-16 13:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:02:02.777076034 +0000 UTC m=+1.206835750" watchObservedRunningTime="2025-12-16 13:02:02.787589368 +0000 UTC m=+1.217349075" Dec 16 13:02:06.744999 kubelet[2758]: I1216 13:02:06.744968 2758 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 16 13:02:06.745476 containerd[1550]: time="2025-12-16T13:02:06.745365762Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 16 13:02:06.746077 kubelet[2758]: I1216 13:02:06.745511 2758 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 16 13:02:07.866474 systemd[1]: Created slice kubepods-besteffort-pod54471a0d_475a_4274_86f6_57d8b869fa3a.slice - libcontainer container kubepods-besteffort-pod54471a0d_475a_4274_86f6_57d8b869fa3a.slice. Dec 16 13:02:07.921296 kubelet[2758]: I1216 13:02:07.921246 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/54471a0d-475a-4274-86f6-57d8b869fa3a-xtables-lock\") pod \"kube-proxy-t7k6z\" (UID: \"54471a0d-475a-4274-86f6-57d8b869fa3a\") " pod="kube-system/kube-proxy-t7k6z" Dec 16 13:02:07.921296 kubelet[2758]: I1216 13:02:07.921290 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/54471a0d-475a-4274-86f6-57d8b869fa3a-kube-proxy\") pod \"kube-proxy-t7k6z\" (UID: \"54471a0d-475a-4274-86f6-57d8b869fa3a\") " pod="kube-system/kube-proxy-t7k6z" Dec 16 13:02:07.921661 kubelet[2758]: I1216 13:02:07.921311 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54471a0d-475a-4274-86f6-57d8b869fa3a-lib-modules\") pod \"kube-proxy-t7k6z\" (UID: \"54471a0d-475a-4274-86f6-57d8b869fa3a\") " pod="kube-system/kube-proxy-t7k6z" Dec 16 13:02:07.921661 kubelet[2758]: I1216 13:02:07.921329 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lbqr\" (UniqueName: \"kubernetes.io/projected/54471a0d-475a-4274-86f6-57d8b869fa3a-kube-api-access-5lbqr\") pod \"kube-proxy-t7k6z\" (UID: \"54471a0d-475a-4274-86f6-57d8b869fa3a\") " pod="kube-system/kube-proxy-t7k6z" Dec 16 13:02:07.986297 systemd[1]: Created slice kubepods-besteffort-podd3ea516b_f269_4722_b214_7e415ea869bc.slice - libcontainer container kubepods-besteffort-podd3ea516b_f269_4722_b214_7e415ea869bc.slice. Dec 16 13:02:08.022320 kubelet[2758]: I1216 13:02:08.022097 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-768wh\" (UniqueName: \"kubernetes.io/projected/d3ea516b-f269-4722-b214-7e415ea869bc-kube-api-access-768wh\") pod \"tigera-operator-65cdcdfd6d-cmwgr\" (UID: \"d3ea516b-f269-4722-b214-7e415ea869bc\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-cmwgr" Dec 16 13:02:08.022320 kubelet[2758]: I1216 13:02:08.022173 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d3ea516b-f269-4722-b214-7e415ea869bc-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-cmwgr\" (UID: \"d3ea516b-f269-4722-b214-7e415ea869bc\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-cmwgr" Dec 16 13:02:08.177145 containerd[1550]: time="2025-12-16T13:02:08.177020170Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-t7k6z,Uid:54471a0d-475a-4274-86f6-57d8b869fa3a,Namespace:kube-system,Attempt:0,}" Dec 16 13:02:08.193022 containerd[1550]: time="2025-12-16T13:02:08.192593601Z" level=info msg="connecting to shim 58dbf5b7ee363d73243ccdc18bab1e01e59c51f14c64d2a6d360e444b4be80cb" address="unix:///run/containerd/s/eaa16b3140e9976d672ea7c96f6be9b8d1452e6da83dbf34e700ed60aa86ba52" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:02:08.224994 systemd[1]: Started cri-containerd-58dbf5b7ee363d73243ccdc18bab1e01e59c51f14c64d2a6d360e444b4be80cb.scope - libcontainer container 58dbf5b7ee363d73243ccdc18bab1e01e59c51f14c64d2a6d360e444b4be80cb. Dec 16 13:02:08.250398 containerd[1550]: time="2025-12-16T13:02:08.250326166Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-t7k6z,Uid:54471a0d-475a-4274-86f6-57d8b869fa3a,Namespace:kube-system,Attempt:0,} returns sandbox id \"58dbf5b7ee363d73243ccdc18bab1e01e59c51f14c64d2a6d360e444b4be80cb\"" Dec 16 13:02:08.255619 containerd[1550]: time="2025-12-16T13:02:08.255579015Z" level=info msg="CreateContainer within sandbox \"58dbf5b7ee363d73243ccdc18bab1e01e59c51f14c64d2a6d360e444b4be80cb\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 16 13:02:08.265978 containerd[1550]: time="2025-12-16T13:02:08.265942408Z" level=info msg="Container 5b1b1b1cc8b182ad0edc7aa5d2c3da9c688fc622e3089ef61d6c8cead4669116: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:02:08.271926 containerd[1550]: time="2025-12-16T13:02:08.271904387Z" level=info msg="CreateContainer within sandbox \"58dbf5b7ee363d73243ccdc18bab1e01e59c51f14c64d2a6d360e444b4be80cb\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"5b1b1b1cc8b182ad0edc7aa5d2c3da9c688fc622e3089ef61d6c8cead4669116\"" Dec 16 13:02:08.272539 containerd[1550]: time="2025-12-16T13:02:08.272503511Z" level=info msg="StartContainer for \"5b1b1b1cc8b182ad0edc7aa5d2c3da9c688fc622e3089ef61d6c8cead4669116\"" Dec 16 13:02:08.273672 containerd[1550]: time="2025-12-16T13:02:08.273641925Z" level=info msg="connecting to shim 5b1b1b1cc8b182ad0edc7aa5d2c3da9c688fc622e3089ef61d6c8cead4669116" address="unix:///run/containerd/s/eaa16b3140e9976d672ea7c96f6be9b8d1452e6da83dbf34e700ed60aa86ba52" protocol=ttrpc version=3 Dec 16 13:02:08.293627 containerd[1550]: time="2025-12-16T13:02:08.293599208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-cmwgr,Uid:d3ea516b-f269-4722-b214-7e415ea869bc,Namespace:tigera-operator,Attempt:0,}" Dec 16 13:02:08.296007 systemd[1]: Started cri-containerd-5b1b1b1cc8b182ad0edc7aa5d2c3da9c688fc622e3089ef61d6c8cead4669116.scope - libcontainer container 5b1b1b1cc8b182ad0edc7aa5d2c3da9c688fc622e3089ef61d6c8cead4669116. Dec 16 13:02:08.319040 containerd[1550]: time="2025-12-16T13:02:08.318991381Z" level=info msg="connecting to shim 641d1fcc7550314836d0946b4fa97503641f81d017b919dd18d36b130cf99310" address="unix:///run/containerd/s/0a38877b26eeb0159b007e767dd0115ffbb359d281c7a398f58441ce21b13cf8" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:02:08.336034 systemd[1]: Started cri-containerd-641d1fcc7550314836d0946b4fa97503641f81d017b919dd18d36b130cf99310.scope - libcontainer container 641d1fcc7550314836d0946b4fa97503641f81d017b919dd18d36b130cf99310. Dec 16 13:02:08.359756 containerd[1550]: time="2025-12-16T13:02:08.359691467Z" level=info msg="StartContainer for \"5b1b1b1cc8b182ad0edc7aa5d2c3da9c688fc622e3089ef61d6c8cead4669116\" returns successfully" Dec 16 13:02:08.386151 containerd[1550]: time="2025-12-16T13:02:08.386112619Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-cmwgr,Uid:d3ea516b-f269-4722-b214-7e415ea869bc,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"641d1fcc7550314836d0946b4fa97503641f81d017b919dd18d36b130cf99310\"" Dec 16 13:02:08.387925 containerd[1550]: time="2025-12-16T13:02:08.387899400Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 16 13:02:08.780332 kubelet[2758]: I1216 13:02:08.780267 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-t7k6z" podStartSLOduration=1.780247943 podStartE2EDuration="1.780247943s" podCreationTimestamp="2025-12-16 13:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:02:08.764060119 +0000 UTC m=+7.193819865" watchObservedRunningTime="2025-12-16 13:02:08.780247943 +0000 UTC m=+7.210007640" Dec 16 13:02:09.036327 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3248614252.mount: Deactivated successfully. Dec 16 13:02:10.866068 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1941588674.mount: Deactivated successfully. Dec 16 13:02:11.294304 containerd[1550]: time="2025-12-16T13:02:11.294188087Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:02:11.295486 containerd[1550]: time="2025-12-16T13:02:11.295416380Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=25061691" Dec 16 13:02:11.296445 containerd[1550]: time="2025-12-16T13:02:11.296420613Z" level=info msg="ImageCreate event name:\"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:02:11.298718 containerd[1550]: time="2025-12-16T13:02:11.298223113Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:02:11.298718 containerd[1550]: time="2025-12-16T13:02:11.298613605Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"25057686\" in 2.910665605s" Dec 16 13:02:11.298718 containerd[1550]: time="2025-12-16T13:02:11.298633643Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:f2c1be207523e593db82e3b8cf356a12f3ad8d1aad2225f8114b2cf9d6486cf1\"" Dec 16 13:02:11.302679 containerd[1550]: time="2025-12-16T13:02:11.302656626Z" level=info msg="CreateContainer within sandbox \"641d1fcc7550314836d0946b4fa97503641f81d017b919dd18d36b130cf99310\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 16 13:02:11.312443 containerd[1550]: time="2025-12-16T13:02:11.311922531Z" level=info msg="Container dbf1edeb4fdfd8df528d8c2d4172d9e3613d3928bdb99e9fa0e1a226391e24ab: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:02:11.326613 containerd[1550]: time="2025-12-16T13:02:11.326581267Z" level=info msg="CreateContainer within sandbox \"641d1fcc7550314836d0946b4fa97503641f81d017b919dd18d36b130cf99310\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"dbf1edeb4fdfd8df528d8c2d4172d9e3613d3928bdb99e9fa0e1a226391e24ab\"" Dec 16 13:02:11.327000 containerd[1550]: time="2025-12-16T13:02:11.326985275Z" level=info msg="StartContainer for \"dbf1edeb4fdfd8df528d8c2d4172d9e3613d3928bdb99e9fa0e1a226391e24ab\"" Dec 16 13:02:11.328033 containerd[1550]: time="2025-12-16T13:02:11.328002072Z" level=info msg="connecting to shim dbf1edeb4fdfd8df528d8c2d4172d9e3613d3928bdb99e9fa0e1a226391e24ab" address="unix:///run/containerd/s/0a38877b26eeb0159b007e767dd0115ffbb359d281c7a398f58441ce21b13cf8" protocol=ttrpc version=3 Dec 16 13:02:11.343949 systemd[1]: Started cri-containerd-dbf1edeb4fdfd8df528d8c2d4172d9e3613d3928bdb99e9fa0e1a226391e24ab.scope - libcontainer container dbf1edeb4fdfd8df528d8c2d4172d9e3613d3928bdb99e9fa0e1a226391e24ab. Dec 16 13:02:11.369560 containerd[1550]: time="2025-12-16T13:02:11.369531864Z" level=info msg="StartContainer for \"dbf1edeb4fdfd8df528d8c2d4172d9e3613d3928bdb99e9fa0e1a226391e24ab\" returns successfully" Dec 16 13:02:12.190423 update_engine[1539]: I20251216 13:02:12.190343 1539 update_attempter.cc:509] Updating boot flags... Dec 16 13:02:16.646711 kubelet[2758]: I1216 13:02:16.646625 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-cmwgr" podStartSLOduration=6.734688318 podStartE2EDuration="9.646608805s" podCreationTimestamp="2025-12-16 13:02:07 +0000 UTC" firstStartedPulling="2025-12-16 13:02:08.38743525 +0000 UTC m=+6.817194956" lastFinishedPulling="2025-12-16 13:02:11.299355737 +0000 UTC m=+9.729115443" observedRunningTime="2025-12-16 13:02:11.780564192 +0000 UTC m=+10.210323908" watchObservedRunningTime="2025-12-16 13:02:16.646608805 +0000 UTC m=+15.076368521" Dec 16 13:02:16.870737 sudo[1818]: pam_unix(sudo:session): session closed for user root Dec 16 13:02:17.041097 sshd[1817]: Connection closed by 139.178.89.65 port 33814 Dec 16 13:02:17.041027 sshd-session[1814]: pam_unix(sshd:session): session closed for user core Dec 16 13:02:17.044868 systemd[1]: sshd@6-77.42.23.34:22-139.178.89.65:33814.service: Deactivated successfully. Dec 16 13:02:17.048044 systemd[1]: session-7.scope: Deactivated successfully. Dec 16 13:02:17.048335 systemd[1]: session-7.scope: Consumed 4.585s CPU time, 163.1M memory peak. Dec 16 13:02:17.049520 systemd-logind[1534]: Session 7 logged out. Waiting for processes to exit. Dec 16 13:02:17.053862 systemd-logind[1534]: Removed session 7. Dec 16 13:02:22.506298 systemd[1]: Created slice kubepods-besteffort-poda31b3e80_5abb_4f81_91b9_daa3396fe448.slice - libcontainer container kubepods-besteffort-poda31b3e80_5abb_4f81_91b9_daa3396fe448.slice. Dec 16 13:02:22.546839 kubelet[2758]: I1216 13:02:22.546779 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shkdx\" (UniqueName: \"kubernetes.io/projected/a31b3e80-5abb-4f81-91b9-daa3396fe448-kube-api-access-shkdx\") pod \"calico-typha-7bbc6b9997-ltsh4\" (UID: \"a31b3e80-5abb-4f81-91b9-daa3396fe448\") " pod="calico-system/calico-typha-7bbc6b9997-ltsh4" Dec 16 13:02:22.547202 kubelet[2758]: I1216 13:02:22.546852 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a31b3e80-5abb-4f81-91b9-daa3396fe448-tigera-ca-bundle\") pod \"calico-typha-7bbc6b9997-ltsh4\" (UID: \"a31b3e80-5abb-4f81-91b9-daa3396fe448\") " pod="calico-system/calico-typha-7bbc6b9997-ltsh4" Dec 16 13:02:22.547202 kubelet[2758]: I1216 13:02:22.546874 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/a31b3e80-5abb-4f81-91b9-daa3396fe448-typha-certs\") pod \"calico-typha-7bbc6b9997-ltsh4\" (UID: \"a31b3e80-5abb-4f81-91b9-daa3396fe448\") " pod="calico-system/calico-typha-7bbc6b9997-ltsh4" Dec 16 13:02:22.756221 systemd[1]: Created slice kubepods-besteffort-podd3edeebd_b70c_4ab8_a6e2_7a5caec1c0a2.slice - libcontainer container kubepods-besteffort-podd3edeebd_b70c_4ab8_a6e2_7a5caec1c0a2.slice. Dec 16 13:02:22.811930 containerd[1550]: time="2025-12-16T13:02:22.811525636Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7bbc6b9997-ltsh4,Uid:a31b3e80-5abb-4f81-91b9-daa3396fe448,Namespace:calico-system,Attempt:0,}" Dec 16 13:02:22.829436 containerd[1550]: time="2025-12-16T13:02:22.829004928Z" level=info msg="connecting to shim fac8119b765935e608a20c360060de785527d887bafce0056571c78faf2a4650" address="unix:///run/containerd/s/62f68b2a1a65473b3faa9b814ec5a91cc31f6213d714e922ed5fc191f47087d8" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:02:22.848411 kubelet[2758]: I1216 13:02:22.848356 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq6wj\" (UniqueName: \"kubernetes.io/projected/d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2-kube-api-access-vq6wj\") pod \"calico-node-96pm6\" (UID: \"d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2\") " pod="calico-system/calico-node-96pm6" Dec 16 13:02:22.848411 kubelet[2758]: I1216 13:02:22.848411 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2-var-lib-calico\") pod \"calico-node-96pm6\" (UID: \"d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2\") " pod="calico-system/calico-node-96pm6" Dec 16 13:02:22.848621 kubelet[2758]: I1216 13:02:22.848433 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2-policysync\") pod \"calico-node-96pm6\" (UID: \"d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2\") " pod="calico-system/calico-node-96pm6" Dec 16 13:02:22.848621 kubelet[2758]: I1216 13:02:22.848449 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2-xtables-lock\") pod \"calico-node-96pm6\" (UID: \"d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2\") " pod="calico-system/calico-node-96pm6" Dec 16 13:02:22.848621 kubelet[2758]: I1216 13:02:22.848468 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2-node-certs\") pod \"calico-node-96pm6\" (UID: \"d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2\") " pod="calico-system/calico-node-96pm6" Dec 16 13:02:22.848621 kubelet[2758]: I1216 13:02:22.848485 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2-cni-log-dir\") pod \"calico-node-96pm6\" (UID: \"d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2\") " pod="calico-system/calico-node-96pm6" Dec 16 13:02:22.848621 kubelet[2758]: I1216 13:02:22.848501 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2-cni-net-dir\") pod \"calico-node-96pm6\" (UID: \"d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2\") " pod="calico-system/calico-node-96pm6" Dec 16 13:02:22.848795 kubelet[2758]: I1216 13:02:22.848549 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2-lib-modules\") pod \"calico-node-96pm6\" (UID: \"d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2\") " pod="calico-system/calico-node-96pm6" Dec 16 13:02:22.848795 kubelet[2758]: I1216 13:02:22.848567 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2-var-run-calico\") pod \"calico-node-96pm6\" (UID: \"d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2\") " pod="calico-system/calico-node-96pm6" Dec 16 13:02:22.848795 kubelet[2758]: I1216 13:02:22.848585 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2-flexvol-driver-host\") pod \"calico-node-96pm6\" (UID: \"d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2\") " pod="calico-system/calico-node-96pm6" Dec 16 13:02:22.848795 kubelet[2758]: I1216 13:02:22.848605 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2-tigera-ca-bundle\") pod \"calico-node-96pm6\" (UID: \"d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2\") " pod="calico-system/calico-node-96pm6" Dec 16 13:02:22.848795 kubelet[2758]: I1216 13:02:22.848625 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2-cni-bin-dir\") pod \"calico-node-96pm6\" (UID: \"d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2\") " pod="calico-system/calico-node-96pm6" Dec 16 13:02:22.864043 systemd[1]: Started cri-containerd-fac8119b765935e608a20c360060de785527d887bafce0056571c78faf2a4650.scope - libcontainer container fac8119b765935e608a20c360060de785527d887bafce0056571c78faf2a4650. Dec 16 13:02:22.953343 kubelet[2758]: E1216 13:02:22.953302 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.953546 kubelet[2758]: W1216 13:02:22.953321 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.953546 kubelet[2758]: E1216 13:02:22.953487 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.954010 kubelet[2758]: E1216 13:02:22.953880 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.954010 kubelet[2758]: W1216 13:02:22.953889 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.954010 kubelet[2758]: E1216 13:02:22.953898 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.954523 kubelet[2758]: E1216 13:02:22.954397 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.954599 kubelet[2758]: W1216 13:02:22.954577 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.954599 kubelet[2758]: E1216 13:02:22.954590 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.955192 kubelet[2758]: E1216 13:02:22.955151 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.955339 kubelet[2758]: W1216 13:02:22.955328 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.955442 kubelet[2758]: E1216 13:02:22.955414 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.957133 kubelet[2758]: E1216 13:02:22.957118 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.957252 kubelet[2758]: W1216 13:02:22.957227 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.957252 kubelet[2758]: E1216 13:02:22.957242 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.957538 kubelet[2758]: E1216 13:02:22.957529 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.957671 kubelet[2758]: W1216 13:02:22.957595 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.957671 kubelet[2758]: E1216 13:02:22.957605 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.958166 kubelet[2758]: E1216 13:02:22.958140 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.958166 kubelet[2758]: W1216 13:02:22.958150 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.958166 kubelet[2758]: E1216 13:02:22.958158 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.959170 kubelet[2758]: E1216 13:02:22.959071 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.959170 kubelet[2758]: W1216 13:02:22.959101 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.959170 kubelet[2758]: E1216 13:02:22.959110 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.959363 kubelet[2758]: E1216 13:02:22.959354 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.959449 kubelet[2758]: W1216 13:02:22.959425 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.959449 kubelet[2758]: E1216 13:02:22.959436 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.959798 kubelet[2758]: E1216 13:02:22.959789 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.961001 kubelet[2758]: W1216 13:02:22.960938 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.961001 kubelet[2758]: E1216 13:02:22.960952 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.961406 kubelet[2758]: E1216 13:02:22.961345 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.961406 kubelet[2758]: W1216 13:02:22.961379 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.961551 kubelet[2758]: E1216 13:02:22.961500 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.961836 kubelet[2758]: E1216 13:02:22.961799 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.961886 kubelet[2758]: W1216 13:02:22.961811 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.961929 kubelet[2758]: E1216 13:02:22.961921 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.962112 kubelet[2758]: E1216 13:02:22.962088 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.962112 kubelet[2758]: W1216 13:02:22.962096 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.962112 kubelet[2758]: E1216 13:02:22.962103 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.962781 kubelet[2758]: E1216 13:02:22.962756 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.962781 kubelet[2758]: W1216 13:02:22.962766 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.962781 kubelet[2758]: E1216 13:02:22.962773 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.963072 kubelet[2758]: E1216 13:02:22.963048 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.963072 kubelet[2758]: W1216 13:02:22.963057 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.963072 kubelet[2758]: E1216 13:02:22.963064 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.964035 kubelet[2758]: E1216 13:02:22.964010 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.964035 kubelet[2758]: W1216 13:02:22.964019 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.964035 kubelet[2758]: E1216 13:02:22.964026 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.964295 kubelet[2758]: E1216 13:02:22.964271 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.964295 kubelet[2758]: W1216 13:02:22.964280 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.964295 kubelet[2758]: E1216 13:02:22.964286 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.966047 kubelet[2758]: E1216 13:02:22.965894 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.966047 kubelet[2758]: W1216 13:02:22.965904 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.966047 kubelet[2758]: E1216 13:02:22.965912 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.966184 kubelet[2758]: E1216 13:02:22.966157 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.966184 kubelet[2758]: W1216 13:02:22.966168 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.966184 kubelet[2758]: E1216 13:02:22.966175 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.966937 kubelet[2758]: E1216 13:02:22.966861 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.966937 kubelet[2758]: W1216 13:02:22.966871 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.966937 kubelet[2758]: E1216 13:02:22.966879 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.969048 kubelet[2758]: E1216 13:02:22.969006 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.969216 kubelet[2758]: W1216 13:02:22.969115 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.969216 kubelet[2758]: E1216 13:02:22.969129 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.970321 kubelet[2758]: E1216 13:02:22.969719 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.970321 kubelet[2758]: W1216 13:02:22.969727 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.970321 kubelet[2758]: E1216 13:02:22.969735 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:22.977976 kubelet[2758]: E1216 13:02:22.977934 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:02:22.981957 kubelet[2758]: E1216 13:02:22.981860 2758 status_manager.go:1018] "Failed to get status for pod" err="pods \"csi-node-driver-ffzww\" is forbidden: User \"system:node:ci-4459-2-2-4-07f930e259\" cannot get resource \"pods\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4459-2-2-4-07f930e259' and this object" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" pod="calico-system/csi-node-driver-ffzww" Dec 16 13:02:22.986931 kubelet[2758]: E1216 13:02:22.986897 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:22.987097 kubelet[2758]: W1216 13:02:22.987047 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:22.987097 kubelet[2758]: E1216 13:02:22.987069 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.007482 kubelet[2758]: E1216 13:02:23.007275 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.007482 kubelet[2758]: W1216 13:02:23.007296 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.007828 kubelet[2758]: E1216 13:02:23.007317 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.008833 kubelet[2758]: E1216 13:02:23.008114 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.008833 kubelet[2758]: W1216 13:02:23.008124 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.008833 kubelet[2758]: E1216 13:02:23.008133 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.009892 kubelet[2758]: E1216 13:02:23.009860 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.009892 kubelet[2758]: W1216 13:02:23.009885 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.009955 kubelet[2758]: E1216 13:02:23.009907 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.010172 kubelet[2758]: E1216 13:02:23.010147 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.010210 kubelet[2758]: W1216 13:02:23.010165 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.010210 kubelet[2758]: E1216 13:02:23.010191 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.010403 kubelet[2758]: E1216 13:02:23.010379 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.010403 kubelet[2758]: W1216 13:02:23.010395 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.010403 kubelet[2758]: E1216 13:02:23.010404 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.010564 kubelet[2758]: E1216 13:02:23.010545 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.010564 kubelet[2758]: W1216 13:02:23.010558 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.010611 kubelet[2758]: E1216 13:02:23.010566 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.010730 kubelet[2758]: E1216 13:02:23.010710 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.010730 kubelet[2758]: W1216 13:02:23.010723 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.010730 kubelet[2758]: E1216 13:02:23.010730 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.011033 kubelet[2758]: E1216 13:02:23.010899 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.011033 kubelet[2758]: W1216 13:02:23.010926 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.011033 kubelet[2758]: E1216 13:02:23.010933 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.011210 kubelet[2758]: E1216 13:02:23.011181 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.011323 kubelet[2758]: W1216 13:02:23.011300 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.011323 kubelet[2758]: E1216 13:02:23.011316 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.011487 kubelet[2758]: E1216 13:02:23.011445 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.011487 kubelet[2758]: W1216 13:02:23.011475 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.011487 kubelet[2758]: E1216 13:02:23.011482 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.011763 kubelet[2758]: E1216 13:02:23.011733 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.011763 kubelet[2758]: W1216 13:02:23.011759 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.011840 kubelet[2758]: E1216 13:02:23.011767 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.012015 kubelet[2758]: E1216 13:02:23.011986 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.012143 kubelet[2758]: W1216 13:02:23.012117 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.012143 kubelet[2758]: E1216 13:02:23.012135 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.012404 kubelet[2758]: E1216 13:02:23.012380 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.012404 kubelet[2758]: W1216 13:02:23.012392 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.012404 kubelet[2758]: E1216 13:02:23.012399 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.012697 kubelet[2758]: E1216 13:02:23.012674 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.012697 kubelet[2758]: W1216 13:02:23.012693 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.012766 kubelet[2758]: E1216 13:02:23.012705 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.012990 kubelet[2758]: E1216 13:02:23.012964 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.012990 kubelet[2758]: W1216 13:02:23.012985 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.013053 kubelet[2758]: E1216 13:02:23.013003 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.013506 kubelet[2758]: E1216 13:02:23.013314 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.013506 kubelet[2758]: W1216 13:02:23.013325 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.013506 kubelet[2758]: E1216 13:02:23.013332 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.013604 kubelet[2758]: E1216 13:02:23.013587 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.013604 kubelet[2758]: W1216 13:02:23.013599 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.013654 kubelet[2758]: E1216 13:02:23.013607 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.013801 kubelet[2758]: E1216 13:02:23.013775 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.013801 kubelet[2758]: W1216 13:02:23.013788 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.013801 kubelet[2758]: E1216 13:02:23.013795 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.014101 kubelet[2758]: E1216 13:02:23.014071 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.014101 kubelet[2758]: W1216 13:02:23.014085 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.014101 kubelet[2758]: E1216 13:02:23.014093 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.014239 kubelet[2758]: E1216 13:02:23.014219 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.014239 kubelet[2758]: W1216 13:02:23.014233 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.014239 kubelet[2758]: E1216 13:02:23.014240 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.043144 containerd[1550]: time="2025-12-16T13:02:23.043085635Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7bbc6b9997-ltsh4,Uid:a31b3e80-5abb-4f81-91b9-daa3396fe448,Namespace:calico-system,Attempt:0,} returns sandbox id \"fac8119b765935e608a20c360060de785527d887bafce0056571c78faf2a4650\"" Dec 16 13:02:23.045010 containerd[1550]: time="2025-12-16T13:02:23.044927239Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 16 13:02:23.050659 kubelet[2758]: E1216 13:02:23.050629 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.050659 kubelet[2758]: W1216 13:02:23.050651 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.050961 kubelet[2758]: E1216 13:02:23.050683 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.050961 kubelet[2758]: I1216 13:02:23.050706 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb-socket-dir\") pod \"csi-node-driver-ffzww\" (UID: \"e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb\") " pod="calico-system/csi-node-driver-ffzww" Dec 16 13:02:23.050961 kubelet[2758]: E1216 13:02:23.050925 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.050961 kubelet[2758]: W1216 13:02:23.050933 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.050961 kubelet[2758]: E1216 13:02:23.050941 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.051055 kubelet[2758]: I1216 13:02:23.050985 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prgk6\" (UniqueName: \"kubernetes.io/projected/e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb-kube-api-access-prgk6\") pod \"csi-node-driver-ffzww\" (UID: \"e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb\") " pod="calico-system/csi-node-driver-ffzww" Dec 16 13:02:23.051450 kubelet[2758]: E1216 13:02:23.051224 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.051450 kubelet[2758]: W1216 13:02:23.051236 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.051450 kubelet[2758]: E1216 13:02:23.051244 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.051450 kubelet[2758]: I1216 13:02:23.051267 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb-kubelet-dir\") pod \"csi-node-driver-ffzww\" (UID: \"e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb\") " pod="calico-system/csi-node-driver-ffzww" Dec 16 13:02:23.051450 kubelet[2758]: E1216 13:02:23.051437 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.051450 kubelet[2758]: W1216 13:02:23.051445 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.051613 kubelet[2758]: E1216 13:02:23.051470 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.051613 kubelet[2758]: I1216 13:02:23.051492 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb-registration-dir\") pod \"csi-node-driver-ffzww\" (UID: \"e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb\") " pod="calico-system/csi-node-driver-ffzww" Dec 16 13:02:23.051652 kubelet[2758]: E1216 13:02:23.051645 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.051671 kubelet[2758]: W1216 13:02:23.051653 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.051671 kubelet[2758]: E1216 13:02:23.051661 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.051706 kubelet[2758]: I1216 13:02:23.051673 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb-varrun\") pod \"csi-node-driver-ffzww\" (UID: \"e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb\") " pod="calico-system/csi-node-driver-ffzww" Dec 16 13:02:23.052333 kubelet[2758]: E1216 13:02:23.051870 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.052333 kubelet[2758]: W1216 13:02:23.051885 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.052333 kubelet[2758]: E1216 13:02:23.051893 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.052333 kubelet[2758]: E1216 13:02:23.052086 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.052333 kubelet[2758]: W1216 13:02:23.052094 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.052333 kubelet[2758]: E1216 13:02:23.052101 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.052333 kubelet[2758]: E1216 13:02:23.052304 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.052333 kubelet[2758]: W1216 13:02:23.052311 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.052333 kubelet[2758]: E1216 13:02:23.052318 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.052574 kubelet[2758]: E1216 13:02:23.052471 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.052574 kubelet[2758]: W1216 13:02:23.052478 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.052574 kubelet[2758]: E1216 13:02:23.052485 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.053053 kubelet[2758]: E1216 13:02:23.052631 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.053053 kubelet[2758]: W1216 13:02:23.052642 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.053053 kubelet[2758]: E1216 13:02:23.052649 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.053053 kubelet[2758]: E1216 13:02:23.052793 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.053053 kubelet[2758]: W1216 13:02:23.052801 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.053053 kubelet[2758]: E1216 13:02:23.052807 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.053053 kubelet[2758]: E1216 13:02:23.052996 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.053053 kubelet[2758]: W1216 13:02:23.053007 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.053053 kubelet[2758]: E1216 13:02:23.053034 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.053231 kubelet[2758]: E1216 13:02:23.053179 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.053231 kubelet[2758]: W1216 13:02:23.053201 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.053231 kubelet[2758]: E1216 13:02:23.053209 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.053626 kubelet[2758]: E1216 13:02:23.053341 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.053626 kubelet[2758]: W1216 13:02:23.053351 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.053626 kubelet[2758]: E1216 13:02:23.053357 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.053626 kubelet[2758]: E1216 13:02:23.053475 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.053626 kubelet[2758]: W1216 13:02:23.053481 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.053626 kubelet[2758]: E1216 13:02:23.053487 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.061464 containerd[1550]: time="2025-12-16T13:02:23.061419544Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-96pm6,Uid:d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2,Namespace:calico-system,Attempt:0,}" Dec 16 13:02:23.098968 containerd[1550]: time="2025-12-16T13:02:23.098495714Z" level=info msg="connecting to shim 2cc15975c36dff346558121152edc1ca9339d3a05c744bd68a4fa829b7a15121" address="unix:///run/containerd/s/ceae76b395b14e1d49485b3d4fc37e96059eadedfe65859d0f2d8d6f9eec47ab" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:02:23.125964 systemd[1]: Started cri-containerd-2cc15975c36dff346558121152edc1ca9339d3a05c744bd68a4fa829b7a15121.scope - libcontainer container 2cc15975c36dff346558121152edc1ca9339d3a05c744bd68a4fa829b7a15121. Dec 16 13:02:23.153414 kubelet[2758]: E1216 13:02:23.153325 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.154771 kubelet[2758]: W1216 13:02:23.154440 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.154771 kubelet[2758]: E1216 13:02:23.154467 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.155184 kubelet[2758]: E1216 13:02:23.154973 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.155184 kubelet[2758]: W1216 13:02:23.154982 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.155184 kubelet[2758]: E1216 13:02:23.154991 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.155836 kubelet[2758]: E1216 13:02:23.155523 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.155836 kubelet[2758]: W1216 13:02:23.155533 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.155836 kubelet[2758]: E1216 13:02:23.155541 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.156445 kubelet[2758]: E1216 13:02:23.156425 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.156577 kubelet[2758]: W1216 13:02:23.156515 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.156577 kubelet[2758]: E1216 13:02:23.156527 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.157131 kubelet[2758]: E1216 13:02:23.157093 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.157304 kubelet[2758]: W1216 13:02:23.157224 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.157304 kubelet[2758]: E1216 13:02:23.157237 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.157862 kubelet[2758]: E1216 13:02:23.157851 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.158022 kubelet[2758]: W1216 13:02:23.158001 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.158022 kubelet[2758]: E1216 13:02:23.158013 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.158516 kubelet[2758]: E1216 13:02:23.158489 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.158516 kubelet[2758]: W1216 13:02:23.158498 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.158516 kubelet[2758]: E1216 13:02:23.158508 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.158928 kubelet[2758]: E1216 13:02:23.158875 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.158928 kubelet[2758]: W1216 13:02:23.158884 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.158928 kubelet[2758]: E1216 13:02:23.158892 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.159532 kubelet[2758]: E1216 13:02:23.159359 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.159532 kubelet[2758]: W1216 13:02:23.159368 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.159532 kubelet[2758]: E1216 13:02:23.159376 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.159876 kubelet[2758]: E1216 13:02:23.159763 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.159876 kubelet[2758]: W1216 13:02:23.159772 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.159876 kubelet[2758]: E1216 13:02:23.159780 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.160306 kubelet[2758]: E1216 13:02:23.160277 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.160430 kubelet[2758]: W1216 13:02:23.160358 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.160510 kubelet[2758]: E1216 13:02:23.160497 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.160970 kubelet[2758]: E1216 13:02:23.160936 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.160970 kubelet[2758]: W1216 13:02:23.160964 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.161029 kubelet[2758]: E1216 13:02:23.160990 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.161336 kubelet[2758]: E1216 13:02:23.161291 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.161336 kubelet[2758]: W1216 13:02:23.161313 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.161336 kubelet[2758]: E1216 13:02:23.161328 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.161650 kubelet[2758]: E1216 13:02:23.161625 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.161650 kubelet[2758]: W1216 13:02:23.161643 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.161737 kubelet[2758]: E1216 13:02:23.161656 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.162102 kubelet[2758]: E1216 13:02:23.162042 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.162102 kubelet[2758]: W1216 13:02:23.162058 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.162102 kubelet[2758]: E1216 13:02:23.162070 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.162345 kubelet[2758]: E1216 13:02:23.162322 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.162345 kubelet[2758]: W1216 13:02:23.162341 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.162406 kubelet[2758]: E1216 13:02:23.162353 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.162578 kubelet[2758]: E1216 13:02:23.162533 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.162642 containerd[1550]: time="2025-12-16T13:02:23.162612587Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-96pm6,Uid:d3edeebd-b70c-4ab8-a6e2-7a5caec1c0a2,Namespace:calico-system,Attempt:0,} returns sandbox id \"2cc15975c36dff346558121152edc1ca9339d3a05c744bd68a4fa829b7a15121\"" Dec 16 13:02:23.162728 kubelet[2758]: W1216 13:02:23.162549 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.163056 kubelet[2758]: E1216 13:02:23.162735 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.163056 kubelet[2758]: E1216 13:02:23.162934 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.163056 kubelet[2758]: W1216 13:02:23.162944 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.163056 kubelet[2758]: E1216 13:02:23.162955 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.163695 kubelet[2758]: E1216 13:02:23.163662 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.163695 kubelet[2758]: W1216 13:02:23.163686 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.163777 kubelet[2758]: E1216 13:02:23.163699 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.164188 kubelet[2758]: E1216 13:02:23.163950 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.164188 kubelet[2758]: W1216 13:02:23.163965 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.164188 kubelet[2758]: E1216 13:02:23.163976 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.164270 kubelet[2758]: E1216 13:02:23.164216 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.164270 kubelet[2758]: W1216 13:02:23.164228 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.164270 kubelet[2758]: E1216 13:02:23.164240 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.165174 kubelet[2758]: E1216 13:02:23.164426 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.165174 kubelet[2758]: W1216 13:02:23.164439 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.165174 kubelet[2758]: E1216 13:02:23.164449 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.165174 kubelet[2758]: E1216 13:02:23.164792 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.165174 kubelet[2758]: W1216 13:02:23.164802 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.165174 kubelet[2758]: E1216 13:02:23.164813 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.165174 kubelet[2758]: E1216 13:02:23.165012 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.165174 kubelet[2758]: W1216 13:02:23.165021 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.165174 kubelet[2758]: E1216 13:02:23.165032 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.165359 kubelet[2758]: E1216 13:02:23.165254 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.165359 kubelet[2758]: W1216 13:02:23.165265 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.165359 kubelet[2758]: E1216 13:02:23.165278 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:23.174907 kubelet[2758]: E1216 13:02:23.174859 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:23.174907 kubelet[2758]: W1216 13:02:23.174882 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:23.174907 kubelet[2758]: E1216 13:02:23.174903 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:24.694742 kubelet[2758]: E1216 13:02:24.694676 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:02:24.997417 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2082287606.mount: Deactivated successfully. Dec 16 13:02:25.888446 containerd[1550]: time="2025-12-16T13:02:25.888402067Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:02:25.889325 containerd[1550]: time="2025-12-16T13:02:25.889187341Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=35234628" Dec 16 13:02:25.890067 containerd[1550]: time="2025-12-16T13:02:25.890039692Z" level=info msg="ImageCreate event name:\"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:02:25.891443 containerd[1550]: time="2025-12-16T13:02:25.891414772Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:02:25.891933 containerd[1550]: time="2025-12-16T13:02:25.891912352Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"35234482\" in 2.846962329s" Dec 16 13:02:25.892011 containerd[1550]: time="2025-12-16T13:02:25.891998814Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:aa1490366a77160b4cc8f9af82281ab7201ffda0882871f860e1eb1c4f825958\"" Dec 16 13:02:25.894168 containerd[1550]: time="2025-12-16T13:02:25.893955674Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 16 13:02:25.906273 containerd[1550]: time="2025-12-16T13:02:25.906242926Z" level=info msg="CreateContainer within sandbox \"fac8119b765935e608a20c360060de785527d887bafce0056571c78faf2a4650\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 16 13:02:25.930837 containerd[1550]: time="2025-12-16T13:02:25.929033120Z" level=info msg="Container 2da65adbca75d603f3a218ec9d7dbfac494763d1b8830cd08f69d9f421ce6c6d: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:02:25.950087 containerd[1550]: time="2025-12-16T13:02:25.950047577Z" level=info msg="CreateContainer within sandbox \"fac8119b765935e608a20c360060de785527d887bafce0056571c78faf2a4650\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"2da65adbca75d603f3a218ec9d7dbfac494763d1b8830cd08f69d9f421ce6c6d\"" Dec 16 13:02:25.950662 containerd[1550]: time="2025-12-16T13:02:25.950589732Z" level=info msg="StartContainer for \"2da65adbca75d603f3a218ec9d7dbfac494763d1b8830cd08f69d9f421ce6c6d\"" Dec 16 13:02:25.951812 containerd[1550]: time="2025-12-16T13:02:25.951795581Z" level=info msg="connecting to shim 2da65adbca75d603f3a218ec9d7dbfac494763d1b8830cd08f69d9f421ce6c6d" address="unix:///run/containerd/s/62f68b2a1a65473b3faa9b814ec5a91cc31f6213d714e922ed5fc191f47087d8" protocol=ttrpc version=3 Dec 16 13:02:25.971926 systemd[1]: Started cri-containerd-2da65adbca75d603f3a218ec9d7dbfac494763d1b8830cd08f69d9f421ce6c6d.scope - libcontainer container 2da65adbca75d603f3a218ec9d7dbfac494763d1b8830cd08f69d9f421ce6c6d. Dec 16 13:02:26.021427 containerd[1550]: time="2025-12-16T13:02:26.021219594Z" level=info msg="StartContainer for \"2da65adbca75d603f3a218ec9d7dbfac494763d1b8830cd08f69d9f421ce6c6d\" returns successfully" Dec 16 13:02:26.694611 kubelet[2758]: E1216 13:02:26.694210 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:02:26.837747 kubelet[2758]: I1216 13:02:26.836170 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7bbc6b9997-ltsh4" podStartSLOduration=1.988062895 podStartE2EDuration="4.836153987s" podCreationTimestamp="2025-12-16 13:02:22 +0000 UTC" firstStartedPulling="2025-12-16 13:02:23.04466822 +0000 UTC m=+21.474427926" lastFinishedPulling="2025-12-16 13:02:25.892759313 +0000 UTC m=+24.322519018" observedRunningTime="2025-12-16 13:02:26.835291739 +0000 UTC m=+25.265051455" watchObservedRunningTime="2025-12-16 13:02:26.836153987 +0000 UTC m=+25.265913693" Dec 16 13:02:26.840525 kubelet[2758]: E1216 13:02:26.840497 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.840525 kubelet[2758]: W1216 13:02:26.840516 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.840605 kubelet[2758]: E1216 13:02:26.840534 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.840750 kubelet[2758]: E1216 13:02:26.840734 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.840750 kubelet[2758]: W1216 13:02:26.840746 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.840896 kubelet[2758]: E1216 13:02:26.840756 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.841052 kubelet[2758]: E1216 13:02:26.841030 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.841052 kubelet[2758]: W1216 13:02:26.841044 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.841052 kubelet[2758]: E1216 13:02:26.841052 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.841633 kubelet[2758]: E1216 13:02:26.841253 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.841633 kubelet[2758]: W1216 13:02:26.841262 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.841633 kubelet[2758]: E1216 13:02:26.841270 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.841633 kubelet[2758]: E1216 13:02:26.841455 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.841633 kubelet[2758]: W1216 13:02:26.841463 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.841633 kubelet[2758]: E1216 13:02:26.841471 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.841973 kubelet[2758]: E1216 13:02:26.841949 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.841973 kubelet[2758]: W1216 13:02:26.841965 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.841973 kubelet[2758]: E1216 13:02:26.841974 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.842163 kubelet[2758]: E1216 13:02:26.842148 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.842163 kubelet[2758]: W1216 13:02:26.842161 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.842223 kubelet[2758]: E1216 13:02:26.842169 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.842357 kubelet[2758]: E1216 13:02:26.842338 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.842357 kubelet[2758]: W1216 13:02:26.842352 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.842357 kubelet[2758]: E1216 13:02:26.842361 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.842695 kubelet[2758]: E1216 13:02:26.842548 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.842695 kubelet[2758]: W1216 13:02:26.842556 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.842695 kubelet[2758]: E1216 13:02:26.842563 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.842907 kubelet[2758]: E1216 13:02:26.842720 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.842907 kubelet[2758]: W1216 13:02:26.842728 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.842907 kubelet[2758]: E1216 13:02:26.842736 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.843575 kubelet[2758]: E1216 13:02:26.842920 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.843575 kubelet[2758]: W1216 13:02:26.842928 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.843575 kubelet[2758]: E1216 13:02:26.842936 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.843575 kubelet[2758]: E1216 13:02:26.843104 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.843575 kubelet[2758]: W1216 13:02:26.843113 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.843575 kubelet[2758]: E1216 13:02:26.843121 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.843575 kubelet[2758]: E1216 13:02:26.843334 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.843575 kubelet[2758]: W1216 13:02:26.843343 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.843575 kubelet[2758]: E1216 13:02:26.843351 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.843575 kubelet[2758]: E1216 13:02:26.843523 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.843802 kubelet[2758]: W1216 13:02:26.843531 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.843802 kubelet[2758]: E1216 13:02:26.843538 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.843999 kubelet[2758]: E1216 13:02:26.843982 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.843999 kubelet[2758]: W1216 13:02:26.843995 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.844065 kubelet[2758]: E1216 13:02:26.844004 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.887687 kubelet[2758]: E1216 13:02:26.887646 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.887687 kubelet[2758]: W1216 13:02:26.887669 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.887911 kubelet[2758]: E1216 13:02:26.887702 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.888027 kubelet[2758]: E1216 13:02:26.888002 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.888027 kubelet[2758]: W1216 13:02:26.888020 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.888852 kubelet[2758]: E1216 13:02:26.888029 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.888852 kubelet[2758]: E1216 13:02:26.888275 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.888852 kubelet[2758]: W1216 13:02:26.888300 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.888852 kubelet[2758]: E1216 13:02:26.888311 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.888852 kubelet[2758]: E1216 13:02:26.888621 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.888852 kubelet[2758]: W1216 13:02:26.888643 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.888852 kubelet[2758]: E1216 13:02:26.888677 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.889021 kubelet[2758]: E1216 13:02:26.888976 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.889021 kubelet[2758]: W1216 13:02:26.888985 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.889021 kubelet[2758]: E1216 13:02:26.888995 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.889212 kubelet[2758]: E1216 13:02:26.889189 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.889212 kubelet[2758]: W1216 13:02:26.889203 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.889212 kubelet[2758]: E1216 13:02:26.889211 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.889452 kubelet[2758]: E1216 13:02:26.889428 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.889452 kubelet[2758]: W1216 13:02:26.889442 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.889452 kubelet[2758]: E1216 13:02:26.889452 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.889762 kubelet[2758]: E1216 13:02:26.889737 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.889762 kubelet[2758]: W1216 13:02:26.889753 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.889860 kubelet[2758]: E1216 13:02:26.889761 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.889981 kubelet[2758]: E1216 13:02:26.889955 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.889981 kubelet[2758]: W1216 13:02:26.889975 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.890045 kubelet[2758]: E1216 13:02:26.889984 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.890139 kubelet[2758]: E1216 13:02:26.890113 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.890139 kubelet[2758]: W1216 13:02:26.890131 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.890139 kubelet[2758]: E1216 13:02:26.890139 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.890351 kubelet[2758]: E1216 13:02:26.890328 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.890351 kubelet[2758]: W1216 13:02:26.890341 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.890351 kubelet[2758]: E1216 13:02:26.890351 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.890534 kubelet[2758]: E1216 13:02:26.890512 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.890534 kubelet[2758]: W1216 13:02:26.890524 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.890534 kubelet[2758]: E1216 13:02:26.890532 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.890755 kubelet[2758]: E1216 13:02:26.890732 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.890755 kubelet[2758]: W1216 13:02:26.890745 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.890755 kubelet[2758]: E1216 13:02:26.890753 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.891131 kubelet[2758]: E1216 13:02:26.891105 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.891131 kubelet[2758]: W1216 13:02:26.891120 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.891131 kubelet[2758]: E1216 13:02:26.891128 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.891277 kubelet[2758]: E1216 13:02:26.891255 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.891277 kubelet[2758]: W1216 13:02:26.891269 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.891277 kubelet[2758]: E1216 13:02:26.891276 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.891429 kubelet[2758]: E1216 13:02:26.891408 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.891429 kubelet[2758]: W1216 13:02:26.891421 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.891429 kubelet[2758]: E1216 13:02:26.891428 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.891633 kubelet[2758]: E1216 13:02:26.891580 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.891633 kubelet[2758]: W1216 13:02:26.891593 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.891633 kubelet[2758]: E1216 13:02:26.891604 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:26.891788 kubelet[2758]: E1216 13:02:26.891762 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:26.891788 kubelet[2758]: W1216 13:02:26.891786 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:26.891872 kubelet[2758]: E1216 13:02:26.891794 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.730874 containerd[1550]: time="2025-12-16T13:02:27.730764759Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:02:27.731982 containerd[1550]: time="2025-12-16T13:02:27.731777470Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4446754" Dec 16 13:02:27.732752 containerd[1550]: time="2025-12-16T13:02:27.732722345Z" level=info msg="ImageCreate event name:\"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:02:27.734850 containerd[1550]: time="2025-12-16T13:02:27.734807411Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:02:27.740973 containerd[1550]: time="2025-12-16T13:02:27.740941114Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5941314\" in 1.846949101s" Dec 16 13:02:27.741140 containerd[1550]: time="2025-12-16T13:02:27.741049037Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:570719e9c34097019014ae2ad94edf4e523bc6892e77fb1c64c23e5b7f390fe5\"" Dec 16 13:02:27.745244 containerd[1550]: time="2025-12-16T13:02:27.745093153Z" level=info msg="CreateContainer within sandbox \"2cc15975c36dff346558121152edc1ca9339d3a05c744bd68a4fa829b7a15121\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 16 13:02:27.765074 containerd[1550]: time="2025-12-16T13:02:27.764122685Z" level=info msg="Container 6eca150e4caec5224fd716c474c23e15fff83a038a708c0e1c7dbc2f0f620c3f: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:02:27.789169 containerd[1550]: time="2025-12-16T13:02:27.789120325Z" level=info msg="CreateContainer within sandbox \"2cc15975c36dff346558121152edc1ca9339d3a05c744bd68a4fa829b7a15121\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"6eca150e4caec5224fd716c474c23e15fff83a038a708c0e1c7dbc2f0f620c3f\"" Dec 16 13:02:27.789911 containerd[1550]: time="2025-12-16T13:02:27.789883898Z" level=info msg="StartContainer for \"6eca150e4caec5224fd716c474c23e15fff83a038a708c0e1c7dbc2f0f620c3f\"" Dec 16 13:02:27.791557 containerd[1550]: time="2025-12-16T13:02:27.791523022Z" level=info msg="connecting to shim 6eca150e4caec5224fd716c474c23e15fff83a038a708c0e1c7dbc2f0f620c3f" address="unix:///run/containerd/s/ceae76b395b14e1d49485b3d4fc37e96059eadedfe65859d0f2d8d6f9eec47ab" protocol=ttrpc version=3 Dec 16 13:02:27.814989 systemd[1]: Started cri-containerd-6eca150e4caec5224fd716c474c23e15fff83a038a708c0e1c7dbc2f0f620c3f.scope - libcontainer container 6eca150e4caec5224fd716c474c23e15fff83a038a708c0e1c7dbc2f0f620c3f. Dec 16 13:02:27.819418 kubelet[2758]: I1216 13:02:27.819390 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:02:27.853955 kubelet[2758]: E1216 13:02:27.853903 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.853955 kubelet[2758]: W1216 13:02:27.853928 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.853955 kubelet[2758]: E1216 13:02:27.853946 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.854135 kubelet[2758]: E1216 13:02:27.854084 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.854135 kubelet[2758]: W1216 13:02:27.854092 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.854135 kubelet[2758]: E1216 13:02:27.854100 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.854508 kubelet[2758]: E1216 13:02:27.854217 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.854508 kubelet[2758]: W1216 13:02:27.854229 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.854508 kubelet[2758]: E1216 13:02:27.854237 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.854681 kubelet[2758]: E1216 13:02:27.854639 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.854681 kubelet[2758]: W1216 13:02:27.854648 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.854681 kubelet[2758]: E1216 13:02:27.854658 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.854916 kubelet[2758]: E1216 13:02:27.854876 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.854916 kubelet[2758]: W1216 13:02:27.854896 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.854916 kubelet[2758]: E1216 13:02:27.854906 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.855519 kubelet[2758]: E1216 13:02:27.855504 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.855519 kubelet[2758]: W1216 13:02:27.855517 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.855588 kubelet[2758]: E1216 13:02:27.855527 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.855742 kubelet[2758]: E1216 13:02:27.855710 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.855742 kubelet[2758]: W1216 13:02:27.855724 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.855742 kubelet[2758]: E1216 13:02:27.855732 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.855965 kubelet[2758]: E1216 13:02:27.855905 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.855965 kubelet[2758]: W1216 13:02:27.855924 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.855965 kubelet[2758]: E1216 13:02:27.855932 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.856453 kubelet[2758]: E1216 13:02:27.856431 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.856453 kubelet[2758]: W1216 13:02:27.856445 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.856453 kubelet[2758]: E1216 13:02:27.856455 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.856810 kubelet[2758]: E1216 13:02:27.856589 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.856810 kubelet[2758]: W1216 13:02:27.856598 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.856810 kubelet[2758]: E1216 13:02:27.856606 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.856810 kubelet[2758]: E1216 13:02:27.856715 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.856810 kubelet[2758]: W1216 13:02:27.856721 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.856810 kubelet[2758]: E1216 13:02:27.856729 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.857892 kubelet[2758]: E1216 13:02:27.857866 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.857892 kubelet[2758]: W1216 13:02:27.857882 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.857892 kubelet[2758]: E1216 13:02:27.857891 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.858446 kubelet[2758]: E1216 13:02:27.858026 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.858446 kubelet[2758]: W1216 13:02:27.858034 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.858446 kubelet[2758]: E1216 13:02:27.858041 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.858446 kubelet[2758]: E1216 13:02:27.858267 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.858446 kubelet[2758]: W1216 13:02:27.858276 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.858446 kubelet[2758]: E1216 13:02:27.858285 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.858706 kubelet[2758]: E1216 13:02:27.858691 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.858706 kubelet[2758]: W1216 13:02:27.858704 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.858765 kubelet[2758]: E1216 13:02:27.858713 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.898133 kubelet[2758]: E1216 13:02:27.898061 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.898696 containerd[1550]: time="2025-12-16T13:02:27.898420026Z" level=info msg="StartContainer for \"6eca150e4caec5224fd716c474c23e15fff83a038a708c0e1c7dbc2f0f620c3f\" returns successfully" Dec 16 13:02:27.898962 kubelet[2758]: W1216 13:02:27.898843 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.898962 kubelet[2758]: E1216 13:02:27.898871 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.900052 kubelet[2758]: E1216 13:02:27.899942 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.901129 kubelet[2758]: W1216 13:02:27.900277 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.901129 kubelet[2758]: E1216 13:02:27.900293 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.901129 kubelet[2758]: E1216 13:02:27.900870 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.901129 kubelet[2758]: W1216 13:02:27.900880 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.901129 kubelet[2758]: E1216 13:02:27.900890 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.902709 kubelet[2758]: E1216 13:02:27.902653 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.903322 kubelet[2758]: W1216 13:02:27.903003 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.903322 kubelet[2758]: E1216 13:02:27.903017 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.904274 kubelet[2758]: E1216 13:02:27.904248 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.905177 kubelet[2758]: W1216 13:02:27.904491 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.905177 kubelet[2758]: E1216 13:02:27.904504 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.905602 kubelet[2758]: E1216 13:02:27.905419 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.905602 kubelet[2758]: W1216 13:02:27.905431 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.905602 kubelet[2758]: E1216 13:02:27.905441 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.905953 kubelet[2758]: E1216 13:02:27.905926 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.906700 kubelet[2758]: W1216 13:02:27.906678 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.906789 kubelet[2758]: E1216 13:02:27.906705 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.907099 kubelet[2758]: E1216 13:02:27.907060 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.907099 kubelet[2758]: W1216 13:02:27.907073 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.907099 kubelet[2758]: E1216 13:02:27.907084 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.907525 kubelet[2758]: E1216 13:02:27.907447 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.907525 kubelet[2758]: W1216 13:02:27.907481 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.907525 kubelet[2758]: E1216 13:02:27.907490 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.907948 kubelet[2758]: E1216 13:02:27.907936 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.908228 kubelet[2758]: W1216 13:02:27.908212 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.908684 kubelet[2758]: E1216 13:02:27.908646 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.908945 kubelet[2758]: E1216 13:02:27.908922 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.909899 kubelet[2758]: W1216 13:02:27.909872 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.909899 kubelet[2758]: E1216 13:02:27.909895 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.910406 kubelet[2758]: E1216 13:02:27.910384 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.910406 kubelet[2758]: W1216 13:02:27.910402 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.910510 kubelet[2758]: E1216 13:02:27.910412 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.910535 kubelet[2758]: E1216 13:02:27.910530 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.910592 kubelet[2758]: W1216 13:02:27.910537 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.910592 kubelet[2758]: E1216 13:02:27.910544 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.910730 kubelet[2758]: E1216 13:02:27.910702 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.910730 kubelet[2758]: W1216 13:02:27.910711 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.910730 kubelet[2758]: E1216 13:02:27.910719 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.911765 kubelet[2758]: E1216 13:02:27.911746 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.911765 kubelet[2758]: W1216 13:02:27.911760 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.911878 kubelet[2758]: E1216 13:02:27.911780 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.912128 kubelet[2758]: E1216 13:02:27.911982 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.912128 kubelet[2758]: W1216 13:02:27.912001 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.912128 kubelet[2758]: E1216 13:02:27.912012 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.912300 kubelet[2758]: E1216 13:02:27.912288 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.912390 kubelet[2758]: W1216 13:02:27.912359 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.912390 kubelet[2758]: E1216 13:02:27.912376 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.912850 kubelet[2758]: E1216 13:02:27.912780 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 13:02:27.913725 kubelet[2758]: W1216 13:02:27.913687 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 13:02:27.913725 kubelet[2758]: E1216 13:02:27.913705 2758 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 13:02:27.915090 systemd[1]: cri-containerd-6eca150e4caec5224fd716c474c23e15fff83a038a708c0e1c7dbc2f0f620c3f.scope: Deactivated successfully. Dec 16 13:02:27.936078 containerd[1550]: time="2025-12-16T13:02:27.936026735Z" level=info msg="received container exit event container_id:\"6eca150e4caec5224fd716c474c23e15fff83a038a708c0e1c7dbc2f0f620c3f\" id:\"6eca150e4caec5224fd716c474c23e15fff83a038a708c0e1c7dbc2f0f620c3f\" pid:3469 exited_at:{seconds:1765890147 nanos:916375579}" Dec 16 13:02:27.964682 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6eca150e4caec5224fd716c474c23e15fff83a038a708c0e1c7dbc2f0f620c3f-rootfs.mount: Deactivated successfully. Dec 16 13:02:28.694373 kubelet[2758]: E1216 13:02:28.694299 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:02:28.825048 containerd[1550]: time="2025-12-16T13:02:28.824968293Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 16 13:02:30.694927 kubelet[2758]: E1216 13:02:30.694868 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:02:32.694739 kubelet[2758]: E1216 13:02:32.694692 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:02:33.193421 kubelet[2758]: I1216 13:02:33.193390 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:02:33.333977 containerd[1550]: time="2025-12-16T13:02:33.333922401Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:02:33.348333 containerd[1550]: time="2025-12-16T13:02:33.334751614Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=70446859" Dec 16 13:02:33.348333 containerd[1550]: time="2025-12-16T13:02:33.337142449Z" level=info msg="ImageCreate event name:\"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:02:33.348715 containerd[1550]: time="2025-12-16T13:02:33.339442371Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"71941459\" in 4.514278389s" Dec 16 13:02:33.348715 containerd[1550]: time="2025-12-16T13:02:33.348626273Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:24e1e7377c738d4080eb462a29e2c6756d383d8d25ad87b7f49165581f20c3cd\"" Dec 16 13:02:33.348871 containerd[1550]: time="2025-12-16T13:02:33.348839585Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:02:33.354237 containerd[1550]: time="2025-12-16T13:02:33.354192471Z" level=info msg="CreateContainer within sandbox \"2cc15975c36dff346558121152edc1ca9339d3a05c744bd68a4fa829b7a15121\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 16 13:02:33.364837 containerd[1550]: time="2025-12-16T13:02:33.364678035Z" level=info msg="Container 067377f49a84704a52f036fac9663c1445a9dfa93f53ed06250b28428ce1662a: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:02:33.382194 containerd[1550]: time="2025-12-16T13:02:33.382160803Z" level=info msg="CreateContainer within sandbox \"2cc15975c36dff346558121152edc1ca9339d3a05c744bd68a4fa829b7a15121\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"067377f49a84704a52f036fac9663c1445a9dfa93f53ed06250b28428ce1662a\"" Dec 16 13:02:33.383262 containerd[1550]: time="2025-12-16T13:02:33.383218475Z" level=info msg="StartContainer for \"067377f49a84704a52f036fac9663c1445a9dfa93f53ed06250b28428ce1662a\"" Dec 16 13:02:33.385317 containerd[1550]: time="2025-12-16T13:02:33.385280650Z" level=info msg="connecting to shim 067377f49a84704a52f036fac9663c1445a9dfa93f53ed06250b28428ce1662a" address="unix:///run/containerd/s/ceae76b395b14e1d49485b3d4fc37e96059eadedfe65859d0f2d8d6f9eec47ab" protocol=ttrpc version=3 Dec 16 13:02:33.410019 systemd[1]: Started cri-containerd-067377f49a84704a52f036fac9663c1445a9dfa93f53ed06250b28428ce1662a.scope - libcontainer container 067377f49a84704a52f036fac9663c1445a9dfa93f53ed06250b28428ce1662a. Dec 16 13:02:33.461165 containerd[1550]: time="2025-12-16T13:02:33.461027067Z" level=info msg="StartContainer for \"067377f49a84704a52f036fac9663c1445a9dfa93f53ed06250b28428ce1662a\" returns successfully" Dec 16 13:02:33.869207 systemd[1]: cri-containerd-067377f49a84704a52f036fac9663c1445a9dfa93f53ed06250b28428ce1662a.scope: Deactivated successfully. Dec 16 13:02:33.869406 systemd[1]: cri-containerd-067377f49a84704a52f036fac9663c1445a9dfa93f53ed06250b28428ce1662a.scope: Consumed 400ms CPU time, 165.5M memory peak, 6.2M read from disk, 171.3M written to disk. Dec 16 13:02:33.947561 containerd[1550]: time="2025-12-16T13:02:33.947530116Z" level=info msg="received container exit event container_id:\"067377f49a84704a52f036fac9663c1445a9dfa93f53ed06250b28428ce1662a\" id:\"067377f49a84704a52f036fac9663c1445a9dfa93f53ed06250b28428ce1662a\" pid:3561 exited_at:{seconds:1765890153 nanos:947294682}" Dec 16 13:02:33.948539 kubelet[2758]: I1216 13:02:33.948519 2758 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Dec 16 13:02:33.995502 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-067377f49a84704a52f036fac9663c1445a9dfa93f53ed06250b28428ce1662a-rootfs.mount: Deactivated successfully. Dec 16 13:02:34.008425 systemd[1]: Created slice kubepods-burstable-podf77669b7_3f9d_436f_969d_f70eca9611c4.slice - libcontainer container kubepods-burstable-podf77669b7_3f9d_436f_969d_f70eca9611c4.slice. Dec 16 13:02:34.024497 systemd[1]: Created slice kubepods-besteffort-podfc7f9104_6597_40ee_97eb_e334f4aba79b.slice - libcontainer container kubepods-besteffort-podfc7f9104_6597_40ee_97eb_e334f4aba79b.slice. Dec 16 13:02:34.039851 kubelet[2758]: I1216 13:02:34.039541 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f77669b7-3f9d-436f-969d-f70eca9611c4-config-volume\") pod \"coredns-66bc5c9577-lbczh\" (UID: \"f77669b7-3f9d-436f-969d-f70eca9611c4\") " pod="kube-system/coredns-66bc5c9577-lbczh" Dec 16 13:02:34.041125 kubelet[2758]: I1216 13:02:34.040788 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df4rg\" (UniqueName: \"kubernetes.io/projected/f77669b7-3f9d-436f-969d-f70eca9611c4-kube-api-access-df4rg\") pod \"coredns-66bc5c9577-lbczh\" (UID: \"f77669b7-3f9d-436f-969d-f70eca9611c4\") " pod="kube-system/coredns-66bc5c9577-lbczh" Dec 16 13:02:34.042115 systemd[1]: Created slice kubepods-burstable-pod436357c5_9bd9_4f17_88f8_cc94151199a4.slice - libcontainer container kubepods-burstable-pod436357c5_9bd9_4f17_88f8_cc94151199a4.slice. Dec 16 13:02:34.048909 systemd[1]: Created slice kubepods-besteffort-pod71bd25bd_53fb_4224_a114_8010f8dec502.slice - libcontainer container kubepods-besteffort-pod71bd25bd_53fb_4224_a114_8010f8dec502.slice. Dec 16 13:02:34.055320 systemd[1]: Created slice kubepods-besteffort-pod012a187d_90e1_44f7_a642_f3d8eed099d6.slice - libcontainer container kubepods-besteffort-pod012a187d_90e1_44f7_a642_f3d8eed099d6.slice. Dec 16 13:02:34.061520 systemd[1]: Created slice kubepods-besteffort-pode556d54d_017a_4277_b8ea_32b10093992b.slice - libcontainer container kubepods-besteffort-pode556d54d_017a_4277_b8ea_32b10093992b.slice. Dec 16 13:02:34.067791 systemd[1]: Created slice kubepods-besteffort-podc38f2008_1a82_45d6_8890_7ece4b855117.slice - libcontainer container kubepods-besteffort-podc38f2008_1a82_45d6_8890_7ece4b855117.slice. Dec 16 13:02:34.142058 kubelet[2758]: I1216 13:02:34.141934 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c38f2008-1a82-45d6-8890-7ece4b855117-config\") pod \"goldmane-7c778bb748-2pnsr\" (UID: \"c38f2008-1a82-45d6-8890-7ece4b855117\") " pod="calico-system/goldmane-7c778bb748-2pnsr" Dec 16 13:02:34.142058 kubelet[2758]: I1216 13:02:34.141973 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71bd25bd-53fb-4224-a114-8010f8dec502-tigera-ca-bundle\") pod \"calico-kube-controllers-5f8d9d9fb6-xwnn4\" (UID: \"71bd25bd-53fb-4224-a114-8010f8dec502\") " pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" Dec 16 13:02:34.142058 kubelet[2758]: I1216 13:02:34.141992 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqlfr\" (UniqueName: \"kubernetes.io/projected/71bd25bd-53fb-4224-a114-8010f8dec502-kube-api-access-lqlfr\") pod \"calico-kube-controllers-5f8d9d9fb6-xwnn4\" (UID: \"71bd25bd-53fb-4224-a114-8010f8dec502\") " pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" Dec 16 13:02:34.142058 kubelet[2758]: I1216 13:02:34.142012 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltrpn\" (UniqueName: \"kubernetes.io/projected/c38f2008-1a82-45d6-8890-7ece4b855117-kube-api-access-ltrpn\") pod \"goldmane-7c778bb748-2pnsr\" (UID: \"c38f2008-1a82-45d6-8890-7ece4b855117\") " pod="calico-system/goldmane-7c778bb748-2pnsr" Dec 16 13:02:34.142058 kubelet[2758]: I1216 13:02:34.142030 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e556d54d-017a-4277-b8ea-32b10093992b-calico-apiserver-certs\") pod \"calico-apiserver-57d769ccb6-5hqtj\" (UID: \"e556d54d-017a-4277-b8ea-32b10093992b\") " pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" Dec 16 13:02:34.143838 kubelet[2758]: I1216 13:02:34.142075 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxsrx\" (UniqueName: \"kubernetes.io/projected/e556d54d-017a-4277-b8ea-32b10093992b-kube-api-access-rxsrx\") pod \"calico-apiserver-57d769ccb6-5hqtj\" (UID: \"e556d54d-017a-4277-b8ea-32b10093992b\") " pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" Dec 16 13:02:34.143838 kubelet[2758]: I1216 13:02:34.142090 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/fc7f9104-6597-40ee-97eb-e334f4aba79b-calico-apiserver-certs\") pod \"calico-apiserver-57d769ccb6-w5zs6\" (UID: \"fc7f9104-6597-40ee-97eb-e334f4aba79b\") " pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" Dec 16 13:02:34.143838 kubelet[2758]: I1216 13:02:34.142115 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/c38f2008-1a82-45d6-8890-7ece4b855117-goldmane-key-pair\") pod \"goldmane-7c778bb748-2pnsr\" (UID: \"c38f2008-1a82-45d6-8890-7ece4b855117\") " pod="calico-system/goldmane-7c778bb748-2pnsr" Dec 16 13:02:34.143838 kubelet[2758]: I1216 13:02:34.142128 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgxxd\" (UniqueName: \"kubernetes.io/projected/012a187d-90e1-44f7-a642-f3d8eed099d6-kube-api-access-fgxxd\") pod \"whisker-7f4cd685db-kv8rm\" (UID: \"012a187d-90e1-44f7-a642-f3d8eed099d6\") " pod="calico-system/whisker-7f4cd685db-kv8rm" Dec 16 13:02:34.143838 kubelet[2758]: I1216 13:02:34.142150 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/436357c5-9bd9-4f17-88f8-cc94151199a4-config-volume\") pod \"coredns-66bc5c9577-vhvvf\" (UID: \"436357c5-9bd9-4f17-88f8-cc94151199a4\") " pod="kube-system/coredns-66bc5c9577-vhvvf" Dec 16 13:02:34.143955 kubelet[2758]: I1216 13:02:34.142176 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c38f2008-1a82-45d6-8890-7ece4b855117-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-2pnsr\" (UID: \"c38f2008-1a82-45d6-8890-7ece4b855117\") " pod="calico-system/goldmane-7c778bb748-2pnsr" Dec 16 13:02:34.143955 kubelet[2758]: I1216 13:02:34.142190 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdwj8\" (UniqueName: \"kubernetes.io/projected/fc7f9104-6597-40ee-97eb-e334f4aba79b-kube-api-access-xdwj8\") pod \"calico-apiserver-57d769ccb6-w5zs6\" (UID: \"fc7f9104-6597-40ee-97eb-e334f4aba79b\") " pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" Dec 16 13:02:34.143955 kubelet[2758]: I1216 13:02:34.142213 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrhns\" (UniqueName: \"kubernetes.io/projected/436357c5-9bd9-4f17-88f8-cc94151199a4-kube-api-access-jrhns\") pod \"coredns-66bc5c9577-vhvvf\" (UID: \"436357c5-9bd9-4f17-88f8-cc94151199a4\") " pod="kube-system/coredns-66bc5c9577-vhvvf" Dec 16 13:02:34.143955 kubelet[2758]: I1216 13:02:34.142226 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/012a187d-90e1-44f7-a642-f3d8eed099d6-whisker-backend-key-pair\") pod \"whisker-7f4cd685db-kv8rm\" (UID: \"012a187d-90e1-44f7-a642-f3d8eed099d6\") " pod="calico-system/whisker-7f4cd685db-kv8rm" Dec 16 13:02:34.143955 kubelet[2758]: I1216 13:02:34.142240 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/012a187d-90e1-44f7-a642-f3d8eed099d6-whisker-ca-bundle\") pod \"whisker-7f4cd685db-kv8rm\" (UID: \"012a187d-90e1-44f7-a642-f3d8eed099d6\") " pod="calico-system/whisker-7f4cd685db-kv8rm" Dec 16 13:02:34.334380 containerd[1550]: time="2025-12-16T13:02:34.334314682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-lbczh,Uid:f77669b7-3f9d-436f-969d-f70eca9611c4,Namespace:kube-system,Attempt:0,}" Dec 16 13:02:34.335273 containerd[1550]: time="2025-12-16T13:02:34.335237471Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57d769ccb6-w5zs6,Uid:fc7f9104-6597-40ee-97eb-e334f4aba79b,Namespace:calico-apiserver,Attempt:0,}" Dec 16 13:02:34.354336 containerd[1550]: time="2025-12-16T13:02:34.354144059Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5f8d9d9fb6-xwnn4,Uid:71bd25bd-53fb-4224-a114-8010f8dec502,Namespace:calico-system,Attempt:0,}" Dec 16 13:02:34.355275 containerd[1550]: time="2025-12-16T13:02:34.355235325Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-vhvvf,Uid:436357c5-9bd9-4f17-88f8-cc94151199a4,Namespace:kube-system,Attempt:0,}" Dec 16 13:02:34.361250 containerd[1550]: time="2025-12-16T13:02:34.361119848Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7f4cd685db-kv8rm,Uid:012a187d-90e1-44f7-a642-f3d8eed099d6,Namespace:calico-system,Attempt:0,}" Dec 16 13:02:34.395005 containerd[1550]: time="2025-12-16T13:02:34.394915330Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-2pnsr,Uid:c38f2008-1a82-45d6-8890-7ece4b855117,Namespace:calico-system,Attempt:0,}" Dec 16 13:02:34.409428 containerd[1550]: time="2025-12-16T13:02:34.407263427Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57d769ccb6-5hqtj,Uid:e556d54d-017a-4277-b8ea-32b10093992b,Namespace:calico-apiserver,Attempt:0,}" Dec 16 13:02:34.542898 containerd[1550]: time="2025-12-16T13:02:34.542859715Z" level=error msg="Failed to destroy network for sandbox \"bde3853d5d9f76a67469cc41180d81ae56ca4eccaf120cdb635c7cd04b095314\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.546258 systemd[1]: run-netns-cni\x2d8ea10fba\x2de194\x2d7170\x2d2a5f\x2d2d929315d1b9.mount: Deactivated successfully. Dec 16 13:02:34.553375 containerd[1550]: time="2025-12-16T13:02:34.553333902Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57d769ccb6-w5zs6,Uid:fc7f9104-6597-40ee-97eb-e334f4aba79b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bde3853d5d9f76a67469cc41180d81ae56ca4eccaf120cdb635c7cd04b095314\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.556725 kubelet[2758]: E1216 13:02:34.556332 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bde3853d5d9f76a67469cc41180d81ae56ca4eccaf120cdb635c7cd04b095314\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.556725 kubelet[2758]: E1216 13:02:34.556395 2758 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bde3853d5d9f76a67469cc41180d81ae56ca4eccaf120cdb635c7cd04b095314\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" Dec 16 13:02:34.556725 kubelet[2758]: E1216 13:02:34.556416 2758 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bde3853d5d9f76a67469cc41180d81ae56ca4eccaf120cdb635c7cd04b095314\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" Dec 16 13:02:34.556890 kubelet[2758]: E1216 13:02:34.556469 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57d769ccb6-w5zs6_calico-apiserver(fc7f9104-6597-40ee-97eb-e334f4aba79b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57d769ccb6-w5zs6_calico-apiserver(fc7f9104-6597-40ee-97eb-e334f4aba79b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bde3853d5d9f76a67469cc41180d81ae56ca4eccaf120cdb635c7cd04b095314\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" podUID="fc7f9104-6597-40ee-97eb-e334f4aba79b" Dec 16 13:02:34.558129 containerd[1550]: time="2025-12-16T13:02:34.558091002Z" level=error msg="Failed to destroy network for sandbox \"1398c6dbb88830ae8b14f86fb16c65dd55d6d18693a0da453c8f95880b51204d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.560955 systemd[1]: run-netns-cni\x2d0975d8a0\x2d2372\x2df7ab\x2d15a3\x2d8b8a155af9e9.mount: Deactivated successfully. Dec 16 13:02:34.568179 containerd[1550]: time="2025-12-16T13:02:34.568147072Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5f8d9d9fb6-xwnn4,Uid:71bd25bd-53fb-4224-a114-8010f8dec502,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1398c6dbb88830ae8b14f86fb16c65dd55d6d18693a0da453c8f95880b51204d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.568514 kubelet[2758]: E1216 13:02:34.568486 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1398c6dbb88830ae8b14f86fb16c65dd55d6d18693a0da453c8f95880b51204d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.568626 kubelet[2758]: E1216 13:02:34.568602 2758 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1398c6dbb88830ae8b14f86fb16c65dd55d6d18693a0da453c8f95880b51204d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" Dec 16 13:02:34.568689 kubelet[2758]: E1216 13:02:34.568676 2758 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1398c6dbb88830ae8b14f86fb16c65dd55d6d18693a0da453c8f95880b51204d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" Dec 16 13:02:34.568807 kubelet[2758]: E1216 13:02:34.568786 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5f8d9d9fb6-xwnn4_calico-system(71bd25bd-53fb-4224-a114-8010f8dec502)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5f8d9d9fb6-xwnn4_calico-system(71bd25bd-53fb-4224-a114-8010f8dec502)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1398c6dbb88830ae8b14f86fb16c65dd55d6d18693a0da453c8f95880b51204d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" podUID="71bd25bd-53fb-4224-a114-8010f8dec502" Dec 16 13:02:34.583477 containerd[1550]: time="2025-12-16T13:02:34.583408286Z" level=error msg="Failed to destroy network for sandbox \"ae772302dcf03d8eae34343619c9ba6294b0f1ccfd38421794989d4b47819e18\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.585243 containerd[1550]: time="2025-12-16T13:02:34.585212513Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-2pnsr,Uid:c38f2008-1a82-45d6-8890-7ece4b855117,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae772302dcf03d8eae34343619c9ba6294b0f1ccfd38421794989d4b47819e18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.585832 kubelet[2758]: E1216 13:02:34.585744 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae772302dcf03d8eae34343619c9ba6294b0f1ccfd38421794989d4b47819e18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.585891 kubelet[2758]: E1216 13:02:34.585855 2758 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae772302dcf03d8eae34343619c9ba6294b0f1ccfd38421794989d4b47819e18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-2pnsr" Dec 16 13:02:34.585891 kubelet[2758]: E1216 13:02:34.585876 2758 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae772302dcf03d8eae34343619c9ba6294b0f1ccfd38421794989d4b47819e18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-2pnsr" Dec 16 13:02:34.586057 kubelet[2758]: E1216 13:02:34.586031 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-2pnsr_calico-system(c38f2008-1a82-45d6-8890-7ece4b855117)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-2pnsr_calico-system(c38f2008-1a82-45d6-8890-7ece4b855117)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ae772302dcf03d8eae34343619c9ba6294b0f1ccfd38421794989d4b47819e18\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-2pnsr" podUID="c38f2008-1a82-45d6-8890-7ece4b855117" Dec 16 13:02:34.586933 containerd[1550]: time="2025-12-16T13:02:34.586909509Z" level=error msg="Failed to destroy network for sandbox \"8ffebbd58ae9d92bcfcc71d2f7b46c156628e1695208e7f77bb437bf5e60cf4e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.589178 containerd[1550]: time="2025-12-16T13:02:34.589138298Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-vhvvf,Uid:436357c5-9bd9-4f17-88f8-cc94151199a4,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ffebbd58ae9d92bcfcc71d2f7b46c156628e1695208e7f77bb437bf5e60cf4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.589377 kubelet[2758]: E1216 13:02:34.589350 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ffebbd58ae9d92bcfcc71d2f7b46c156628e1695208e7f77bb437bf5e60cf4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.589475 kubelet[2758]: E1216 13:02:34.589461 2758 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ffebbd58ae9d92bcfcc71d2f7b46c156628e1695208e7f77bb437bf5e60cf4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-vhvvf" Dec 16 13:02:34.589569 kubelet[2758]: E1216 13:02:34.589537 2758 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8ffebbd58ae9d92bcfcc71d2f7b46c156628e1695208e7f77bb437bf5e60cf4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-vhvvf" Dec 16 13:02:34.589802 kubelet[2758]: E1216 13:02:34.589651 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-vhvvf_kube-system(436357c5-9bd9-4f17-88f8-cc94151199a4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-vhvvf_kube-system(436357c5-9bd9-4f17-88f8-cc94151199a4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8ffebbd58ae9d92bcfcc71d2f7b46c156628e1695208e7f77bb437bf5e60cf4e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-vhvvf" podUID="436357c5-9bd9-4f17-88f8-cc94151199a4" Dec 16 13:02:34.593391 containerd[1550]: time="2025-12-16T13:02:34.593349499Z" level=error msg="Failed to destroy network for sandbox \"ba8ccc975f70b638817804fcc242923780118840647f762096f4e33123828d9c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.593572 containerd[1550]: time="2025-12-16T13:02:34.593450770Z" level=error msg="Failed to destroy network for sandbox \"b679282cdef2f57a51df74d7a3131cc019634544e76f5ee697f355c0c040a4aa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.594352 containerd[1550]: time="2025-12-16T13:02:34.594326058Z" level=error msg="Failed to destroy network for sandbox \"24aa2374f81530f83f529933ad6f5dc80a46c60bf6d0e370313e1da2e4b6476c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.594688 containerd[1550]: time="2025-12-16T13:02:34.594621875Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57d769ccb6-5hqtj,Uid:e556d54d-017a-4277-b8ea-32b10093992b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba8ccc975f70b638817804fcc242923780118840647f762096f4e33123828d9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.594853 kubelet[2758]: E1216 13:02:34.594766 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba8ccc975f70b638817804fcc242923780118840647f762096f4e33123828d9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.594908 kubelet[2758]: E1216 13:02:34.594890 2758 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba8ccc975f70b638817804fcc242923780118840647f762096f4e33123828d9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" Dec 16 13:02:34.594992 kubelet[2758]: E1216 13:02:34.594908 2758 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba8ccc975f70b638817804fcc242923780118840647f762096f4e33123828d9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" Dec 16 13:02:34.595085 kubelet[2758]: E1216 13:02:34.595055 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57d769ccb6-5hqtj_calico-apiserver(e556d54d-017a-4277-b8ea-32b10093992b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57d769ccb6-5hqtj_calico-apiserver(e556d54d-017a-4277-b8ea-32b10093992b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ba8ccc975f70b638817804fcc242923780118840647f762096f4e33123828d9c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" podUID="e556d54d-017a-4277-b8ea-32b10093992b" Dec 16 13:02:34.595394 containerd[1550]: time="2025-12-16T13:02:34.595373311Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-lbczh,Uid:f77669b7-3f9d-436f-969d-f70eca9611c4,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b679282cdef2f57a51df74d7a3131cc019634544e76f5ee697f355c0c040a4aa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.595698 kubelet[2758]: E1216 13:02:34.595677 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b679282cdef2f57a51df74d7a3131cc019634544e76f5ee697f355c0c040a4aa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.595749 kubelet[2758]: E1216 13:02:34.595732 2758 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b679282cdef2f57a51df74d7a3131cc019634544e76f5ee697f355c0c040a4aa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-lbczh" Dec 16 13:02:34.595749 kubelet[2758]: E1216 13:02:34.595745 2758 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b679282cdef2f57a51df74d7a3131cc019634544e76f5ee697f355c0c040a4aa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-lbczh" Dec 16 13:02:34.595971 kubelet[2758]: E1216 13:02:34.595947 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-lbczh_kube-system(f77669b7-3f9d-436f-969d-f70eca9611c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-lbczh_kube-system(f77669b7-3f9d-436f-969d-f70eca9611c4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b679282cdef2f57a51df74d7a3131cc019634544e76f5ee697f355c0c040a4aa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-lbczh" podUID="f77669b7-3f9d-436f-969d-f70eca9611c4" Dec 16 13:02:34.596338 containerd[1550]: time="2025-12-16T13:02:34.596303933Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7f4cd685db-kv8rm,Uid:012a187d-90e1-44f7-a642-f3d8eed099d6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"24aa2374f81530f83f529933ad6f5dc80a46c60bf6d0e370313e1da2e4b6476c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.596480 kubelet[2758]: E1216 13:02:34.596464 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24aa2374f81530f83f529933ad6f5dc80a46c60bf6d0e370313e1da2e4b6476c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.596538 kubelet[2758]: E1216 13:02:34.596487 2758 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24aa2374f81530f83f529933ad6f5dc80a46c60bf6d0e370313e1da2e4b6476c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7f4cd685db-kv8rm" Dec 16 13:02:34.596538 kubelet[2758]: E1216 13:02:34.596498 2758 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24aa2374f81530f83f529933ad6f5dc80a46c60bf6d0e370313e1da2e4b6476c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7f4cd685db-kv8rm" Dec 16 13:02:34.596538 kubelet[2758]: E1216 13:02:34.596524 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7f4cd685db-kv8rm_calico-system(012a187d-90e1-44f7-a642-f3d8eed099d6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7f4cd685db-kv8rm_calico-system(012a187d-90e1-44f7-a642-f3d8eed099d6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"24aa2374f81530f83f529933ad6f5dc80a46c60bf6d0e370313e1da2e4b6476c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7f4cd685db-kv8rm" podUID="012a187d-90e1-44f7-a642-f3d8eed099d6" Dec 16 13:02:34.699584 systemd[1]: Created slice kubepods-besteffort-pode441e8ab_0689_4cb9_bcbd_f66c0c3dbffb.slice - libcontainer container kubepods-besteffort-pode441e8ab_0689_4cb9_bcbd_f66c0c3dbffb.slice. Dec 16 13:02:34.703746 containerd[1550]: time="2025-12-16T13:02:34.703708925Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ffzww,Uid:e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb,Namespace:calico-system,Attempt:0,}" Dec 16 13:02:34.758378 containerd[1550]: time="2025-12-16T13:02:34.758303119Z" level=error msg="Failed to destroy network for sandbox \"ce9bc6876fecaccab3a0e6907b41a691a1879ff18a0875a39c72f6499eef1d24\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.759561 containerd[1550]: time="2025-12-16T13:02:34.759515693Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ffzww,Uid:e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce9bc6876fecaccab3a0e6907b41a691a1879ff18a0875a39c72f6499eef1d24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.759875 kubelet[2758]: E1216 13:02:34.759748 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce9bc6876fecaccab3a0e6907b41a691a1879ff18a0875a39c72f6499eef1d24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 13:02:34.759875 kubelet[2758]: E1216 13:02:34.759804 2758 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce9bc6876fecaccab3a0e6907b41a691a1879ff18a0875a39c72f6499eef1d24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ffzww" Dec 16 13:02:34.759996 kubelet[2758]: E1216 13:02:34.759965 2758 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce9bc6876fecaccab3a0e6907b41a691a1879ff18a0875a39c72f6499eef1d24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ffzww" Dec 16 13:02:34.760072 kubelet[2758]: E1216 13:02:34.760036 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ffzww_calico-system(e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ffzww_calico-system(e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ce9bc6876fecaccab3a0e6907b41a691a1879ff18a0875a39c72f6499eef1d24\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:02:34.848411 containerd[1550]: time="2025-12-16T13:02:34.848307618Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 16 13:02:35.366803 systemd[1]: run-netns-cni\x2dd4dd2f7f\x2d07c8\x2dbbd4\x2d59d5\x2dc20cc7f33919.mount: Deactivated successfully. Dec 16 13:02:35.366916 systemd[1]: run-netns-cni\x2d85d08304\x2dfe2b\x2dedcc\x2d28c0\x2d003627026a95.mount: Deactivated successfully. Dec 16 13:02:35.366969 systemd[1]: run-netns-cni\x2dbcf7121c\x2d91a5\x2db018\x2d8ab0\x2dfdb8f8d6da58.mount: Deactivated successfully. Dec 16 13:02:35.367076 systemd[1]: run-netns-cni\x2d449f56d3\x2d9a83\x2dee7a\x2d9f23\x2d9dcda2d939c2.mount: Deactivated successfully. Dec 16 13:02:35.367148 systemd[1]: run-netns-cni\x2de74de910\x2db017\x2dfe8b\x2d6e74\x2d3fa5df691637.mount: Deactivated successfully. Dec 16 13:02:42.013862 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3290940761.mount: Deactivated successfully. Dec 16 13:02:42.039382 containerd[1550]: time="2025-12-16T13:02:42.033376615Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:02:42.040278 containerd[1550]: time="2025-12-16T13:02:42.035157162Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=156883675" Dec 16 13:02:42.051171 containerd[1550]: time="2025-12-16T13:02:42.050346530Z" level=info msg="ImageCreate event name:\"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:02:42.052073 containerd[1550]: time="2025-12-16T13:02:42.052047769Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 13:02:42.052471 containerd[1550]: time="2025-12-16T13:02:42.052452199Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"156883537\" in 7.204089828s" Dec 16 13:02:42.052539 containerd[1550]: time="2025-12-16T13:02:42.052527951Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:833e8e11d9dc187377eab6f31e275114a6b0f8f0afc3bf578a2a00507e85afc9\"" Dec 16 13:02:42.079079 containerd[1550]: time="2025-12-16T13:02:42.079043121Z" level=info msg="CreateContainer within sandbox \"2cc15975c36dff346558121152edc1ca9339d3a05c744bd68a4fa829b7a15121\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 16 13:02:42.099854 containerd[1550]: time="2025-12-16T13:02:42.099628484Z" level=info msg="Container 484503cabb05e32371ee4bc9081e9e402ec5dc8abc8cf222175b4d5ba9c4c385: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:02:42.101713 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3509967300.mount: Deactivated successfully. Dec 16 13:02:42.112668 containerd[1550]: time="2025-12-16T13:02:42.112624438Z" level=info msg="CreateContainer within sandbox \"2cc15975c36dff346558121152edc1ca9339d3a05c744bd68a4fa829b7a15121\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"484503cabb05e32371ee4bc9081e9e402ec5dc8abc8cf222175b4d5ba9c4c385\"" Dec 16 13:02:42.113456 containerd[1550]: time="2025-12-16T13:02:42.113349891Z" level=info msg="StartContainer for \"484503cabb05e32371ee4bc9081e9e402ec5dc8abc8cf222175b4d5ba9c4c385\"" Dec 16 13:02:42.117288 containerd[1550]: time="2025-12-16T13:02:42.117267417Z" level=info msg="connecting to shim 484503cabb05e32371ee4bc9081e9e402ec5dc8abc8cf222175b4d5ba9c4c385" address="unix:///run/containerd/s/ceae76b395b14e1d49485b3d4fc37e96059eadedfe65859d0f2d8d6f9eec47ab" protocol=ttrpc version=3 Dec 16 13:02:42.186038 systemd[1]: Started cri-containerd-484503cabb05e32371ee4bc9081e9e402ec5dc8abc8cf222175b4d5ba9c4c385.scope - libcontainer container 484503cabb05e32371ee4bc9081e9e402ec5dc8abc8cf222175b4d5ba9c4c385. Dec 16 13:02:42.286444 containerd[1550]: time="2025-12-16T13:02:42.285081179Z" level=info msg="StartContainer for \"484503cabb05e32371ee4bc9081e9e402ec5dc8abc8cf222175b4d5ba9c4c385\" returns successfully" Dec 16 13:02:42.482681 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 16 13:02:42.486917 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 16 13:02:42.803331 kubelet[2758]: I1216 13:02:42.803282 2758 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgxxd\" (UniqueName: \"kubernetes.io/projected/012a187d-90e1-44f7-a642-f3d8eed099d6-kube-api-access-fgxxd\") pod \"012a187d-90e1-44f7-a642-f3d8eed099d6\" (UID: \"012a187d-90e1-44f7-a642-f3d8eed099d6\") " Dec 16 13:02:42.803331 kubelet[2758]: I1216 13:02:42.803319 2758 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/012a187d-90e1-44f7-a642-f3d8eed099d6-whisker-backend-key-pair\") pod \"012a187d-90e1-44f7-a642-f3d8eed099d6\" (UID: \"012a187d-90e1-44f7-a642-f3d8eed099d6\") " Dec 16 13:02:42.805022 kubelet[2758]: I1216 13:02:42.803439 2758 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/012a187d-90e1-44f7-a642-f3d8eed099d6-whisker-ca-bundle\") pod \"012a187d-90e1-44f7-a642-f3d8eed099d6\" (UID: \"012a187d-90e1-44f7-a642-f3d8eed099d6\") " Dec 16 13:02:42.805022 kubelet[2758]: I1216 13:02:42.803867 2758 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/012a187d-90e1-44f7-a642-f3d8eed099d6-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "012a187d-90e1-44f7-a642-f3d8eed099d6" (UID: "012a187d-90e1-44f7-a642-f3d8eed099d6"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 16 13:02:42.810079 kubelet[2758]: I1216 13:02:42.810040 2758 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/012a187d-90e1-44f7-a642-f3d8eed099d6-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "012a187d-90e1-44f7-a642-f3d8eed099d6" (UID: "012a187d-90e1-44f7-a642-f3d8eed099d6"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 16 13:02:42.810800 kubelet[2758]: I1216 13:02:42.810283 2758 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/012a187d-90e1-44f7-a642-f3d8eed099d6-kube-api-access-fgxxd" (OuterVolumeSpecName: "kube-api-access-fgxxd") pod "012a187d-90e1-44f7-a642-f3d8eed099d6" (UID: "012a187d-90e1-44f7-a642-f3d8eed099d6"). InnerVolumeSpecName "kube-api-access-fgxxd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 16 13:02:42.881884 systemd[1]: Removed slice kubepods-besteffort-pod012a187d_90e1_44f7_a642_f3d8eed099d6.slice - libcontainer container kubepods-besteffort-pod012a187d_90e1_44f7_a642_f3d8eed099d6.slice. Dec 16 13:02:42.905836 kubelet[2758]: I1216 13:02:42.904528 2758 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fgxxd\" (UniqueName: \"kubernetes.io/projected/012a187d-90e1-44f7-a642-f3d8eed099d6-kube-api-access-fgxxd\") on node \"ci-4459-2-2-4-07f930e259\" DevicePath \"\"" Dec 16 13:02:42.905984 kubelet[2758]: I1216 13:02:42.905971 2758 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/012a187d-90e1-44f7-a642-f3d8eed099d6-whisker-backend-key-pair\") on node \"ci-4459-2-2-4-07f930e259\" DevicePath \"\"" Dec 16 13:02:42.906047 kubelet[2758]: I1216 13:02:42.906038 2758 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/012a187d-90e1-44f7-a642-f3d8eed099d6-whisker-ca-bundle\") on node \"ci-4459-2-2-4-07f930e259\" DevicePath \"\"" Dec 16 13:02:42.922474 kubelet[2758]: I1216 13:02:42.922413 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-96pm6" podStartSLOduration=2.033626903 podStartE2EDuration="20.922399377s" podCreationTimestamp="2025-12-16 13:02:22 +0000 UTC" firstStartedPulling="2025-12-16 13:02:23.164395551 +0000 UTC m=+21.594155257" lastFinishedPulling="2025-12-16 13:02:42.053168034 +0000 UTC m=+40.482927731" observedRunningTime="2025-12-16 13:02:42.921021587 +0000 UTC m=+41.350781303" watchObservedRunningTime="2025-12-16 13:02:42.922399377 +0000 UTC m=+41.352159084" Dec 16 13:02:42.997995 systemd[1]: Created slice kubepods-besteffort-pode5c12796_2fe4_496b_bdb1_f696ef061ec9.slice - libcontainer container kubepods-besteffort-pode5c12796_2fe4_496b_bdb1_f696ef061ec9.slice. Dec 16 13:02:43.009904 systemd[1]: var-lib-kubelet-pods-012a187d\x2d90e1\x2d44f7\x2da642\x2df3d8eed099d6-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dfgxxd.mount: Deactivated successfully. Dec 16 13:02:43.009985 systemd[1]: var-lib-kubelet-pods-012a187d\x2d90e1\x2d44f7\x2da642\x2df3d8eed099d6-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 16 13:02:43.109424 kubelet[2758]: I1216 13:02:43.108951 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e5c12796-2fe4-496b-bdb1-f696ef061ec9-whisker-backend-key-pair\") pod \"whisker-57f7c6c79b-6gqd8\" (UID: \"e5c12796-2fe4-496b-bdb1-f696ef061ec9\") " pod="calico-system/whisker-57f7c6c79b-6gqd8" Dec 16 13:02:43.109424 kubelet[2758]: I1216 13:02:43.109006 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gsnv\" (UniqueName: \"kubernetes.io/projected/e5c12796-2fe4-496b-bdb1-f696ef061ec9-kube-api-access-8gsnv\") pod \"whisker-57f7c6c79b-6gqd8\" (UID: \"e5c12796-2fe4-496b-bdb1-f696ef061ec9\") " pod="calico-system/whisker-57f7c6c79b-6gqd8" Dec 16 13:02:43.109424 kubelet[2758]: I1216 13:02:43.109025 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5c12796-2fe4-496b-bdb1-f696ef061ec9-whisker-ca-bundle\") pod \"whisker-57f7c6c79b-6gqd8\" (UID: \"e5c12796-2fe4-496b-bdb1-f696ef061ec9\") " pod="calico-system/whisker-57f7c6c79b-6gqd8" Dec 16 13:02:43.304376 containerd[1550]: time="2025-12-16T13:02:43.304318610Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-57f7c6c79b-6gqd8,Uid:e5c12796-2fe4-496b-bdb1-f696ef061ec9,Namespace:calico-system,Attempt:0,}" Dec 16 13:02:43.657989 systemd-networkd[1467]: cali666e0c1da5b: Link UP Dec 16 13:02:43.658506 systemd-networkd[1467]: cali666e0c1da5b: Gained carrier Dec 16 13:02:43.681577 containerd[1550]: 2025-12-16 13:02:43.354 [INFO][3896] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 13:02:43.681577 containerd[1550]: 2025-12-16 13:02:43.423 [INFO][3896] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--4--07f930e259-k8s-whisker--57f7c6c79b--6gqd8-eth0 whisker-57f7c6c79b- calico-system e5c12796-2fe4-496b-bdb1-f696ef061ec9 895 0 2025-12-16 13:02:42 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:57f7c6c79b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-2-4-07f930e259 whisker-57f7c6c79b-6gqd8 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali666e0c1da5b [] [] }} ContainerID="0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" Namespace="calico-system" Pod="whisker-57f7c6c79b-6gqd8" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-whisker--57f7c6c79b--6gqd8-" Dec 16 13:02:43.681577 containerd[1550]: 2025-12-16 13:02:43.423 [INFO][3896] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" Namespace="calico-system" Pod="whisker-57f7c6c79b-6gqd8" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-whisker--57f7c6c79b--6gqd8-eth0" Dec 16 13:02:43.681577 containerd[1550]: 2025-12-16 13:02:43.598 [INFO][3905] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" HandleID="k8s-pod-network.0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" Workload="ci--4459--2--2--4--07f930e259-k8s-whisker--57f7c6c79b--6gqd8-eth0" Dec 16 13:02:43.683422 containerd[1550]: 2025-12-16 13:02:43.600 [INFO][3905] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" HandleID="k8s-pod-network.0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" Workload="ci--4459--2--2--4--07f930e259-k8s-whisker--57f7c6c79b--6gqd8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000318430), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-4-07f930e259", "pod":"whisker-57f7c6c79b-6gqd8", "timestamp":"2025-12-16 13:02:43.598201241 +0000 UTC"}, Hostname:"ci-4459-2-2-4-07f930e259", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:02:43.683422 containerd[1550]: 2025-12-16 13:02:43.600 [INFO][3905] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:02:43.683422 containerd[1550]: 2025-12-16 13:02:43.601 [INFO][3905] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:02:43.683422 containerd[1550]: 2025-12-16 13:02:43.601 [INFO][3905] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-4-07f930e259' Dec 16 13:02:43.683422 containerd[1550]: 2025-12-16 13:02:43.616 [INFO][3905] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:43.683422 containerd[1550]: 2025-12-16 13:02:43.625 [INFO][3905] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:43.683422 containerd[1550]: 2025-12-16 13:02:43.630 [INFO][3905] ipam/ipam.go 511: Trying affinity for 192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:43.683422 containerd[1550]: 2025-12-16 13:02:43.631 [INFO][3905] ipam/ipam.go 158: Attempting to load block cidr=192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:43.683422 containerd[1550]: 2025-12-16 13:02:43.633 [INFO][3905] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:43.683809 containerd[1550]: 2025-12-16 13:02:43.633 [INFO][3905] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.59.64/26 handle="k8s-pod-network.0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:43.683809 containerd[1550]: 2025-12-16 13:02:43.635 [INFO][3905] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb Dec 16 13:02:43.683809 containerd[1550]: 2025-12-16 13:02:43.639 [INFO][3905] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.59.64/26 handle="k8s-pod-network.0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:43.683809 containerd[1550]: 2025-12-16 13:02:43.644 [INFO][3905] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.59.65/26] block=192.168.59.64/26 handle="k8s-pod-network.0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:43.683809 containerd[1550]: 2025-12-16 13:02:43.644 [INFO][3905] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.59.65/26] handle="k8s-pod-network.0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:43.683809 containerd[1550]: 2025-12-16 13:02:43.644 [INFO][3905] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:02:43.683809 containerd[1550]: 2025-12-16 13:02:43.644 [INFO][3905] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.59.65/26] IPv6=[] ContainerID="0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" HandleID="k8s-pod-network.0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" Workload="ci--4459--2--2--4--07f930e259-k8s-whisker--57f7c6c79b--6gqd8-eth0" Dec 16 13:02:43.684114 containerd[1550]: 2025-12-16 13:02:43.646 [INFO][3896] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" Namespace="calico-system" Pod="whisker-57f7c6c79b-6gqd8" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-whisker--57f7c6c79b--6gqd8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--07f930e259-k8s-whisker--57f7c6c79b--6gqd8-eth0", GenerateName:"whisker-57f7c6c79b-", Namespace:"calico-system", SelfLink:"", UID:"e5c12796-2fe4-496b-bdb1-f696ef061ec9", ResourceVersion:"895", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 2, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"57f7c6c79b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-07f930e259", ContainerID:"", Pod:"whisker-57f7c6c79b-6gqd8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.59.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali666e0c1da5b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:02:43.684114 containerd[1550]: 2025-12-16 13:02:43.646 [INFO][3896] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.59.65/32] ContainerID="0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" Namespace="calico-system" Pod="whisker-57f7c6c79b-6gqd8" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-whisker--57f7c6c79b--6gqd8-eth0" Dec 16 13:02:43.684229 containerd[1550]: 2025-12-16 13:02:43.646 [INFO][3896] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali666e0c1da5b ContainerID="0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" Namespace="calico-system" Pod="whisker-57f7c6c79b-6gqd8" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-whisker--57f7c6c79b--6gqd8-eth0" Dec 16 13:02:43.684229 containerd[1550]: 2025-12-16 13:02:43.658 [INFO][3896] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" Namespace="calico-system" Pod="whisker-57f7c6c79b-6gqd8" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-whisker--57f7c6c79b--6gqd8-eth0" Dec 16 13:02:43.684297 containerd[1550]: 2025-12-16 13:02:43.659 [INFO][3896] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" Namespace="calico-system" Pod="whisker-57f7c6c79b-6gqd8" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-whisker--57f7c6c79b--6gqd8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--07f930e259-k8s-whisker--57f7c6c79b--6gqd8-eth0", GenerateName:"whisker-57f7c6c79b-", Namespace:"calico-system", SelfLink:"", UID:"e5c12796-2fe4-496b-bdb1-f696ef061ec9", ResourceVersion:"895", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 2, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"57f7c6c79b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-07f930e259", ContainerID:"0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb", Pod:"whisker-57f7c6c79b-6gqd8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.59.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali666e0c1da5b", MAC:"8a:09:eb:74:f5:25", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:02:43.684382 containerd[1550]: 2025-12-16 13:02:43.673 [INFO][3896] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" Namespace="calico-system" Pod="whisker-57f7c6c79b-6gqd8" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-whisker--57f7c6c79b--6gqd8-eth0" Dec 16 13:02:43.698085 kubelet[2758]: I1216 13:02:43.698049 2758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="012a187d-90e1-44f7-a642-f3d8eed099d6" path="/var/lib/kubelet/pods/012a187d-90e1-44f7-a642-f3d8eed099d6/volumes" Dec 16 13:02:43.864004 containerd[1550]: time="2025-12-16T13:02:43.863951290Z" level=info msg="connecting to shim 0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb" address="unix:///run/containerd/s/e1bb676ec8a3864c665edc2572fc99b091f7549a91a9f91bfed24679d6d844db" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:02:43.878789 kubelet[2758]: I1216 13:02:43.878534 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:02:43.893010 systemd[1]: Started cri-containerd-0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb.scope - libcontainer container 0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb. Dec 16 13:02:43.943069 containerd[1550]: time="2025-12-16T13:02:43.942977340Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-57f7c6c79b-6gqd8,Uid:e5c12796-2fe4-496b-bdb1-f696ef061ec9,Namespace:calico-system,Attempt:0,} returns sandbox id \"0f3bc497c8ce7aefed93e52544d7595842e220c680133ef2d2fa1fd32a85bbbb\"" Dec 16 13:02:43.950269 containerd[1550]: time="2025-12-16T13:02:43.950235864Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:02:44.417013 containerd[1550]: time="2025-12-16T13:02:44.416959194Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:02:44.418219 containerd[1550]: time="2025-12-16T13:02:44.418183415Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:02:44.418286 containerd[1550]: time="2025-12-16T13:02:44.418264859Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:02:44.418531 kubelet[2758]: E1216 13:02:44.418432 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:02:44.419832 kubelet[2758]: E1216 13:02:44.419081 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:02:44.420056 kubelet[2758]: E1216 13:02:44.420003 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-57f7c6c79b-6gqd8_calico-system(e5c12796-2fe4-496b-bdb1-f696ef061ec9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:02:44.423058 containerd[1550]: time="2025-12-16T13:02:44.423014415Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:02:44.566726 systemd-networkd[1467]: vxlan.calico: Link UP Dec 16 13:02:44.566737 systemd-networkd[1467]: vxlan.calico: Gained carrier Dec 16 13:02:44.839023 containerd[1550]: time="2025-12-16T13:02:44.838914910Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:02:44.839880 containerd[1550]: time="2025-12-16T13:02:44.839847894Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:02:44.839983 containerd[1550]: time="2025-12-16T13:02:44.839917544Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:02:44.840461 kubelet[2758]: E1216 13:02:44.840050 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:02:44.840461 kubelet[2758]: E1216 13:02:44.840088 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:02:44.840461 kubelet[2758]: E1216 13:02:44.840149 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-57f7c6c79b-6gqd8_calico-system(e5c12796-2fe4-496b-bdb1-f696ef061ec9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:02:44.840867 kubelet[2758]: E1216 13:02:44.840188 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-57f7c6c79b-6gqd8" podUID="e5c12796-2fe4-496b-bdb1-f696ef061ec9" Dec 16 13:02:44.881523 kubelet[2758]: E1216 13:02:44.881471 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-57f7c6c79b-6gqd8" podUID="e5c12796-2fe4-496b-bdb1-f696ef061ec9" Dec 16 13:02:44.978072 systemd-networkd[1467]: cali666e0c1da5b: Gained IPv6LL Dec 16 13:02:45.698784 containerd[1550]: time="2025-12-16T13:02:45.698385071Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-lbczh,Uid:f77669b7-3f9d-436f-969d-f70eca9611c4,Namespace:kube-system,Attempt:0,}" Dec 16 13:02:45.702326 containerd[1550]: time="2025-12-16T13:02:45.702238042Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57d769ccb6-w5zs6,Uid:fc7f9104-6597-40ee-97eb-e334f4aba79b,Namespace:calico-apiserver,Attempt:0,}" Dec 16 13:02:45.888171 kubelet[2758]: E1216 13:02:45.887848 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-57f7c6c79b-6gqd8" podUID="e5c12796-2fe4-496b-bdb1-f696ef061ec9" Dec 16 13:02:45.932805 systemd-networkd[1467]: cali0c6fb565b32: Link UP Dec 16 13:02:45.933890 systemd-networkd[1467]: cali0c6fb565b32: Gained carrier Dec 16 13:02:45.939302 systemd-networkd[1467]: vxlan.calico: Gained IPv6LL Dec 16 13:02:45.965758 containerd[1550]: 2025-12-16 13:02:45.797 [INFO][4163] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--w5zs6-eth0 calico-apiserver-57d769ccb6- calico-apiserver fc7f9104-6597-40ee-97eb-e334f4aba79b 815 0 2025-12-16 13:02:17 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:57d769ccb6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-4-07f930e259 calico-apiserver-57d769ccb6-w5zs6 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali0c6fb565b32 [] [] }} ContainerID="6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" Namespace="calico-apiserver" Pod="calico-apiserver-57d769ccb6-w5zs6" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--w5zs6-" Dec 16 13:02:45.965758 containerd[1550]: 2025-12-16 13:02:45.798 [INFO][4163] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" Namespace="calico-apiserver" Pod="calico-apiserver-57d769ccb6-w5zs6" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--w5zs6-eth0" Dec 16 13:02:45.965758 containerd[1550]: 2025-12-16 13:02:45.860 [INFO][4180] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" HandleID="k8s-pod-network.6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" Workload="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--w5zs6-eth0" Dec 16 13:02:45.968514 containerd[1550]: 2025-12-16 13:02:45.861 [INFO][4180] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" HandleID="k8s-pod-network.6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" Workload="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--w5zs6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5670), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-4-07f930e259", "pod":"calico-apiserver-57d769ccb6-w5zs6", "timestamp":"2025-12-16 13:02:45.860977475 +0000 UTC"}, Hostname:"ci-4459-2-2-4-07f930e259", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:02:45.968514 containerd[1550]: 2025-12-16 13:02:45.861 [INFO][4180] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:02:45.968514 containerd[1550]: 2025-12-16 13:02:45.861 [INFO][4180] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:02:45.968514 containerd[1550]: 2025-12-16 13:02:45.861 [INFO][4180] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-4-07f930e259' Dec 16 13:02:45.968514 containerd[1550]: 2025-12-16 13:02:45.870 [INFO][4180] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:45.968514 containerd[1550]: 2025-12-16 13:02:45.874 [INFO][4180] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:45.968514 containerd[1550]: 2025-12-16 13:02:45.878 [INFO][4180] ipam/ipam.go 511: Trying affinity for 192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:45.968514 containerd[1550]: 2025-12-16 13:02:45.881 [INFO][4180] ipam/ipam.go 158: Attempting to load block cidr=192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:45.968514 containerd[1550]: 2025-12-16 13:02:45.886 [INFO][4180] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:45.982251 containerd[1550]: 2025-12-16 13:02:45.886 [INFO][4180] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.59.64/26 handle="k8s-pod-network.6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:45.982251 containerd[1550]: 2025-12-16 13:02:45.890 [INFO][4180] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212 Dec 16 13:02:45.982251 containerd[1550]: 2025-12-16 13:02:45.904 [INFO][4180] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.59.64/26 handle="k8s-pod-network.6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:45.982251 containerd[1550]: 2025-12-16 13:02:45.914 [INFO][4180] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.59.66/26] block=192.168.59.64/26 handle="k8s-pod-network.6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:45.982251 containerd[1550]: 2025-12-16 13:02:45.914 [INFO][4180] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.59.66/26] handle="k8s-pod-network.6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:45.982251 containerd[1550]: 2025-12-16 13:02:45.914 [INFO][4180] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:02:45.982251 containerd[1550]: 2025-12-16 13:02:45.915 [INFO][4180] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.59.66/26] IPv6=[] ContainerID="6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" HandleID="k8s-pod-network.6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" Workload="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--w5zs6-eth0" Dec 16 13:02:45.986477 containerd[1550]: 2025-12-16 13:02:45.922 [INFO][4163] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" Namespace="calico-apiserver" Pod="calico-apiserver-57d769ccb6-w5zs6" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--w5zs6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--w5zs6-eth0", GenerateName:"calico-apiserver-57d769ccb6-", Namespace:"calico-apiserver", SelfLink:"", UID:"fc7f9104-6597-40ee-97eb-e334f4aba79b", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 2, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57d769ccb6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-07f930e259", ContainerID:"", Pod:"calico-apiserver-57d769ccb6-w5zs6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.59.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0c6fb565b32", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:02:45.986537 containerd[1550]: 2025-12-16 13:02:45.924 [INFO][4163] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.59.66/32] ContainerID="6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" Namespace="calico-apiserver" Pod="calico-apiserver-57d769ccb6-w5zs6" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--w5zs6-eth0" Dec 16 13:02:45.986537 containerd[1550]: 2025-12-16 13:02:45.924 [INFO][4163] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0c6fb565b32 ContainerID="6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" Namespace="calico-apiserver" Pod="calico-apiserver-57d769ccb6-w5zs6" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--w5zs6-eth0" Dec 16 13:02:45.986537 containerd[1550]: 2025-12-16 13:02:45.934 [INFO][4163] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" Namespace="calico-apiserver" Pod="calico-apiserver-57d769ccb6-w5zs6" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--w5zs6-eth0" Dec 16 13:02:45.986713 containerd[1550]: 2025-12-16 13:02:45.935 [INFO][4163] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" Namespace="calico-apiserver" Pod="calico-apiserver-57d769ccb6-w5zs6" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--w5zs6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--w5zs6-eth0", GenerateName:"calico-apiserver-57d769ccb6-", Namespace:"calico-apiserver", SelfLink:"", UID:"fc7f9104-6597-40ee-97eb-e334f4aba79b", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 2, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57d769ccb6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-07f930e259", ContainerID:"6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212", Pod:"calico-apiserver-57d769ccb6-w5zs6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.59.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0c6fb565b32", MAC:"6a:30:a8:64:30:27", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:02:45.986783 containerd[1550]: 2025-12-16 13:02:45.954 [INFO][4163] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" Namespace="calico-apiserver" Pod="calico-apiserver-57d769ccb6-w5zs6" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--w5zs6-eth0" Dec 16 13:02:46.012427 containerd[1550]: time="2025-12-16T13:02:46.012368988Z" level=info msg="connecting to shim 6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212" address="unix:///run/containerd/s/28aa5fb5b053adbc218e7cb952c7c48a5435352935629508e0bbfcd5778f4cef" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:02:46.018898 systemd-networkd[1467]: calid6b8efcb604: Link UP Dec 16 13:02:46.023707 systemd-networkd[1467]: calid6b8efcb604: Gained carrier Dec 16 13:02:46.048193 containerd[1550]: 2025-12-16 13:02:45.813 [INFO][4157] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--lbczh-eth0 coredns-66bc5c9577- kube-system f77669b7-3f9d-436f-969d-f70eca9611c4 810 0 2025-12-16 13:02:07 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-4-07f930e259 coredns-66bc5c9577-lbczh eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid6b8efcb604 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" Namespace="kube-system" Pod="coredns-66bc5c9577-lbczh" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--lbczh-" Dec 16 13:02:46.048193 containerd[1550]: 2025-12-16 13:02:45.814 [INFO][4157] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" Namespace="kube-system" Pod="coredns-66bc5c9577-lbczh" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--lbczh-eth0" Dec 16 13:02:46.048193 containerd[1550]: 2025-12-16 13:02:45.870 [INFO][4186] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" HandleID="k8s-pod-network.8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" Workload="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--lbczh-eth0" Dec 16 13:02:46.048894 containerd[1550]: 2025-12-16 13:02:45.870 [INFO][4186] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" HandleID="k8s-pod-network.8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" Workload="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--lbczh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ac7b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-4-07f930e259", "pod":"coredns-66bc5c9577-lbczh", "timestamp":"2025-12-16 13:02:45.870190899 +0000 UTC"}, Hostname:"ci-4459-2-2-4-07f930e259", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:02:46.048894 containerd[1550]: 2025-12-16 13:02:45.870 [INFO][4186] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:02:46.048894 containerd[1550]: 2025-12-16 13:02:45.915 [INFO][4186] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:02:46.048894 containerd[1550]: 2025-12-16 13:02:45.915 [INFO][4186] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-4-07f930e259' Dec 16 13:02:46.048894 containerd[1550]: 2025-12-16 13:02:45.973 [INFO][4186] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:46.048894 containerd[1550]: 2025-12-16 13:02:45.986 [INFO][4186] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:46.048894 containerd[1550]: 2025-12-16 13:02:45.992 [INFO][4186] ipam/ipam.go 511: Trying affinity for 192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:46.048894 containerd[1550]: 2025-12-16 13:02:45.993 [INFO][4186] ipam/ipam.go 158: Attempting to load block cidr=192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:46.048894 containerd[1550]: 2025-12-16 13:02:45.996 [INFO][4186] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:46.049087 containerd[1550]: 2025-12-16 13:02:45.996 [INFO][4186] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.59.64/26 handle="k8s-pod-network.8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:46.049087 containerd[1550]: 2025-12-16 13:02:45.997 [INFO][4186] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484 Dec 16 13:02:46.049087 containerd[1550]: 2025-12-16 13:02:46.006 [INFO][4186] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.59.64/26 handle="k8s-pod-network.8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:46.049087 containerd[1550]: 2025-12-16 13:02:46.012 [INFO][4186] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.59.67/26] block=192.168.59.64/26 handle="k8s-pod-network.8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:46.049087 containerd[1550]: 2025-12-16 13:02:46.012 [INFO][4186] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.59.67/26] handle="k8s-pod-network.8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:46.049087 containerd[1550]: 2025-12-16 13:02:46.012 [INFO][4186] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:02:46.049087 containerd[1550]: 2025-12-16 13:02:46.012 [INFO][4186] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.59.67/26] IPv6=[] ContainerID="8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" HandleID="k8s-pod-network.8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" Workload="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--lbczh-eth0" Dec 16 13:02:46.049249 containerd[1550]: 2025-12-16 13:02:46.015 [INFO][4157] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" Namespace="kube-system" Pod="coredns-66bc5c9577-lbczh" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--lbczh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--lbczh-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"f77669b7-3f9d-436f-969d-f70eca9611c4", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 2, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-07f930e259", ContainerID:"", Pod:"coredns-66bc5c9577-lbczh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.59.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid6b8efcb604", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:02:46.049249 containerd[1550]: 2025-12-16 13:02:46.015 [INFO][4157] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.59.67/32] ContainerID="8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" Namespace="kube-system" Pod="coredns-66bc5c9577-lbczh" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--lbczh-eth0" Dec 16 13:02:46.049249 containerd[1550]: 2025-12-16 13:02:46.015 [INFO][4157] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid6b8efcb604 ContainerID="8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" Namespace="kube-system" Pod="coredns-66bc5c9577-lbczh" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--lbczh-eth0" Dec 16 13:02:46.049249 containerd[1550]: 2025-12-16 13:02:46.026 [INFO][4157] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" Namespace="kube-system" Pod="coredns-66bc5c9577-lbczh" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--lbczh-eth0" Dec 16 13:02:46.049249 containerd[1550]: 2025-12-16 13:02:46.028 [INFO][4157] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" Namespace="kube-system" Pod="coredns-66bc5c9577-lbczh" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--lbczh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--lbczh-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"f77669b7-3f9d-436f-969d-f70eca9611c4", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 2, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-07f930e259", ContainerID:"8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484", Pod:"coredns-66bc5c9577-lbczh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.59.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid6b8efcb604", MAC:"f6:df:b7:37:c5:5b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:02:46.049409 containerd[1550]: 2025-12-16 13:02:46.041 [INFO][4157] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" Namespace="kube-system" Pod="coredns-66bc5c9577-lbczh" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--lbczh-eth0" Dec 16 13:02:46.058929 systemd[1]: Started cri-containerd-6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212.scope - libcontainer container 6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212. Dec 16 13:02:46.082644 containerd[1550]: time="2025-12-16T13:02:46.082264056Z" level=info msg="connecting to shim 8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484" address="unix:///run/containerd/s/36bc5a4367b5374f818ade167d810dc0b19aa41141408d56348a363e7f0bda34" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:02:46.112931 systemd[1]: Started cri-containerd-8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484.scope - libcontainer container 8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484. Dec 16 13:02:46.149677 containerd[1550]: time="2025-12-16T13:02:46.149646365Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57d769ccb6-w5zs6,Uid:fc7f9104-6597-40ee-97eb-e334f4aba79b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"6ce0bc7d94075c7f73b7995d2361fa3aafded08c73ebc98c8d5e2aed6941f212\"" Dec 16 13:02:46.152541 containerd[1550]: time="2025-12-16T13:02:46.152111196Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:02:46.168517 containerd[1550]: time="2025-12-16T13:02:46.168430900Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-lbczh,Uid:f77669b7-3f9d-436f-969d-f70eca9611c4,Namespace:kube-system,Attempt:0,} returns sandbox id \"8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484\"" Dec 16 13:02:46.172315 containerd[1550]: time="2025-12-16T13:02:46.172288367Z" level=info msg="CreateContainer within sandbox \"8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 13:02:46.185050 containerd[1550]: time="2025-12-16T13:02:46.185013998Z" level=info msg="Container 0a0febaa2e55d4ae825fa987293664d4cb4cf8d14681c0a9083932ba6af896af: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:02:46.191073 containerd[1550]: time="2025-12-16T13:02:46.191045341Z" level=info msg="CreateContainer within sandbox \"8130b99173b45dd769bee0d83d3b52dfd9e3d97a13e32a93dea047d3c1072484\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"0a0febaa2e55d4ae825fa987293664d4cb4cf8d14681c0a9083932ba6af896af\"" Dec 16 13:02:46.192060 containerd[1550]: time="2025-12-16T13:02:46.191604391Z" level=info msg="StartContainer for \"0a0febaa2e55d4ae825fa987293664d4cb4cf8d14681c0a9083932ba6af896af\"" Dec 16 13:02:46.192255 containerd[1550]: time="2025-12-16T13:02:46.192229145Z" level=info msg="connecting to shim 0a0febaa2e55d4ae825fa987293664d4cb4cf8d14681c0a9083932ba6af896af" address="unix:///run/containerd/s/36bc5a4367b5374f818ade167d810dc0b19aa41141408d56348a363e7f0bda34" protocol=ttrpc version=3 Dec 16 13:02:46.207957 systemd[1]: Started cri-containerd-0a0febaa2e55d4ae825fa987293664d4cb4cf8d14681c0a9083932ba6af896af.scope - libcontainer container 0a0febaa2e55d4ae825fa987293664d4cb4cf8d14681c0a9083932ba6af896af. Dec 16 13:02:46.235488 containerd[1550]: time="2025-12-16T13:02:46.235295875Z" level=info msg="StartContainer for \"0a0febaa2e55d4ae825fa987293664d4cb4cf8d14681c0a9083932ba6af896af\" returns successfully" Dec 16 13:02:46.626946 containerd[1550]: time="2025-12-16T13:02:46.626766667Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:02:46.629908 containerd[1550]: time="2025-12-16T13:02:46.629574213Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:02:46.629908 containerd[1550]: time="2025-12-16T13:02:46.629689730Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:02:46.631953 kubelet[2758]: E1216 13:02:46.630286 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:02:46.631953 kubelet[2758]: E1216 13:02:46.630339 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:02:46.631953 kubelet[2758]: E1216 13:02:46.630435 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57d769ccb6-w5zs6_calico-apiserver(fc7f9104-6597-40ee-97eb-e334f4aba79b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:02:46.631953 kubelet[2758]: E1216 13:02:46.630479 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" podUID="fc7f9104-6597-40ee-97eb-e334f4aba79b" Dec 16 13:02:46.643706 kubelet[2758]: I1216 13:02:46.641897 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 13:02:46.701049 containerd[1550]: time="2025-12-16T13:02:46.701002543Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5f8d9d9fb6-xwnn4,Uid:71bd25bd-53fb-4224-a114-8010f8dec502,Namespace:calico-system,Attempt:0,}" Dec 16 13:02:46.722918 containerd[1550]: time="2025-12-16T13:02:46.702099945Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-vhvvf,Uid:436357c5-9bd9-4f17-88f8-cc94151199a4,Namespace:kube-system,Attempt:0,}" Dec 16 13:02:46.908451 kubelet[2758]: E1216 13:02:46.908350 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" podUID="fc7f9104-6597-40ee-97eb-e334f4aba79b" Dec 16 13:02:46.962034 systemd-networkd[1467]: cali0c6fb565b32: Gained IPv6LL Dec 16 13:02:47.000207 kubelet[2758]: I1216 13:02:46.999744 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-lbczh" podStartSLOduration=39.994461366 podStartE2EDuration="39.994461366s" podCreationTimestamp="2025-12-16 13:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:02:46.993966365 +0000 UTC m=+45.423726072" watchObservedRunningTime="2025-12-16 13:02:46.994461366 +0000 UTC m=+45.424221102" Dec 16 13:02:47.014513 systemd-networkd[1467]: calic6423da84f2: Link UP Dec 16 13:02:47.015697 systemd-networkd[1467]: calic6423da84f2: Gained carrier Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.835 [INFO][4339] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--4--07f930e259-k8s-calico--kube--controllers--5f8d9d9fb6--xwnn4-eth0 calico-kube-controllers-5f8d9d9fb6- calico-system 71bd25bd-53fb-4224-a114-8010f8dec502 819 0 2025-12-16 13:02:23 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5f8d9d9fb6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-2-4-07f930e259 calico-kube-controllers-5f8d9d9fb6-xwnn4 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic6423da84f2 [] [] }} ContainerID="2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" Namespace="calico-system" Pod="calico-kube-controllers-5f8d9d9fb6-xwnn4" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--kube--controllers--5f8d9d9fb6--xwnn4-" Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.836 [INFO][4339] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" Namespace="calico-system" Pod="calico-kube-controllers-5f8d9d9fb6-xwnn4" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--kube--controllers--5f8d9d9fb6--xwnn4-eth0" Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.912 [INFO][4393] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" HandleID="k8s-pod-network.2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" Workload="ci--4459--2--2--4--07f930e259-k8s-calico--kube--controllers--5f8d9d9fb6--xwnn4-eth0" Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.913 [INFO][4393] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" HandleID="k8s-pod-network.2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" Workload="ci--4459--2--2--4--07f930e259-k8s-calico--kube--controllers--5f8d9d9fb6--xwnn4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fa10), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-4-07f930e259", "pod":"calico-kube-controllers-5f8d9d9fb6-xwnn4", "timestamp":"2025-12-16 13:02:46.912981469 +0000 UTC"}, Hostname:"ci-4459-2-2-4-07f930e259", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.913 [INFO][4393] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.913 [INFO][4393] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.913 [INFO][4393] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-4-07f930e259' Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.925 [INFO][4393] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.933 [INFO][4393] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.939 [INFO][4393] ipam/ipam.go 511: Trying affinity for 192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.941 [INFO][4393] ipam/ipam.go 158: Attempting to load block cidr=192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.946 [INFO][4393] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.946 [INFO][4393] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.59.64/26 handle="k8s-pod-network.2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.954 [INFO][4393] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628 Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.959 [INFO][4393] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.59.64/26 handle="k8s-pod-network.2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.990 [INFO][4393] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.59.68/26] block=192.168.59.64/26 handle="k8s-pod-network.2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.990 [INFO][4393] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.59.68/26] handle="k8s-pod-network.2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.990 [INFO][4393] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:02:47.034959 containerd[1550]: 2025-12-16 13:02:46.994 [INFO][4393] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.59.68/26] IPv6=[] ContainerID="2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" HandleID="k8s-pod-network.2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" Workload="ci--4459--2--2--4--07f930e259-k8s-calico--kube--controllers--5f8d9d9fb6--xwnn4-eth0" Dec 16 13:02:47.036520 containerd[1550]: 2025-12-16 13:02:46.996 [INFO][4339] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" Namespace="calico-system" Pod="calico-kube-controllers-5f8d9d9fb6-xwnn4" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--kube--controllers--5f8d9d9fb6--xwnn4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--07f930e259-k8s-calico--kube--controllers--5f8d9d9fb6--xwnn4-eth0", GenerateName:"calico-kube-controllers-5f8d9d9fb6-", Namespace:"calico-system", SelfLink:"", UID:"71bd25bd-53fb-4224-a114-8010f8dec502", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 2, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5f8d9d9fb6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-07f930e259", ContainerID:"", Pod:"calico-kube-controllers-5f8d9d9fb6-xwnn4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.59.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic6423da84f2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:02:47.036520 containerd[1550]: 2025-12-16 13:02:46.996 [INFO][4339] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.59.68/32] ContainerID="2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" Namespace="calico-system" Pod="calico-kube-controllers-5f8d9d9fb6-xwnn4" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--kube--controllers--5f8d9d9fb6--xwnn4-eth0" Dec 16 13:02:47.036520 containerd[1550]: 2025-12-16 13:02:46.996 [INFO][4339] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic6423da84f2 ContainerID="2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" Namespace="calico-system" Pod="calico-kube-controllers-5f8d9d9fb6-xwnn4" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--kube--controllers--5f8d9d9fb6--xwnn4-eth0" Dec 16 13:02:47.036520 containerd[1550]: 2025-12-16 13:02:47.016 [INFO][4339] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" Namespace="calico-system" Pod="calico-kube-controllers-5f8d9d9fb6-xwnn4" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--kube--controllers--5f8d9d9fb6--xwnn4-eth0" Dec 16 13:02:47.036520 containerd[1550]: 2025-12-16 13:02:47.018 [INFO][4339] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" Namespace="calico-system" Pod="calico-kube-controllers-5f8d9d9fb6-xwnn4" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--kube--controllers--5f8d9d9fb6--xwnn4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--07f930e259-k8s-calico--kube--controllers--5f8d9d9fb6--xwnn4-eth0", GenerateName:"calico-kube-controllers-5f8d9d9fb6-", Namespace:"calico-system", SelfLink:"", UID:"71bd25bd-53fb-4224-a114-8010f8dec502", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 2, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5f8d9d9fb6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-07f930e259", ContainerID:"2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628", Pod:"calico-kube-controllers-5f8d9d9fb6-xwnn4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.59.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic6423da84f2", MAC:"a6:c3:87:dc:5b:23", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:02:47.036520 containerd[1550]: 2025-12-16 13:02:47.029 [INFO][4339] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" Namespace="calico-system" Pod="calico-kube-controllers-5f8d9d9fb6-xwnn4" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--kube--controllers--5f8d9d9fb6--xwnn4-eth0" Dec 16 13:02:47.076841 containerd[1550]: time="2025-12-16T13:02:47.076638905Z" level=info msg="connecting to shim 2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628" address="unix:///run/containerd/s/f0f9b3597b2f9f9e0b5dd913e93c2b84d854ab830ee815e0eba84089d0b80d36" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:02:47.091304 systemd-networkd[1467]: caliee96d8020b5: Link UP Dec 16 13:02:47.093367 systemd-networkd[1467]: caliee96d8020b5: Gained carrier Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:46.835 [INFO][4357] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--vhvvf-eth0 coredns-66bc5c9577- kube-system 436357c5-9bd9-4f17-88f8-cc94151199a4 818 0 2025-12-16 13:02:07 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-2-4-07f930e259 coredns-66bc5c9577-vhvvf eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliee96d8020b5 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" Namespace="kube-system" Pod="coredns-66bc5c9577-vhvvf" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--vhvvf-" Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:46.835 [INFO][4357] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" Namespace="kube-system" Pod="coredns-66bc5c9577-vhvvf" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--vhvvf-eth0" Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:46.927 [INFO][4391] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" HandleID="k8s-pod-network.36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" Workload="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--vhvvf-eth0" Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:46.927 [INFO][4391] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" HandleID="k8s-pod-network.36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" Workload="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--vhvvf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c55a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-2-4-07f930e259", "pod":"coredns-66bc5c9577-vhvvf", "timestamp":"2025-12-16 13:02:46.927217548 +0000 UTC"}, Hostname:"ci-4459-2-2-4-07f930e259", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:46.927 [INFO][4391] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:46.990 [INFO][4391] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:46.991 [INFO][4391] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-4-07f930e259' Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:47.026 [INFO][4391] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:47.037 [INFO][4391] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:47.045 [INFO][4391] ipam/ipam.go 511: Trying affinity for 192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:47.048 [INFO][4391] ipam/ipam.go 158: Attempting to load block cidr=192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:47.055 [INFO][4391] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:47.056 [INFO][4391] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.59.64/26 handle="k8s-pod-network.36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:47.059 [INFO][4391] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:47.069 [INFO][4391] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.59.64/26 handle="k8s-pod-network.36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:47.079 [INFO][4391] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.59.69/26] block=192.168.59.64/26 handle="k8s-pod-network.36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:47.079 [INFO][4391] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.59.69/26] handle="k8s-pod-network.36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:47.079 [INFO][4391] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:02:47.114007 containerd[1550]: 2025-12-16 13:02:47.079 [INFO][4391] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.59.69/26] IPv6=[] ContainerID="36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" HandleID="k8s-pod-network.36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" Workload="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--vhvvf-eth0" Dec 16 13:02:47.117184 containerd[1550]: 2025-12-16 13:02:47.088 [INFO][4357] cni-plugin/k8s.go 418: Populated endpoint ContainerID="36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" Namespace="kube-system" Pod="coredns-66bc5c9577-vhvvf" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--vhvvf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--vhvvf-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"436357c5-9bd9-4f17-88f8-cc94151199a4", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 2, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-07f930e259", ContainerID:"", Pod:"coredns-66bc5c9577-vhvvf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.59.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliee96d8020b5", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:02:47.117184 containerd[1550]: 2025-12-16 13:02:47.088 [INFO][4357] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.59.69/32] ContainerID="36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" Namespace="kube-system" Pod="coredns-66bc5c9577-vhvvf" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--vhvvf-eth0" Dec 16 13:02:47.117184 containerd[1550]: 2025-12-16 13:02:47.088 [INFO][4357] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliee96d8020b5 ContainerID="36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" Namespace="kube-system" Pod="coredns-66bc5c9577-vhvvf" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--vhvvf-eth0" Dec 16 13:02:47.117184 containerd[1550]: 2025-12-16 13:02:47.093 [INFO][4357] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" Namespace="kube-system" Pod="coredns-66bc5c9577-vhvvf" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--vhvvf-eth0" Dec 16 13:02:47.117184 containerd[1550]: 2025-12-16 13:02:47.093 [INFO][4357] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" Namespace="kube-system" Pod="coredns-66bc5c9577-vhvvf" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--vhvvf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--vhvvf-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"436357c5-9bd9-4f17-88f8-cc94151199a4", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 2, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-07f930e259", ContainerID:"36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba", Pod:"coredns-66bc5c9577-vhvvf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.59.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliee96d8020b5", MAC:"ea:94:c8:51:bd:7d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:02:47.117340 containerd[1550]: 2025-12-16 13:02:47.108 [INFO][4357] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" Namespace="kube-system" Pod="coredns-66bc5c9577-vhvvf" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-coredns--66bc5c9577--vhvvf-eth0" Dec 16 13:02:47.117933 systemd[1]: Started cri-containerd-2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628.scope - libcontainer container 2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628. Dec 16 13:02:47.138887 containerd[1550]: time="2025-12-16T13:02:47.138839662Z" level=info msg="connecting to shim 36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba" address="unix:///run/containerd/s/350458031198a7434302d8d86a0a6bd2a2d6fa73122f1e007f4e62286fa566ad" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:02:47.162072 systemd[1]: Started cri-containerd-36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba.scope - libcontainer container 36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba. Dec 16 13:02:47.188225 containerd[1550]: time="2025-12-16T13:02:47.188192791Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5f8d9d9fb6-xwnn4,Uid:71bd25bd-53fb-4224-a114-8010f8dec502,Namespace:calico-system,Attempt:0,} returns sandbox id \"2f1ac68017210382a9b52f754f527152f86640c2cf7b95bfba3b686b41e4d628\"" Dec 16 13:02:47.190901 containerd[1550]: time="2025-12-16T13:02:47.190864581Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:02:47.214937 containerd[1550]: time="2025-12-16T13:02:47.214900196Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-vhvvf,Uid:436357c5-9bd9-4f17-88f8-cc94151199a4,Namespace:kube-system,Attempt:0,} returns sandbox id \"36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba\"" Dec 16 13:02:47.218289 containerd[1550]: time="2025-12-16T13:02:47.218249188Z" level=info msg="CreateContainer within sandbox \"36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 13:02:47.224304 containerd[1550]: time="2025-12-16T13:02:47.224272494Z" level=info msg="Container 7b33f31523db92c9a88f34319aeba8906b765096a1833892eea510eb72aae57f: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:02:47.229211 containerd[1550]: time="2025-12-16T13:02:47.229175276Z" level=info msg="CreateContainer within sandbox \"36a38b8ce6a620e608e982577786940112c7fd705d9ad1995323078cf0e04bba\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7b33f31523db92c9a88f34319aeba8906b765096a1833892eea510eb72aae57f\"" Dec 16 13:02:47.229835 containerd[1550]: time="2025-12-16T13:02:47.229716322Z" level=info msg="StartContainer for \"7b33f31523db92c9a88f34319aeba8906b765096a1833892eea510eb72aae57f\"" Dec 16 13:02:47.230776 containerd[1550]: time="2025-12-16T13:02:47.230426747Z" level=info msg="connecting to shim 7b33f31523db92c9a88f34319aeba8906b765096a1833892eea510eb72aae57f" address="unix:///run/containerd/s/350458031198a7434302d8d86a0a6bd2a2d6fa73122f1e007f4e62286fa566ad" protocol=ttrpc version=3 Dec 16 13:02:47.245950 systemd[1]: Started cri-containerd-7b33f31523db92c9a88f34319aeba8906b765096a1833892eea510eb72aae57f.scope - libcontainer container 7b33f31523db92c9a88f34319aeba8906b765096a1833892eea510eb72aae57f. Dec 16 13:02:47.273293 containerd[1550]: time="2025-12-16T13:02:47.273256782Z" level=info msg="StartContainer for \"7b33f31523db92c9a88f34319aeba8906b765096a1833892eea510eb72aae57f\" returns successfully" Dec 16 13:02:47.657660 containerd[1550]: time="2025-12-16T13:02:47.657590562Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:02:47.659212 containerd[1550]: time="2025-12-16T13:02:47.659000841Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:02:47.659212 containerd[1550]: time="2025-12-16T13:02:47.659056055Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:02:47.659582 kubelet[2758]: E1216 13:02:47.659469 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:02:47.659582 kubelet[2758]: E1216 13:02:47.659540 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:02:47.659777 kubelet[2758]: E1216 13:02:47.659661 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5f8d9d9fb6-xwnn4_calico-system(71bd25bd-53fb-4224-a114-8010f8dec502): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:02:47.659777 kubelet[2758]: E1216 13:02:47.659708 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" podUID="71bd25bd-53fb-4224-a114-8010f8dec502" Dec 16 13:02:47.698492 containerd[1550]: time="2025-12-16T13:02:47.698236693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ffzww,Uid:e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb,Namespace:calico-system,Attempt:0,}" Dec 16 13:02:47.699976 containerd[1550]: time="2025-12-16T13:02:47.699802696Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57d769ccb6-5hqtj,Uid:e556d54d-017a-4277-b8ea-32b10093992b,Namespace:calico-apiserver,Attempt:0,}" Dec 16 13:02:47.730773 systemd-networkd[1467]: calid6b8efcb604: Gained IPv6LL Dec 16 13:02:47.889234 systemd-networkd[1467]: calif57ddfbdaff: Link UP Dec 16 13:02:47.891254 systemd-networkd[1467]: calif57ddfbdaff: Gained carrier Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.802 [INFO][4577] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--4--07f930e259-k8s-csi--node--driver--ffzww-eth0 csi-node-driver- calico-system e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb 760 0 2025-12-16 13:02:22 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-2-4-07f930e259 csi-node-driver-ffzww eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif57ddfbdaff [] [] }} ContainerID="3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" Namespace="calico-system" Pod="csi-node-driver-ffzww" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-csi--node--driver--ffzww-" Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.802 [INFO][4577] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" Namespace="calico-system" Pod="csi-node-driver-ffzww" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-csi--node--driver--ffzww-eth0" Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.844 [INFO][4607] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" HandleID="k8s-pod-network.3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" Workload="ci--4459--2--2--4--07f930e259-k8s-csi--node--driver--ffzww-eth0" Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.844 [INFO][4607] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" HandleID="k8s-pod-network.3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" Workload="ci--4459--2--2--4--07f930e259-k8s-csi--node--driver--ffzww-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d55a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-4-07f930e259", "pod":"csi-node-driver-ffzww", "timestamp":"2025-12-16 13:02:47.844175449 +0000 UTC"}, Hostname:"ci-4459-2-2-4-07f930e259", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.844 [INFO][4607] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.844 [INFO][4607] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.844 [INFO][4607] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-4-07f930e259' Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.855 [INFO][4607] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.859 [INFO][4607] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.864 [INFO][4607] ipam/ipam.go 511: Trying affinity for 192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.867 [INFO][4607] ipam/ipam.go 158: Attempting to load block cidr=192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.869 [INFO][4607] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.869 [INFO][4607] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.59.64/26 handle="k8s-pod-network.3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.871 [INFO][4607] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.875 [INFO][4607] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.59.64/26 handle="k8s-pod-network.3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.880 [INFO][4607] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.59.70/26] block=192.168.59.64/26 handle="k8s-pod-network.3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.880 [INFO][4607] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.59.70/26] handle="k8s-pod-network.3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.880 [INFO][4607] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:02:47.908900 containerd[1550]: 2025-12-16 13:02:47.880 [INFO][4607] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.59.70/26] IPv6=[] ContainerID="3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" HandleID="k8s-pod-network.3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" Workload="ci--4459--2--2--4--07f930e259-k8s-csi--node--driver--ffzww-eth0" Dec 16 13:02:47.909568 containerd[1550]: 2025-12-16 13:02:47.885 [INFO][4577] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" Namespace="calico-system" Pod="csi-node-driver-ffzww" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-csi--node--driver--ffzww-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--07f930e259-k8s-csi--node--driver--ffzww-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb", ResourceVersion:"760", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 2, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-07f930e259", ContainerID:"", Pod:"csi-node-driver-ffzww", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.59.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif57ddfbdaff", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:02:47.909568 containerd[1550]: 2025-12-16 13:02:47.887 [INFO][4577] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.59.70/32] ContainerID="3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" Namespace="calico-system" Pod="csi-node-driver-ffzww" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-csi--node--driver--ffzww-eth0" Dec 16 13:02:47.909568 containerd[1550]: 2025-12-16 13:02:47.887 [INFO][4577] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif57ddfbdaff ContainerID="3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" Namespace="calico-system" Pod="csi-node-driver-ffzww" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-csi--node--driver--ffzww-eth0" Dec 16 13:02:47.909568 containerd[1550]: 2025-12-16 13:02:47.892 [INFO][4577] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" Namespace="calico-system" Pod="csi-node-driver-ffzww" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-csi--node--driver--ffzww-eth0" Dec 16 13:02:47.909568 containerd[1550]: 2025-12-16 13:02:47.893 [INFO][4577] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" Namespace="calico-system" Pod="csi-node-driver-ffzww" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-csi--node--driver--ffzww-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--07f930e259-k8s-csi--node--driver--ffzww-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb", ResourceVersion:"760", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 2, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-07f930e259", ContainerID:"3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a", Pod:"csi-node-driver-ffzww", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.59.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif57ddfbdaff", MAC:"fa:b6:63:c2:cb:95", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:02:47.909568 containerd[1550]: 2025-12-16 13:02:47.903 [INFO][4577] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" Namespace="calico-system" Pod="csi-node-driver-ffzww" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-csi--node--driver--ffzww-eth0" Dec 16 13:02:47.934484 kubelet[2758]: E1216 13:02:47.934431 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" podUID="71bd25bd-53fb-4224-a114-8010f8dec502" Dec 16 13:02:47.935747 kubelet[2758]: E1216 13:02:47.935570 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" podUID="fc7f9104-6597-40ee-97eb-e334f4aba79b" Dec 16 13:02:47.942531 containerd[1550]: time="2025-12-16T13:02:47.942396008Z" level=info msg="connecting to shim 3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a" address="unix:///run/containerd/s/35c055c76d2b74542efa8704a22eaf7f636e2564abef9df5c425dabaf4abb065" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:02:47.979028 systemd[1]: Started cri-containerd-3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a.scope - libcontainer container 3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a. Dec 16 13:02:47.979898 kubelet[2758]: I1216 13:02:47.979402 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-vhvvf" podStartSLOduration=40.979385379 podStartE2EDuration="40.979385379s" podCreationTimestamp="2025-12-16 13:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 13:02:47.958788644 +0000 UTC m=+46.388548360" watchObservedRunningTime="2025-12-16 13:02:47.979385379 +0000 UTC m=+46.409145085" Dec 16 13:02:48.025445 systemd-networkd[1467]: calicb29e45ac69: Link UP Dec 16 13:02:48.027477 systemd-networkd[1467]: calicb29e45ac69: Gained carrier Dec 16 13:02:48.030063 containerd[1550]: time="2025-12-16T13:02:48.029978031Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ffzww,Uid:e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb,Namespace:calico-system,Attempt:0,} returns sandbox id \"3676f789ba171d0b3050c01008dcd201f8a9d108e9aa2fed7d19f2c0e6d7926a\"" Dec 16 13:02:48.033543 containerd[1550]: time="2025-12-16T13:02:48.033526797Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:47.798 [INFO][4584] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--5hqtj-eth0 calico-apiserver-57d769ccb6- calico-apiserver e556d54d-017a-4277-b8ea-32b10093992b 821 0 2025-12-16 13:02:17 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:57d769ccb6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-2-4-07f930e259 calico-apiserver-57d769ccb6-5hqtj eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calicb29e45ac69 [] [] }} ContainerID="196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" Namespace="calico-apiserver" Pod="calico-apiserver-57d769ccb6-5hqtj" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--5hqtj-" Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:47.799 [INFO][4584] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" Namespace="calico-apiserver" Pod="calico-apiserver-57d769ccb6-5hqtj" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--5hqtj-eth0" Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:47.848 [INFO][4601] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" HandleID="k8s-pod-network.196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" Workload="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--5hqtj-eth0" Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:47.848 [INFO][4601] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" HandleID="k8s-pod-network.196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" Workload="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--5hqtj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00032a150), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-2-2-4-07f930e259", "pod":"calico-apiserver-57d769ccb6-5hqtj", "timestamp":"2025-12-16 13:02:47.848021716 +0000 UTC"}, Hostname:"ci-4459-2-2-4-07f930e259", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:47.849 [INFO][4601] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:47.880 [INFO][4601] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:47.880 [INFO][4601] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-4-07f930e259' Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:47.964 [INFO][4601] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:47.975 [INFO][4601] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:47.984 [INFO][4601] ipam/ipam.go 511: Trying affinity for 192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:47.987 [INFO][4601] ipam/ipam.go 158: Attempting to load block cidr=192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:47.989 [INFO][4601] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:47.989 [INFO][4601] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.59.64/26 handle="k8s-pod-network.196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:47.991 [INFO][4601] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:47.996 [INFO][4601] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.59.64/26 handle="k8s-pod-network.196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:48.010 [INFO][4601] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.59.71/26] block=192.168.59.64/26 handle="k8s-pod-network.196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:48.011 [INFO][4601] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.59.71/26] handle="k8s-pod-network.196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:48.011 [INFO][4601] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:02:48.047255 containerd[1550]: 2025-12-16 13:02:48.011 [INFO][4601] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.59.71/26] IPv6=[] ContainerID="196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" HandleID="k8s-pod-network.196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" Workload="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--5hqtj-eth0" Dec 16 13:02:48.047969 containerd[1550]: 2025-12-16 13:02:48.017 [INFO][4584] cni-plugin/k8s.go 418: Populated endpoint ContainerID="196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" Namespace="calico-apiserver" Pod="calico-apiserver-57d769ccb6-5hqtj" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--5hqtj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--5hqtj-eth0", GenerateName:"calico-apiserver-57d769ccb6-", Namespace:"calico-apiserver", SelfLink:"", UID:"e556d54d-017a-4277-b8ea-32b10093992b", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 2, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57d769ccb6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-07f930e259", ContainerID:"", Pod:"calico-apiserver-57d769ccb6-5hqtj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.59.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicb29e45ac69", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:02:48.047969 containerd[1550]: 2025-12-16 13:02:48.017 [INFO][4584] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.59.71/32] ContainerID="196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" Namespace="calico-apiserver" Pod="calico-apiserver-57d769ccb6-5hqtj" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--5hqtj-eth0" Dec 16 13:02:48.047969 containerd[1550]: 2025-12-16 13:02:48.017 [INFO][4584] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicb29e45ac69 ContainerID="196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" Namespace="calico-apiserver" Pod="calico-apiserver-57d769ccb6-5hqtj" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--5hqtj-eth0" Dec 16 13:02:48.047969 containerd[1550]: 2025-12-16 13:02:48.033 [INFO][4584] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" Namespace="calico-apiserver" Pod="calico-apiserver-57d769ccb6-5hqtj" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--5hqtj-eth0" Dec 16 13:02:48.047969 containerd[1550]: 2025-12-16 13:02:48.034 [INFO][4584] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" Namespace="calico-apiserver" Pod="calico-apiserver-57d769ccb6-5hqtj" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--5hqtj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--5hqtj-eth0", GenerateName:"calico-apiserver-57d769ccb6-", Namespace:"calico-apiserver", SelfLink:"", UID:"e556d54d-017a-4277-b8ea-32b10093992b", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 2, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57d769ccb6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-07f930e259", ContainerID:"196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f", Pod:"calico-apiserver-57d769ccb6-5hqtj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.59.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicb29e45ac69", MAC:"42:fd:04:9d:80:3e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:02:48.047969 containerd[1550]: 2025-12-16 13:02:48.044 [INFO][4584] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" Namespace="calico-apiserver" Pod="calico-apiserver-57d769ccb6-5hqtj" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-calico--apiserver--57d769ccb6--5hqtj-eth0" Dec 16 13:02:48.077846 containerd[1550]: time="2025-12-16T13:02:48.077744930Z" level=info msg="connecting to shim 196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f" address="unix:///run/containerd/s/7f5e7f6e729de421373f44692729ff8ae57125db304a2ebc84bce6be0c809a30" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:02:48.114018 systemd[1]: Started cri-containerd-196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f.scope - libcontainer container 196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f. Dec 16 13:02:48.200918 containerd[1550]: time="2025-12-16T13:02:48.200624708Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57d769ccb6-5hqtj,Uid:e556d54d-017a-4277-b8ea-32b10093992b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"196eaa19009679489be41e2cd025065d9e3da22d112676114e5215982604f48f\"" Dec 16 13:02:48.242603 systemd-networkd[1467]: caliee96d8020b5: Gained IPv6LL Dec 16 13:02:48.463286 containerd[1550]: time="2025-12-16T13:02:48.463037685Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:02:48.464167 containerd[1550]: time="2025-12-16T13:02:48.464124247Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:02:48.464242 containerd[1550]: time="2025-12-16T13:02:48.464218443Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:02:48.464773 kubelet[2758]: E1216 13:02:48.464493 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:02:48.464773 kubelet[2758]: E1216 13:02:48.464581 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:02:48.466006 kubelet[2758]: E1216 13:02:48.465954 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-ffzww_calico-system(e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:02:48.467327 containerd[1550]: time="2025-12-16T13:02:48.467105397Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:02:48.562980 systemd-networkd[1467]: calic6423da84f2: Gained IPv6LL Dec 16 13:02:48.912155 containerd[1550]: time="2025-12-16T13:02:48.912019569Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:02:48.913502 containerd[1550]: time="2025-12-16T13:02:48.913384503Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:02:48.913502 containerd[1550]: time="2025-12-16T13:02:48.913477127Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:02:48.914345 kubelet[2758]: E1216 13:02:48.913708 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:02:48.914345 kubelet[2758]: E1216 13:02:48.913765 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:02:48.914345 kubelet[2758]: E1216 13:02:48.913957 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57d769ccb6-5hqtj_calico-apiserver(e556d54d-017a-4277-b8ea-32b10093992b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:02:48.914345 kubelet[2758]: E1216 13:02:48.913990 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" podUID="e556d54d-017a-4277-b8ea-32b10093992b" Dec 16 13:02:48.914988 containerd[1550]: time="2025-12-16T13:02:48.914971915Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:02:48.939136 kubelet[2758]: E1216 13:02:48.938746 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" podUID="e556d54d-017a-4277-b8ea-32b10093992b" Dec 16 13:02:48.941163 kubelet[2758]: E1216 13:02:48.941033 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" podUID="71bd25bd-53fb-4224-a114-8010f8dec502" Dec 16 13:02:49.363592 containerd[1550]: time="2025-12-16T13:02:49.363467748Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:02:49.364503 containerd[1550]: time="2025-12-16T13:02:49.364446718Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:02:49.364784 containerd[1550]: time="2025-12-16T13:02:49.364486412Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:02:49.364929 kubelet[2758]: E1216 13:02:49.364899 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:02:49.365037 kubelet[2758]: E1216 13:02:49.365014 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:02:49.365153 kubelet[2758]: E1216 13:02:49.365137 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-ffzww_calico-system(e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:02:49.371841 kubelet[2758]: E1216 13:02:49.371023 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:02:49.587013 systemd-networkd[1467]: calicb29e45ac69: Gained IPv6LL Dec 16 13:02:49.651552 systemd-networkd[1467]: calif57ddfbdaff: Gained IPv6LL Dec 16 13:02:49.720377 containerd[1550]: time="2025-12-16T13:02:49.720019199Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-2pnsr,Uid:c38f2008-1a82-45d6-8890-7ece4b855117,Namespace:calico-system,Attempt:0,}" Dec 16 13:02:49.856202 systemd-networkd[1467]: cali528132addd7: Link UP Dec 16 13:02:49.857218 systemd-networkd[1467]: cali528132addd7: Gained carrier Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.770 [INFO][4734] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--2--4--07f930e259-k8s-goldmane--7c778bb748--2pnsr-eth0 goldmane-7c778bb748- calico-system c38f2008-1a82-45d6-8890-7ece4b855117 817 0 2025-12-16 13:02:20 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-2-4-07f930e259 goldmane-7c778bb748-2pnsr eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali528132addd7 [] [] }} ContainerID="481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" Namespace="calico-system" Pod="goldmane-7c778bb748-2pnsr" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-goldmane--7c778bb748--2pnsr-" Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.770 [INFO][4734] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" Namespace="calico-system" Pod="goldmane-7c778bb748-2pnsr" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-goldmane--7c778bb748--2pnsr-eth0" Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.803 [INFO][4746] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" HandleID="k8s-pod-network.481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" Workload="ci--4459--2--2--4--07f930e259-k8s-goldmane--7c778bb748--2pnsr-eth0" Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.803 [INFO][4746] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" HandleID="k8s-pod-network.481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" Workload="ci--4459--2--2--4--07f930e259-k8s-goldmane--7c778bb748--2pnsr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cb6f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-2-4-07f930e259", "pod":"goldmane-7c778bb748-2pnsr", "timestamp":"2025-12-16 13:02:49.803336792 +0000 UTC"}, Hostname:"ci-4459-2-2-4-07f930e259", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.803 [INFO][4746] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.803 [INFO][4746] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.803 [INFO][4746] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-2-4-07f930e259' Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.811 [INFO][4746] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.815 [INFO][4746] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.821 [INFO][4746] ipam/ipam.go 511: Trying affinity for 192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.824 [INFO][4746] ipam/ipam.go 158: Attempting to load block cidr=192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.827 [INFO][4746] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.59.64/26 host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.827 [INFO][4746] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.59.64/26 handle="k8s-pod-network.481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.828 [INFO][4746] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7 Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.834 [INFO][4746] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.59.64/26 handle="k8s-pod-network.481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.848 [INFO][4746] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.59.72/26] block=192.168.59.64/26 handle="k8s-pod-network.481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.848 [INFO][4746] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.59.72/26] handle="k8s-pod-network.481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" host="ci-4459-2-2-4-07f930e259" Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.848 [INFO][4746] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 13:02:49.883385 containerd[1550]: 2025-12-16 13:02:49.848 [INFO][4746] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.59.72/26] IPv6=[] ContainerID="481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" HandleID="k8s-pod-network.481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" Workload="ci--4459--2--2--4--07f930e259-k8s-goldmane--7c778bb748--2pnsr-eth0" Dec 16 13:02:49.883920 containerd[1550]: 2025-12-16 13:02:49.849 [INFO][4734] cni-plugin/k8s.go 418: Populated endpoint ContainerID="481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" Namespace="calico-system" Pod="goldmane-7c778bb748-2pnsr" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-goldmane--7c778bb748--2pnsr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--07f930e259-k8s-goldmane--7c778bb748--2pnsr-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"c38f2008-1a82-45d6-8890-7ece4b855117", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 2, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-07f930e259", ContainerID:"", Pod:"goldmane-7c778bb748-2pnsr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.59.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali528132addd7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:02:49.883920 containerd[1550]: 2025-12-16 13:02:49.850 [INFO][4734] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.59.72/32] ContainerID="481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" Namespace="calico-system" Pod="goldmane-7c778bb748-2pnsr" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-goldmane--7c778bb748--2pnsr-eth0" Dec 16 13:02:49.883920 containerd[1550]: 2025-12-16 13:02:49.851 [INFO][4734] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali528132addd7 ContainerID="481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" Namespace="calico-system" Pod="goldmane-7c778bb748-2pnsr" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-goldmane--7c778bb748--2pnsr-eth0" Dec 16 13:02:49.883920 containerd[1550]: 2025-12-16 13:02:49.858 [INFO][4734] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" Namespace="calico-system" Pod="goldmane-7c778bb748-2pnsr" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-goldmane--7c778bb748--2pnsr-eth0" Dec 16 13:02:49.883920 containerd[1550]: 2025-12-16 13:02:49.858 [INFO][4734] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" Namespace="calico-system" Pod="goldmane-7c778bb748-2pnsr" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-goldmane--7c778bb748--2pnsr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--2--4--07f930e259-k8s-goldmane--7c778bb748--2pnsr-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"c38f2008-1a82-45d6-8890-7ece4b855117", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 13, 2, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-2-4-07f930e259", ContainerID:"481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7", Pod:"goldmane-7c778bb748-2pnsr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.59.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali528132addd7", MAC:"da:89:db:6b:b6:f2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 13:02:49.883920 containerd[1550]: 2025-12-16 13:02:49.880 [INFO][4734] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" Namespace="calico-system" Pod="goldmane-7c778bb748-2pnsr" WorkloadEndpoint="ci--4459--2--2--4--07f930e259-k8s-goldmane--7c778bb748--2pnsr-eth0" Dec 16 13:02:49.925451 containerd[1550]: time="2025-12-16T13:02:49.923774024Z" level=info msg="connecting to shim 481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7" address="unix:///run/containerd/s/89049d699e4a957c79f321148d09691a2bdd1ee6e916040f73eda78e287a4441" namespace=k8s.io protocol=ttrpc version=3 Dec 16 13:02:49.943424 kubelet[2758]: E1216 13:02:49.943390 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" podUID="e556d54d-017a-4277-b8ea-32b10093992b" Dec 16 13:02:49.944422 kubelet[2758]: E1216 13:02:49.944381 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:02:49.979889 systemd[1]: Started cri-containerd-481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7.scope - libcontainer container 481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7. Dec 16 13:02:50.186453 containerd[1550]: time="2025-12-16T13:02:50.186351870Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-2pnsr,Uid:c38f2008-1a82-45d6-8890-7ece4b855117,Namespace:calico-system,Attempt:0,} returns sandbox id \"481b35e2f3eef921bd62cf3ed0791f46e7e541a015a2cc2aecc2daf46d1382e7\"" Dec 16 13:02:50.188830 containerd[1550]: time="2025-12-16T13:02:50.188773167Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:02:50.641101 containerd[1550]: time="2025-12-16T13:02:50.640763467Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:02:50.643450 containerd[1550]: time="2025-12-16T13:02:50.643235140Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:02:50.643450 containerd[1550]: time="2025-12-16T13:02:50.643399137Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:02:50.644908 kubelet[2758]: E1216 13:02:50.643864 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:02:50.644908 kubelet[2758]: E1216 13:02:50.643928 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:02:50.644908 kubelet[2758]: E1216 13:02:50.644318 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-2pnsr_calico-system(c38f2008-1a82-45d6-8890-7ece4b855117): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:02:50.644908 kubelet[2758]: E1216 13:02:50.644377 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-2pnsr" podUID="c38f2008-1a82-45d6-8890-7ece4b855117" Dec 16 13:02:50.949954 kubelet[2758]: E1216 13:02:50.949673 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-2pnsr" podUID="c38f2008-1a82-45d6-8890-7ece4b855117" Dec 16 13:02:51.889957 systemd-networkd[1467]: cali528132addd7: Gained IPv6LL Dec 16 13:02:51.950497 kubelet[2758]: E1216 13:02:51.950311 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-2pnsr" podUID="c38f2008-1a82-45d6-8890-7ece4b855117" Dec 16 13:02:58.696563 containerd[1550]: time="2025-12-16T13:02:58.696100595Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:02:59.164087 containerd[1550]: time="2025-12-16T13:02:59.163894126Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:02:59.165030 containerd[1550]: time="2025-12-16T13:02:59.164925693Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:02:59.165030 containerd[1550]: time="2025-12-16T13:02:59.164948245Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:02:59.166461 kubelet[2758]: E1216 13:02:59.166415 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:02:59.167198 kubelet[2758]: E1216 13:02:59.166802 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:02:59.167198 kubelet[2758]: E1216 13:02:59.166908 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-57f7c6c79b-6gqd8_calico-system(e5c12796-2fe4-496b-bdb1-f696ef061ec9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:02:59.168055 containerd[1550]: time="2025-12-16T13:02:59.167923689Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:02:59.588714 containerd[1550]: time="2025-12-16T13:02:59.588563140Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:02:59.590335 containerd[1550]: time="2025-12-16T13:02:59.590202377Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:02:59.590778 containerd[1550]: time="2025-12-16T13:02:59.590433479Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:02:59.591114 kubelet[2758]: E1216 13:02:59.590989 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:02:59.591114 kubelet[2758]: E1216 13:02:59.591067 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:02:59.591592 kubelet[2758]: E1216 13:02:59.591429 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-57f7c6c79b-6gqd8_calico-system(e5c12796-2fe4-496b-bdb1-f696ef061ec9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:02:59.591976 kubelet[2758]: E1216 13:02:59.591938 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-57f7c6c79b-6gqd8" podUID="e5c12796-2fe4-496b-bdb1-f696ef061ec9" Dec 16 13:03:00.699168 containerd[1550]: time="2025-12-16T13:03:00.699076358Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:03:01.173835 containerd[1550]: time="2025-12-16T13:03:01.173512055Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:03:01.174850 containerd[1550]: time="2025-12-16T13:03:01.174792798Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:03:01.174942 containerd[1550]: time="2025-12-16T13:03:01.174915779Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:03:01.175173 kubelet[2758]: E1216 13:03:01.175132 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:03:01.175436 kubelet[2758]: E1216 13:03:01.175183 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:03:01.175436 kubelet[2758]: E1216 13:03:01.175358 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57d769ccb6-w5zs6_calico-apiserver(fc7f9104-6597-40ee-97eb-e334f4aba79b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:03:01.175436 kubelet[2758]: E1216 13:03:01.175393 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" podUID="fc7f9104-6597-40ee-97eb-e334f4aba79b" Dec 16 13:03:01.177242 containerd[1550]: time="2025-12-16T13:03:01.177207891Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:03:01.606207 containerd[1550]: time="2025-12-16T13:03:01.605996840Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:03:01.607196 containerd[1550]: time="2025-12-16T13:03:01.607107904Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:03:01.607196 containerd[1550]: time="2025-12-16T13:03:01.607171053Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:03:01.607395 kubelet[2758]: E1216 13:03:01.607354 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:03:01.607437 kubelet[2758]: E1216 13:03:01.607420 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:03:01.607526 kubelet[2758]: E1216 13:03:01.607508 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-ffzww_calico-system(e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:03:01.608504 containerd[1550]: time="2025-12-16T13:03:01.608484918Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:03:02.047506 containerd[1550]: time="2025-12-16T13:03:02.047446884Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:03:02.048664 containerd[1550]: time="2025-12-16T13:03:02.048563709Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:03:02.048764 containerd[1550]: time="2025-12-16T13:03:02.048673515Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:03:02.048962 kubelet[2758]: E1216 13:03:02.048865 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:03:02.048962 kubelet[2758]: E1216 13:03:02.048918 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:03:02.049660 kubelet[2758]: E1216 13:03:02.049018 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-ffzww_calico-system(e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:03:02.049660 kubelet[2758]: E1216 13:03:02.049076 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:03:02.697310 containerd[1550]: time="2025-12-16T13:03:02.697255761Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:03:03.146532 containerd[1550]: time="2025-12-16T13:03:03.146483903Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:03:03.147515 containerd[1550]: time="2025-12-16T13:03:03.147484941Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:03:03.147603 containerd[1550]: time="2025-12-16T13:03:03.147589747Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:03:03.147782 kubelet[2758]: E1216 13:03:03.147748 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:03:03.148181 kubelet[2758]: E1216 13:03:03.148012 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:03:03.148181 kubelet[2758]: E1216 13:03:03.148113 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57d769ccb6-5hqtj_calico-apiserver(e556d54d-017a-4277-b8ea-32b10093992b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:03:03.148181 kubelet[2758]: E1216 13:03:03.148157 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" podUID="e556d54d-017a-4277-b8ea-32b10093992b" Dec 16 13:03:03.698719 containerd[1550]: time="2025-12-16T13:03:03.698648585Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:03:04.129371 containerd[1550]: time="2025-12-16T13:03:04.129314230Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:03:04.130656 containerd[1550]: time="2025-12-16T13:03:04.130625440Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:03:04.130969 containerd[1550]: time="2025-12-16T13:03:04.130637262Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:03:04.131072 kubelet[2758]: E1216 13:03:04.131026 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:03:04.131072 kubelet[2758]: E1216 13:03:04.131066 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:03:04.132896 kubelet[2758]: E1216 13:03:04.131167 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5f8d9d9fb6-xwnn4_calico-system(71bd25bd-53fb-4224-a114-8010f8dec502): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:03:04.132896 kubelet[2758]: E1216 13:03:04.132853 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" podUID="71bd25bd-53fb-4224-a114-8010f8dec502" Dec 16 13:03:06.695086 containerd[1550]: time="2025-12-16T13:03:06.695040227Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:03:07.143262 containerd[1550]: time="2025-12-16T13:03:07.143203694Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:03:07.144373 containerd[1550]: time="2025-12-16T13:03:07.144308377Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:03:07.144439 containerd[1550]: time="2025-12-16T13:03:07.144402303Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:03:07.144658 kubelet[2758]: E1216 13:03:07.144609 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:03:07.145044 kubelet[2758]: E1216 13:03:07.144683 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:03:07.145044 kubelet[2758]: E1216 13:03:07.144756 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-2pnsr_calico-system(c38f2008-1a82-45d6-8890-7ece4b855117): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:03:07.145044 kubelet[2758]: E1216 13:03:07.144791 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-2pnsr" podUID="c38f2008-1a82-45d6-8890-7ece4b855117" Dec 16 13:03:13.701067 kubelet[2758]: E1216 13:03:13.699971 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:03:14.697665 kubelet[2758]: E1216 13:03:14.696645 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-57f7c6c79b-6gqd8" podUID="e5c12796-2fe4-496b-bdb1-f696ef061ec9" Dec 16 13:03:15.699518 kubelet[2758]: E1216 13:03:15.698992 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" podUID="71bd25bd-53fb-4224-a114-8010f8dec502" Dec 16 13:03:15.700030 kubelet[2758]: E1216 13:03:15.699190 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" podUID="fc7f9104-6597-40ee-97eb-e334f4aba79b" Dec 16 13:03:18.696303 kubelet[2758]: E1216 13:03:18.696266 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" podUID="e556d54d-017a-4277-b8ea-32b10093992b" Dec 16 13:03:18.696885 kubelet[2758]: E1216 13:03:18.696855 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-2pnsr" podUID="c38f2008-1a82-45d6-8890-7ece4b855117" Dec 16 13:03:25.697360 containerd[1550]: time="2025-12-16T13:03:25.696687136Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:03:26.131781 containerd[1550]: time="2025-12-16T13:03:26.131730649Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:03:26.132911 containerd[1550]: time="2025-12-16T13:03:26.132869472Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:03:26.133804 containerd[1550]: time="2025-12-16T13:03:26.132946587Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:03:26.134272 kubelet[2758]: E1216 13:03:26.133154 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:03:26.134272 kubelet[2758]: E1216 13:03:26.133218 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:03:26.134272 kubelet[2758]: E1216 13:03:26.133304 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-ffzww_calico-system(e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:03:26.135147 containerd[1550]: time="2025-12-16T13:03:26.135062256Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:03:26.568187 containerd[1550]: time="2025-12-16T13:03:26.568068110Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:03:26.569021 containerd[1550]: time="2025-12-16T13:03:26.568978511Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:03:26.569078 containerd[1550]: time="2025-12-16T13:03:26.569048634Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:03:26.569443 kubelet[2758]: E1216 13:03:26.569371 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:03:26.569443 kubelet[2758]: E1216 13:03:26.569420 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:03:26.569719 kubelet[2758]: E1216 13:03:26.569653 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-ffzww_calico-system(e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:03:26.570026 kubelet[2758]: E1216 13:03:26.570001 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:03:27.696108 containerd[1550]: time="2025-12-16T13:03:27.696056578Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:03:28.152290 containerd[1550]: time="2025-12-16T13:03:28.152236930Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:03:28.153312 containerd[1550]: time="2025-12-16T13:03:28.153281835Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:03:28.153394 containerd[1550]: time="2025-12-16T13:03:28.153329495Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:03:28.153973 kubelet[2758]: E1216 13:03:28.153933 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:03:28.154523 kubelet[2758]: E1216 13:03:28.154299 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:03:28.154682 kubelet[2758]: E1216 13:03:28.154607 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-57f7c6c79b-6gqd8_calico-system(e5c12796-2fe4-496b-bdb1-f696ef061ec9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:03:28.157207 containerd[1550]: time="2025-12-16T13:03:28.157013886Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:03:28.625286 containerd[1550]: time="2025-12-16T13:03:28.625238783Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:03:28.626606 containerd[1550]: time="2025-12-16T13:03:28.626561252Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:03:28.626659 containerd[1550]: time="2025-12-16T13:03:28.626626815Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:03:28.626849 kubelet[2758]: E1216 13:03:28.626792 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:03:28.626915 kubelet[2758]: E1216 13:03:28.626854 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:03:28.626964 kubelet[2758]: E1216 13:03:28.626945 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-57f7c6c79b-6gqd8_calico-system(e5c12796-2fe4-496b-bdb1-f696ef061ec9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:03:28.627802 kubelet[2758]: E1216 13:03:28.626984 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-57f7c6c79b-6gqd8" podUID="e5c12796-2fe4-496b-bdb1-f696ef061ec9" Dec 16 13:03:29.696728 containerd[1550]: time="2025-12-16T13:03:29.696667475Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:03:30.122555 containerd[1550]: time="2025-12-16T13:03:30.122504755Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:03:30.123892 containerd[1550]: time="2025-12-16T13:03:30.123808378Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:03:30.124300 containerd[1550]: time="2025-12-16T13:03:30.123937241Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:03:30.124333 kubelet[2758]: E1216 13:03:30.124133 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:03:30.124333 kubelet[2758]: E1216 13:03:30.124177 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:03:30.124333 kubelet[2758]: E1216 13:03:30.124243 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57d769ccb6-w5zs6_calico-apiserver(fc7f9104-6597-40ee-97eb-e334f4aba79b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:03:30.124333 kubelet[2758]: E1216 13:03:30.124276 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" podUID="fc7f9104-6597-40ee-97eb-e334f4aba79b" Dec 16 13:03:30.698083 containerd[1550]: time="2025-12-16T13:03:30.698003682Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:03:31.134029 containerd[1550]: time="2025-12-16T13:03:31.133985812Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:03:31.135261 containerd[1550]: time="2025-12-16T13:03:31.135190297Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:03:31.135438 containerd[1550]: time="2025-12-16T13:03:31.135310824Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:03:31.135511 kubelet[2758]: E1216 13:03:31.135453 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:03:31.135511 kubelet[2758]: E1216 13:03:31.135492 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:03:31.136125 kubelet[2758]: E1216 13:03:31.135579 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5f8d9d9fb6-xwnn4_calico-system(71bd25bd-53fb-4224-a114-8010f8dec502): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:03:31.136125 kubelet[2758]: E1216 13:03:31.135610 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" podUID="71bd25bd-53fb-4224-a114-8010f8dec502" Dec 16 13:03:31.697000 containerd[1550]: time="2025-12-16T13:03:31.696594906Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:03:32.115356 containerd[1550]: time="2025-12-16T13:03:32.115225176Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:03:32.116676 containerd[1550]: time="2025-12-16T13:03:32.116625619Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:03:32.116736 containerd[1550]: time="2025-12-16T13:03:32.116711841Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:03:32.116949 kubelet[2758]: E1216 13:03:32.116901 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:03:32.117006 kubelet[2758]: E1216 13:03:32.116956 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:03:32.117081 kubelet[2758]: E1216 13:03:32.117047 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57d769ccb6-5hqtj_calico-apiserver(e556d54d-017a-4277-b8ea-32b10093992b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:03:32.117115 kubelet[2758]: E1216 13:03:32.117096 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" podUID="e556d54d-017a-4277-b8ea-32b10093992b" Dec 16 13:03:32.696386 containerd[1550]: time="2025-12-16T13:03:32.696288674Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:03:33.178758 containerd[1550]: time="2025-12-16T13:03:33.178705977Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:03:33.179926 containerd[1550]: time="2025-12-16T13:03:33.179844086Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:03:33.179926 containerd[1550]: time="2025-12-16T13:03:33.179866327Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:03:33.180252 kubelet[2758]: E1216 13:03:33.180207 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:03:33.180480 kubelet[2758]: E1216 13:03:33.180258 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:03:33.180480 kubelet[2758]: E1216 13:03:33.180405 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-2pnsr_calico-system(c38f2008-1a82-45d6-8890-7ece4b855117): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:03:33.180665 kubelet[2758]: E1216 13:03:33.180629 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-2pnsr" podUID="c38f2008-1a82-45d6-8890-7ece4b855117" Dec 16 13:03:37.699652 kubelet[2758]: E1216 13:03:37.699531 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:03:39.697794 kubelet[2758]: E1216 13:03:39.697753 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-57f7c6c79b-6gqd8" podUID="e5c12796-2fe4-496b-bdb1-f696ef061ec9" Dec 16 13:03:42.696261 kubelet[2758]: E1216 13:03:42.696214 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" podUID="e556d54d-017a-4277-b8ea-32b10093992b" Dec 16 13:03:44.695711 kubelet[2758]: E1216 13:03:44.695527 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" podUID="fc7f9104-6597-40ee-97eb-e334f4aba79b" Dec 16 13:03:46.695994 kubelet[2758]: E1216 13:03:46.695758 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" podUID="71bd25bd-53fb-4224-a114-8010f8dec502" Dec 16 13:03:48.696603 kubelet[2758]: E1216 13:03:48.696302 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-2pnsr" podUID="c38f2008-1a82-45d6-8890-7ece4b855117" Dec 16 13:03:49.196072 systemd[1]: Started sshd@7-77.42.23.34:22-139.178.89.65:52412.service - OpenSSH per-connection server daemon (139.178.89.65:52412). Dec 16 13:03:50.216032 sshd[4910]: Accepted publickey for core from 139.178.89.65 port 52412 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:03:50.218636 sshd-session[4910]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:03:50.229843 systemd-logind[1534]: New session 8 of user core. Dec 16 13:03:50.235147 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 16 13:03:52.121533 sshd[4913]: Connection closed by 139.178.89.65 port 52412 Dec 16 13:03:52.123004 sshd-session[4910]: pam_unix(sshd:session): session closed for user core Dec 16 13:03:52.132007 systemd[1]: sshd@7-77.42.23.34:22-139.178.89.65:52412.service: Deactivated successfully. Dec 16 13:03:52.138165 systemd[1]: session-8.scope: Deactivated successfully. Dec 16 13:03:52.141048 systemd-logind[1534]: Session 8 logged out. Waiting for processes to exit. Dec 16 13:03:52.143689 systemd-logind[1534]: Removed session 8. Dec 16 13:03:52.696876 kubelet[2758]: E1216 13:03:52.696835 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:03:52.697251 kubelet[2758]: E1216 13:03:52.696925 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-57f7c6c79b-6gqd8" podUID="e5c12796-2fe4-496b-bdb1-f696ef061ec9" Dec 16 13:03:54.700402 kubelet[2758]: E1216 13:03:54.700331 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" podUID="e556d54d-017a-4277-b8ea-32b10093992b" Dec 16 13:03:56.583793 systemd[1]: Started sshd@8-77.42.23.34:22-139.178.89.65:37196.service - OpenSSH per-connection server daemon (139.178.89.65:37196). Dec 16 13:03:57.586568 sshd[4927]: Accepted publickey for core from 139.178.89.65 port 37196 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:03:57.587980 sshd-session[4927]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:03:57.592857 systemd-logind[1534]: New session 9 of user core. Dec 16 13:03:57.599946 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 16 13:03:57.697082 kubelet[2758]: E1216 13:03:57.697040 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" podUID="fc7f9104-6597-40ee-97eb-e334f4aba79b" Dec 16 13:03:58.341674 sshd[4930]: Connection closed by 139.178.89.65 port 37196 Dec 16 13:03:58.342226 sshd-session[4927]: pam_unix(sshd:session): session closed for user core Dec 16 13:03:58.345999 systemd[1]: sshd@8-77.42.23.34:22-139.178.89.65:37196.service: Deactivated successfully. Dec 16 13:03:58.348286 systemd[1]: session-9.scope: Deactivated successfully. Dec 16 13:03:58.349900 systemd-logind[1534]: Session 9 logged out. Waiting for processes to exit. Dec 16 13:03:58.351511 systemd-logind[1534]: Removed session 9. Dec 16 13:03:58.545014 systemd[1]: Started sshd@9-77.42.23.34:22-139.178.89.65:37200.service - OpenSSH per-connection server daemon (139.178.89.65:37200). Dec 16 13:03:59.632074 sshd[4943]: Accepted publickey for core from 139.178.89.65 port 37200 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:03:59.633271 sshd-session[4943]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:03:59.640796 systemd-logind[1534]: New session 10 of user core. Dec 16 13:03:59.646047 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 16 13:03:59.695335 kubelet[2758]: E1216 13:03:59.695259 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-2pnsr" podUID="c38f2008-1a82-45d6-8890-7ece4b855117" Dec 16 13:04:00.496862 sshd[4946]: Connection closed by 139.178.89.65 port 37200 Dec 16 13:04:00.498922 sshd-session[4943]: pam_unix(sshd:session): session closed for user core Dec 16 13:04:00.505009 systemd[1]: sshd@9-77.42.23.34:22-139.178.89.65:37200.service: Deactivated successfully. Dec 16 13:04:00.509166 systemd[1]: session-10.scope: Deactivated successfully. Dec 16 13:04:00.513608 systemd-logind[1534]: Session 10 logged out. Waiting for processes to exit. Dec 16 13:04:00.517139 systemd-logind[1534]: Removed session 10. Dec 16 13:04:00.643895 systemd[1]: Started sshd@10-77.42.23.34:22-139.178.89.65:55660.service - OpenSSH per-connection server daemon (139.178.89.65:55660). Dec 16 13:04:00.699035 kubelet[2758]: E1216 13:04:00.699004 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" podUID="71bd25bd-53fb-4224-a114-8010f8dec502" Dec 16 13:04:01.646895 sshd[4956]: Accepted publickey for core from 139.178.89.65 port 55660 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:04:01.650629 sshd-session[4956]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:04:01.664883 systemd-logind[1534]: New session 11 of user core. Dec 16 13:04:01.667929 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 16 13:04:02.416737 sshd[4963]: Connection closed by 139.178.89.65 port 55660 Dec 16 13:04:02.420128 sshd-session[4956]: pam_unix(sshd:session): session closed for user core Dec 16 13:04:02.426639 systemd[1]: sshd@10-77.42.23.34:22-139.178.89.65:55660.service: Deactivated successfully. Dec 16 13:04:02.429058 systemd[1]: session-11.scope: Deactivated successfully. Dec 16 13:04:02.431257 systemd-logind[1534]: Session 11 logged out. Waiting for processes to exit. Dec 16 13:04:02.432525 systemd-logind[1534]: Removed session 11. Dec 16 13:04:06.696088 kubelet[2758]: E1216 13:04:06.695731 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" podUID="e556d54d-017a-4277-b8ea-32b10093992b" Dec 16 13:04:06.697338 kubelet[2758]: E1216 13:04:06.697303 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-57f7c6c79b-6gqd8" podUID="e5c12796-2fe4-496b-bdb1-f696ef061ec9" Dec 16 13:04:07.594593 systemd[1]: Started sshd@11-77.42.23.34:22-139.178.89.65:55670.service - OpenSSH per-connection server daemon (139.178.89.65:55670). Dec 16 13:04:07.698980 containerd[1550]: time="2025-12-16T13:04:07.698947520Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 13:04:08.151278 containerd[1550]: time="2025-12-16T13:04:08.151029507Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:04:08.152552 containerd[1550]: time="2025-12-16T13:04:08.152465938Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 13:04:08.152907 containerd[1550]: time="2025-12-16T13:04:08.152720787Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=69" Dec 16 13:04:08.153969 kubelet[2758]: E1216 13:04:08.153312 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:04:08.153969 kubelet[2758]: E1216 13:04:08.153374 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 13:04:08.153969 kubelet[2758]: E1216 13:04:08.153466 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-ffzww_calico-system(e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/csi:v3.30.4\": ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 13:04:08.155682 containerd[1550]: time="2025-12-16T13:04:08.155650246Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 13:04:08.583754 containerd[1550]: time="2025-12-16T13:04:08.583634683Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:04:08.584900 containerd[1550]: time="2025-12-16T13:04:08.584770760Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 13:04:08.585048 containerd[1550]: time="2025-12-16T13:04:08.585015580Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=93" Dec 16 13:04:08.585259 kubelet[2758]: E1216 13:04:08.585178 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:04:08.585259 kubelet[2758]: E1216 13:04:08.585215 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 13:04:08.585398 kubelet[2758]: E1216 13:04:08.585277 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-ffzww_calico-system(e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 13:04:08.585398 kubelet[2758]: E1216 13:04:08.585310 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:04:08.620242 sshd[4983]: Accepted publickey for core from 139.178.89.65 port 55670 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:04:08.622582 sshd-session[4983]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:04:08.631686 systemd-logind[1534]: New session 12 of user core. Dec 16 13:04:08.637984 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 16 13:04:08.695742 kubelet[2758]: E1216 13:04:08.695584 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" podUID="fc7f9104-6597-40ee-97eb-e334f4aba79b" Dec 16 13:04:09.391805 sshd[4986]: Connection closed by 139.178.89.65 port 55670 Dec 16 13:04:09.393362 sshd-session[4983]: pam_unix(sshd:session): session closed for user core Dec 16 13:04:09.398462 systemd[1]: sshd@11-77.42.23.34:22-139.178.89.65:55670.service: Deactivated successfully. Dec 16 13:04:09.401220 systemd[1]: session-12.scope: Deactivated successfully. Dec 16 13:04:09.402947 systemd-logind[1534]: Session 12 logged out. Waiting for processes to exit. Dec 16 13:04:09.405152 systemd-logind[1534]: Removed session 12. Dec 16 13:04:13.696220 containerd[1550]: time="2025-12-16T13:04:13.696095817Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 13:04:14.144102 containerd[1550]: time="2025-12-16T13:04:14.141637326Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:04:14.145341 containerd[1550]: time="2025-12-16T13:04:14.145253273Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 13:04:14.145341 containerd[1550]: time="2025-12-16T13:04:14.145322604Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=77" Dec 16 13:04:14.145638 kubelet[2758]: E1216 13:04:14.145568 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:04:14.145638 kubelet[2758]: E1216 13:04:14.145606 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 13:04:14.147834 kubelet[2758]: E1216 13:04:14.146033 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-2pnsr_calico-system(c38f2008-1a82-45d6-8890-7ece4b855117): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 13:04:14.147834 kubelet[2758]: E1216 13:04:14.146351 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-2pnsr" podUID="c38f2008-1a82-45d6-8890-7ece4b855117" Dec 16 13:04:14.565002 systemd[1]: Started sshd@12-77.42.23.34:22-139.178.89.65:34416.service - OpenSSH per-connection server daemon (139.178.89.65:34416). Dec 16 13:04:14.707395 containerd[1550]: time="2025-12-16T13:04:14.707335794Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 13:04:15.146870 containerd[1550]: time="2025-12-16T13:04:15.146748782Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:04:15.149652 containerd[1550]: time="2025-12-16T13:04:15.149413780Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 13:04:15.149652 containerd[1550]: time="2025-12-16T13:04:15.149493751Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=85" Dec 16 13:04:15.149810 kubelet[2758]: E1216 13:04:15.149706 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:04:15.149810 kubelet[2758]: E1216 13:04:15.149750 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 13:04:15.149810 kubelet[2758]: E1216 13:04:15.149809 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-5f8d9d9fb6-xwnn4_calico-system(71bd25bd-53fb-4224-a114-8010f8dec502): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 13:04:15.149810 kubelet[2758]: E1216 13:04:15.149862 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" podUID="71bd25bd-53fb-4224-a114-8010f8dec502" Dec 16 13:04:15.572463 sshd[5002]: Accepted publickey for core from 139.178.89.65 port 34416 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:04:15.572249 sshd-session[5002]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:04:15.577996 systemd-logind[1534]: New session 13 of user core. Dec 16 13:04:15.582961 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 16 13:04:16.328382 sshd[5005]: Connection closed by 139.178.89.65 port 34416 Dec 16 13:04:16.330111 sshd-session[5002]: pam_unix(sshd:session): session closed for user core Dec 16 13:04:16.333971 systemd[1]: sshd@12-77.42.23.34:22-139.178.89.65:34416.service: Deactivated successfully. Dec 16 13:04:16.335698 systemd[1]: session-13.scope: Deactivated successfully. Dec 16 13:04:16.339500 systemd-logind[1534]: Session 13 logged out. Waiting for processes to exit. Dec 16 13:04:16.340560 systemd-logind[1534]: Removed session 13. Dec 16 13:04:17.699008 containerd[1550]: time="2025-12-16T13:04:17.698809351Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:04:18.135980 containerd[1550]: time="2025-12-16T13:04:18.135942632Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:04:18.137520 containerd[1550]: time="2025-12-16T13:04:18.137374202Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:04:18.137520 containerd[1550]: time="2025-12-16T13:04:18.137420218Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:04:18.137640 kubelet[2758]: E1216 13:04:18.137604 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:04:18.138152 kubelet[2758]: E1216 13:04:18.137643 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:04:18.138152 kubelet[2758]: E1216 13:04:18.137704 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57d769ccb6-5hqtj_calico-apiserver(e556d54d-017a-4277-b8ea-32b10093992b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:04:18.138152 kubelet[2758]: E1216 13:04:18.137754 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" podUID="e556d54d-017a-4277-b8ea-32b10093992b" Dec 16 13:04:18.697364 containerd[1550]: time="2025-12-16T13:04:18.697283109Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 13:04:19.148855 containerd[1550]: time="2025-12-16T13:04:19.148763301Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:04:19.150433 containerd[1550]: time="2025-12-16T13:04:19.150276986Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 13:04:19.150433 containerd[1550]: time="2025-12-16T13:04:19.150294228Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=73" Dec 16 13:04:19.150943 kubelet[2758]: E1216 13:04:19.150773 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:04:19.150943 kubelet[2758]: E1216 13:04:19.150891 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 13:04:19.151326 kubelet[2758]: E1216 13:04:19.150994 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-57f7c6c79b-6gqd8_calico-system(e5c12796-2fe4-496b-bdb1-f696ef061ec9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.4\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 13:04:19.153009 containerd[1550]: time="2025-12-16T13:04:19.152950670Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 13:04:19.591999 containerd[1550]: time="2025-12-16T13:04:19.591897482Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:04:19.592941 containerd[1550]: time="2025-12-16T13:04:19.592894576Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 13:04:19.593011 containerd[1550]: time="2025-12-16T13:04:19.592971741Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=85" Dec 16 13:04:19.593164 kubelet[2758]: E1216 13:04:19.593099 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:04:19.593164 kubelet[2758]: E1216 13:04:19.593145 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 13:04:19.593358 kubelet[2758]: E1216 13:04:19.593206 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-57f7c6c79b-6gqd8_calico-system(e5c12796-2fe4-496b-bdb1-f696ef061ec9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 13:04:19.593358 kubelet[2758]: E1216 13:04:19.593239 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-57f7c6c79b-6gqd8" podUID="e5c12796-2fe4-496b-bdb1-f696ef061ec9" Dec 16 13:04:20.695208 containerd[1550]: time="2025-12-16T13:04:20.695175328Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 13:04:21.169777 containerd[1550]: time="2025-12-16T13:04:21.169655658Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 13:04:21.171289 containerd[1550]: time="2025-12-16T13:04:21.171158662Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 13:04:21.171289 containerd[1550]: time="2025-12-16T13:04:21.171250896Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=77" Dec 16 13:04:21.171527 kubelet[2758]: E1216 13:04:21.171425 2758 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:04:21.171527 kubelet[2758]: E1216 13:04:21.171515 2758 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 13:04:21.172372 kubelet[2758]: E1216 13:04:21.171608 2758 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-57d769ccb6-w5zs6_calico-apiserver(fc7f9104-6597-40ee-97eb-e334f4aba79b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve reference \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 13:04:21.172372 kubelet[2758]: E1216 13:04:21.171650 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" podUID="fc7f9104-6597-40ee-97eb-e334f4aba79b" Dec 16 13:04:21.501680 systemd[1]: Started sshd@13-77.42.23.34:22-139.178.89.65:33894.service - OpenSSH per-connection server daemon (139.178.89.65:33894). Dec 16 13:04:22.513710 sshd[5049]: Accepted publickey for core from 139.178.89.65 port 33894 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:04:22.515551 sshd-session[5049]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:04:22.519844 systemd-logind[1534]: New session 14 of user core. Dec 16 13:04:22.525072 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 16 13:04:22.697461 kubelet[2758]: E1216 13:04:22.697409 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:04:23.277563 sshd[5066]: Connection closed by 139.178.89.65 port 33894 Dec 16 13:04:23.280186 sshd-session[5049]: pam_unix(sshd:session): session closed for user core Dec 16 13:04:23.285365 systemd[1]: sshd@13-77.42.23.34:22-139.178.89.65:33894.service: Deactivated successfully. Dec 16 13:04:23.286994 systemd[1]: session-14.scope: Deactivated successfully. Dec 16 13:04:23.287793 systemd-logind[1534]: Session 14 logged out. Waiting for processes to exit. Dec 16 13:04:23.289573 systemd-logind[1534]: Removed session 14. Dec 16 13:04:23.444494 systemd[1]: Started sshd@14-77.42.23.34:22-139.178.89.65:33908.service - OpenSSH per-connection server daemon (139.178.89.65:33908). Dec 16 13:04:24.447737 sshd[5078]: Accepted publickey for core from 139.178.89.65 port 33908 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:04:24.448426 sshd-session[5078]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:04:24.453271 systemd-logind[1534]: New session 15 of user core. Dec 16 13:04:24.458084 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 16 13:04:25.424651 sshd[5081]: Connection closed by 139.178.89.65 port 33908 Dec 16 13:04:25.425634 sshd-session[5078]: pam_unix(sshd:session): session closed for user core Dec 16 13:04:25.431534 systemd-logind[1534]: Session 15 logged out. Waiting for processes to exit. Dec 16 13:04:25.431969 systemd[1]: sshd@14-77.42.23.34:22-139.178.89.65:33908.service: Deactivated successfully. Dec 16 13:04:25.433567 systemd[1]: session-15.scope: Deactivated successfully. Dec 16 13:04:25.435514 systemd-logind[1534]: Removed session 15. Dec 16 13:04:25.630733 systemd[1]: Started sshd@15-77.42.23.34:22-139.178.89.65:33920.service - OpenSSH per-connection server daemon (139.178.89.65:33920). Dec 16 13:04:26.695939 kubelet[2758]: E1216 13:04:26.695140 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-2pnsr" podUID="c38f2008-1a82-45d6-8890-7ece4b855117" Dec 16 13:04:26.695939 kubelet[2758]: E1216 13:04:26.695186 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" podUID="71bd25bd-53fb-4224-a114-8010f8dec502" Dec 16 13:04:26.750525 sshd[5091]: Accepted publickey for core from 139.178.89.65 port 33920 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:04:26.752146 sshd-session[5091]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:04:26.760298 systemd-logind[1534]: New session 16 of user core. Dec 16 13:04:26.766979 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 16 13:04:28.214840 sshd[5094]: Connection closed by 139.178.89.65 port 33920 Dec 16 13:04:28.218488 sshd-session[5091]: pam_unix(sshd:session): session closed for user core Dec 16 13:04:28.224671 systemd[1]: sshd@15-77.42.23.34:22-139.178.89.65:33920.service: Deactivated successfully. Dec 16 13:04:28.227177 systemd[1]: session-16.scope: Deactivated successfully. Dec 16 13:04:28.228333 systemd-logind[1534]: Session 16 logged out. Waiting for processes to exit. Dec 16 13:04:28.230970 systemd-logind[1534]: Removed session 16. Dec 16 13:04:28.367120 systemd[1]: Started sshd@16-77.42.23.34:22-139.178.89.65:33934.service - OpenSSH per-connection server daemon (139.178.89.65:33934). Dec 16 13:04:29.366325 sshd[5109]: Accepted publickey for core from 139.178.89.65 port 33934 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:04:29.368581 sshd-session[5109]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:04:29.373501 systemd-logind[1534]: New session 17 of user core. Dec 16 13:04:29.380077 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 16 13:04:30.397673 sshd[5114]: Connection closed by 139.178.89.65 port 33934 Dec 16 13:04:30.400052 sshd-session[5109]: pam_unix(sshd:session): session closed for user core Dec 16 13:04:30.407314 systemd-logind[1534]: Session 17 logged out. Waiting for processes to exit. Dec 16 13:04:30.408245 systemd[1]: sshd@16-77.42.23.34:22-139.178.89.65:33934.service: Deactivated successfully. Dec 16 13:04:30.411265 systemd[1]: session-17.scope: Deactivated successfully. Dec 16 13:04:30.418266 systemd-logind[1534]: Removed session 17. Dec 16 13:04:30.609255 systemd[1]: Started sshd@17-77.42.23.34:22-139.178.89.65:40180.service - OpenSSH per-connection server daemon (139.178.89.65:40180). Dec 16 13:04:31.727227 kubelet[2758]: E1216 13:04:31.725224 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-57f7c6c79b-6gqd8" podUID="e5c12796-2fe4-496b-bdb1-f696ef061ec9" Dec 16 13:04:31.735702 sshd[5124]: Accepted publickey for core from 139.178.89.65 port 40180 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:04:31.737190 sshd-session[5124]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:04:31.755554 systemd-logind[1534]: New session 18 of user core. Dec 16 13:04:31.760924 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 16 13:04:32.575599 sshd[5127]: Connection closed by 139.178.89.65 port 40180 Dec 16 13:04:32.576343 sshd-session[5124]: pam_unix(sshd:session): session closed for user core Dec 16 13:04:32.581523 systemd[1]: sshd@17-77.42.23.34:22-139.178.89.65:40180.service: Deactivated successfully. Dec 16 13:04:32.584595 systemd[1]: session-18.scope: Deactivated successfully. Dec 16 13:04:32.586516 systemd-logind[1534]: Session 18 logged out. Waiting for processes to exit. Dec 16 13:04:32.588576 systemd-logind[1534]: Removed session 18. Dec 16 13:04:33.697868 kubelet[2758]: E1216 13:04:33.697310 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" podUID="e556d54d-017a-4277-b8ea-32b10093992b" Dec 16 13:04:33.697868 kubelet[2758]: E1216 13:04:33.697788 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" podUID="fc7f9104-6597-40ee-97eb-e334f4aba79b" Dec 16 13:04:36.697398 kubelet[2758]: E1216 13:04:36.697317 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:04:37.696552 kubelet[2758]: E1216 13:04:37.696491 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" podUID="71bd25bd-53fb-4224-a114-8010f8dec502" Dec 16 13:04:37.760858 systemd[1]: Started sshd@18-77.42.23.34:22-139.178.89.65:40196.service - OpenSSH per-connection server daemon (139.178.89.65:40196). Dec 16 13:04:38.874162 sshd[5141]: Accepted publickey for core from 139.178.89.65 port 40196 ssh2: RSA SHA256:ZUC5+jwMPGmdjOY75CPCzVYpIXnBtNPXtAIGEYlroCc Dec 16 13:04:38.875779 sshd-session[5141]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 13:04:38.881437 systemd-logind[1534]: New session 19 of user core. Dec 16 13:04:38.887992 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 16 13:04:39.700976 sshd[5146]: Connection closed by 139.178.89.65 port 40196 Dec 16 13:04:39.701914 sshd-session[5141]: pam_unix(sshd:session): session closed for user core Dec 16 13:04:39.706251 systemd-logind[1534]: Session 19 logged out. Waiting for processes to exit. Dec 16 13:04:39.707057 systemd[1]: sshd@18-77.42.23.34:22-139.178.89.65:40196.service: Deactivated successfully. Dec 16 13:04:39.710663 systemd[1]: session-19.scope: Deactivated successfully. Dec 16 13:04:39.715732 systemd-logind[1534]: Removed session 19. Dec 16 13:04:41.699208 kubelet[2758]: E1216 13:04:41.698597 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-2pnsr" podUID="c38f2008-1a82-45d6-8890-7ece4b855117" Dec 16 13:04:44.696213 kubelet[2758]: E1216 13:04:44.696125 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" podUID="e556d54d-017a-4277-b8ea-32b10093992b" Dec 16 13:04:46.696764 kubelet[2758]: E1216 13:04:46.696680 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-57f7c6c79b-6gqd8" podUID="e5c12796-2fe4-496b-bdb1-f696ef061ec9" Dec 16 13:04:48.695008 kubelet[2758]: E1216 13:04:48.694937 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-w5zs6" podUID="fc7f9104-6597-40ee-97eb-e334f4aba79b" Dec 16 13:04:48.695008 kubelet[2758]: E1216 13:04:48.694989 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5f8d9d9fb6-xwnn4" podUID="71bd25bd-53fb-4224-a114-8010f8dec502" Dec 16 13:04:50.696338 kubelet[2758]: E1216 13:04:50.696231 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-ffzww" podUID="e441e8ab-0689-4cb9-bcbd-f66c0c3dbffb" Dec 16 13:04:55.157650 kubelet[2758]: E1216 13:04:55.157594 2758 controller.go:195] "Failed to update lease" err="Put \"https://77.42.23.34:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-2-4-07f930e259?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 13:04:55.573880 systemd[1]: cri-containerd-dbf1edeb4fdfd8df528d8c2d4172d9e3613d3928bdb99e9fa0e1a226391e24ab.scope: Deactivated successfully. Dec 16 13:04:55.576111 systemd[1]: cri-containerd-dbf1edeb4fdfd8df528d8c2d4172d9e3613d3928bdb99e9fa0e1a226391e24ab.scope: Consumed 23.788s CPU time, 118M memory peak, 41.8M read from disk. Dec 16 13:04:55.667003 kubelet[2758]: E1216 13:04:55.666797 2758 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:50664->10.0.0.2:2379: read: connection timed out" Dec 16 13:04:55.744748 containerd[1550]: time="2025-12-16T13:04:55.744641369Z" level=info msg="received container exit event container_id:\"dbf1edeb4fdfd8df528d8c2d4172d9e3613d3928bdb99e9fa0e1a226391e24ab\" id:\"dbf1edeb4fdfd8df528d8c2d4172d9e3613d3928bdb99e9fa0e1a226391e24ab\" pid:3080 exit_status:1 exited_at:{seconds:1765890295 nanos:582498973}" Dec 16 13:04:55.821128 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dbf1edeb4fdfd8df528d8c2d4172d9e3613d3928bdb99e9fa0e1a226391e24ab-rootfs.mount: Deactivated successfully. Dec 16 13:04:56.205101 systemd[1]: cri-containerd-f284aaea70f2a1195ebff833c0a2c77e97febf5f381f4eb6f3316c8693c21b25.scope: Deactivated successfully. Dec 16 13:04:56.205510 systemd[1]: cri-containerd-f284aaea70f2a1195ebff833c0a2c77e97febf5f381f4eb6f3316c8693c21b25.scope: Consumed 3.048s CPU time, 84M memory peak, 45.3M read from disk. Dec 16 13:04:56.211948 containerd[1550]: time="2025-12-16T13:04:56.211814052Z" level=info msg="received container exit event container_id:\"f284aaea70f2a1195ebff833c0a2c77e97febf5f381f4eb6f3316c8693c21b25\" id:\"f284aaea70f2a1195ebff833c0a2c77e97febf5f381f4eb6f3316c8693c21b25\" pid:2613 exit_status:1 exited_at:{seconds:1765890296 nanos:211468282}" Dec 16 13:04:56.250497 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f284aaea70f2a1195ebff833c0a2c77e97febf5f381f4eb6f3316c8693c21b25-rootfs.mount: Deactivated successfully. Dec 16 13:04:56.323206 kubelet[2758]: I1216 13:04:56.323155 2758 scope.go:117] "RemoveContainer" containerID="f284aaea70f2a1195ebff833c0a2c77e97febf5f381f4eb6f3316c8693c21b25" Dec 16 13:04:56.324111 kubelet[2758]: I1216 13:04:56.324067 2758 scope.go:117] "RemoveContainer" containerID="dbf1edeb4fdfd8df528d8c2d4172d9e3613d3928bdb99e9fa0e1a226391e24ab" Dec 16 13:04:56.355915 containerd[1550]: time="2025-12-16T13:04:56.354138858Z" level=info msg="CreateContainer within sandbox \"d64d5f398304db34c8b57a051323ae10fa12f8290c5d47a60f4b9efc1716af68\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 16 13:04:56.356184 containerd[1550]: time="2025-12-16T13:04:56.356005764Z" level=info msg="CreateContainer within sandbox \"641d1fcc7550314836d0946b4fa97503641f81d017b919dd18d36b130cf99310\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 16 13:04:56.414452 containerd[1550]: time="2025-12-16T13:04:56.414387958Z" level=info msg="Container 705ac18a15cba55fffad3eb8f59e6f810db7c04a64344f0f1fa51ca7b15b9e8b: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:04:56.417293 containerd[1550]: time="2025-12-16T13:04:56.417110821Z" level=info msg="Container 06e298e400e5ef350ab5abc7b75986678970113bfb369826b73ca1b0a5c8b929: CDI devices from CRI Config.CDIDevices: []" Dec 16 13:04:56.429882 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4047049229.mount: Deactivated successfully. Dec 16 13:04:56.506141 containerd[1550]: time="2025-12-16T13:04:56.506008128Z" level=info msg="CreateContainer within sandbox \"641d1fcc7550314836d0946b4fa97503641f81d017b919dd18d36b130cf99310\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"06e298e400e5ef350ab5abc7b75986678970113bfb369826b73ca1b0a5c8b929\"" Dec 16 13:04:56.508480 containerd[1550]: time="2025-12-16T13:04:56.508331571Z" level=info msg="StartContainer for \"06e298e400e5ef350ab5abc7b75986678970113bfb369826b73ca1b0a5c8b929\"" Dec 16 13:04:56.511536 containerd[1550]: time="2025-12-16T13:04:56.510422838Z" level=info msg="CreateContainer within sandbox \"d64d5f398304db34c8b57a051323ae10fa12f8290c5d47a60f4b9efc1716af68\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"705ac18a15cba55fffad3eb8f59e6f810db7c04a64344f0f1fa51ca7b15b9e8b\"" Dec 16 13:04:56.512872 containerd[1550]: time="2025-12-16T13:04:56.512804731Z" level=info msg="StartContainer for \"705ac18a15cba55fffad3eb8f59e6f810db7c04a64344f0f1fa51ca7b15b9e8b\"" Dec 16 13:04:56.517120 containerd[1550]: time="2025-12-16T13:04:56.517061144Z" level=info msg="connecting to shim 705ac18a15cba55fffad3eb8f59e6f810db7c04a64344f0f1fa51ca7b15b9e8b" address="unix:///run/containerd/s/7a1c025d8923a345aafadf21651f8cd25461f7d8d814c2df344f5c34b4e2ecaf" protocol=ttrpc version=3 Dec 16 13:04:56.530942 containerd[1550]: time="2025-12-16T13:04:56.530867631Z" level=info msg="connecting to shim 06e298e400e5ef350ab5abc7b75986678970113bfb369826b73ca1b0a5c8b929" address="unix:///run/containerd/s/0a38877b26eeb0159b007e767dd0115ffbb359d281c7a398f58441ce21b13cf8" protocol=ttrpc version=3 Dec 16 13:04:56.562293 systemd[1]: Started cri-containerd-705ac18a15cba55fffad3eb8f59e6f810db7c04a64344f0f1fa51ca7b15b9e8b.scope - libcontainer container 705ac18a15cba55fffad3eb8f59e6f810db7c04a64344f0f1fa51ca7b15b9e8b. Dec 16 13:04:56.579337 systemd[1]: Started cri-containerd-06e298e400e5ef350ab5abc7b75986678970113bfb369826b73ca1b0a5c8b929.scope - libcontainer container 06e298e400e5ef350ab5abc7b75986678970113bfb369826b73ca1b0a5c8b929. Dec 16 13:04:56.639830 containerd[1550]: time="2025-12-16T13:04:56.639735980Z" level=info msg="StartContainer for \"705ac18a15cba55fffad3eb8f59e6f810db7c04a64344f0f1fa51ca7b15b9e8b\" returns successfully" Dec 16 13:04:56.649027 containerd[1550]: time="2025-12-16T13:04:56.648961264Z" level=info msg="StartContainer for \"06e298e400e5ef350ab5abc7b75986678970113bfb369826b73ca1b0a5c8b929\" returns successfully" Dec 16 13:04:56.696919 kubelet[2758]: E1216 13:04:56.696862 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-2pnsr" podUID="c38f2008-1a82-45d6-8890-7ece4b855117" Dec 16 13:04:57.696101 kubelet[2758]: E1216 13:04:57.695792 2758 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-57d769ccb6-5hqtj" podUID="e556d54d-017a-4277-b8ea-32b10093992b" Dec 16 13:05:00.661143 systemd[1]: cri-containerd-9cf70de4ca81a87b886b07eef471fb2edd9fb4fd00c1e6d497e275c6e1659419.scope: Deactivated successfully. Dec 16 13:05:00.661412 systemd[1]: cri-containerd-9cf70de4ca81a87b886b07eef471fb2edd9fb4fd00c1e6d497e275c6e1659419.scope: Consumed 1.721s CPU time, 38.8M memory peak, 27.7M read from disk. Dec 16 13:05:00.664655 containerd[1550]: time="2025-12-16T13:05:00.664614429Z" level=info msg="received container exit event container_id:\"9cf70de4ca81a87b886b07eef471fb2edd9fb4fd00c1e6d497e275c6e1659419\" id:\"9cf70de4ca81a87b886b07eef471fb2edd9fb4fd00c1e6d497e275c6e1659419\" pid:2600 exit_status:1 exited_at:{seconds:1765890300 nanos:664230117}" Dec 16 13:05:00.684722 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9cf70de4ca81a87b886b07eef471fb2edd9fb4fd00c1e6d497e275c6e1659419-rootfs.mount: Deactivated successfully.